BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.
Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun
2012-09-01
MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
Marathe, Aniruddha P.; Harris, Rachel A.; Lowenthal, David K.; ...
2015-12-17
The use of clouds to execute high-performance computing (HPC) applications has greatly increased recently. Clouds provide several potential advantages over traditional supercomputers and in-house clusters. The most popular cloud is currently Amazon EC2, which provides fixed-cost and variable-cost, auction-based options. The auction market trades lower cost for potential interruptions that necessitate checkpointing; if the market price exceeds the bid price, a node is taken away from the user without warning. We explore techniques to maximize performance per dollar given a time constraint within which an application must complete. Specifically, we design and implement multiple techniques to reduce expected cost bymore » exploiting redundancy in the EC2 auction market. We then design an adaptive algorithm that selects a scheduling algorithm and determines the bid price. We show that our adaptive algorithm executes programs up to seven times cheaper than using the on-demand market and up to 44 percent cheaper than the best non-redundant, auction-market algorithm. We extend our adaptive algorithm to incorporate application scalability characteristics for further cost savings. In conclusion, we show that the adaptive algorithm informed with scalability characteristics of applications achieves up to 56 percent cost savings compared to the expected cost for the base adaptive algorithm run at a fixed, user-defined scale.« less
Di Tommaso, Paolo; Orobitg, Miquel; Guirado, Fernando; Cores, Fernado; Espinosa, Toni; Notredame, Cedric
2010-08-01
We present the first parallel implementation of the T-Coffee consistency-based multiple aligner. We benchmark it on the Amazon Elastic Cloud (EC2) and show that the parallelization procedure is reasonably effective. We also conclude that for a web server with moderate usage (10K hits/month) the cloud provides a cost-effective alternative to in-house deployment. T-Coffee is a freeware open source package available from http://www.tcoffee.org/homepage.html
Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.
2012-12-01
Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.
2011-01-01
This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.
Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less
Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.; ...
2017-12-06
Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less
Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha
2016-02-27
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
NASA Astrophysics Data System (ADS)
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-03-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-01-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
Exploiting parallel R in the cloud with SPRINT.
Piotrowski, M; McGilvary, G A; Sloan, T M; Mewissen, M; Lloyd, A D; Forster, T; Mitchell, L; Ghazal, P; Hill, J
2013-01-01
Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon's Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of the algorithm. Resource underutilization can further improve the time to result. End-user's location impacts on costs due to factors such as local taxation. Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds.
Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization
Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...
2015-01-01
This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less
Cloud computing for comparative genomics.
Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J
2010-05-18
Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.
Exploration of cloud computing late start LDRD #149630 : Raincoat. v. 2.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Echeverria, Victor T.; Metral, Michael David; Leger, Michelle A.
This report contains documentation from an interoperability study conducted under the Late Start LDRD 149630, Exploration of Cloud Computing. A small late-start LDRD from last year resulted in a study (Raincoat) on using Virtual Private Networks (VPNs) to enhance security in a hybrid cloud environment. Raincoat initially explored the use of OpenVPN on IPv4 and demonstrates that it is possible to secure the communication channel between two small 'test' clouds (a few nodes each) at New Mexico Tech and Sandia. We extended the Raincoat study to add IPSec support via Vyatta routers, to interface with a public cloud (Amazon Elasticmore » Compute Cloud (EC2)), and to be significantly more scalable than the previous iteration. The study contributed to our understanding of interoperability in a hybrid cloud.« less
BlueSky Cloud - rapid infrastructure capacity using Amazon's Cloud for wildfire emergency response
NASA Astrophysics Data System (ADS)
Haderman, M.; Larkin, N. K.; Beach, M.; Cavallaro, A. M.; Stilley, J. C.; DeWinter, J. L.; Craig, K. J.; Raffuse, S. M.
2013-12-01
During peak fire season in the United States, many large wildfires often burn simultaneously across the country. Smoke from these fires can produce air quality emergencies. It is vital that incident commanders, air quality agencies, and public health officials have smoke impact information at their fingertips for evaluating where fires and smoke are and where the smoke will go next. To address the need for this kind of information, the U.S. Forest Service AirFire Team created the BlueSky Framework, a modeling system that predicts concentrations of particle pollution from wildfires. During emergency response, decision makers use BlueSky predictions to make public outreach and evacuation decisions. The models used in BlueSky predictions are computationally intensive, and the peak fire season requires significantly more computer resources than off-peak times. Purchasing enough hardware to run the number of BlueSky Framework runs that are needed during fire season is expensive and leaves idle servers running the majority of the year. The AirFire Team and STI developed BlueSky Cloud to take advantage of Amazon's virtual servers hosted in the cloud. With BlueSky Cloud, as demand increases and decreases, servers can be easily spun up and spun down at a minimal cost. Moving standard BlueSky Framework runs into the Amazon Cloud made it possible for the AirFire Team to rapidly increase the number of BlueSky Framework instances that could be run simultaneously without the costs associated with purchasing and managing servers. In this presentation, we provide an overview of the features of BlueSky Cloud, describe how the system uses Amazon Cloud, and discuss the costs and benefits of moving from privately hosted servers to a cloud-based infrastructure.
Cloud computing for comparative genomics
2010-01-01
Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems. PMID:20482786
Castaño-Díez, Daniel
2017-06-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
NASA Astrophysics Data System (ADS)
Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao
In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.
Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.
2015-01-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363
Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L
2015-02-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Exploiting Parallel R in the Cloud with SPRINT
Piotrowski, M.; McGilvary, G.A.; Sloan, T. M.; Mewissen, M.; Lloyd, A.D.; Forster, T.; Mitchell, L.; Ghazal, P.; Hill, J.
2012-01-01
Background Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Objectives Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon’s Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. Methods The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. Results It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of algorithm. Resource underutilization can further improve the time to result. End-user’s location impacts on costs due to factors such as local taxation. Conclusions: Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds. PMID:23223611
Castaño-Díez, Daniel
2017-01-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID
Cloud Condensation Nuclei Activity of Aerosols during GoAmazon 2014/15 Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J.; Martin, S. T.; Kleinman, L.
2016-03-01
Aerosol indirect effects, which represent the impact of aerosols on climate through influencing the properties of clouds, remain one of the main uncertainties in climate predictions (Stocker et al. 2013). Reducing this large uncertainty requires both improved understanding and representation of aerosol properties and processes in climate models, including the cloud activation properties of aerosols. The Atmospheric System Research (ASR) science program plan of January 2010 states that: “A key requirement for simulating aerosol-cloud interactions is the ability to calculate cloud condensation nuclei and ice nuclei (CCN and IN, respectively) concentrations as a function of supersaturation from the chemical andmore » microphysical properties of the aerosol.” The Observations and Modeling of the Green Ocean Amazon (GoAmazon 2014/15) study seeks to understand how aerosol and cloud life cycles are influenced by pollutant outflow from a tropical megacity (Manaus)—in particular, the differences in cloud-aerosol-precipitation interactions between polluted and pristine conditions. One key question of GoAmazon2014/5 is: “What is the influence of the Manaus pollution plume on the cloud condensation nuclei (CCN) activities of the aerosol particles and the secondary organic material in the particles?” To address this question, we measured size-resolved CCN spectra, a critical measurement for GoAmazon2014/5.« less
Security-aware Virtual Machine Allocation in the Cloud: A Game Theoretic Approach
2015-01-13
predecessor, however, this paper used empirical evidence and actual data from running experiments on the Amazon EC2 cloud . They began by running all 5...is through effective VM allocation management of the cloud provider to ensure delivery of maximum security for all cloud users. The negative... Cloud : A Game Theoretic Approach 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f
NASA Astrophysics Data System (ADS)
Oishi, Y.; Kamei, A.; Murakami, K.; Dupuy, E.; Yokota, Y.; Hiraki, K.; Ninomiya, K.; Saito, M.; Yoshida, Y.; Morino, I.; Nakajima, T. Y.; Yokota, T.; Matsunaga, T.
2013-12-01
Greenhouse gases Observing SATellite (GOSAT) was launched in 2009 to measure the global atmospheric CO2 and CH4 concentrations. GOSAT is equipped with two sensors: the Thermal And Near-infrared Sensor for carbon Observation-Fourier Transform Spectrometer (TANSO-FTS) and the Cloud and Aerosol Imager (TANSO-CAI). The presence of clouds in the instantaneous field-of-view (IFOV) of the FTS leads to incorrect estimates of the CO2 or CH4 concentration. To deal with this problem, the FTS data which are suspected to be cloud-contaminated must be identified and rejected. As a result, there are very few remaining FTS data in the region of tropical rainforest such as the Amazon. In the meanwhile the feasibility studies of GOSAT-2 started for more precise monitoring of atmospheric greenhouse gases than GOSAT in 2011. To improve the accuracy of estimates of the column-averaged dry air mole fraction of atmospheric CO2 (XCO2), we need to understand the present situation about cloud screening in the rain forest regions and to examine the cloud-contaminated data whose processing might be possible with improvement of instruments or algorithms. In this study we evaluated the impact of thin clouds on estimates of the XCO2 using an atmospheric radiative transfer code, which can simulate the spectrum at the top of the atmosphere under thin cloud conditions. First, we decided the input parameters, among which relative position of the sun and satellite to observation point, surface reflectance using cloud-free GOSAT data products in the Amazon, FTS L1B data products (radiance spectral data), FTS L2 data products (CO2 column abundance data), and CAI L3 data products (clear-sky reflectance data). The evaluation was performed by comparing depths of the CO2 absorption lines in output radiation spectra with varying CO2 concentrations and cloud conditions, cloud type, cloud optical depth, and cloud top altitude. We will present our latest results.
Smoke Invigoration Versus Inhibition of Clouds over the Amazon
NASA Technical Reports Server (NTRS)
Koren, Ilan; Martins, J. Vanderlei; Lorraine, A. Remer; Afargan, Hila
2008-01-01
The effect of anthropogenic aerosols on clouds is one of the most important and least understood aspects of human-induced climate change. Small changes in the amount of cloud coverage can produce a climate forcing equivalent in magnitude and opposite in sign to that caused by anthropogenic greenhouse gases, and changes in cloud height can shift the effect of clouds from cooling to warming. Focusing on the Amazon, we show a smooth transition between two opposing effects of aerosols on clouds: the microphysical and the radiative. We show how a feedback between the optical properties of aerosols and the cloud fraction can modify the aerosol forcing, changing the total radiative energy and redistributing it over the atmospheric column.
Thermodynamics and Cloud Radiative Effect from the First Year of GoAmazon
NASA Technical Reports Server (NTRS)
Collow, Allie Marquardt; Miller, Mark; Trabachino, Lynne
2015-01-01
Deforestation is an ongoing concern for the Amazon Rainforest of Brazil and associated changes to the land surface have been hypothesized to alter the climate in the region. A comprehensive set of meteorological observations at the surface and within the lower troposphere above Manacapuru, Brazil and data from the Modern Era Retrospective Analysis for Research and Applications Version 2 (MERRA-2) are used to evaluate the seasonal cycle of cloudiness, thermodynamics, and the radiation budget. While ample moisture is present in the Amazon Rainforest year round, the northward progression of the Hadley circulation during the dry season contributes to a drying of the middle troposphere and inhibits the formation of deep convection. This results in a reduction in cloudiness and precipitation as well as an increase in the height of the lifting condensation level, which is shown to have a negative correlation to the fraction of low clouds. Frequent cloudiness prevents solar radiation from reaching the surface and clouds are often reflective with high values of shortwave cloud radiative effect at the surface and top of the atmosphere. Cloud radiative effect is reduced during the dry season however the dry season surface shortwave cloud radiative effect is still double what is observed during the wet season in other tropical locations. Within the column, the impact of clouds on the radiation budget is more prevalent in the longwave part of the spectrum, with a net warming in the wet season.
Menu-driven cloud computing and resource sharing for R and Bioconductor.
Bolouri, Hamid; Dulepet, Rajiv; Angerman, Michael
2011-08-15
We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. hbolouri@fhcrc.org.
NASA Astrophysics Data System (ADS)
Collow, A.; Miller, M. A.
2015-12-01
The Amazon Rainforest of Brazil is a region with potential climate sensitivities, especially with ongoing land surface changes and biomass burning aerosols due to deforestation. Ubiquitous moisture in the area make clouds a common feature over the Amazon Rainforest and along with the influences from deforestation have a significant impact on the radiation budget. This region experiences a seasonal contrast in clouds, precipitation, and aerosols making it an ideal location to study the relationship between these variables and the radiation budget. An internationally sponsored campaign entitled GOAmazon2014/15 included a deployment of an Atmospheric Radiation Measurement (ARM) Mobile Facility, which collected comprehensive measurements using in situ and remote sensors. Observations of clouds, aerosols, and radiative fluxes from the first year of the deployment are analyzed in conjunction with top of the atmosphere (TOA) observations from the Clouds and the Earth's Radiant Energy System (CERES) and analyses from the newly released Modern Era Retrospective Analysis for Research and Applications Version-2 (MERRA-2). The combination of surface and TOA observations allows for the calculation of radiative flux divergence and cloud radiative effect (CRE) within the column, while the comparison to MERRA-2 enables the verification of a new reanalysis product and a view of the spatial variation of the radiation budget. Clouds are very reflective in the area, creating a cooling effect in the shortwave (SW) at the surface, with some seasonality present due to the reduction of optically thick clouds in the dry season. Clouds have little effect on the column itself in the SW due to the balance between the reflective and absorbing properties of the clouds with the majority of the impact on the atmosphere from clouds warming in the longwave. Influences of aerosols are seen in the dry season, and an increase in moisture above the Amazon River and its tributaries enhance the CRE.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
Biomedical cloud computing with Amazon Web Services.
Fusaro, Vincent A; Patil, Prasad; Gafni, Erik; Wall, Dennis P; Tonellato, Peter J
2011-08-01
In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster), provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/). More information about cloud computing, detailed cost analysis, and security can be found in references.
Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud
NASA Astrophysics Data System (ADS)
Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok
Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.
Rainforest aerosols as biogenic nuclei of clouds and precipitation in the Amazon.
Pöschl, U; Martin, S T; Sinha, B; Chen, Q; Gunthe, S S; Huffman, J A; Borrmann, S; Farmer, D K; Garland, R M; Helas, G; Jimenez, J L; King, S M; Manzi, A; Mikhailov, E; Pauliquevis, T; Petters, M D; Prenni, A J; Roldin, P; Rose, D; Schneider, J; Su, H; Zorn, S R; Artaxo, P; Andreae, M O
2010-09-17
The Amazon is one of the few continental regions where atmospheric aerosol particles and their effects on climate are not dominated by anthropogenic sources. During the wet season, the ambient conditions approach those of the pristine pre-industrial era. We show that the fine submicrometer particles accounting for most cloud condensation nuclei are predominantly composed of secondary organic material formed by oxidation of gaseous biogenic precursors. Supermicrometer particles, which are relevant as ice nuclei, consist mostly of primary biological material directly released from rainforest biota. The Amazon Basin appears to be a biogeochemical reactor, in which the biosphere and atmospheric photochemistry produce nuclei for clouds and precipitation sustaining the hydrological cycle. The prevailing regime of aerosol-cloud interactions in this natural environment is distinctly different from polluted regions.
Rainforest Aerosols as Biogenic Nuclei of Clouds and Precipitation in the Amazon
NASA Astrophysics Data System (ADS)
Pöschl, U.; Martin, S. T.; Sinha, B.; Chen, Q.; Gunthe, S. S.; Huffman, J. A.; Borrmann, S.; Farmer, D. K.; Garland, R. M.; Helas, G.; Jimenez, J. L.; King, S. M.; Manzi, A.; Mikhailov, E.; Pauliquevis, T.; Petters, M. D.; Prenni, A. J.; Roldin, P.; Rose, D.; Schneider, J.; Su, H.; Zorn, S. R.; Artaxo, P.; Andreae, M. O.
2010-09-01
The Amazon is one of the few continental regions where atmospheric aerosol particles and their effects on climate are not dominated by anthropogenic sources. During the wet season, the ambient conditions approach those of the pristine pre-industrial era. We show that the fine submicrometer particles accounting for most cloud condensation nuclei are predominantly composed of secondary organic material formed by oxidation of gaseous biogenic precursors. Supermicrometer particles, which are relevant as ice nuclei, consist mostly of primary biological material directly released from rainforest biota. The Amazon Basin appears to be a biogeochemical reactor, in which the biosphere and atmospheric photochemistry produce nuclei for clouds and precipitation sustaining the hydrological cycle. The prevailing regime of aerosol-cloud interactions in this natural environment is distinctly different from polluted regions.
A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Good, J. C. Deelman, E.; Singh, G. Livny, M.
2009-09-01
Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance.
Martin, S. T.; Artaxo, P.; Machado, L.; ...
2017-05-15
The Observations and Modeling of the Green Ocean Amazon 2014–2015 (GoAmazon2014/5) experiment took place around the urban region of Manaus in central Amazonia across 2 years. The urban pollution plume was used to study the susceptibility of gases, aerosols, clouds, and rainfall to human activities in a tropical environment. Many aspects of air quality, weather, terrestrial ecosystems, and climate work differently in the tropics than in the more thoroughly studied temperate regions of Earth. GoAmazon2014/5, a cooperative project of Brazil, Germany, and the United States, employed an unparalleled suite of measurements at nine ground sites and on board two aircraftmore » to investigate the flow of background air into Manaus, the emissions into the air over the city, and the advection of the pollution downwind of the city. Here in this paper, to visualize this train of processes and its effects, observations aboard a low-flying aircraft are presented. Comparative measurements within and adjacent to the plume followed the emissions of biogenic volatile organic carbon compounds (BVOCs) from the tropical forest, their transformations by the atmospheric oxidant cycle, alterations of this cycle by the influence of the pollutants, transformations of the chemical products into aerosol particles, the relationship of these particles to cloud condensation nuclei (CCN) activity, and the differences in cloud properties and rainfall for background compared to polluted conditions. The observations of the GoAmazon2014/5 experiment illustrate how the hydrologic cycle, radiation balance, and carbon recycling may be affected by present-day as well as future economic development and pollution over the Amazonian tropical forest.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, S. T.; Artaxo, P.; Machado, L.
The Observations and Modeling of the Green Ocean Amazon 2014–2015 (GoAmazon2014/5) experiment took place around the urban region of Manaus in central Amazonia across 2 years. The urban pollution plume was used to study the susceptibility of gases, aerosols, clouds, and rainfall to human activities in a tropical environment. Many aspects of air quality, weather, terrestrial ecosystems, and climate work differently in the tropics than in the more thoroughly studied temperate regions of Earth. GoAmazon2014/5, a cooperative project of Brazil, Germany, and the United States, employed an unparalleled suite of measurements at nine ground sites and on board two aircraftmore » to investigate the flow of background air into Manaus, the emissions into the air over the city, and the advection of the pollution downwind of the city. Here in this paper, to visualize this train of processes and its effects, observations aboard a low-flying aircraft are presented. Comparative measurements within and adjacent to the plume followed the emissions of biogenic volatile organic carbon compounds (BVOCs) from the tropical forest, their transformations by the atmospheric oxidant cycle, alterations of this cycle by the influence of the pollutants, transformations of the chemical products into aerosol particles, the relationship of these particles to cloud condensation nuclei (CCN) activity, and the differences in cloud properties and rainfall for background compared to polluted conditions. The observations of the GoAmazon2014/5 experiment illustrate how the hydrologic cycle, radiation balance, and carbon recycling may be affected by present-day as well as future economic development and pollution over the Amazonian tropical forest.« less
Menu-driven cloud computing and resource sharing for R and Bioconductor
Bolouri, Hamid; Angerman, Michael
2011-01-01
Summary: We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. Availability and Implementation: CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. Contact: hbolouri@fhcrc.org PMID:21685055
Atlas2 Cloud: a framework for personal genome analysis in the cloud
2012-01-01
Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663
Atlas2 Cloud: a framework for personal genome analysis in the cloud.
Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli
2012-01-01
Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.
Assessing the Amazon Cloud Suitability for CLARREO's Computational Needs
NASA Technical Reports Server (NTRS)
Goldin, Daniel; Vakhnin, Andrei A.; Currey, Jon C.
2015-01-01
In this document we compare the performance of the Amazon Web Services (AWS), also known as Amazon Cloud, with the CLARREO (Climate Absolute Radiance and Refractivity Observatory) cluster and assess its suitability for computational needs of the CLARREO mission. A benchmark executable to process one month and one year of PARASOL (Polarization and Anistropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar) data was used. With the optimal AWS configuration, adequate data-processing times, comparable to the CLARREO cluster, were found. The assessment of alternatives to the CLARREO cluster continues and several options, such as a NASA-based cluster, are being considered.
CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment
NASA Astrophysics Data System (ADS)
Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.
2017-12-01
Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).
Benchmarking undedicated cloud computing providers for analysis of genomic datasets.
Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W
2014-01-01
A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.
A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation
Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas
2011-01-01
High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089
Study of the thermodynamic phase of hydrometeors in convective clouds in the Amazon Basin
NASA Astrophysics Data System (ADS)
Ferreira, W. C.; Correia, A. L.; Martins, J.
2012-12-01
Aerosol-cloud interactions are responsible for large uncertainties in climatic models. One key fator when studying clouds perturbed by aerosols is determining the thermodynamic phase of hydrometeors as a function of temperature or height in the cloud. Conventional remote sensing can provide information on the thermodynamic phase of clouds over large areas, but it lacks the precision needed to understand how a single, real cloud evolves. Here we present mappings of the thermodynamic phase of droplets and ice particles in individual convective clouds in the Amazon Basin, by analyzing the emerging infrared radiance on cloud sides (Martins et al., 2011). In flights over the Amazon Basin with a research aircraft Martins et al. (2011) used imaging radiometers with spectral filters to record the emerging radiance on cloud sides at the wavelengths of 2.10 and 2.25 μm. Due to differential absorption and scattering of these wavelengths by hydrometeors in liquid or solid phases, the intensity ratio between images recorded at the two wavelengths can be used as proxy to the thermodynamic phase of these hydrometeors. In order to analyze the acquired dataset we used the MATLAB tools package, developing scripts to handle data files and derive the thermodynamic phase. In some cases parallax effects due to aircraft movement required additional data processing before calculating ratios. Only well illuminated scenes were considered, i.e. images acquired as close as possible to the backscatter vector from the incident solar radiation. It's important to notice that the intensity ratio values corresponding to a given thermodynamic phase can vary from cloud to cloud (Martins et al., 2011), however inside the same cloud the distinction between ice, water and mixed-phase is clear. Analyzing histograms of reflectance ratios 2.10/2.25 μm in selected cases, we found averages typically between 0.3 and 0.4 for ice phase hydrometeors, and between 0.5 and 0.7 for water phase droplets, consistent
2010-06-01
Woods Hole, MA 02543, USA 3 Raytheon Intelligence and Information Systems, Aurora , CO 80011, USA 4 Scripps Institution of Oceanography, La Jolla...Amazon.com, Amazon Web Services for the Amazon Elastic Compute Cloud ( Amazon EC2). http://aws.amazon.com/ec2/. [4] M. Arrott, B. Demchak, V. Ermagan, C
Oh, Jeongsu; Choi, Chi-Hwan; Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo
2016-01-01
High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA
Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo
2016-01-01
High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology–a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in
Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets
Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.
2014-01-01
A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298
2011-08-01
5 Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis...classification of streaming data. Example input images (top left). All digit prototypes (cluster centers) found, with size proportional to frequency (top...Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis 1 http
Boverhof's App Earns Honorable Mention in Amazon's Web Services
» Boverhof's App Earns Honorable Mention in Amazon's Web Services Competition News & Publications News Publications Facebook Google+ Twitter Boverhof's App Earns Honorable Mention in Amazon's Web Services by Amazon Web Services (AWS). Amazon officially announced the winners of its EC2 Spotathon on Monday
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, S. T.; Artaxo, P.; Machado, L.
The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) experiment took place around the urban region of Manaus in central Amazonia across two years. The urban pollution plume was used to study the susceptibility of gases, aerosols, clouds, and rainfall to human activities in a tropical environment. Many aspects of air quality, weather, terrestrial ecosystems, and climate work differently in the tropics than in the more thoroughly studied USA, employed an unparalleled suite of measurements at nine ground sites and onboard two aircraft to investigate the flow of background air into Manaus, the emissions into the air over themore » city, and the advection of the pollution downwind of the city. Herein, to visualize this train of processes and its effects, observations aboard a low-flying aircraft are presented. Comparative measurements within and adjacent to the plume followed the emissions of biogenic volatile organic carbon compounds (BVOCs) from the tropical forest, their transformations by the atmospheric oxidant cycle, alterations of this cycle by the influence of the pollutants, transformations of the chemical products into aerosol particles, the relationship of these particles to cloud condensation nuclei (CCN) activity, and the differences in cloud properties and rainfall for background compared to polluted conditions. The observations of the GoAmazon2014/5 experiment illustrate how the hydrologic cycle, radiation balance, and carbon recycling may be affected by present-day as well as future economic development and pollution over the Amazonian tropical forest.« less
Are Cloud Environments Ready for Scientific Applications?
NASA Astrophysics Data System (ADS)
Mehrotra, P.; Shackleford, K.
2011-12-01
Cloud computing environments are becoming widely available both in the commercial and government sectors. They provide flexibility to rapidly provision resources in order to meet dynamic and changing computational needs without the customers incurring capital expenses and/or requiring technical expertise. Clouds also provide reliable access to resources even though the end-user may not have in-house expertise for acquiring or operating such resources. Consolidation and pooling in a cloud environment allow organizations to achieve economies of scale in provisioning or procuring computing resources and services. Because of these and other benefits, many businesses and organizations are migrating their business applications (e.g., websites, social media, and business processes) to cloud environments-evidenced by the commercial success of offerings such as the Amazon EC2. In this paper, we focus on the feasibility of utilizing cloud environments for scientific workloads and workflows particularly of interest to NASA scientists and engineers. There is a wide spectrum of such technical computations. These applications range from small workstation-level computations to mid-range computing requiring small clusters to high-performance simulations requiring supercomputing systems with high bandwidth/low latency interconnects. Data-centric applications manage and manipulate large data sets such as satellite observational data and/or data previously produced by high-fidelity modeling and simulation computations. Most of the applications are run in batch mode with static resource requirements. However, there do exist situations that have dynamic demands, particularly ones with public-facing interfaces providing information to the general public, collaborators and partners, as well as to internal NASA users. In the last few months we have been studying the suitability of cloud environments for NASA's technical and scientific workloads. We have ported several applications to
Cloud computing and validation of expandable in silico livers
2010-01-01
Background In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. Results The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. Conclusions The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide
Cloud computing and validation of expandable in silico livers.
Ropella, Glen E P; Hunt, C Anthony
2010-12-03
In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling
NASA Astrophysics Data System (ADS)
Andreae, M. O.; Afchine, A.; Albrecht, R. I.; Artaxo, P.; Borrmann, S.; Cecchini, M. A.; Costa, A.; Dollner, M.; Fütterer, D.; Järvinen, E.; Klimach, T.; Konemann, T.; Kraemer, M.; Krüger, M. L.; Machado, L.; Mertes, S.; Pöhlker, C.; Poeschl, U.; Sauer, D. N.; Schnaiter, M.; Schneider, J.; Schulz, C.; Spanu, A.; Walser, A.; Weinzierl, B.; Wendisch, M.
2015-12-01
The German-Brazilian cooperative aircraft campaign ACRIDICON-CHUVA (Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems) on the German research aircraft HALO took place over the Amazon Basin in September/October 2014, with the objective of studying tropical deep convective clouds over the Amazon rainforest and their interactions with trace gases, aerosol particles, and atmospheric radiation. The aircraft was equipped with about 30 remote sensing and in-situ instruments for meteorological, trace gas, aerosol, cloud, precipitation, and solar radiation measurements. Fourteen research flights were conducted during this campaign. Observations during ACRIDICON-CHUVA showed high aerosol concentrations in the upper troposphere (UT) over the Amazon Basin, with concentrations after normalization to standard conditions often exceeding those in the boundary layer (BL). This behavior was consistent between several aerosol metrics, including condensation nuclei (CN), cloud condensation nuclei (CCN), and chemical species mass concentrations. These UT aerosols were different in their composition and size distribution from the aerosol in the BL, making convective transport of particles unlikely as a source. The regions in the immediate outflow of deep convective clouds were found to be depleted in aerosol particles, whereas enhanced aerosol number and mass concentrations were found in UT regions that had experienced outflow from deep convection in the preceding 24-48 hours. This suggests that aerosol production takes place in the UT based on volatile and condensable material brought up by deep convection. Subsequently, downward mixing and transport of upper tropospheric aerosol may be a source of particles to the BL, where they increase in size by the condensation of biogenic volatile organic carbon (BVOC) oxidation products. This may be an important source of aerosol particles in the Amazonian BL, where aerosol nucleation and new
Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian
2011-01-01
The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single
Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian
2011-01-01
Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S r
The effect of atmospheric aerosol particles and clouds on net ecosystem exchange in the Amazon
NASA Astrophysics Data System (ADS)
Cirino, G. G.; Souza, R. A. F.; Adams, D. K.; Artaxo, P.
2014-07-01
Carbon cycling in the Amazon is closely linked to atmospheric processes and climate in the region as a consequence of the strong coupling between the atmosphere and biosphere. This work examines the effects of changes in net radiation due to atmospheric aerosol particles and clouds on the net ecosystem exchange (NEE) of CO2 in the Amazon region. Some of the major environmental factors affecting the photosynthetic activity of plants, such as air temperature and relative humidity, were also examined. An algorithm for clear-sky irradiance was developed and used to determine the relative irradiance, f, which quantifies the percentage of solar radiation absorbed and scattered due to atmospheric aerosol particles and clouds. Aerosol optical depth (AOD) was calculated from irradiances measured with the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor, onboard the Terra and Aqua satellites, and was validated with ground-based AOD measurements from AERONET (Aerosol Robotic Network) sun photometers. Carbon fluxes were measured using eddy covariance technique at the Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) flux towers. Two sites were studied: the Jaru Biological Reserve (RBJ), located in Rondonia, and the Cuieiras Biological Reserve at the K34 LBA tower (located in a preserved region in the central Amazon). Analysis was performed continuously from 1999 to 2009 at K34 and from 1999 to 2002 at RBJ, and includes wet, dry and transition seasons. In the Jaru Biological Reserve, a 29% increase in carbon uptake (NEE) was observed when the AOD ranged from 0.10 to 1.5 at 550 nm. In the Cuieiras Biological Reserve, the aerosol effect on NEE was smaller, accounting for an approximate 20% increase in NEE. High aerosol loading (AOD above 3 at 550 nm) or high cloud cover leads to reductions in solar flux and strong decreases in photosynthesis up to the point where NEE approaches zero. The observed increase in NEE is attributed to an enhancement (~50%) in
Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing
Balasubramaniam, S.; Kavitha, V.
2015-01-01
Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826
Geometric data perturbation-based personal health record transactions in cloud computing.
Balasubramaniam, S; Kavitha, V
2015-01-01
Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.
SeReNA Project: studying aerosol interactions with cloud microphysics in the Amazon Basin
NASA Astrophysics Data System (ADS)
Correia, A. L.; Catandi, P. B.; Frigeri, F. F.; Ferreira, W. C.; Martins, J.; Artaxo, P.
2012-12-01
Cloud microphysics and its interaction with aerosols is a key atmospheric process for weather and climate. Interactions between clouds and aerosols can impact Earth's radiative balance, its hydrological and energetic cycles, and are responsible for a large fraction of the uncertainty in climatic models. On a planetary scale, the Amazon Basin is one of the most significant land sources of moisture and latent heat energy. Moreover, every year this region undergoes mearked seasonal shifts in its atmospheric state, transitioning from clean to heavily polluted conditions due to the occurrence of seasonal biomass burning fires, that emit large amounts of smoke to the atmosphere. These conditions make the Amazon Basin a special place to study aerosol-cloud interactions. The SeReNA Project ("Remote sensing of clouds and their interaction with aerosols", from the acronym in Portuguese, @SerenaProject on Twitter) is an ongoing effort to experimentally investigate the impact of aerosols upon cloud microphysics in Amazonia. Vertical profiles of droplet effective radius of water and ice particles, in single convective clouds, can be derived from measurements of the emerging radiation on cloud sides. Aerosol optical depth, cloud top properties, and meteorological parameters retrieved from satellites will be correlated with microphysical properties derived for single clouds. Maps of cloud brightness temperature will allow building temperature vs. effective radius profiles for hydrometeors in single clouds. Figure 1 shows an example extracted from Martins et al. (2011), illustrating a proof-of-concept for the kind of result expected within the framework for the SeReNA Project. The results to be obtained will help foster the quantitative knowledge about interactions between aerosols and clouds in a microphysical level. These interactions are a fundamental process in the context of global climatic changes, they are key to understanding basic processes within clouds and how aerosols
An Architecture for Cross-Cloud System Management
NASA Astrophysics Data System (ADS)
Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad
The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.
NASA Astrophysics Data System (ADS)
Hogenson, K.; Arko, S. A.; Buechler, B.; Hogenson, R.; Herrmann, J.; Geiger, A.
2016-12-01
A problem often faced by Earth science researchers is how to scale algorithms that were developed against few datasets and take them to regional or global scales. One significant hurdle can be the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively, while remaining generic enough to incorporate new algorithms with limited administration time or expense. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon services such as Lambda, the Simple Notification Service (SNS), Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. The HyP3 user interface was written using elastic beanstalk, and the system uses SNS and Lamdba to handle creating, instantiating, executing, and terminating EC2 instances automatically. Data are sent to S3 for delivery to customers and removed using standard data lifecycle management rules. In HyP3 all data processing is ephemeral; there are no persistent processes taking compute and storage resources or generating added cost. When complete, HyP3 will leverage the automatic scaling up and down of EC2 compute power to respond to event-driven demand surges correlated with natural disaster or reprocessing efforts. Massive simultaneous processing within EC2 will be able match the demand spike in ways conventional physical computing power never could, and then tail off incurring no costs when not needed. This presentation will focus on the development techniques and technologies that were used in developing the HyP3 system. Data and process flow will be shown
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
NASA Astrophysics Data System (ADS)
Kobayashi, H.; Dye, D. G.
2004-12-01
Normalized difference vegetation index (NDVI) derived from National Oceanic and Atmospheric Administration (NOAA)/Advanced Very High Resolution Radiometer (AVHRR) is a unique measurement of long-term variations in global vegetation dynamics. The NDVI data have been used for the detection of the seasonal and interannual variations in vegetation. However, as reported in several studies, NDVI decreases with the increase in clouds and/or smoke aerosol contaminated in the pixels. This study assesses the smoke and clouds effect on long-term Global Inventory Modeling and Mapping Studies (GIMMS) and Pathfinder AVHRR Land (PAL) NDVI data in Amazon. This knowledge will help developing the correction method in the tropics in the future. To assess the smoke and cloud effects on GIMMS and PAL, we used another satellite-derived data sets; NDVI derived from SPOT/VEGETATION (VGT) data and Aerosol Index (AI) derived from Total Ozone Mapping Spectrometer (TOMS). Since April 1998, VGT has measured the earth surface globally including in Amazon. The advantage of the VGT is that it has blue channel where the smoke and cloud can be easily detected. By analyzing the VGT NDVI and comparing with the AVHRR-based NDVI, we inferred smoke and cloud effect on the AVHRR-based NDVI. From the results of the VGT analysis, we found the large NDVI seasonality in South and Southeastern Amazon. In these areas, the NDVI gradually increased from April to July and decreased from August to October. However the sufficient NDVI data were not existed from August to November when the smoke and cloud pixels were masked using blue reflectance. Thus it is said that the smoke and clouds mainly cause the large decreases in NDVI between August and November and NDVI has little vegetation signature in these months. Also we examined the interannual variations in NDVI and smoke aerosol. Then the decrease in NDVI is well consistent with the increase in the increase in AI. Our results suggest that the months between April
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.
Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E
2012-03-19
A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community
2012-01-01
Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the
Long-term observations of cloud condensation nuclei in the Amazon rain forest
NASA Astrophysics Data System (ADS)
Pöhlker, Mira L.; Pöhlker, Christopher; Ditas, Florian; Klimach, Thomas; Hrabe de Angelis, Isabella; Brito, Joel; Carbone, Samara; Cheng, Yafang; Martin, Scot T.; Moran-Zuloaga, Daniel; Rose, Diana; Saturno, Jorge; Su, Hang; Thalman, Ryan; Walter, David; Wang, Jian; Barbosa, Henrique; Artaxo, Paulo; Andreae, Meinrat O.; Pöschl, Ulrich
2017-04-01
Size-resolved long-term measurements of atmospheric aerosol and cloud condensation nuclei (CCN) concentrations and hygroscopicity were conducted at the remote Amazon Tall Tower Observatory (ATTO) in the central Amazon Basin over a full seasonal cycle (Mar 2014 - Feb 2015). The measurements provide a climatology of CCN properties characteristic of a remote central Amazonian rain forest site [1,2]. The CCN measurements were continuously cycled through 10 levels of supersaturation (S = 0.11 to 1.10 %) and span the aerosol particle size range from 20 to 245 nm. The particle hygroscopicity exhibits a pronounced size dependence with lower values for the Aitken mode (κAit = 0.14 ± 0.03), higher values for the accumulation mode (κAcc = 0.22 ± 0.05), and an overall mean value of κmean = 0.17 ± 0.06, consistent with high fractions of organic aerosol. The hygroscopicity parameter, κ, exhibits remarkably little temporal variability: no pronounced diurnal cycles, only weak seasonal trends, and few short-term variations during long-range transport events. In contrast, the CCN number concentrations exhibit a pronounced seasonal cycle, tracking the pollution-related seasonality in total aerosol concentration. We find that the variability in the CCN concentrations in the central Amazon is mostly driven by aerosol particle number concentration and size distribution, while variations in aerosol hygroscopicity and chemical composition matter only during a few episodes. For modelling purposes, we compare different approaches of predicting CCN number concentration and present a novel parameterization, which allows accurate CCN predictions based on a small set of input data. In addition, we analyzed the CCN short-term variability in relation to air mass changes as well as aerosol emission and transformation processes. The CCN short term variability is presented for selected case studies, which analyze particularly interesting and characteristic events/conditions in the Amazon
Reducing and Analyzing the PHAT Survey with the Cloud
NASA Astrophysics Data System (ADS)
Williams, Benjamin F.; Olsen, Knut; Khan, Rubab; Pirone, Daniel; Rosema, Keith
2018-05-01
We discuss the technical challenges we faced and the techniques we used to overcome them when reducing the Panchromatic Hubble Andromeda Treasury (PHAT) photometric data set on the Amazon Elastic Compute Cloud (EC2). We first describe the architecture of our photometry pipeline, which we found particularly efficient for reducing the data in multiple ways for different purposes. We then describe the features of EC2 that make this architecture both efficient to use and challenging to implement. We describe the techniques we adopted to process our data, and suggest ways these techniques may be improved for those interested in trying such reductions in the future. Finally, we summarize the output photometry data products, which are now hosted publicly in two places in two formats. They are in simple fits tables in the high-level science products on MAST, and on a queryable database available through the NOAO Data Lab.
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...
2015-02-19
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Satellite-Observed Vertical Structures of Clouds over the Amazon Basin
NASA Astrophysics Data System (ADS)
Wu, M.; Lee, J. E.
2017-12-01
The long wet season of the Amazon basin currently plays a critical role in the terrestrial ecosystem, regulating carbon balance and supporting high biodiversity. It has been argued that the land surface processes are important in maintaining high precipitation; yet, how the land-atmosphere interactions modulate the atmospheric processes are not completely understood. As a first step toward solving this problem, here we examine the vertical structures of clouds and the thermodynamics of the atmosphere over the entire basin at the different time of the year. We combine the vertical distribution of cloud water content from CloudSat, and the atmospheric thermodynamic conditions from the ECMWF ERA-interim reanalysis to compare and contrast the atmospheric condition at different time of the year-the wet, dry, and dry-to-wet transition seasons-and in different regions-ever-wet evergreen broadleaf forests, wet evergreen broadleaf forests with a dry season, and dry wooded grasslands/woodlands-following water stress gradient. In the ever-wet and wet regions, a large amount of cloud ice water is present in the upper atmosphere (above 11km) and convective available potential energy (CAPE) is high during the transition season, supporting the claim that the convective activity is strongest during the transition season. In the dry region, there are more cloud water above 8km over woodlands than over wooded grasslands during the dry and transition seasons, indicating the influence of the land cover. We also classified our data following the large-scale circulation pattern, and the CloudSat data support more deep convective activities in the wet and dry regions when the wind blows from the east during the wet and transition seasons. As a next step, we will focus more on linking the cloud structure to the large-scale circulation and surface processes.
Effect of Amazon Smoke on Cloud Microphysics and Albedo-Analysis from Satellite Imagery.
NASA Astrophysics Data System (ADS)
Kaufman, Yoram J.; Nakajima, Teruyuki
1993-04-01
NOAA Advanced Very High Resolution Radiometer images taken over the Brazilian Amazon Basin during the biomass burning season of 1987 are used to study the effect of smoke aerosol particles on the properties of low cumulus and stratocumulus clouds. The reflectance at a wavelength of 0.64 µm and the drop size, derived from the cloud reflectance at 3.75 µm, are studied for tens of thousands of clouds. The opacity of the smoke layer adjacent to each cloud is also monitored simultaneously. Though from satellite data it is impossible to derive all the parameters that influence cloud properties and smoke cloud interaction (e.g., detailed aerosol particles size distribution and chemistry, liquid water content, etc.); satellite data can be used to generate large-scale statistics of the properties of clouds and surrounding aerosol (e.g., smoke optical thickness, cloud-drop size, and cloud reflection of solar radiation) from which the interaction of aerosol with clouds can be surmised. In order to minimize the effect of variations in the precipitable water vapor and in other smoke and cloud properties, biomass burning in the tropics is chosen as the study topic, and the results are averaged for numerous clouds with the same ambient smoke optical thickness.It is shown in this study that the presence of dense smoke (an increase in the optical thickness from 0.1 to 2.0) can reduce the remotely sensed drop size of continental cloud drops from 15 to 9 µm. Due to both the high initial reflectance of clouds in the visible part of the spectrum and the presence of graphitic carbon, the average cloud reflectance at 0.64 µm is reduced from 0.71 to 0.68 for an increase in smoke optical thickness from 0.1 to 2.0. The measurements are compared to results from other years, and it is found that, as predicted, high concentration of aerosol particles causes a decrease in the cloud-drop size and that smoke darkens the bright Amazonian clouds. Comparison with theoretical computations based
Introduction: Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5)
NASA Astrophysics Data System (ADS)
Martin, S. T.; Artaxo, P.; Machado, L. A. T.; Manzi, A. O.; Souza, R. A. F.; Schumacher, C.; Wang, J.; Andreae, M. O.; Barbosa, H. M. J.; Fan, J.; Fisch, G.; Goldstein, A. H.; Guenther, A.; Jimenez, J. L.; Pöschl, U.; Silva Dias, M. A.; Smith, J. N.; Wendisch, M.
2016-04-01
The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin for 2 years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from the Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the introduction to the special issue of GoAmazon2014/5, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G-1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the 2-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. The G-1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs also correspond to the clean and
NASA Astrophysics Data System (ADS)
Arko, S. A.; Hogenson, R.; Geiger, A.; Herrmann, J.; Buechler, B.; Hogenson, K.
2016-12-01
In the coming years there will be an unprecedented amount of SAR data available on a free and open basis to research and operational users around the globe. The Alaska Satellite Facility (ASF) DAAC hosts, through an international agreement, data from the Sentinel-1 spacecraft and will be hosting data from the upcoming NASA ISRO SAR (NISAR) mission. To more effectively manage and exploit these vast datasets, ASF DAAC has begun moving portions of the archive to the cloud and utilizing cloud services to provide higher-level processing on the data. The Hybrid Pluggable Processing Pipeline (HyP3) project is designed to support higher-level data processing in the cloud and extend the capabilities of researchers to larger scales. Built upon a set of core Amazon cloud services, the HyP3 system allows users to request data processing using a number of canned algorithms or their own algorithms once they have been uploaded to the cloud. The HyP3 system automatically accesses the ASF cloud-based archive through the DAAC RESTful application programming interface and processes the data on Amazon's elastic compute cluster (EC2). Final products are distributed through Amazon's simple storage service (S3) and are available for user download. This presentation will provide an overview of ASF DAAC's activities moving the Sentinel-1 archive into the cloud and developing the integrated HyP3 system, covering both the benefits and difficulties of working in the cloud. Additionally, we will focus on the utilization of HyP3 for higher-level processing of SAR data. Two example algorithms, for sea-ice tracking and change detection, will be discussed as well as the mechanism for integrating new algorithms into the pipeline for community use.
An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Little, M. M.
2013-12-01
NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.
Introduction: Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5)
NASA Astrophysics Data System (ADS)
Martin, S. T.; Artaxo, P.; Machado, L. A. T.; Manzi, A. O.; Souza, R. A. F.; Schumacher, C.; Wang, J.; Andreae, M. O.; Barbosa, H. M. J.; Fan, J.; Fisch, G.; Goldstein, A. H.; Guenther, A.; Jimenez, J. L.; Pöschl, U.; Silva Dias, M. A.; Smith, J. N.; Wendisch, M.
2015-11-01
The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin during two years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from the Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the Introduction to the GoAmazon2014/5 Special Issue, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the two-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. The G1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs also correspond to the clean
NASA Cloud-Based Climate Data Services
NASA Astrophysics Data System (ADS)
McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.
2012-12-01
Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.
Introduction: Observations and modeling of the Green Ocean Amazon (GoAmazon2014/5)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, S. T.; Artaxo, P.; Machado, L. A. T.
The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin for 2 years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from themore » Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the introduction to the special issue of GoAmazon2014/5, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G-1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the 2-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. In addition, the G-1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs
Introduction: Observations and modeling of the Green Ocean Amazon (GoAmazon2014/5)
Martin, S. T.; Artaxo, P.; Machado, L. A. T.; ...
2016-04-19
The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin for 2 years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from themore » Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the introduction to the special issue of GoAmazon2014/5, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G-1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the 2-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. In addition, the G-1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs
MarDRe: efficient MapReduce-based removal of duplicate DNA reads in the cloud.
Expósito, Roberto R; Veiga, Jorge; González-Domínguez, Jorge; Touriño, Juan
2017-09-01
This article presents MarDRe, a de novo cloud-ready duplicate and near-duplicate removal tool that can process single- and paired-end reads from FASTQ/FASTA datasets. MarDRe takes advantage of the widely adopted MapReduce programming model to fully exploit Big Data technologies on cloud-based infrastructures. Written in Java to maximize cross-platform compatibility, MarDRe is built upon the open-source Apache Hadoop project, the most popular distributed computing framework for scalable Big Data processing. On a 16-node cluster deployed on the Amazon EC2 cloud platform, MarDRe is up to 8.52 times faster than a representative state-of-the-art tool. Source code in Java and Hadoop as well as a user's guide are freely available under the GNU GPLv3 license at http://mardre.des.udc.es . rreye@udc.es. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Hayley, Kevin; Schumacher, J.; MacMillan, G. J.; Boutin, L. C.
2014-05-01
Expanding groundwater datasets collected by automated sensors, and improved groundwater databases, have caused a rapid increase in calibration data available for groundwater modeling projects. Improved methods of subsurface characterization have increased the need for model complexity to represent geological and hydrogeological interpretations. The larger calibration datasets and the need for meaningful predictive uncertainty analysis have both increased the degree of parameterization necessary during model calibration. Due to these competing demands, modern groundwater modeling efforts require a massive degree of parallelization in order to remain computationally tractable. A methodology for the calibration of highly parameterized, computationally expensive models using the Amazon EC2 cloud computing service is presented. The calibration of a regional-scale model of groundwater flow in Alberta, Canada, is provided as an example. The model covers a 30,865-km2 domain and includes 28 hydrostratigraphic units. Aquifer properties were calibrated to more than 1,500 static hydraulic head measurements and 10 years of measurements during industrial groundwater use. Three regionally extensive aquifers were parameterized (with spatially variable hydraulic conductivity fields), as was the aerial recharge boundary condition, leading to 450 adjustable parameters in total. The PEST-based model calibration was parallelized on up to 250 computing nodes located on Amazon's EC2 servers.
GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.
Liu, Yangchuan; Tang, Yuguo; Gao, Xin
2017-12-01
The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.
Amazon boundary layer aerosol concentration sustained by vertical transport during rainfall
NASA Astrophysics Data System (ADS)
Wang, Jian; Krejci, Radovan; Giangrande, Scott; Kuang, Chongai; Barbosa, Henrique M. J.; Brito, Joel; Carbone, Samara; Chi, Xuguang; Comstock, Jennifer; Ditas, Florian; Lavric, Jost; Manninen, Hanna E.; Mei, Fan; Moran-Zuloaga, Daniel; Pöhlker, Christopher; Pöhlker, Mira L.; Saturno, Jorge; Schmid, Beat; Souza, Rodrigo A. F.; Springston, Stephen R.; Tomlinson, Jason M.; Toto, Tami; Walter, David; Wimmer, Daniela; Smith, James N.; Kulmala, Markku; Machado, Luiz A. T.; Artaxo, Paulo; Andreae, Meinrat O.; Petäjä, Tuukka; Martin, Scot T.
2016-11-01
The nucleation of atmospheric vapours is an important source of new aerosol particles that can subsequently grow to form cloud condensation nuclei in the atmosphere. Most field studies of atmospheric aerosols over continents are influenced by atmospheric vapours of anthropogenic origin (for example, ref. 2) and, in consequence, aerosol processes in pristine, terrestrial environments remain poorly understood. The Amazon rainforest is one of the few continental regions where aerosol particles and their precursors can be studied under near-natural conditions, but the origin of small aerosol particles that grow into cloud condensation nuclei in the Amazon boundary layer remains unclear. Here we present aircraft- and ground-based measurements under clean conditions during the wet season in the central Amazon basin. We find that high concentrations of small aerosol particles (with diameters of less than 50 nanometres) in the lower free troposphere are transported from the free troposphere into the boundary layer during precipitation events by strong convective downdrafts and weaker downward motions in the trailing stratiform region. This rapid vertical transport can help to maintain the population of particles in the pristine Amazon boundary layer, and may therefore influence cloud properties and climate under natural conditions.
Amazon boundary layer aerosol concentration sustained by vertical transport during rainfall.
Wang, Jian; Krejci, Radovan; Giangrande, Scott; Kuang, Chongai; Barbosa, Henrique M J; Brito, Joel; Carbone, Samara; Chi, Xuguang; Comstock, Jennifer; Ditas, Florian; Lavric, Jost; Manninen, Hanna E; Mei, Fan; Moran-Zuloaga, Daniel; Pöhlker, Christopher; Pöhlker, Mira L; Saturno, Jorge; Schmid, Beat; Souza, Rodrigo A F; Springston, Stephen R; Tomlinson, Jason M; Toto, Tami; Walter, David; Wimmer, Daniela; Smith, James N; Kulmala, Markku; Machado, Luiz A T; Artaxo, Paulo; Andreae, Meinrat O; Petäjä, Tuukka; Martin, Scot T
2016-11-17
The nucleation of atmospheric vapours is an important source of new aerosol particles that can subsequently grow to form cloud condensation nuclei in the atmosphere. Most field studies of atmospheric aerosols over continents are influenced by atmospheric vapours of anthropogenic origin (for example, ref. 2) and, in consequence, aerosol processes in pristine, terrestrial environments remain poorly understood. The Amazon rainforest is one of the few continental regions where aerosol particles and their precursors can be studied under near-natural conditions, but the origin of small aerosol particles that grow into cloud condensation nuclei in the Amazon boundary layer remains unclear. Here we present aircraft- and ground-based measurements under clean conditions during the wet season in the central Amazon basin. We find that high concentrations of small aerosol particles (with diameters of less than 50 nanometres) in the lower free troposphere are transported from the free troposphere into the boundary layer during precipitation events by strong convective downdrafts and weaker downward motions in the trailing stratiform region. This rapid vertical transport can help to maintain the population of particles in the pristine Amazon boundary layer, and may therefore influence cloud properties and climate under natural conditions.
Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan
2013-06-27
Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available
Adopting Cloud Computing in the Pakistan Navy
2015-06-01
administrative aspect is required to operate optimally, provide synchronized delivery of cloud services, and integrate multi-provider cloud environment...AND ABBREVIATIONS ANSI American National Standards Institute AWS Amazon web services CIA Confidentiality Integrity Availability CIO Chief...also adopted cloud computing as an integral component of military operations conducted either locally or remotely. With the use of 2 cloud services
Integration of Cloud resources in the LHCb Distributed Computing
NASA Astrophysics Data System (ADS)
Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel
2014-06-01
This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.
Toward a web-based real-time radiation treatment planning system in a cloud computing environment.
Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei
2013-09-21
To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are
Toward a web-based real-time radiation treatment planning system in a cloud computing environment
NASA Astrophysics Data System (ADS)
Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei
2013-09-01
To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical
CloVR-Comparative: automated, cloud-enabled comparative microbial genome sequence analysis pipeline.
Agrawal, Sonia; Arze, Cesar; Adkins, Ricky S; Crabtree, Jonathan; Riley, David; Vangala, Mahesh; Galens, Kevin; Fraser, Claire M; Tettelin, Hervé; White, Owen; Angiuoli, Samuel V; Mahurkar, Anup; Fricke, W Florian
2017-04-27
The benefit of increasing genomic sequence data to the scientific community depends on easy-to-use, scalable bioinformatics support. CloVR-Comparative combines commonly used bioinformatics tools into an intuitive, automated, and cloud-enabled analysis pipeline for comparative microbial genomics. CloVR-Comparative runs on annotated complete or draft genome sequences that are uploaded by the user or selected via a taxonomic tree-based user interface and downloaded from NCBI. CloVR-Comparative runs reference-free multiple whole-genome alignments to determine unique, shared and core coding sequences (CDSs) and single nucleotide polymorphisms (SNPs). Output includes short summary reports and detailed text-based results files, graphical visualizations (phylogenetic trees, circular figures), and a database file linked to the Sybil comparative genome browser. Data up- and download, pipeline configuration and monitoring, and access to Sybil are managed through CloVR-Comparative web interface. CloVR-Comparative and Sybil are distributed as part of the CloVR virtual appliance, which runs on local computers or the Amazon EC2 cloud. Representative datasets (e.g. 40 draft and complete Escherichia coli genomes) are processed in <36 h on a local desktop or at a cost of <$20 on EC2. CloVR-Comparative allows anybody with Internet access to run comparative genomics projects, while eliminating the need for on-site computational resources and expertise.
Giovanni in the Cloud: Earth Science Data Exploration in Amazon Web Services
NASA Astrophysics Data System (ADS)
Hegde, M.; Petrenko, M.; Smit, C.; Zhang, H.; Pilone, P.; Zasorin, A. A.; Pham, L.
2017-12-01
Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a popular online data exploration tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), providing 22 analysis and visualization services for over 1600 Earth Science data variables. Owing to its popularity, Giovanni has experienced a consistent growth in overall demand, with periodic usage spikes attributed to trainings by education organizations, extensive data analysis in response to natural disasters, preparations for science meetings, etc. Furthermore, the new generation of spaceborne sensors and high resolution models have resulted in an exponential growth in data volume with data distributed across the traditional boundaries of datacenters. Seamless exploration of data (without users having to worry about data center boundaries) has been a key recommendation of the GES DISC User Working Group. These factors have required new strategies for delivering acceptable performance. The cloud-based Giovanni, built on Amazon Web Services (AWS), evaluates (1) AWS native solutions to provide a scalable, serverless architecture; (2) open standards for data storage in the Cloud; (3) a cost model for operations; and (4) end-user performance. Our preliminary findings indicate that the use of serverless architecture has a potential to significantly reduce development and operational cost of Giovanni. The combination of using AWS managed services, storage of data in open standards, and schema-on-read data access strategy simplifies data access and analytics, in addition to making data more accessible to the end users of Giovanni through popular programming languages.
Giovanni in the Cloud: Earth Science Data Exploration in Amazon Web Services
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Hegde, Mahabal; Smit, Christine; Zhang, Hailiang; Pilone, Paul; Zasorin, Andrey A.; Pham, Long
2017-01-01
Giovanni is an exploration tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), providing 22 analysis and visualization services for over 1600 Earth Science data variables. Owing to its popularity, Giovanni has experienced a consistent growth in overall demand, with periodic usage spikes attributed to trainings by education organizations, extensive data analysis in response to natural disasters, preparations for science meetings, etc. Furthermore, the new generation of spaceborne sensors and high resolution models have resulted in an exponential growth in data volume with data distributed across the traditional boundaries of data centers. Seamless exploration of data (without users having to worry about data center boundaries) has been a key recommendation of the GES DISC User Working Group. These factors have required new strategies for delivering acceptable performance. The cloud-based Giovanni, built on Amazon Web Services (AWS), evaluates (1) AWS native solutions to provide a scalable, serverless architecture; (2) open standards for data storage in the Cloud; (3) a cost model for operations; and (4) end-user performance. Our preliminary findings indicate that the use of serverless architecture has a potential to significantly reduce development and operational cost of Giovanni. The combination of using AWS managed services, storage of data in open standards, and schema-on-read data access strategy simplifies data access and analytics, in addition to making data more accessible to the end users of Giovanni through popular programming languages.
Observational constraints on mixed-phase clouds imply higher climate sensitivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Ivy; Storelvmo, Trude; Zelinka, Mark D.
Global climate model (GCM) estimates of the equilibrium global mean surface temperature response to a doubling of atmospheric CO 2, measured by the equilibrium climate sensitivity (ECS), range from 2.0° to 4.6°C. Clouds are among the leading causes of this uncertainty. Here, in this paper, we show that the ECS can be up to 1.3°C higher in simulations where mixed-phase clouds consisting of ice crystals and supercooled liquid droplets are constrained by global satellite observations. The higher ECS estimates are directly linked to a weakened cloud-phase feedback arising from a decreased cloud glaciation rate in a warmer climate. Finally, wemore » point out the need for realistic representations of the supercooled liquid fraction in mixed-phase clouds in GCMs, given the sensitivity of the ECS to the cloud-phase feedback.« less
Observational constraints on mixed-phase clouds imply higher climate sensitivity
Tan, Ivy; Storelvmo, Trude; Zelinka, Mark D.
2016-04-08
Global climate model (GCM) estimates of the equilibrium global mean surface temperature response to a doubling of atmospheric CO 2, measured by the equilibrium climate sensitivity (ECS), range from 2.0° to 4.6°C. Clouds are among the leading causes of this uncertainty. Here, in this paper, we show that the ECS can be up to 1.3°C higher in simulations where mixed-phase clouds consisting of ice crystals and supercooled liquid droplets are constrained by global satellite observations. The higher ECS estimates are directly linked to a weakened cloud-phase feedback arising from a decreased cloud glaciation rate in a warmer climate. Finally, wemore » point out the need for realistic representations of the supercooled liquid fraction in mixed-phase clouds in GCMs, given the sensitivity of the ECS to the cloud-phase feedback.« less
Observational constraints on mixed-phase clouds imply higher climate sensitivity.
Tan, Ivy; Storelvmo, Trude; Zelinka, Mark D
2016-04-08
Global climate model (GCM) estimates of the equilibrium global mean surface temperature response to a doubling of atmospheric CO2, measured by the equilibrium climate sensitivity (ECS), range from 2.0° to 4.6°C. Clouds are among the leading causes of this uncertainty. Here we show that the ECS can be up to 1.3°C higher in simulations where mixed-phase clouds consisting of ice crystals and supercooled liquid droplets are constrained by global satellite observations. The higher ECS estimates are directly linked to a weakened cloud-phase feedback arising from a decreased cloud glaciation rate in a warmer climate. We point out the need for realistic representations of the supercooled liquid fraction in mixed-phase clouds in GCMs, given the sensitivity of the ECS to the cloud-phase feedback. Copyright © 2016, American Association for the Advancement of Science.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.
2009-01-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578
Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N
2009-06-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).
Redundancy and Replication Help Make Your Systems Stress-Free
ERIC Educational Resources Information Center
Mitchell, Erik
2011-01-01
In mid-April, Amazon EC2 services had a small problem. Apparently, a large swath of its cloud computing environment had such substantial trouble that a number of customers had server issues. A number of high-profile sites, including Reddit, Evite, and Foursquare, went down when Amazon experienced issues in their US East 1a region (Justinb 2011).…
NASA Astrophysics Data System (ADS)
Weeden, R.; Horn, W. B.; Dimarchi, H.; Arko, S. A.; Hogenson, K.
2017-12-01
A problem often faced by Earth science researchers is the question of how to scale algorithms that were developed against few datasets and take them to regional or global scales. This problem only gets worse as we look to a future with larger and larger datasets becoming available. One significant hurdle can be having the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon cloud services such as Lambda, Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. HyP3 provides an Application Programming Interface (API) through which users can programmatically interface with the HyP3 system; allowing them to monitor and control processing jobs running in HyP3, and retrieve the generated HyP3 products when completed. This presentation will focus on the development techniques and enabling technologies that were used in developing the HyP3 system. Data and process flow, from new subscription through to order completion will be shown, highlighting the benefits of the cloud for each step. Because the HyP3 system can be accessed directly from a user's Python scripts, powerful applications leveraging SAR products can be put together fairly easily. This is the true power of HyP3; allowing people to programmatically leverage the power of the cloud.
Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing
2013-01-01
Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of
AceCloud: Molecular Dynamics Simulations in the Cloud.
Harvey, M J; De Fabritiis, G
2015-05-26
We present AceCloud, an on-demand service for molecular dynamics simulations. AceCloud is designed to facilitate the secure execution of large ensembles of simulations on an external cloud computing service (currently Amazon Web Services). The AceCloud client, integrated into the ACEMD molecular dynamics package, provides an easy-to-use interface that abstracts all aspects of interaction with the cloud services. This gives the user the experience that all simulations are running on their local machine, minimizing the learning curve typically associated with the transition to using high performance computing services.
Atmosphere-biosphere exchange of CO2 and O3 in the Central Amazon Forest
NASA Technical Reports Server (NTRS)
Fan, Song-Miao; Wofsy, Steven C.; Bakwin, Peter S.; Jacob, Daniel J.; Fitzjarrald, David R.
1990-01-01
An eddy correlation measurement of O3 deposition and CO2 exchange at a level 10 m above the canopy of the Amazon forest, conducted as part of the NASA/INPE ABLE2b mission during the wet season of 1987, is presented. It was found that the ecosystem exchange of CO2 undergoes a well-defined diurnal variation driven by the input of solar radiation. A curvilinear relationship was found between solar irradiance and uptake of CO2, with net CO2 uptake at a given solar irradiance equal to rates observed over forests in other climate zones. The carbon balance of the system appeared sensitive to cloud cover on the time scale of the experiment, suggesting that global carbon storage might be affected by changes in insolation associated with tropical climate fluctuations. The forest was found to be an efficient sink for O3 during the day, and evidence indicates that the Amazon forests could be a significant sink for global ozone during the nine-month wet period and that deforestation could dramatically alter O3 budgets.
ERIC Educational Resources Information Center
Fredette, Michelle
2012-01-01
"Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…
NASA Technical Reports Server (NTRS)
Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent
2017-01-01
This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.
Leveraging the Cloud for Robust and Efficient Lunar Image Processing
NASA Technical Reports Server (NTRS)
Chang, George; Malhotra, Shan; Wolgast, Paul
2011-01-01
The Lunar Mapping and Modeling Project (LMMP) is tasked to aggregate lunar data, from the Apollo era to the latest instruments on the LRO spacecraft, into a central repository accessible by scientists and the general public. A critical function of this task is to provide users with the best solution for browsing the vast amounts of imagery available. The image files LMMP manages range from a few gigabytes to hundreds of gigabytes in size with new data arriving every day. Despite this ever-increasing amount of data, LMMP must make the data readily available in a timely manner for users to view and analyze. This is accomplished by tiling large images into smaller images using Hadoop, a distributed computing software platform implementation of the MapReduce framework, running on a small cluster of machines locally. Additionally, the software is implemented to use Amazon's Elastic Compute Cloud (EC2) facility. We also developed a hybrid solution to serve images to users by leveraging cloud storage using Amazon's Simple Storage Service (S3) for public data while keeping private information on our own data servers. By using Cloud Computing, we improve upon our local solution by reducing the need to manage our own hardware and computing infrastructure, thereby reducing costs. Further, by using a hybrid of local and cloud storage, we are able to provide data to our users more efficiently and securely. 12 This paper examines the use of a distributed approach with Hadoop to tile images, an approach that provides significant improvements in image processing time, from hours to minutes. This paper describes the constraints imposed on the solution and the resulting techniques developed for the hybrid solution of a customized Hadoop infrastructure over local and cloud resources in managing this ever-growing data set. It examines the performance trade-offs of using the more plentiful resources of the cloud, such as those provided by S3, against the bandwidth limitations such use
STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.
Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T
2014-01-01
The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Z; Gao, M
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster softwaremore » developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.« less
Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing
NASA Astrophysics Data System (ADS)
Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim
2011-03-01
Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.
Amazon boundary layer aerosol concentration sustained by vertical transport during rainfall
Wang, Jian; Krejci, Radovan; Giangrande, Scott; ...
2016-10-24
The nucleation of atmospheric vapours is an important source of new aerosol particles that can subsequently grow to form cloud condensation nuclei in the atmosphere. Most field studies of atmospheric aerosols over continents are influenced by atmospheric vapours of anthropogenic origin and, in consequence, aerosol processes in pristine, terrestrial environments remain poorly understood. The Amazon rainforest is one of the few continental regions where aerosol particles and their precursors can be studied under near-natural conditions, but the origin of small aerosol particles that grow into cloud condensation nuclei in the Amazon boundary layer remains unclear. Here we present aircraft- andmore » ground-based measurements under clean conditions during the wet season in the central Amazon basin. We find that high concentrations of small aerosol particles (with diameters of less than 50 nanometres) in the lower free troposphere are transported from the free troposphere into the boundary layer during precipitation events by strong convective downdrafts and weaker downward motions in the trailing stratiform region. Lastly, this rapid vertical transport can help to maintain the population of particles in the pristine Amazon boundary layer, and may therefore influence cloud properties and climate under natural conditions.« less
NASA Astrophysics Data System (ADS)
Montalto, F. A.; Yu, Z.; Soldner, K.; Israel, A.; Fritch, M.; Kim, Y.; White, S.
2017-12-01
Urban stormwater utilities are increasingly using decentralized "green" infrastructure (GI) systems to capture stormwater and achieve compliance with regulations. Because environmental conditions, and design varies by GSI facility, monitoring of GSI systems under a range of conditions is essential. Conventional monitoring efforts can be costly because in-field data logging requires intense data transmission rates. The Internet of Things (IoT) can be used to more cost-effectively collect, store, and publish GSI monitoring data. Using 3G mobile networks, a cloud-based database was built on an Amazon Web Services (AWS) EC2 virtual machine to store and publish data collected with environmental sensors deployed in the field. This database can store multi-dimensional time series data, as well as photos and other observations logged by citizen scientists through a public engagement mobile app through a new Application Programming Interface (API). Also on the AWS EC2 virtual machine, a real-time QAQC flagging algorithm was developed to validate the sensor data streams.
Mouths of the Amazon River, Brazil, South America
1992-08-08
STS046-80-009 (31 July-8 Aug. 1992) --- A view of the mouth of the Amazon River and the Amazon Delta shows a large sediment plume expanding outward into the Atlantic Ocean. The sediment plume can be seen hugging the coast north of the Delta. This is caused by the west-northwest flowing Guyana Current. The large island of Marajo is partially visible through the clouds.
SIMPLEX: Cloud-Enabled Pipeline for the Comprehensive Analysis of Exome Sequencing Data
Fischer, Maria; Snajder, Rene; Pabinger, Stephan; Dander, Andreas; Schossig, Anna; Zschocke, Johannes; Trajanoski, Zlatko; Stocker, Gernot
2012-01-01
In recent studies, exome sequencing has proven to be a successful screening tool for the identification of candidate genes causing rare genetic diseases. Although underlying targeted sequencing methods are well established, necessary data handling and focused, structured analysis still remain demanding tasks. Here, we present a cloud-enabled autonomous analysis pipeline, which comprises the complete exome analysis workflow. The pipeline combines several in-house developed and published applications to perform the following steps: (a) initial quality control, (b) intelligent data filtering and pre-processing, (c) sequence alignment to a reference genome, (d) SNP and DIP detection, (e) functional annotation of variants using different approaches, and (f) detailed report generation during various stages of the workflow. The pipeline connects the selected analysis steps, exposes all available parameters for customized usage, performs required data handling, and distributes computationally expensive tasks either on a dedicated high-performance computing infrastructure or on the Amazon cloud environment (EC2). The presented application has already been used in several research projects including studies to elucidate the role of rare genetic diseases. The pipeline is continuously tested and is publicly available under the GPL as a VirtualBox or Cloud image at http://simplex.i-med.ac.at; additional supplementary data is provided at http://www.icbi.at/exome. PMID:22870267
Automating NEURON Simulation Deployment in Cloud Resources.
Stockton, David B; Santamaria, Fidel
2017-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.
Deploying Crowd-Sourced Formal Verification Systems in a DoD Network
2013-09-01
INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. INTRODUCTION In 2014 cyber attacks on critical infrastructure are expected to increase...CSFV systems on the Internet‒‒possibly using cloud infrastructure (Dean, 2013). By using Amazon Compute Cloud (EC2) systems, DARPA will use ordinary...through standard access methods. Those clients could be mobile phones, laptops, netbooks, tablet computers or personal digital assistants (PDAs) (Smoot
STORMSeq: An Open-Source, User-Friendly Pipeline for Processing Personal Genomics Data in the Cloud
Karczewski, Konrad J.; Fernald, Guy Haskin; Martin, Alicia R.; Snyder, Michael; Tatonetti, Nicholas P.; Dudley, Joel T.
2014-01-01
The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5–10 hours to process a full exome sequence and $30 and 3–8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756
Cloudiness over the Amazon rainforest: Meteorology and thermodynamics
NASA Astrophysics Data System (ADS)
Collow, Allison B. Marquardt; Miller, Mark A.; Trabachino, Lynne C.
2016-07-01
Comprehensive meteorological observations collected during GOAmazon2014/15 using the Atmospheric Radiation Measurement Mobile Facility no. 1 and assimilated observations from the Modern-Era Retrospective Analysis for Research and Applications, Version 2 are used to document the seasonal cycle of cloudiness, thermodynamics, and precipitation above the Amazon rainforest. The reversal of synoptic-scale vertical motions modulates the transition between the wet and dry seasons. Ascending moist air during the wet season originates near the surface of the Atlantic Ocean and is advected into the Amazon rainforest, where it experiences convergence and, ultimately, precipitates. The dry season is characterized by weaker winds and synoptic-scale subsidence with little or no moisture convergence accompanying moisture advection. This combination results in the drying of the midtroposphere during June through October as indicated by a decrease in liquid water path, integrated water, and the vertical profile of water vapor mixing ratio. The vertical profile of cloud fraction exhibits a relatively consistent decline in cloud fraction from the lifting condensation level (LCL) to the freezing level where a minimum is observed, unlike many other tropical regions. Coefficients of determination between the LCL and cloud fractional coverage suggest a relatively robust relationship between the LCL and cloudiness beneath 5 km during the dry season (R2 = 0.42) but a weak relationship during the wet season (0.12).
NASA Astrophysics Data System (ADS)
Gehrcke, Jan-Philip; Kluth, Stefan; Stonjek, Stefan
2010-04-01
We show how the ATLAS offline software is ported on the Amazon Elastic Compute Cloud (EC2). We prepare an Amazon Machine Image (AMI) on the basis of the standard ATLAS platform Scientific Linux 4 (SL4). Then an instance of the SLC4 AMI is started on EC2 and we install and validate a recent release of the ATLAS offline software distribution kit. The installed software is archived as an image on the Amazon Simple Storage Service (S3) and can be quickly retrieved and connected to new SL4 AMI instances using the Amazon Elastic Block Store (EBS). ATLAS jobs can then configure against the release kit using the ATLAS configuration management tool (cmt) in the standard way. The output of jobs is exported to S3 before the SL4 AMI is terminated. Job status information is transferred to the Amazon SimpleDB service. The whole process of launching instances of our AMI, starting, monitoring and stopping jobs and retrieving job output from S3 is controlled from a client machine using python scripts implementing the Amazon EC2/S3 API via the boto library working together with small scripts embedded in the SL4 AMI. We report our experience with setting up and operating the system using standard ATLAS job transforms.
Spectrometry of Pasture Condition and Biogeochemistry in the Central Amazon
NASA Technical Reports Server (NTRS)
Asner, Gregory P.; Townsend, Alan R.; Bustamante, Mercedes M. C.
1999-01-01
Regional analyses of Amazon cattle pasture biogeochemistry are difficult due to the complexity of human, edaphic, biotic and climatic factors and persistent cloud cover in satellite observations. We developed a method to estimate key biophysical properties of Amazon pastures using hyperspectral reflectance data and photon transport inverse modeling. Remote estimates of live and senescent biomass were strongly correlated with plant-available forms of soil phosphorus and calcium. These results provide a basis for monitoring pasture condition and biogeochemistry in the Amazon Basin using spaceborne hyperspectral sensors.
An interactive web-based system using cloud for large-scale visual analytics
NASA Astrophysics Data System (ADS)
Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.
2015-03-01
Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.
A Tale Of 160 Scientists, Three Applications, a Workshop and a Cloud
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Brinkworth, C.; Gelino, D.; Wittman, D. K.; Deelman, E.; Juve, G.; Rynge, M.; Kinney, J.
2013-10-01
The NASA Exoplanet Science Institute (NExScI) hosts the annual Sagan Workshops, thematic meetings aimed at introducing researchers to the latest tools and methodologies in exoplanet research. The theme of the Summer 2012 workshop, held from July 23 to July 27 at Caltech, was to explore the use of exoplanet light curves to study planetary system architectures and atmospheres. A major part of the workshop was to use hands-on sessions to instruct attendees in the use of three open source tools for the analysis of light curves, especially from the Kepler mission. Each hands-on session involved the 160 attendees using their laptops to follow step-by-step tutorials given by experts. One of the applications, PyKE, is a suite of Python tools designed to reduce and analyze Kepler light curves; these tools can be invoked from the Unix command line or a GUI in PyRAF. The Transit Analysis Package (TAP) uses Markov Chain Monte Carlo (MCMC) techniques to fit light curves under the Interactive Data Language (IDL) environment, and Transit Timing Variations (TTV) uses IDL tools and Java-based GUIs to confirm and detect exoplanets from timing variations in light curve fitting. Rather than attempt to run these diverse applications on the inevitable wide range of environments on attendees laptops, they were run instead on the Amazon Elastic Cloud 2 (EC2). The cloud offers features ideal for this type of short term need: computing and storage services are made available on demand for as long as needed, and a processing environment can be customized and replicated as needed. The cloud environment included an NFS file server virtual machine (VM), 20 client VMs for use by attendees, and a VM to enable ftp downloads of the attendees' results. The file server was configured with a 1 TB Elastic Block Storage (EBS) volume (network-attached storage mounted as a device) containing the application software and attendees home directories. The clients were configured to mount the applications and
An Automatic Prediction of Epileptic Seizures Using Cloud Computing and Wireless Sensor Networks.
Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar
2016-11-01
Epilepsy is one of the most common neurological disorders which is characterized by the spontaneous and unforeseeable occurrence of seizures. An automatic prediction of seizure can protect the patients from accidents and save their life. In this article, we proposed a mobile-based framework that automatically predict seizures using the information contained in electroencephalography (EEG) signals. The wireless sensor technology is used to capture the EEG signals of patients. The cloud-based services are used to collect and analyze the EEG data from the patient's mobile phone. The features from the EEG signal are extracted using the fast Walsh-Hadamard transform (FWHT). The Higher Order Spectral Analysis (HOSA) is applied to FWHT coefficients in order to select the features set relevant to normal, preictal and ictal states of seizure. We subsequently exploit the selected features as input to a k-means classifier to detect epileptic seizure states in a reasonable time. The performance of the proposed model is tested on Amazon EC2 cloud and compared in terms of execution time and accuracy. The findings show that with selected HOS based features, we were able to achieve a classification accuracy of 94.6 %.
Calibration of radio-astronomical data on the cloud. LOFAR, the pathway to SKA
NASA Astrophysics Data System (ADS)
Sabater, J.; Sánchez-Expósito, S.; Garrido, J.; Ruiz, J. E.; Best, P. N.; Verdes-Montenegro, L.
2015-05-01
The radio interferometer LOFAR (LOw Frequency ARray) is fully operational now. This Square Kilometre Array (SKA) pathfinder allows the observation of the sky at frequencies between 10 and 240 MHz, a relatively unexplored region of the spectrum. LOFAR is a software defined telescope: the data is mainly processed using specialized software running in common computing facilities. That means that the capabilities of the telescope are virtually defined by software and mainly limited by the available computing power. However, the quantity of data produced can quickly reach huge volumes (several Petabytes per day). After the correlation and pre-processing of the data in a dedicated cluster, the final dataset is handled to the user (typically several Terabytes). The calibration of these data requires a powerful computing facility in which the specific state of the art software under heavy continuous development can be easily installed and updated. That makes this case a perfect candidate for a cloud infrastructure which adds the advantages of an on demand, flexible solution. We present our approach to the calibration of LOFAR data using Ibercloud, the cloud infrastructure provided by Ibergrid. With the calibration work-flow adapted to the cloud, we can explore calibration strategies for the SKA and show how private or commercial cloud infrastructures (Ibercloud, Amazon EC2, Google Compute Engine, etc.) can help to solve the problems with big datasets that will be prevalent in the future of astronomy.
Cloud-based Predictive Modeling System and its Application to Asthma Readmission Prediction
Chen, Robert; Su, Hang; Khalilia, Mohammed; Lin, Sizhe; Peng, Yue; Davis, Tod; Hirsh, Daniel A; Searles, Elizabeth; Tejedor-Sojo, Javier; Thompson, Michael; Sun, Jimeng
2015-01-01
The predictive modeling process is time consuming and requires clinical researchers to handle complex electronic health record (EHR) data in restricted computational environments. To address this problem, we implemented a cloud-based predictive modeling system via a hybrid setup combining a secure private server with the Amazon Web Services (AWS) Elastic MapReduce platform. EHR data is preprocessed on a private server and the resulting de-identified event sequences are hosted on AWS. Based on user-specified modeling configurations, an on-demand web service launches a cluster of Elastic Compute 2 (EC2) instances on AWS to perform feature selection and classification algorithms in a distributed fashion. Afterwards, the secure private server aggregates results and displays them via interactive visualization. We tested the system on a pediatric asthma readmission task on a de-identified EHR dataset of 2,967 patients. We conduct a larger scale experiment on the CMS Linkable 2008–2010 Medicare Data Entrepreneurs’ Synthetic Public Use File dataset of 2 million patients, which achieves over 25-fold speedup compared to sequential execution. PMID:26958172
NASA Astrophysics Data System (ADS)
Pauliquevis, T.; Alves, C. F.; Barbosa, H. M.
2016-12-01
Previous studies in Amazon have shown a clear discrepance between models and observations of convection. From the observational stand point convection in Amazonia has a typical diurnal cycle, which is characterized by shallow convection and followed by shallow to deep transition (usually in early afternoon) and rain. Differently, numerical models based in cumulus parameterizations put heavy rain in the early hours of the morning. In this context, observations are crucial both to constraint as well to validate improvement in models. In this study we investigated statistical properties of clouds, precipitation and convection employing several instruments operated during GoAmazon2014/5-DOE/ARM at Manacapuru, AM (Brazil) combined with Cloud Top Temperature data obtained by GOES. Previous studies (e.g. Adams et al., 2013) defined deep convection events as connected to rapid CTT decrease, PWV increase (convergence) and precipitation. They also observed that the average deep convection event has two characteristic time-scales of its formation, in the sense that water vapor convergence begins to build 12 hs before precipitation, with an invigoration 4 hs before rain occur. In this study we revisited this approach using GoAmazon2014/5 measurement with special focus to its statistical variability. Preliminar results for the wet season of 2014 showed that events with rapid decrease in CTT were associated with 60% of the observed precipitation at ground. Defining t0 as the central time of CTT (rapid) decrease and analyzing only events with rain volume > 10 mm it was possible to observe that precipitation maximums distributed around t0 with mean difference Δ = 24 ± 82 minutes. Most of events presented several maxima (up to 16), and the general structure was similar to beatings in oscillatory systems. In several cases eve the first maximum of rain rate was 1 hour shifted from t0. In this presentation, the above results will be discussed combined with radiometer measurements (T
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
simultaneous cloud nodes. 1. INTRODUCTION The proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as...Amazon Web Services and Google Compute Engine means more cloud tenants are hosting sensitive, private, and business critical data and applications in the...thousands of IaaS resources as they are elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features
NASA Astrophysics Data System (ADS)
Yu, H.; Chin, M.; Yuan, T.; Bian, H.; Prospero, J. M.; Omar, A. H.; Remer, L. A.; Winker, D. M.; Yang, Y.; Zhang, Y.; Zhang, Z.
2014-12-01
The productivity of Amazon rainforest is constrained by the availability of nutrients, in particular phosphorus (P). Deposition of transported African dust in boreal winter and spring is considered an important nutrient input for the Amazon Basin, though its magnitude is not well qunatified. This study provides a remote sensing observation-based estimate of dust deposition in the Amazon Basin using a 7-year (2007-2013) record of three dimensional (3D) distributions of aerosol in both cloud-free and above-cloud conditions from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP). It is estimated that the 7-year average of dust deposition into the Amazon Basin amounts to 15.1 ~ 32.1 Tg a-1 (Tg = 1012 g). This imported dust could provide 0.012 ~ 0.025 Tg P a-1 or equivalent to 12 ~ 26 g P ha-1 a-1 to fertilize the Amazon rainforest, which largely compensates the hydrological loss of P. The CLAIOP-based estimate agrees better with estimates from in-situ measurements and model simulations than what has been reported in literature. The closer agreement benefits from a more realistic geographic definition of the Amazon Basin and inclusion of meridional dust transport calculation in addition to the 3D nature of CALIOP aerosol measurements. The trans-Atlantic transport and deposition of dust shows strong interannual variations that are found to correlate with the North Atlantic Oscillation index in the winter season and anticorrelate with the prior-year Sahel Precipitation Index on an annual basis. Uncertainties associated with the estimate will also be discussed.
NASA Astrophysics Data System (ADS)
Gallagher, J. H. R.; Jelenak, A.; Potter, N.; Fulker, D. W.; Habermann, T.
2017-12-01
Providing data services based on cloud computing technology that is equivalent to those developed for traditional computing and storage systems is critical for successful migration to cloud-based architectures for data production, scientific analysis and storage. OPeNDAP Web-service capabilities (comprising the Data Access Protocol (DAP) specification plus open-source software for realizing DAP in servers and clients) are among the most widely deployed means for achieving data-as-service functionality in the Earth sciences. OPeNDAP services are especially common in traditional data center environments where servers offer access to datasets stored in (very large) file systems, and a preponderance of the source data for these services is being stored in the Hierarchical Data Format Version 5 (HDF5). Three candidate architectures for serving NASA satellite Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS) were developed and their performance examined for a set of representative use cases. The performance was based both on runtime and incurred cost. The three architectures differ in how HDF5 files are stored in the Amazon Simple Storage Service (S3) and how the Hyrax server (as an EC2 instance) retrieves their data. The results for both the serial and parallel access to HDF5 data in the S3 will be presented. While the study focused on HDF5 data, OPeNDAP and the Hyrax data server, the architectures are generic and the analysis can be extrapolated to many different data formats, web APIs, and data servers.
NASA Astrophysics Data System (ADS)
Pouchard, L. C.; Depriest, A.; Huhns, M.
2012-12-01
system on Amazon 's Elastic Compute Cloud (EC2) where ESIP maintains an account. Our approach had three phases: 1) set up a private cloud environment at the University of South Carolina to become familiar with the complex architecture of the system and enable some basic customization, 2) coordinate the production of a Virtual Appliance for the system with NCBO and deploy it on the Amazon cloud, and 3) outreach to the ESIP community to solicit participation, populate the repository, and develop new use cases. Phase 2 is nearing completion and Phase 3 is underway. Ontologies were gathered during updates to the ESIP cluster. Discussion points included the criteria for a shareable ontology and how to determine the best size for an ontology to be reusable. Outreach highlighted that the system can start addressing an integration of discovery frameworks via linking data and services in a pull model (data and service casting), a key issue of the Discovery cluster. This work thus presents several contributions: 1) technology injection from another domain into the earth sciences, 2) the deployment of a mature knowledge platform on the EC2 cloud, and 3) the successful engagement of the community through the ESIP clusters and Testbed model.
CCN numerical simulations for the GoAmazon with the OLAM model
NASA Astrophysics Data System (ADS)
Ramos-da-Silva, R.; Haas, R.; Barbosa, H. M.; Machado, L.
2015-12-01
Manaus is a large city in the center of the Amazon rainforest. The GoAmazon field project is exploring the region through various data collection and modeling to investigate in impacts of the urban polluted plume on the surrounding pristine areas. In this study a numerical model was applied to simulate the atmospheric dynamics and the Cloud Condensation Nucleai (CCN) concentrations evolution. Simulations with and without the urban plume was performed to identify its dynamics and local impacts. The results show that the land surface characteristics has important hole on the CCN distribution and rainfall over the region. At the south of Manaus the atmospheric dynamics is dominated by the cloud streets that are aligned with the trade winds and the Amazon River. At the north of Manaus, the Negro River produces the advection of a more stable atmosphere causing a higher CCN concentration on the boundary layer. Assuming a local high CCN concentration at the Manaus boundary layer region, the simulations show that the land-atmosphere interaction sets important dynamics on the plume. The model shows that the CCN plume moves along with the flow towards southwest of Manaus following the cloud streets and the river direction having the highest concentrations over the most stable water surface regions.
Cloud Computing Security Issue: Survey
NASA Astrophysics Data System (ADS)
Kamal, Shailza; Kaur, Rajpreet
2011-12-01
Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.
NASA Astrophysics Data System (ADS)
Siahaan, P.; Wuning, S.; Manna, A.; Prasasty, V. D.; Hudiyanti, D.
2018-04-01
Deeply understanding that intermolecular interaction between molecules on the paracellular pathway has given insight to its microscopic and macroscopic properties. In the paracellular pathway, synthetic cyclic ADTC1 (Ac-CADTPPVC-NH2) peptide has been studied to modulate EC1-EC2 domain, computationally using molecular docking method. The aim of this research is to probe the effect of amino acid alanine (A) of ADTC1 on its interaction properties. The study carried out in two steps: 1. the optimization using GROMACS v4.6.5 program and; 2. Determination of the interaction properties using AutoDock 4.2 program. The interaction was done for A-J box, and the best position of the binding site and binding energy on the OC and CC ADTC1 peptides against the EC1-EC2 domain of E-cadherin was selected. The result showed that the CC of the F box ADTC1 has the best interaction with binding energy of - 26.36 kJ/mol and its energy was lower than ADTC5 without alanine amino acid. ADTC1 interacted with EC1 of EC1-EC2 on Asp1, Trp2, Val3, Ile4, Ile24, Lys25, Ser26, Asn27, and Met92 residues.
Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.
Trudgian, David C; Mirzaei, Hamid
2012-12-07
We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.
Consolidation of cloud computing in ATLAS
NASA Astrophysics Data System (ADS)
Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration
2017-10-01
Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.
NASA Astrophysics Data System (ADS)
Yu, Hongbin; Chin, Mian; Yuan, Tianle; Bian, Huisheng; Remer, Lorraine A.; Prospero, Joseph M.; Omar, Ali; Winker, David; Yang, Yuekui; Zhang, Yan; Zhang, Zhibo; Zhao, Chun
2015-03-01
The productivity of the Amazon rainforest is constrained by the availability of nutrients, in particular phosphorus (P). Deposition of long-range transported African dust is recognized as a potentially important but poorly quantified source of phosphorus. This study provides a first multiyear satellite-based estimate of dust deposition into the Amazon Basin using three-dimensional (3-D) aerosol measurements over 2007-2013 from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP). The 7 year average of dust deposition into the Amazon Basin is estimated to be 28 (8-48) Tg a-1 or 29 (8-50) kg ha-1 a-1. The dust deposition shows significant interannual variation that is negatively correlated with the prior-year rainfall in the Sahel. The CALIOP-based multiyear mean estimate of dust deposition matches better with estimates from in situ measurements and model simulations than a previous satellite-based estimate does. The closer agreement benefits from a more realistic geographic definition of the Amazon Basin and inclusion of meridional dust transport calculation in addition to the 3-D nature of CALIOP aerosol measurements. The imported dust could provide about 0.022 (0.006-0.037) Tg P of phosphorus per year, equivalent to 23 (7-39) g P ha-1 a-1 to fertilize the Amazon rainforest. This out-of-basin phosphorus input is comparable to the hydrological loss of phosphorus from the basin, suggesting an important role of African dust in preventing phosphorus depletion on timescales of decades to centuries.
The Integration of CloudStack and OCCI/OpenNebula with DIRAC
NASA Astrophysics Data System (ADS)
Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan
2012-12-01
The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License
Using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.
2016-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
Hoo, Henny; Hashidoko, Yasuyuki; Islam, Md. Tofazzal; Tahara, Satoshi
2004-01-01
Mg2+ is one of the essential elements for bacterial cell growth. The presence of the magnesium cation (Mg2+) in various concentrations often affects cell growth restoration in plant-associating bacteria. This study attempted to determine whether Mg2+ levels in Sphingomonas yanoikuyae EC-S001 affected cell growth restoration in the host plant and what the threshold level is. S. yanoikuyae EC-S001, isolated from the rhizoplane of spinach seedlings grown from surface-sterilized seeds under aseptic conditions, displayed uniform dispersion and attachment throughout the rhizoplane and phylloplane of the host seedlings. S. yanoikuyae EC-S001 did not grow in potato-dextrose broth medium but grew well in an aqueous extract of spinach leaves. Chemical investigation of the growth factor in the spinach leaf extract led to identification of the active principle as the magnesium cation. A concentration of ca. 0.10 mM Mg2+ or more allowed S. yanoikuyae EC-S001 to grow in potato-dextrose broth medium. Some saprophytic and/or diazotrophic bacteria used in our experiment were found to have diverse threshold levels for their Mg2+ requirements. For example, Burkholderia cepacia EC-K014, originally isolated from the rhizoplane of a Melastoma sp., could grow even in Mg2+-free Hoagland's no. 2 medium with saccharose and glutamine (HSG medium) and requires a trace level of Mg2+ for its growth. In contrast, S. yanoikuyae EC-S001, together with Bacillus subtilis IFO12113, showed the most drastic restoring responses to subsequent addition of 0.98 mM Mg2+ to Mg2+-free HSG medium. Our studies concluded that Mg2+ is more than just the essential trace element needed for cell growth restoration in S. yanoikuyae EC-S001 and that certain nonculturable bacteria may require a higher concentration of Mg2+ or another specific essential element for their growth. PMID:15345402
NASA Astrophysics Data System (ADS)
Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats
2014-06-01
Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt
Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less
Cross-VM Side Channels and Their Use to Extract Private Keys
2012-10-16
clouds such as Amazon EC2 and Rackspace, but also by other Xen use cases. For ex- 4 ample, many virtual desktop infrastructure ( VDI ) solutions (e.g...whose bit length is, for example, 337, 403, or 457 when κ is 2048 , 3072, or 4096, respectively. We note that this deviates from standard ElGamal, in
Green Ocean Amazon 2014/15 Manaus Pollution Study Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keutsch, Frank N.
This work was part of the larger Green Ocean Amazon 2014/15 (GOAmazon 2014/15) experiment, which extended through the wet and dry seasons from January 2014 through December 2015 and which took place around the urban region of Manaus, Brazil in central Amazonia. This work was conducted as part of this experiment at the main U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ground research site “T3” circa 100 km west of Manaus during two intensive operational periods, “IOP1” and “IOP2” (February 1 to March 31, 2014, and August 15 to October 15, 2014, respectively). Funding formore » this work was provided by the National Science Foundation AGS 1321987/1628491. The GoAmazon experiment was designed to enable the study of how aerosols and surface fluxes influence cloud cycles under clean conditions, as well as how aerosol and cloud life cycles, including cloud-aerosol-precipitation interactions, are influenced by pollutant outflow from a tropical megacity. These observations provide a data set vital to constrain tropical rain forest model parameterizations for organic aerosols, cloud and convection schemes, and terrestrial vegetation components and how these are perturbed by pollution. Research objectives specific to this work and the T3 ground site included studies of how outflow of pollution from Manaus modulated the photochemically driven conversion of emitted precursors to aerosol precursors and aerosol.« less
Campaign datasets for Observations and Modeling of the Green Ocean Amazon (GOAMAZON)
Martin,Scot; Mei,Fan; Alexander,Lizabeth; Artaxo,Paulo; Barbosa,Henrique; Bartholomew,Mary Jane; Biscaro,Thiago; Buseck,Peter; Chand,Duli; Comstock,Jennifer; Dubey,Manvendra; Godstein,Allen; Guenther,Alex; Hubbe,John; Jardine,Kolby; Jimenez,Jose-Luis; Kim,Saewung; Kuang,Chongai; Laskin,Alexander; Long,Chuck; Paralovo,Sarah; Petaja,Tuukka; Powers,Heath; Schumacher,Courtney; Sedlacek,Arthur; Senum,Gunnar; Smith,James; Shilling,John; Springston,Stephen; Thayer,Mitchell; Tomlinson,Jason; Wang,Jian; Xie,Shaocheng
2016-05-30
The hydrologic cycle of the Amazon Basin is one of the primary heat engines of the Southern Hemisphere. Any accurate climate model must succeed in a good description of the Basin, both in its natural state and in states perturbed by regional and global human activities. At the present time, however, tropical deep convection in a natural state is poorly understood and modeled, with insufficient observational data sets for model constraint. Furthermore, future climate scenarios resulting from human activities globally show the possible drying and the eventual possible conversion of rain forest to savanna in response to global climate change. Based on our current state of knowledge, the governing conditions of this catastrophic change are not defined. Human activities locally, including the economic development activities that are growing the population and the industry within the Basin, also have the potential to shift regional climate, most immediately by an increment in aerosol number and mass concentrations, and the shift is across the range of values to which cloud properties are most sensitive. The ARM Climate Research Facility in the Amazon Basin seeks to understand aerosol and cloud life cycles, particularly the susceptibility to cloud aerosol precipitation interactions, within the Amazon Basin.
Electron-Cloud Build-Up: Theory and Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M. A.
We present a broad-brush survey of the phenomenology, history and importance of the electron-cloud effect (ECE). We briefly discuss the simulation techniques used to quantify the electron-cloud (EC) dynamics. Finally, we present in more detail an effective theory to describe the EC density build-up in terms of a few effective parameters. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire 'ECLOUD' series. In addition, the proceedings of the various flavors of Particle Accelerator Conferences contain a large number of EC-related publications.more » The ICFA Beam Dynamics Newsletter series contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC.« less
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
NASA Astrophysics Data System (ADS)
Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos
2014-05-01
Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.
Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.
2015-12-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, MODIS, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. HySDS is a Hybrid-Cloud Science Data System that has been developed and applied under NASA AIST, MEaSUREs, and ACCESS grants. HySDS uses the SciFlow workflow engine to partition analysis workflows into parallel tasks (e.g. segmenting by time or space) that are pushed into a durable job queue. The tasks are "pulled" from the queue by worker Virtual Machines (VM's) and executed in an on-premise Cloud (Eucalyptus or OpenStack) or at Amazon in the public Cloud or govCloud. In this way, years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the transferred data. We are using HySDS to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a MEASURES grant. We will present the architecture of HySDS, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. Our system demonstrates how one can pull A-Train variables (Levels 2 & 3) on-demand into the Amazon Cloud, and cache only those variables that are heavily used, so that any number of compute jobs can be
NASA Astrophysics Data System (ADS)
Campos Braga, Ramon; Rosenfeld, Daniel; Weigel, Ralf; Jurkat, Tina; Andreae, Meinrat O.; Wendisch, Manfred; Pöschl, Ulrich; Voigt, Christiane; Mahnke, Christoph; Borrmann, Stephan; Albrecht, Rachel I.; Molleker, Sergej; Vila, Daniel A.; Machado, Luiz A. T.; Grulich, Lucas
2017-12-01
We have investigated how aerosols affect the height above cloud base of rain and ice hydrometeor initiation and the subsequent vertical evolution of cloud droplet size and number concentrations in growing convective cumulus. For this purpose we used in situ data of hydrometeor size distributions measured with instruments mounted on HALO aircraft during the ACRIDICON-CHUVA campaign over the Amazon during September 2014. The results show that the height of rain initiation by collision and coalescence processes (Dr, in units of meters above cloud base) is linearly correlated with the number concentration of droplets (Nd in cm-3) nucleated at cloud base (Dr ≈ 5 ṡ Nd). Additional cloud processes associated with Dr, such as GCCN, cloud, and mixing with ambient air and other processes, produce deviations of ˜ 21 % in the linear relationship, but it does not mask the clear relationship between Dr and Nd, which was also found at different regions around the globe (e.g., Israel and India). When Nd exceeded values of about 1000 cm-3, Dr became greater than 5000 m, and the first observed precipitation particles were ice hydrometeors. Therefore, no liquid water raindrops were observed within growing convective cumulus during polluted conditions. Furthermore, the formation of ice particles also took place at higher altitudes in the clouds in polluted conditions because the resulting smaller cloud droplets froze at colder temperatures compared to the larger drops in the unpolluted cases. The measured vertical profiles of droplet effective radius (re) were close to those estimated by assuming adiabatic conditions (rea), supporting the hypothesis that the entrainment and mixing of air into convective clouds is nearly inhomogeneous. Additional CCN activation on aerosol particles from biomass burning and air pollution reduced re below rea, which further inhibited the formation of raindrops and ice particles and resulted in even higher altitudes for rain and ice initiation.
Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.
Tran, Ngoc Tam L; Huang, Chun-Hsi
2017-05-01
We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
Anber, Usama; Gentine, Pierre; Wang, Shuguang; Sobel, Adam H.
2015-01-01
The diurnal and seasonal water cycles in the Amazon remain poorly simulated in general circulation models, exhibiting peak evapotranspiration in the wrong season and rain too early in the day. We show that those biases are not present in cloud-resolving simulations with parameterized large-scale circulation. The difference is attributed to the representation of the morning fog layer, and to more accurate characterization of convection and its coupling with large-scale circulation. The morning fog layer, present in the wet season but absent in the dry season, dramatically increases cloud albedo, which reduces evapotranspiration through its modulation of the surface energy budget. These results highlight the importance of the coupling between the energy and hydrological cycles and the key role of cloud albedo feedback for climates over tropical continents. PMID:26324902
NASA Astrophysics Data System (ADS)
Marinos, Alexandros; Briscoe, Gerard
Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.
The Impacts of Amazon Deforestation on Pacific Climate
NASA Astrophysics Data System (ADS)
Lindsey, Leah
Variability in eastern Pacific sea surface temperatures (SSTs) associated with the El Nino Southern Oscillation are known to affect Amazonian precipitation, but to what extent do changing Amazonian vegetation and rainfall impact eastern Pacific SST? The Amazon rainforest is threatened by many factors including climate change and clearing for agricultural reasons. Forest fires and dieback are more likely due to increased frequency and intensity of droughts in the region. It is possible that extensive Amazon deforestation can enhance El Nino conditions by weakening the Walker circulation. Correlations between annual rainfall rates over the Amazon and other atmospheric parameters (global precipitation, surface air temperature, low cloud amount, 500 hPa vertical velocity, surface winds, and 200 hPa winds) over the eastern Pacific indicate strong relationships among these fields. Maps of these correlations (teleconnection maps) reveal that when the Amazon is rainy SSTs in the central and eastern Pacific are cold, rainfall is suppressed over the central and eastern Pacific, low clouds are prominent over the eastern and southeastern Pacific, and subsidence over the central and eastern Pacific is enhanced. Precipitation in the Amazon is also consistent with a strong Walker circulation (La Nina conditions), manifest as strong correlations with the easterly surface and westerly 200 hPa zonal winds. Coupling between Amazon rainfall and these fields are seen in observations and model data. Correlations were calculated using data from observations, reanalysis data, two models under the Coupled Model Intercomparison Project/Atmospheric Model Intercomparison Project (CMIP5/AMIP), and an AMIP run with the model used in this study, the Community Earth System Model (CESM1.1.1). Although the correlations between Amazon precipitation and the aforementioned fields are strong, they do not show causality. In order to investigate the impact of tropical South American deforestation on the
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Fisher, W.; Yoksas, T.
2014-12-01
Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real
NASA Technical Reports Server (NTRS)
Pickering, Kenneth E.; Thompson, Anne M.; Tao, Wei-Kuo; Simpson, Joanne; Scala, John R.
1991-01-01
The role of convection was examined in trace gas transport and ozone production in a tropical dry season squall line sampled on August 3, 1985, during NASA Global Tropospheric Experiment/Amazon Boundary Layer Experiment 2A (NASA GTE/ABLE 2A) in Amazonia, Brazil. Two types of analyses were performed. Transient effects within the cloud are examined with a combination of two-dimensional cloud and one-dimensional photochemical modeling. Tracer analyses using the cloud model wind fields yield a series of cross sections of NO(x), CO, and O3 distribution during the lifetime of the cloud; these fields are used in the photochemical model to compute the net rate of O3 production. At noon, when the cloud was mature, the instantaneous ozone production potential in the cloud is between 50 and 60 percent less than in no-cloud conditions due to reduced photolysis and cloud scavenging of radicals. Analysis of cloud inflows and outflows is used to differentiate between air that is undisturbed and air that has been modified by the storm. These profiles are used in the photochemical model to examine the aftereffects of convective redistribution in the 24-hour period following the storm. Total tropospheric column O3 production changed little due to convection because so little NO(x) was available in the lower troposphere. However, the integrated O3 production potential in the 5- to 13-km layer changed from net destruction to net production as a result of the convection. The conditions of the August 3, 1985, event may be typical of the early part of the dry season in Amazonia, when only minimal amounts of pollution from biomass burning have been transported into the region.
Anber, Usama; Gentine, Pierre; Wang, Shuguang; ...
2015-08-31
The diurnal and seasonal water cycles in the Amazon remain poorly simulated in general circulation models, exhibiting peak evapotranspiration in the wrong season and rain too early in the day. We show that those biases are not present in cloud-resolving simulations with parameterized large-scale circulation. The difference is attributed to the representation of the morning fog layer, and to more accurate characterization of convection and its coupling with large-scale circulation. The morning fog layer, present in the wet season but absent in the dry season, dramatically increases cloud albedo, which reduces evapotranspiration through its modulation of the surface energy budget.more » Finally, these results highlight the importance of the coupling between the energy and hydrological cycles and the key role of cloud albedo feedback for climates over tropical continents.« less
Modeling Optical and Radiative Properties of Clouds Constrained with CARDEX Observations
NASA Astrophysics Data System (ADS)
Mishra, S. K.; Praveen, P. S.; Ramanathan, V.
2013-12-01
Carbonaceous aerosols (CA) have important effects on climate by directly absorbing solar radiation and indirectly changing cloud properties. These particles tend to be a complex mixture of graphitic carbon and organic compounds. The graphitic component, called as elemental carbon (EC), is characterized by significant absorption of solar radiation. Recent studies showed that organic carbon (OC) aerosols absorb strongly near UV region, and this faction is known as Brown Carbon (BrC). The indirect effect of CA can occur in two ways, first by changing the thermal structure of the atmosphere which further affects dynamical processes governing cloud life cycle; secondly, by acting as cloud condensation nuclei (CCN) that can change cloud radiative properties. In this work, cloud optical properties have been numerically estimated by accounting for CAEDEX (Cloud Aerosol Radiative Forcing Dynamics Experiment) observed cloud parameters and the physico-chemical and optical properties of aerosols. The aerosol inclusions in the cloud drop have been considered as core shell structure with core as EC and shell comprising of ammonium sulfate, ammonium nitrate, sea salt and organic carbon (organic acids, OA and brown carbon, BrC). The EC/OC ratio of the inclusion particles have been constrained based on observations. Moderate and heavy pollution events have been decided based on the aerosol number and BC concentration. Cloud drop's co-albedo at 550nm was found nearly identical for pure EC sphere inclusions and core-shell inclusions with all non-absorbing organics in the shell. However, co-albedo was found to increase for the drop having all BrC in the shell. The co-albedo of a cloud drop was found to be the maximum for all aerosol present as interstitial compare to 50% and 0% inclusions existing as interstitial aerosols. The co-albedo was found to be ~ 9.87e-4 for the drop with 100% inclusions existing as interstitial aerosols externally mixed with micron size mineral dust with 2
Secure and Resilient Cloud Computing for the Department of Defense
2015-11-16
platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail
Rail-dbGaP: analyzing dbGaP-protected data in the cloud with Amazon Elastic MapReduce.
Nellore, Abhinav; Wilks, Christopher; Hansen, Kasper D; Leek, Jeffrey T; Langmead, Ben
2016-08-15
Public archives contain thousands of trillions of bases of valuable sequencing data. More than 40% of the Sequence Read Archive is human data protected by provisions such as dbGaP. To analyse dbGaP-protected data, researchers must typically work with IT administrators and signing officials to ensure all levels of security are implemented at their institution. This is a major obstacle, impeding reproducibility and reducing the utility of archived data. We present a protocol and software tool for analyzing protected data in a commercial cloud. The protocol, Rail-dbGaP, is applicable to any tool running on Amazon Web Services Elastic MapReduce. The tool, Rail-RNA v0.2, is a spliced aligner for RNA-seq data, which we demonstrate by running on 9662 samples from the dbGaP-protected GTEx consortium dataset. The Rail-dbGaP protocol makes explicit for the first time the steps an investigator must take to develop Elastic MapReduce pipelines that analyse dbGaP-protected data in a manner compliant with NIH guidelines. Rail-RNA automates implementation of the protocol, making it easy for typical biomedical investigators to study protected RNA-seq data, regardless of their local IT resources or expertise. Rail-RNA is available from http://rail.bio Technical details on the Rail-dbGaP protocol as well as an implementation walkthrough are available at https://github.com/nellore/rail-dbgap Detailed instructions on running Rail-RNA on dbGaP-protected data using Amazon Web Services are available at http://docs.rail.bio/dbgap/ : anellore@gmail.com or langmea@cs.jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Measurements of the light-absorbing material inside cloud droplets and its effect on cloud albedo
NASA Technical Reports Server (NTRS)
Twohy, C. H.; Clarke, A. D.; Warren, Stephen G.; Radke, L. F.; Charleson, R. J.
1990-01-01
Most of the measurements of light-absorbing aerosol particles made previously have been in non-cloudy air and therefore provide no insight into aerosol effects on cloud properties. Here, researchers describe an experiment designed to measure light absorption exclusively due to substances inside cloud droplets, compare the results to related light absorption measurements, and evaluate possible effects on the albedo of clouds. The results of this study validate those of Twomey and Cocks and show that the measured levels of light-absorbing material are negligible for the radiative properties of realistic clouds. For the measured clouds, which appear to have been moderately polluted, the amount of elemental carbon (EC) present was insufficient to affect albedo. Much higher contaminant levels or much larger droplets than those measured would be necessary to significantly alter the radiative properties. The effect of the concentrations of EC actually measured on the albedo of snow, however, would be much more pronounced since, in contrast to clouds, snowpacks are usually optically semi-infinite and have large particle sizes.
NASA Astrophysics Data System (ADS)
Albrecht, Rachel I.; Morales, Carlos A.; Silva Dias, Maria A. F.
2011-04-01
This study investigated the physical processes involved in the development of thunderstorms over southwestern Amazon by hypothesizing causalities for the observed cloud-to-ground lightning variability and the local environmental characteristics. Southwestern Amazon experiences every year a large variety of environmental factors, such as the gradual increase in atmospheric moisture, extremely high pollution due to biomass burning, and intense deforestation, which directly affects cloud development by differential surface energy partition. In the end of the dry period it was observed higher percentages of positive cloud-to-ground (+CG) lightning due to a relative increase in +CG dominated thunderstorms (positive thunderstorms). Positive (negative) thunderstorms initiated preferentially over deforested (forest) areas with higher (lower) cloud base heights, shallower (deeper) warm cloud depths, and higher (lower) convective potential available energy. These features characterized the positive (negative) thunderstorms as deeper (relatively shallower) clouds, stronger (relatively weaker) updrafts with enhanced (decreased) mixed and cold vertically integrated liquid. No significant difference between thunderstorms (negative and positive) and nonthunderstorms were observed in terms of atmospheric pollution, once the atmosphere was overwhelmed by pollution leading to an updraft-limited regime. However, in the wet season both negative and positive thunderstorms occurred during periods of relatively higher aerosol concentration and differentiated size distributions, suggesting an aerosol-limited regime where cloud electrification could be dependent on the aerosol concentration to suppress the warm and enhance the ice phase. The suggested causalities are consistent with the invoked hypotheses, but they are not observed facts; they are just hypotheses based on plausible physical mechanisms.
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
Cumulus cloud model estimates of trace gas transports
NASA Technical Reports Server (NTRS)
Garstang, Michael; Scala, John; Simpson, Joanne; Tao, Wei-Kuo; Thompson, A.; Pickering, K. E.; Harris, R.
1989-01-01
Draft structures in convective clouds are examined with reference to the results of the NASA Amazon Boundary Layer Experiments (ABLE IIa and IIb) and calculations based on a multidimensional time dependent dynamic and microphysical numerical cloud model. It is shown that some aspects of the draft structures can be calculated from measurements of the cloud environment. Estimated residence times in the lower regions of the cloud based on surface observations (divergence and vertical velocities) are within the same order of magnitude (about 20 min) as model trajectory estimates.
Geenen, I L A; Molin, D G M; van den Akker, N M S; Jeukens, F; Spronk, H M; Schurink, G W H; Post, M J
2015-05-01
Primary endothelial cells (ECs) are the preferred cellular source for luminal seeding of tissue-engineered (TE) vascular grafts. Research into the potential of ECs for vascular TE has focused particularly on venous rather than arterial ECs. In this study we evaluated the functional characteristics of arterial and venous ECs, relevant for vascular TE. Porcine ECs were isolated from femoral artery (PFAECs) and vein (PFVECs). The proliferation rate was comparable for both EC sources, whereas migration, determined through a wound-healing assay, was less profound for PFVECs. EC adhesion was lower for PFVECs on collagen I, measured after 10 min of arterial shear stress. Gene expression was analysed by qRT-PCR for ECs cultured under static conditions and after exposure to arterial shear stress and revealed differences in gene expression, with lower expression of EphrinB2 and VCAM-1 and higher levels of vWF and COUP-TFII in PFVECs than in PFAECs. PFVECs exhibited diminished platelet adhesion under flow and cell-based thrombin generation was delayed for PFVECs, indicating diminished tissue factor (TF) activity. After stimulation, prostacyclin secretion, but not nitric oxide (NO), was lower in PFVECs. Our data support the use of venous ECs for TE because of their beneficial antithrombogenic profile. Copyright © 2012 John Wiley & Sons, Ltd.
Electron Cloud Effects in Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
Abstract We present a brief summary of various aspects of the electron-cloud effect (ECE) in accelerators. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire ?ECLOUD? series [1?22]. In addition, the proceedings of the various flavors of Particle Accelerator Conferences [23] contain a large number of EC-related publications. The ICFA Beam Dynamics Newsletter series [24] contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC [25].
Cloud draft structure and trace gas transport
NASA Technical Reports Server (NTRS)
Scala, John R.; Tao, Wei-Kuo; Thompson, Anne M.; Simpson, Joanne; Garstang, Michael; Pickering, Kenneth E.; Browell, Edward V.; Sachse, Glen W.; Gregory, Gerald L.; Torres, Arnold L.
1990-01-01
During the second Amazon Boundary Layer Experiment (ABLE 2B), meteorological observations, chemical measurements, and model simulations are utilized in order to interpret convective cloud draft structure and to analyze its role in transport and vertical distribution of trace gases. One-dimensional photochemical model results suggest that the observed poststorm changes in ozone concentration can be attributed to convective transports rather than photochemical production and the results of a two-dimensional time-dependent cloud model simulation are presented for the May 6, 1987 squall system. The mesoscale convective system exhibited evidence of significant midlevel detrainment in addition to transports to anvil heights. Chemical measurements of O3 and CO obtained in the convective environment are used to predict photochemical production within the troposphere and to corroborate the cloud model results.
Daytime turbulent exchange between the Amazon forest and the atmosphere
NASA Technical Reports Server (NTRS)
Fitzjarrald, David R.; Moore, Kathleen E.; Cabral, Osvaldo M. R.; Scolar, Jose; Manzi, Antonio O.; Deabreusa, Leonardo D.
1989-01-01
Detailed observations of turbulence just above and below the crown of the Amazon rain forest during the wet season are presented. The forest canopy is shown to remove high frequency turbulent fluctuations while passing lower frequencies. Filter characteristics of turbulent transfer into the Amazon rain forest canopy are quantified. Simple empirical relations that relate observed turbulent heat fluxes to horizontal wind variance are presented. Changes in the amount of turbulent coupling between the forest and the boundary layer associated with deep convective clouds are presented both as statistical averages and as a series of case studies. These convective processes during the rainy season are shown to alter the diurnal course of turbulent fluxes. In wake of giant coastal systems, no significant heat or moisture fluxes occur for up to a day after the event. Radar data is used to demonstrate that even small raining clouds are capable of evacuating the canopy of substances normally trapped by persistent static stability near the forest floor. Recovery from these events can take more than an hour, even during mid-day. In spite of the ubiquitous presence of clouds and frequent rain during this season, the average horizontal wind speed spectrum is well described by dry CBL similarity hypotheses originally found to apply in flat terrain.
Daytime turbulent exchange between the Amazon forest and the atmosphere
NASA Technical Reports Server (NTRS)
Fitzjarrald, David R.; Moore, Kathleen E.; Cabral, Osvaldo M. R.; Scolar, Jose; Manzi, Antonio
1990-01-01
Detailed observations of turbulence just above and below the crown of the Amazon rain forest during the wet season are presented. The forest canopy is shown to remove high frequency turbulent fluctuations while passing lower frequencies. Filter characteristics of turbulent transfer into the Amazon rain forest canopy are quantified. Simple empirical relations that relate observed turbulent heat fluxes to horizontal wind variance are presented. Changes in the amount of turbulent coupling between the forest and the boundary layer associated with deep convective clouds are presented both as statistical averages and as a series of case studies. These convective processes during the rainy season are shown to alter the diurnal course of turbulent fluxes. In wake of giant coastal systems, no significant heat or moisture fluxes occur for up to a day after the event. Radar data is used to demonstrate that even small raining clouds are capable of evacuating the canopy of substances normally trapped by persistent static stability near the forest floor. Recovery from these events can take more than an hour, even during mid-day. In spite of the ubiquitous presence of clouds and frequent rain during this season, the average horizontal wind speed spectrum is well described by dry CBL similarity hypotheses originally found to apply in flat terrain.
Cloud services for the Fermilab scientific stakeholders
Timm, S.; Garzoglio, G.; Mhashilkar, P.; ...
2015-12-23
As part of the Fermilab/KISTI cooperative research project, Fermilab has successfully run an experimental simulation workflow at scale on a federation of Amazon Web Services (AWS), FermiCloud, and local FermiGrid resources. We used the CernVM-FS (CVMFS) file system to deliver the application software. We established Squid caching servers in AWS as well, using the Shoal system to let each individual virtual machine find the closest squid server. We also developed an automatic virtual machine conversion system so that we could transition virtual machines made on FermiCloud to Amazon Web Services. We used this system to successfully run a cosmic raymore » simulation of the NOvA detector at Fermilab, making use of both AWS spot pricing and network bandwidth discounts to minimize the cost. On FermiCloud we also were able to run the workflow at the scale of 1000 virtual machines, using a private network routable inside of Fermilab. As a result, we present in detail the technological improvements that were used to make this work a reality.« less
Cloud services for the Fermilab scientific stakeholders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timm, S.; Garzoglio, G.; Mhashilkar, P.
As part of the Fermilab/KISTI cooperative research project, Fermilab has successfully run an experimental simulation workflow at scale on a federation of Amazon Web Services (AWS), FermiCloud, and local FermiGrid resources. We used the CernVM-FS (CVMFS) file system to deliver the application software. We established Squid caching servers in AWS as well, using the Shoal system to let each individual virtual machine find the closest squid server. We also developed an automatic virtual machine conversion system so that we could transition virtual machines made on FermiCloud to Amazon Web Services. We used this system to successfully run a cosmic raymore » simulation of the NOvA detector at Fermilab, making use of both AWS spot pricing and network bandwidth discounts to minimize the cost. On FermiCloud we also were able to run the workflow at the scale of 1000 virtual machines, using a private network routable inside of Fermilab. As a result, we present in detail the technological improvements that were used to make this work a reality.« less
A PACS archive architecture supported on cloud services.
Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis
2012-05-01
Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.
NASA Astrophysics Data System (ADS)
Wendisch, Manfred; Pöschl, Ulrich; Andreae, Meinrat O.; Machado, Luiz A. T.; Albrecht, Rachel; Schlager, Hans; Rosenfeld, Daniel; Krämer, Martina
2015-04-01
An extensive airborne/ground-based measurement campaign to study tropical convective clouds is introduced. It was performed in Brazil with focus on the Amazon rainforest from 1 September to 4 October 2014. The project combined the joint German-Brazilian ACRIDICON (Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems) and CHUVA (Machado et al.2014) projects. ACRIDICON aimed at the quantification of aerosol-cloud-precipitation interactions and their thermodynamic, dynamic and radiative effects in convective cloud systems by in-situ aircraft observations and indirect measurements (aircraft, satellite, and ground-based). The ACRIDICON-CHUVA campaign was conducted in cooperation with the second Intensive Operational Phase (IOP) of the GOAmazon (Green Ocean Amazon) program. The focus in this presentation is on the airborne observations within ACRIDICON-CHUVA. The German HALO (High Altitude and Long-Range Research Aircraft) was based in Manaus (Amazonas State); it carried out 14 research flights (96 flight hours in total). HALO was equipped with remote sensing and in-situ instrumentation for meteorological, trace gas, aerosol, cloud, and precipitation measurements. Five mission objectives were pursued: (1) cloud vertical evolution (cloud profiling), (2) aerosol processing (inflow and outflow), (3) satellite validation, (4) vertical transport and mixing (tracer experiment), and (5) clouds over forested and deforested areas. The five cloud missions collected data in clean atmospheric conditions and in contrasting polluted (urban and biomass burning) environments.
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Do Southern Ocean Cloud Feedbacks Matter for 21st Century Warming?
NASA Astrophysics Data System (ADS)
Frey, W. R.; Maroon, E. A.; Pendergrass, A. G.; Kay, J. E.
2017-12-01
Cloud phase improvements in a state-of-the-art climate model produce a large 1.5 K increase in equilibrium climate sensitivity (ECS, the surface warming in response to instantaneously doubled CO2) via extratropical shortwave cloud feedbacks. Here we show that the same model improvements produce only a small surface warming increase in a realistic 21st century emissions scenario. The small 21st century warming increase is attributed to extratropical ocean heat uptake. Southern Ocean mean-state circulation takes up heat while a slowdown in North Atlantic circulation acts as a feedback to slow surface warming. Persistent heat uptake by extratropical oceans implies that extratropical cloud biases may not be as important to 21st century warming as biases in other regions. Observational constraints on cloud phase and shortwave radiation that produce a large ECS increase do not imply large changes in 21st century warming.
NASA Astrophysics Data System (ADS)
Moreira, D. S.; Longo, K.; Freitas, S.; Mercado, L. M.; Miller, J. B.; Rosario, N. M. E. D.; Gatti, L.; Yamasoe, M. A.
2017-12-01
The Amazon region is characterized by high cloudiness, mainly due to convective clouds during most of the year due to the high humidity, and heat availability. However, during the Austral winter, the northward movement of the inter-tropical convergence zone (ITCZ) from its climatological position, significantly reducing cloudiness and precipitation, facilitating vegetation fires. Consequently, during these dry months, biomass burning aerosols contribute to relatively high values of aerosol optical depth (AOD) in Amazonia, typically exceeding 1.0 in the 550 nm wavelength. Both clouds and aerosols scatter solar radiation, reducing the direct irradiance and increasing the diffuse fraction that reaches the surface, decreasing near surface temperature and increasing photosynthetically active radiation (PAR) availability. This, in turn, affects energy and CO2 fluxes within the vegetation canopy. We applied an atmospheric model fully coupled to terrestrial carbon cycle model to assess the relative impact of biomass burning aerosols and clouds on CO2 fluxes in the Amazon region. Our results indicate that during most of the year, gross primary productivity (GPP) is high mainly due to high soil moisture and high values of the diffuse fraction of solar irradiation due to cloudiness. Therefore, heterotrophic and autotrophic respiration are both high, increasing the NEE values (i.e. reducing the net land sink). On the other hand, during the dry season, with a significant reduction of cloudiness, the biomass burning aerosol is mainly responsible for the increase in the diffuse fraction of solar irradiation and the GPP of the forest. However, the low soil moisture during the dry season, especially in the eastern Amazon, reduces heterotrophic and autotrophic respiration and thus compensates for reduced GPP compared to the wet season. Different reasons, an anthropogenic one (human induced fires during the dry season) and a natural one (cloudiness), lead to a somewhat stable value
The Amazon Boundary Layer Experiment (ABLE 2A) - Dry season 1985
NASA Technical Reports Server (NTRS)
Harriss, R. C.; Browell, E. V.; Hoell, J. M., Jr.; Bendura, R. J.; Beck, S. M.; Wofsy, S. C.; Mcneal, R. J.; Navarro, R. L.; Riley, J. T.; Snell, R. L.
1988-01-01
The Amazon Boundary Layer Experiment (ABLE 2A) used data from aircraft, ground-based, and satellite platforms to characterize the chemistry and dynamics of the lower atmosphere over the Amazon Basin during the early-to-middle dry season, July and August 1985. This paper reports the conceptual framework and experimental approach used in ABLE 2A and serves as an introduction to the detailed papers which follow in this issue. The results of ABLE 2A demonstrate that isoprene, methane, carbon dioxide, nitric oxide, dimethylsulfide, and organic aerosol emissions from soils and vegetation play a major role in determining the chemical composition of the atmospheric mixed layer over undisturbed forest and wetland environments. As the dry season progresses, emissions from both local and distant biomass burning become an important source of carbon monoxide, nitric oxide and ozone in the atmosphere over the central Amazon Basin.
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; Fisher, Ward; Yoksas, Tom
2015-04-01
services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
Libraries in the Cloud: Making a Case for Google and Amazon
ERIC Educational Resources Information Center
Buck, Stephanie
2009-01-01
As news outlets create headlines such as "A Cloud & A Prayer," "The Cloud Is the Computer," and "Leveraging Clouds to Make You More Efficient," many readers have been left with cloud confusion. Many definitions exist for cloud computing, and a uniform definition is hard to find. In its most basic form, cloud…
Cloud Surprises in Moving NASA EOSDIS Applications into Amazon Web Services
NASA Technical Reports Server (NTRS)
Mclaughlin, Brett
2017-01-01
NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. We ran into surprising network policy limitations, billing challenges in a government-based cost model, and difficulty in obtaining certificates in an NASA security-compliant manner. On the other hand, this approach has allowed us to move a number of applications from local hosting to the cloud in a matter of hours (yes, hours!!), and our CMR application now services 95% of granule searches and an astonishing 99% of all collection searches in under a second. And most surprising of all, well, you'll just have to wait and see the realization that caught our entire team off guard!
The Shallow-to-Deep Transition in Convective Clouds During GoAmazon 2014/5
NASA Astrophysics Data System (ADS)
Jensen, M. P.; Gostic, C.; Giangrande, S. E.; Mechem, D. B.; Ghate, V. P.; Toto, T.
2016-12-01
Nearly two years of observations from the ARM Mobile Facility (AMF) deployed at Manacapuru, Brazil during the GOAmazon 2014/5 campaign are analyzed to investigate the environmental conditions controlling the transition from shallow to deep convective clouds. The Active Remote Sensing of Clouds (ARSCL) product, which combines radar and lidar observations to produce best estimates of cloud locations in the vertical column is used to qualitatively define four subsets of convective cloud conditions: 1,2) Transition cases (wet season, dry season), where a period of shallow convective clouds is followed by a period of deep convective clouds and 2) Non-transition cases (wet season, dry season), where shallow convective clouds persist without any subsequent development. For these subsets, observations of the time varying thermodynamic properties of the atmosphere, including the surface heat and radiative fluxes, the profiles of atmospheric state variables, and the ECMWF-derived large-scale advective tendencies, are composited to define averaged properties for each transition state. Initial analysis indicates that the transition state strongly depends on the pre-dawn free-tropospheric humidity, the convective inhibition and surface temperature and humidity with little dependence on the convective available potential energy and surface heat fluxes. The composited environmental thermodynamics are then used to force large-eddy simulations for the four transition states to further evaluate the sensitivity of the transition to the composite thermodynamics versus the importance of larger-scale forcing.
Security Risks of Cloud Computing and Its Emergence as 5th Utility Service
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
De Paris, Renata; Frantz, Fábio A.; Norberto de Souza, Osmar; Ruiz, Duncan D. A.
2013-01-01
Molecular docking simulations of fully flexible protein receptor (FFR) models are coming of age. In our studies, an FFR model is represented by a series of different conformations derived from a molecular dynamic simulation trajectory of the receptor. For each conformation in the FFR model, a docking simulation is executed and analyzed. An important challenge is to perform virtual screening of millions of ligands using an FFR model in a sequential mode since it can become computationally very demanding. In this paper, we propose a cloud-based web environment, called web Flexible Receptor Docking Workflow (wFReDoW), which reduces the CPU time in the molecular docking simulations of FFR models to small molecules. It is based on the new workflow data pattern called self-adaptive multiple instances (P-SaMIs) and on a middleware built on Amazon EC2 instances. P-SaMI reduces the number of molecular docking simulations while the middleware speeds up the docking experiments using a High Performance Computing (HPC) environment on the cloud. The experimental results show a reduction in the total elapsed time of docking experiments and the quality of the new reduced receptor models produced by discarding the nonpromising conformations from an FFR model ruled by the P-SaMI data pattern. PMID:23691504
The Green Ocean: Precipitation Insights from the GoAmazon2014/5 Experiment
Wang, Die; Giangrande, Scott E.; Bartholomew, Mary Jane; ...
2018-02-07
This study summarizes the precipitation properties collected during the GoAmazon2014/5 campaign near Manaus in central Amazonia, Brazil. Precipitation breakdowns, summary radar rainfall relationships and self-consistency concepts from a coupled disdrometer and radar wind profiler measurements are presented. The properties of Amazon cumulus and associated stratiform precipitation are discussed, including segregations according to seasonal (Wet/Dry regime) variability, cloud echo-top height and possible aerosol influences on the apparent oceanic characteristics of the precipitation drop size distributions. Overall, we observe that the Amazon precipitation straddles behaviors found during previous U.S. Department of Energy Atmospheric Radiation Measurements program (ARM) tropical deployments, with distributions favoringmore » higher concentrations of smaller drops than ARM continental examples. Oceanic type precipitation characteristics are predominantly observed during the Amazon Wet seasons. Finally, an exploration of the controls on Wet season precipitation properties reveals that wind direction, as compared with other standard radiosonde thermodynamic parameters or aerosol count/regime classifications performed at the ARM site, provides a good indicator for those Wet season Amazon events having an oceanic character for their precipitation drop size distributions.« less
The Green Ocean: Precipitation Insights from the GoAmazon2014/5 Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Die; Giangrande, Scott E.; Bartholomew, Mary Jane
This study summarizes the precipitation properties collected during the GoAmazon2014/5 campaign near Manaus in central Amazonia, Brazil. Precipitation breakdowns, summary radar rainfall relationships and self-consistency concepts from a coupled disdrometer and radar wind profiler measurements are presented. The properties of Amazon cumulus and associated stratiform precipitation are discussed, including segregations according to seasonal (Wet/Dry regime) variability, cloud echo-top height and possible aerosol influences on the apparent oceanic characteristics of the precipitation drop size distributions. Overall, we observe that the Amazon precipitation straddles behaviors found during previous U.S. Department of Energy Atmospheric Radiation Measurements program (ARM) tropical deployments, with distributions favoringmore » higher concentrations of smaller drops than ARM continental examples. Oceanic type precipitation characteristics are predominantly observed during the Amazon Wet seasons. Finally, an exploration of the controls on Wet season precipitation properties reveals that wind direction, as compared with other standard radiosonde thermodynamic parameters or aerosol count/regime classifications performed at the ARM site, provides a good indicator for those Wet season Amazon events having an oceanic character for their precipitation drop size distributions.« less
NASA Astrophysics Data System (ADS)
Oishi, Yu; Ishida, Haruma; Nakajima, Takashi Y.; Nakamura, Ryosuke; Matsunaga, Tsuneo
2018-05-01
The Greenhouse Gases Observing Satellite (GOSAT) was launched in 2009 to measure global atmospheric CO2 and CH4 concentrations. GOSAT is equipped with two sensors: the Thermal And Near infrared Sensor for carbon Observations (TANSO)-Fourier transform spectrometer (FTS) and TANSO-Cloud and Aerosol Imager (CAI). The presence of clouds in the instantaneous field of view of the FTS leads to incorrect estimates of the concentrations. Thus, the FTS data suspected to have cloud contamination must be identified by a CAI cloud discrimination algorithm and rejected. Conversely, overestimating clouds reduces the amount of FTS data that can be used to estimate greenhouse gas concentrations. This is a serious problem in tropical rainforest regions, such as the Amazon, where the amount of useable FTS data is small because of cloud cover. Preparations are continuing for the launch of the GOSAT-2 in fiscal year 2018. To improve the accuracy of the estimates of greenhouse gases concentrations, we need to refine the existing CAI cloud discrimination algorithm: Cloud and Aerosol Unbiased Decision Intellectual Algorithm (CLAUDIA1). A new cloud discrimination algorithm using a support vector machine (CLAUDIA3) was developed and presented in another paper. Although the use of visual inspection of clouds as a standard for judging is not practical for screening a full satellite data set, it has the advantage of allowing for locally optimized thresholds, while CLAUDIA1 and -3 use common global thresholds. Thus, the accuracy of visual inspection is better than that of these algorithms in most regions, with the exception of snow- and ice-covered surfaces, where there is not enough spectral contrast to identify cloud. In other words, visual inspection results can be used as truth data for accuracy evaluation of CLAUDIA1 and -3. For this reason visual inspection can be used for the truth metric for the cloud discrimination verification exercise. In this study, we compared CLAUDIA1-CAI and
Covariability in the Monthly Mean Convective and Radiative Diurnal Cycles in the Amazon
NASA Technical Reports Server (NTRS)
Dodson, Jason B.; Taylor, Patrick C.
2015-01-01
The diurnal cycle of convective clouds greatly influences the radiative energy balance in convectively active regions of Earth, through both direct presence, and the production of anvil and stratiform clouds. Previous studies show that the frequency and properties of convective clouds can vary on monthly timescales as a result of variability in the monthly mean atmospheric state. Furthermore, the radiative budget in convectively active regions also varies by up to 7 Wm-2 in convectively active regions. These facts suggest that convective clouds connect atmospheric state variability and radiation variability beyond clear sky effects alone. Previous research has identified monthly covariability between the diurnal cycle of CERES-observed top-of-atmosphere radiative fluxes and multiple atmospheric state variables from reanalysis over the Amazon region. ASVs that enhance (reduce) deep convection, such as CAPE (LTS), tend to shift the daily OLR and cloud albedo maxima earlier (later) in the day by 2-3 hr. We first test the analysis method using multiple reanalysis products for both the dry and wet seasons to further investigate the robustness of the preliminary results. We then use CloudSat data as an independent cloud observing system to further evaluate the relationships of cloud properties to variability in radiation and atmospheric states. While CERES can decompose OLR variability into clear sky and cloud effects, it cannot determine what variability in cloud properties lead to variability in the radiative cloud effects. Cloud frequency, cloud top height, and cloud microphysics all contribute to the cloud radiative effect, all of which are observable by CloudSat. In addition, CloudSat can also observe the presence and variability of deep convective cores responsible for the production of anvil clouds. We use these capabilities to determine the covariability of convective cloud properties and the radiative diurnal cycle.
Towards Efficient Scientific Data Management Using Cloud Storage
NASA Technical Reports Server (NTRS)
He, Qiming
2013-01-01
A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.
Cloud-Based Numerical Weather Prediction for Near Real-Time Forecasting and Disaster Response
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venner, Jason; Schroeder, Richard; Checchi, Milton; Zavodsky, Bradley; O'Brien, Raymond
2015-01-01
Cloud computing capabilities have rapidly expanded within the private sector, offering new opportunities for meteorological applications. Collaborations between NASA Marshall, NASA Ames, and contractor partners led to evaluations of private (NASA) and public (Amazon) resources for executing short-term NWP systems. Activities helped the Marshall team further understand cloud capabilities, and benchmark use of cloud resources for NWP and other applications
High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing
NASA Astrophysics Data System (ADS)
Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.
2015-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.
NASA Astrophysics Data System (ADS)
Ten Hoeve, J. E.; Jacobson, M. Z.
2010-12-01
Satellite observational studies have found an increase in cloud fraction (CF) and cloud optical depth (COD) with increasing aerosol optical depth (AOD) followed by a decreasing CF/COD with increasing AOD at higher AODs over the Amazon Basin. The shape of this curve is similar to that of a boomerang, and thus the effect has been dubbed the "boomerang effect.” The increase in CF/COD with increasing AOD at low AODs is ascribed to the first and second indirect effects and is referred to as a microphysical effect of aerosols on clouds. The decrease in CF/COD at higher AODs is ascribed to enhanced warming of clouds due to absorbing aerosols, either as inclusions in drops or interstitially between drops. This is referred to as a radiative effect. To date, the interaction of the microphysical and radiative effects has not been simulated with a regional or global computer model. Here, we simulate the boomerang effect with the nested global-through-urban climate, air pollution, weather forecast model, GATOR-GCMOM, for the Amazon biomass burning season of 2006. We also compare the model with an extensive set of data, including satellite data from MODIS, TRMM, and CALIPSO, in situ surface observations, upper-air data, and AERONET data. Biomass burning emissions are obtained from the Global Fire Emissions Database (GFEDv2), and are combined with MODIS land cover data along with biomass burning emission factors. A high-resolution domain, nested within three increasingly coarser domains, is employed over the heaviest biomass burning region within the arc of deforestation. Modeled trends in cloud properties with aerosol loading compare well with MODIS observed trends, allowing causation of these observed correlations, including of the boomerang effect, to be determined by model results. The impact of aerosols on various cloud parameters, such as cloud optical thickness, cloud fraction, cloud liquid water/ice content, and precipitation, are shown through differences between
Giangrande, Scott E.; Toto, Tami; Jensen, Michael P.; ...
2016-11-15
A radar wind profiler data set collected during the 2 year Department of Energy Atmospheric Radiation Measurement Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign is used to estimate convective cloud vertical velocity, area fraction, and mass flux profiles. Vertical velocity observations are presented using cumulative frequency histograms and weighted mean profiles to provide insights in a manner suitable for global climate model scale comparisons (spatial domains from 20 km to 60 km). Convective profile sensitivity to changes in environmental conditions and seasonal regime controls is also considered. Aggregate and ensemble average vertical velocity, convective area fraction, andmore » mass flux profiles, as well as magnitudes and relative profile behaviors, are found consistent with previous studies. Updrafts and downdrafts increase in magnitude with height to midlevels (6 to 10 km), with updraft area also increasing with height. Updraft mass flux profiles similarly increase with height, showing a peak in magnitude near 8 km. Downdrafts are observed to be most frequent below the freezing level, with downdraft area monotonically decreasing with height. Updraft and downdraft profile behaviors are further stratified according to environmental controls. These results indicate stronger vertical velocity profile behaviors under higher convective available potential energy and lower low-level moisture conditions. Sharp contrasts in convective area fraction and mass flux profiles are most pronounced when retrievals are segregated according to Amazonian wet and dry season conditions. During this deployment, wet season regimes favored higher domain mass flux profiles, attributed to more frequent convection that offsets weaker average convective cell vertical velocities.« less
Application of microarray analysis on computer cluster and cloud platforms.
Bernau, C; Boulesteix, A-L; Knaus, J
2013-01-01
Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.
Inexpensive and Highly Reproducible Cloud-Based Variant Calling of 2,535 Human Genomes
Shringarpure, Suyash S.; Carroll, Andrew; De La Vega, Francisco M.; Bustamante, Carlos D.
2015-01-01
Population scale sequencing of whole human genomes is becoming economically feasible; however, data management and analysis remains a formidable challenge for many research groups. Large sequencing studies, like the 1000 Genomes Project, have improved our understanding of human demography and the effect of rare genetic variation in disease. Variant calling on datasets of hundreds or thousands of genomes is time-consuming, expensive, and not easily reproducible given the myriad components of a variant calling pipeline. Here, we describe a cloud-based pipeline for joint variant calling in large samples using the Real Time Genomics population caller. We deployed the population caller on the Amazon cloud with the DNAnexus platform in order to achieve low-cost variant calling. Using our pipeline, we were able to identify 68.3 million variants in 2,535 samples from Phase 3 of the 1000 Genomes Project. By performing the variant calling in a parallel manner, the data was processed within 5 days at a compute cost of $7.33 per sample (a total cost of $18,590 for completed jobs and $21,805 for all jobs). Analysis of cost dependence and running time on the data size suggests that, given near linear scalability, cloud computing can be a cheap and efficient platform for analyzing even larger sequencing studies in the future. PMID:26110529
NASA Astrophysics Data System (ADS)
Lapola, D. M.
2015-12-01
The existence, magnitude and duration of a supposed "CO2 fertilization" effect in tropical forests remains largely undetermined, despite being suggested for nearly 20 years as a key knowledge gap for understanding the future resilience of Amazonian forests and its impact on the global carbon cycle. Reducing this uncertainty is critical for assessing the future of the Amazon region as well as its vulnerability to climate change. The AmazonFACE (Free-Air CO2 Enrichment) research program is an integrated model-experiment initiative of unprecedented scope in an old-growth Amazon forest near Manaus, Brazil - the first of its kind in tropical forest. The experimental treatment will simulate an atmospheric CO2 concentration [CO2] of the future in order to address the question: "How will rising atmospheric CO2 affect the resilience of the Amazon forest, the biodiversity it harbors, and the ecosystem services it provides, in light of projected climatic changes?" AmazonFACE is divided into three phases: (I) pre-experimental ecological characterization of the research site; (II) pilot experiment comprised of two 30-m diameter plots, with one treatment plot maintained at elevated [CO2] (ambient +200 ppmv), and the other control plot at ambient [CO2]; and (III) a fully-replicated long-term experiment comprised of four pairs of control/treatment FACE plots maintained for 10 years. A team of scientists from Brazil, USA, Australia and Europe will employ state-of-the-art methods to study the forest inside these plots in terms of carbon metabolism and cycling, water use, nutrient cycling, forest community composition, and interactions with environmental stressors. All project phases also encompass ecosystem-modeling activities in a way such that models provide hypothesis to be verified in the experiment, which in turn will feed models to ultimately produce more accurate projections of the environment. Resulting datasets and analyses will be a valuable resource for a broad community
NASA Astrophysics Data System (ADS)
Lapola, David; Quesada, Carlos; Norby, Richard; Araújo, Alessandro; Domingues, Tomas; Hartley, Iain; Kruijt, Bart; Lewin, Keith; Meir, Patrick; Ometto, Jean; Rammig, Anja
2016-04-01
The existence, magnitude and duration of a supposed "CO2 fertilization" effect in tropical forests remains largely undetermined, despite being suggested for nearly 20 years as a key knowledge gap for understanding the future resilience of Amazonian forests and its impact on the global carbon cycle. Reducing this uncertainty is critical for assessing the future of the Amazon region as well as its vulnerability to climate change. The AmazonFACE (Free-Air CO2 Enrichment) research program is an integrated model-experiment initiative of unprecedented scope in an old-growth Amazon forest near Manaus, Brazil - the first of its kind in tropical forest. The experimental treatment will simulate an atmospheric CO2 concentration [CO2] of the future in order to address the question: "How will rising atmospheric CO2 affect the resilience of the Amazon forest, the biodiversity it harbors, and the ecosystem services it provides, in light of projected climatic changes?" AmazonFACE is divided into three phases: (I) pre-experimental ecological characterization of the research site; (II) pilot experiment comprised of two 30-m diameter plots, with one treatment plot maintained at elevated [CO2] (ambient +200 ppmv), and the other control plot at ambient [CO2]; and (III) a fully-replicated long-term experiment comprised of four pairs of control/treatment FACE plots maintained for 10 years. A team of scientists from Brazil, USA, Australia and Europe will employ state-of-the-art methods to study the forest inside these plots in terms of carbon metabolism and cycling, water use, nutrient cycling, forest community composition, and interactions with environmental stressors. All project phases also encompass ecosystem-modeling activities in a way such that models provide hypothesis to be verified in the experiment, which in turn will feed models to ultimately produce more accurate projections of the environment. Resulting datasets and analyses will be a valuable resource for a broad community
NASA Astrophysics Data System (ADS)
Andreae, M. O.; Afchine, A.; Albrecht, R. I.; Artaxo, P.; Borrmann, S.; Cecchini, M. A.; Costa, A.; Fütterer, D.; Järvinen, E.; Klimach, T.; Konemann, T.; Kraemer, M.; Machado, L.; Mertes, S.; Pöhlker, C.; Pöhlker, M. L.; Poeschl, U.; Sauer, D. N.; Schnaiter, M.; Schneider, J.; Schulz, C.; Spanu, A.; Walser, A.; Wang, J.; Weinzierl, B.; Wendisch, M.
2016-12-01
Observations during ACRIDICON-CHUVA showed high aerosol concentrations in the upper troposphere (UT) over the Amazon Basin, with aerosol number concentrations after normalization to STP often exceeding those in the boundary layer (BL) by one or two orders of magnitude. The measurements were made during the German-Brazilian cooperative aircraft campaign ACRIDICON-CHUVA (Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems) on the German research aircraft HALO. The campaign took place over the Amazon Basin in September/October 2014, with the objective of studying tropical deep convective clouds over the Amazon rainforest and their interactions with trace gases, aerosol particles, and atmospheric radiation. Aerosol enhancements were consistently observed on all flights, using several aerosol metrics, including condensation nuclei (CN), cloud condensation nuclei (CCN), and chemical species mass concentrations. These UT aerosols were different in their composition and size distribution from the aerosol in the BL, making convective transport of particles unlikely as a source. The regions in the immediate outflow of deep convective clouds were depleted in aerosol particles, whereas dramatically enhanced small (<90 nm diameter) aerosol number concentrations were found in UT regions that had experienced outflow from deep convection in the preceding 24-48 hours. We also found elevated concentrations of larger (>90 nm) particles in the UT, which consisted mostly of organic matter and nitrate and were very effective CCN. Our findings suggest that aerosol production takes place in the UT from volatile material brought up by deep convection, which is converted to condensable species in the UT. Subsequently, downward mixing and transport of upper tropospheric aerosol may be a source of particles to the BL, where they increase in size by the condensation of biogenic volatile organic carbon (BVOC) oxidation products. This may be an
Automating NEURON Simulation Deployment in Cloud Resources
Santamaria, Fidel
2016-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341
NASA Astrophysics Data System (ADS)
Rammig, A.; Fleischer, K.; Lapola, D.; Holm, J.; Hoosbeek, M.
2017-12-01
Increasing atmospheric CO2 concentration is assumed to have a stimulating effect ("CO2 fertilization effect") on forest growth and resilience. Empirical evidence, however, for the existence and strength of such a tropical CO2 fertilization effect is scarce and thus a major impediment for constraining the uncertainties in Earth System Model projections. The implications of the tropical CO2 effect are far-reaching, as it strongly influences the global carbon and water cycle, and hence future global climate. In the scope of the Amazon Free Air CO2 Enrichment (FACE) experiment, we addressed these uncertainties by assessing the CO2 fertilization effect at ecosystem scale. AmazonFACE is the first FACE experiment in an old-growth, highly diverse tropical rainforest. Here, we present a priori model-based hypotheses for the experiment derived from a set of 12 ecosystem models. Model simulations identified key uncertainties in our understanding of limiting processes and derived model-based hypotheses of expected ecosystem responses to elevated CO2 that can directly be tested during the experiment. Ambient model simulations compared satisfactorily with in-situ measurements of ecosystem carbon fluxes, as well as carbon, nitrogen, and phosphorus stocks. Models consistently predicted an increase in photosynthesis with elevated CO2, which declined over time due to developing limitations. The conversion of enhanced photosynthesis into biomass, and hence ecosystem carbon sequestration, varied strongly among the models due to different assumptions on nutrient limitation. Models with flexible allocation schemes consistently predicted an increased investment in belowground structures to alleviate nutrient limitation, in turn accelerating turnover rates of soil organic matter. The models diverged on the prediction for carbon accumulation after 10 years of elevated CO2, mainly due to contrasting assumptions in their phosphorus cycle representation. These differences define the expected
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-09-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.
Biogenic cloud nuclei in the central Amazon during the transition from wet to dry season
NASA Astrophysics Data System (ADS)
Whitehead, James D.; Darbyshire, Eoghan; Brito, Joel; Barbosa, Henrique M. J.; Crawford, Ian; Stern, Rafael; Gallagher, Martin W.; Kaye, Paul H.; Allan, James D.; Coe, Hugh; Artaxo, Paulo; McFiggans, Gordon
2016-08-01
The Amazon basin is a vast continental area in which atmospheric composition is relatively unaffected by anthropogenic aerosol particles. Understanding the properties of the natural biogenic aerosol particles over the Amazon rainforest is key to understanding their influence on regional and global climate. While there have been a number of studies during the wet season, and of biomass burning particles in the dry season, there has been relatively little work on the transition period - the start of the dry season in the absence of biomass burning. As part of the Brazil-UK Network for Investigation of Amazonian Atmospheric Composition and Impacts on Climate (BUNIAACIC) project, aerosol measurements, focussing on unpolluted biogenic air masses, were conducted at a remote rainforest site in the central Amazon during the transition from wet to dry season in July 2013. This period marks the start of the dry season but before significant biomass burning occurs in the region. Median particle number concentrations were 266 cm-3, with size distributions dominated by an accumulation mode of 130-150 nm. During periods of low particle counts, a smaller Aitken mode could also be seen around 80 nm. While the concentrations were similar in magnitude to those seen during the wet season, the size distributions suggest an enhancement in the accumulation mode compared to the wet season, but not yet to the extent seen later in the dry season, when significant biomass burning takes place. Submicron nonrefractory aerosol composition, as measured by an aerosol chemical speciation monitor (ACSM), was dominated by organic material (around 81 %). Aerosol hygroscopicity was probed using measurements from a hygroscopicity tandem differential mobility analyser (HTDMA), and a quasi-monodisperse cloud condensation nuclei counter (CCNc). The hygroscopicity parameter, κ, was found to be low, ranging from 0.12 for Aitken-mode particles to 0.18 for accumulation-mode particles. This was consistent
NASA Astrophysics Data System (ADS)
Cochrane, S.; Schmidt, S.; Massie, S. T.; Iwabuchi, H.; Chen, H.
2017-12-01
Analysis of multiple partially cloudy scenes as observed by OCO-2 in nadir and target mode (published previously and reviewed here) revealed that XCO2 retrievals are systematically biased in presence of scattered clouds. The bias can only partially be removed by applying more stringent filtering, and it depends on the degree of scene inhomogeneity as quantified with collocated MODIS/Aqua imagery. The physical reason behind this effect was so far not well understood because in contrast to cloud-mediated biases in imagery-derived aerosol retrievals, passive gas absorption spectroscopy products do not depend on the absolute radiance level and should therefore be less sensitive to 3D cloud effects and surface albedo variability. However, preliminary evidence from 3D radiative transfer calculations suggested that clouds in the vicinity of an OCO-2 footprint not only offset the reflected radiance spectrum, but introduce a spectrally dependent perturbation that affects absorbing channels disproportionately, and therefore bias the spectroscopy products. To understand the nature of this effect for a variety of scenes, we developed the OCO-2 radiance simulator, which uses the available information on a scene (e.g., MODIS-derived surface albedo, cloud distribution, and other parameters) as the basis for 3D radiative transfer calculations that can predict the radiances observed by OCO-2. We present this new tool and show examples of its utility for a few specific scenes. More importantly, we draw conclusions about the physical mechanism behind this 3D cloud effect on radiances and ultimately OCO-2 retrievals, which involves not only the clouds themselves but also the surface. Harnessed with this understanding, we can now detect cloud vicinity effects in the OCO-2 spectra directly, without actually running the 3D radiance simulator. Potentially, it is even possible to mitigate these effects and thus increase data harvest in regions with ubiquitous cloud cover such as the Amazon
NASA Astrophysics Data System (ADS)
Freud, E.; Rosenfeld, D.; Andreae, M. O.; Costa, A. A.; Artaxo, P.
2008-03-01
In-situ measurements in convective clouds (up to the freezing level) over the Amazon basin show that smoke from deforestation fires prevents clouds from precipitating until they acquire a vertical development of at least 4 km, compared to only 1-2 km in clean clouds. The average cloud depth required for the onset of warm rain increased by ~350 m for each additional 100 cloud condensation nuclei per cm3 at a super-saturation of 0.5% (CCN0.5%). In polluted clouds, the diameter of modal liquid water content grows much slower with cloud depth (at least by a factor of ~2), due to the large number of droplets that compete for available water and to the suppressed coalescence processes. Contrary to what other studies have suggested, we did not observe this effect to reach saturation at 3000 or more accumulation mode particles per cm3. The CCN0.5% concentration was found to be a very good predictor for the cloud depth required for the onset of warm precipitation and other microphysical factors, leaving only a secondary role for the updraft velocities in determining the cloud drop size distributions. The effective radius of the cloud droplets (re) was found to be a quite robust parameter for a given environment and cloud depth, showing only a small effect of partial droplet evaporation from the cloud's mixing with its drier environment. This supports one of the basic assumptions of satellite analysis of cloud microphysical processes: the ability to look at different cloud top heights in the same region and regard their re as if they had been measured inside one well developed cloud. The dependence of re on the adiabatic fraction decreased higher in the clouds, especially for cleaner conditions, and disappeared at re≥~10 μm. We propose that droplet coalescence, which is at its peak when warm rain is formed in the cloud at re=~10 μm, continues to be significant during the cloud's mixing with the entrained air, cancelling out the decrease in re due to evaporation.
NASA Astrophysics Data System (ADS)
Freud, E.; Rosenfeld, D.; Andreae, M. O.; Costa, A. A.; Artaxo, P.
2005-10-01
In-situ measurements in convective clouds (up to the freezing level) over the Amazon basin show that smoke from deforestation fires prevents clouds from precipitating until they acquire a vertical development of at least 4 km, compared to only 1-2 km in clean clouds. The average cloud depth required for the onset of warm rain increased by ~350 m for each additional 100 cloud condensation nuclei per cm3 at a super-saturation of 0.5% (CCN0.5%). In polluted clouds, the diameter of modal liquid water content grows much slower with cloud depth (at least by a factor of ~2), due to the large number of droplets that compete for available water and to the suppressed coalescence processes. Contrary to what other studies have suggested, we did not observe this effect to reach saturation at 3000 or more accumulation mode particles per cm3. The CCN0.5% concentration was found to be a very good predictor for the cloud depth required for the onset of warm precipitation and other microphysical factors, leaving only a secondary role for the updraft velocities in determining the cloud drop size distributions. The effective radius of the cloud droplets (re) was found to be a quite robust parameter for a given environment and cloud depth, showing only a small effect of partial droplet evaporation from the cloud's mixing with its drier environment. This supports one of the basic assumptions of satellite analysis of cloud microphysical processes: the ability to look at different cloud top heights in the same region and regard their re as if they had been measured inside one well developed cloud. The dependence of re on the adiabatic fraction decreased higher in the clouds, especially for cleaner conditions, and disappeared at re≥~10 µm. We propose that droplet coalescence, which is at its peak when warm rain is formed in the cloud at re~10 µm, continues to be significant during the cloud's mixing with the entrained air, canceling out the decrease in re due to evaporation.
Eun, So Young; Park, Sang Won; Lee, Jae Heun; Chang, Ki Churl; Kim, Hye Jung
2014-04-01
Lipoprotein oxidation, inflammation, and immune responses involving the vascular endothelium and immune cells contribute to the pathogenesis of atherosclerosis. In an atherosclerotic animal model, P2Y2 receptor (P2Y2R) upregulation and stimulation were previously shown to induce intimal hyperplasia and increased intimal monocyte infiltration. Thus, we investigated the role of P2Y2R in oxidized low-density lipoprotein (oxLDL)-mediated oxidative stress and the subsequent interaction between endothelial cells (ECs) and immune cells. The treatment of human ECs with oxLDL caused the rapid release of ATP (maximum after 5 min). ECs treated with oxLDL or the P2Y2R agonists ATP/UTP for 1h exhibited significant reactive oxygen species (ROS) production, but this effect was not observed in P2Y2R siRNA-transfected ECs. In addition, oxLDL and ATP/UTP both induced RAGE expression, which was P2Y2R dependent. Oxidized LDL- and ATP/UTP-mediated ROS production was diminished in RAGE siRNA-transfected ECs, suggesting that RAGE is an important mediator in P2Y2R-mediated ROS production. Treatment with oxLDL for 24h induced P2Y2R expression in the human monocyte cell line THP-1 and increased THP-1 cell migration toward ECs. The addition of apyrase, an enzyme that hydrolyzes nucleotides, or diphenyleneiodonium (DPI), a well-known inhibitor of NADPH oxidase, significantly inhibited the increase in cell migration caused by oxLDL. P2Y2R siRNA-transfected THP-1 cells did not migrate in response to oxLDL or ATP/UTP treatment, indicating a critical role for P2Y2R and nucleotide release in oxLDL-induced monocyte migration. Last, oxLDL and ATP/UTP effectively increased ICAM-1 and VCAM-1 expression and the subsequent binding of THP-1 cells to ECs, which was inhibited by pretreatment with DPI or by siRNA against P2Y2R or RAGE, suggesting that P2Y2R is an important mediator in oxLDL-mediated monocyte adhesion to ECs through the regulation of ROS-dependent adhesion molecule expression in ECs. Taken
Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T
2014-09-10
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.
Cecchini, Micael A.; Machado, Luiz A. T.; Comstock, Jennifer M.; ...
2016-06-09
The remote atmosphere over the Amazon can be similar to oceanic regions in terms of aerosol conditions and cloud type formations. This is especially true during the wet season. The main aerosol-related disturbances over the Amazon have both natural sources, such as dust transport from Africa, and anthropogenic sources, such as biomass burning or urban pollution. The present work considers the impacts of the latter on the microphysical properties of warm-phase clouds by analyzing observations of the interactions between the Manaus pollution plume and its surroundings, as part of the GoAmazon2014/5 Experiment. The analyzed period corresponds to the wet seasonmore » (specifically from February to March 2014 and corresponding to the first Intensive Operating Period (IOP1) of GoAmazon2014/5). The droplet size distributions reported are in the range 1 µm ≤ D ≤ 50 µm in order to capture the processes leading up to the precipitation formation. The wet season largely presents a clean background atmosphere characterized by frequent rain showers. As such, the contrast between background clouds and those affected by the Manaus pollution can be observed and detailed. The focus is on the characteristics of the initial microphysical properties in cumulus clouds predominantly at their early stages. The pollution-affected clouds are found to have smaller effective diameters and higher droplet number concentrations. The differences range from 10 to 40 % for the effective diameter and are as high as 1000% for droplet concentration for the same vertical levels. The growth rates of droplets with altitude are slower for pollution-affected clouds (2.90 compared to 5.59 µm km –1), as explained by the absence of bigger droplets at the onset of cloud development. Clouds under background conditions have higher concentrations of larger droplets (> 20 µm) near the cloud base, which would contribute significantly to the growth rates through the collision–coalescence process. The
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cecchini, Micael A.; Machado, Luiz A. T.; Comstock, Jennifer M.
The remote atmosphere over the Amazon can be similar to oceanic regions in terms of aerosol conditions and cloud type formations. This is especially true during the wet season. The main aerosol-related disturbances over the Amazon have both natural sources, such as dust transport from Africa, and anthropogenic sources, such as biomass burning or urban pollution. The present work considers the impacts of the latter on the microphysical properties of warm-phase clouds by analyzing observations of the interactions between the Manaus pollution plume and its surroundings, as part of the GoAmazon2014/5 Experiment. The analyzed period corresponds to the wet seasonmore » (specifically from February to March 2014 and corresponding to the first Intensive Operating Period (IOP1) of GoAmazon2014/5). The droplet size distributions reported are in the range 1 µm ≤ D ≤ 50 µm in order to capture the processes leading up to the precipitation formation. The wet season largely presents a clean background atmosphere characterized by frequent rain showers. As such, the contrast between background clouds and those affected by the Manaus pollution can be observed and detailed. The focus is on the characteristics of the initial microphysical properties in cumulus clouds predominantly at their early stages. The pollution-affected clouds are found to have smaller effective diameters and higher droplet number concentrations. The differences range from 10 to 40 % for the effective diameter and are as high as 1000% for droplet concentration for the same vertical levels. The growth rates of droplets with altitude are slower for pollution-affected clouds (2.90 compared to 5.59 µm km –1), as explained by the absence of bigger droplets at the onset of cloud development. Clouds under background conditions have higher concentrations of larger droplets (> 20 µm) near the cloud base, which would contribute significantly to the growth rates through the collision–coalescence process. The
Cloud-based Web Services for Near-Real-Time Web access to NPP Satellite Imagery and other Data
NASA Astrophysics Data System (ADS)
Evans, J. D.; Valente, E. G.
2010-12-01
We are building a scalable, cloud computing-based infrastructure for Web access to near-real-time data products synthesized from the U.S. National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP) and other geospatial and meteorological data. Given recent and ongoing changes in the the NPP and NPOESS programs (now Joint Polar Satellite System), the need for timely delivery of NPP data is urgent. We propose an alternative to a traditional, centralized ground segment, using distributed Direct Broadcast facilities linked to industry-standard Web services by a streamlined processing chain running in a scalable cloud computing environment. Our processing chain, currently implemented on Amazon.com's Elastic Compute Cloud (EC2), retrieves raw data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and synthesizes data products such as Sea-Surface Temperature, Vegetation Indices, etc. The cloud computing approach lets us grow and shrink computing resources to meet large and rapid fluctuations (twice daily) in both end-user demand and data availability from polar-orbiting sensors. Early prototypes have delivered various data products to end-users with latencies between 6 and 32 minutes. We have begun to replicate machine instances in the cloud, so as to reduce latency and maintain near-real time data access regardless of increased data input rates or user demand -- all at quite moderate monthly costs. Our service-based approach (in which users invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored and composite (e.g., false-color multiband) products on demand. To facilitate broad impact and adoption of our technology, we have emphasized open, industry-standard software interfaces and open source software. Through our work, we envision the widespread establishment of similar, derived, or interoperable systems for
Mouths of the Amazon River, Brazil, South America
NASA Technical Reports Server (NTRS)
1991-01-01
Huge sediment loads from the interior of the country flow through the Mouths of the Amazon River, Brazil (0.5S, 50.0W). The river current carries hundreds of tons of sediment through the multiple outlets of the great river over 100 miles from shore before it is carried northward by oceanic currents. The characteristic 'fair weather cumulus' pattern of low clouds over the land but not over water may be observed in this scene.
Mouths of the Amazon River, Brazil, South America
1991-08-11
Huge sediment loads from the interior of the country flow through the Mouths of the Amazon River, Brazil (0.5S, 50.0W). The river current carries hundreds of tons of sediment through the multiple outlets of the great river over 100 miles from shore before it is carried northward by oceanic currents. The characteristic "fair weather cumulus" pattern of low clouds over the land but not over water may be observed in this scene.
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-01-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333
On the controls of deep convection and lightning in the Amazon
NASA Astrophysics Data System (ADS)
Albrecht, R. I.; Giangrande, S. E.; Wang, D.; Morales, C. A.; Pereira, R. F. O.; Machado, L.; Silva Dias, M. A. F.
2017-12-01
Local observations and remote sensing have been extensively used to unravel cloud distribution and life cycle but yet their representativeness in cloud resolve models (CRMs) and global climate models (GCMs) are still very poor. In addition, the complex cloud-aerosol-precipitation interactions (CAPI), as well as thermodynamics, dynamics and large scale controls on convection have been the focus of many studies in the last two decades but still no final answer has been reached on the overall impacts of these interactions and controls on clouds, especially on deep convection. To understand the environmental and CAPI controls of deep convection, cloud electrification and lightning activity in the pristine region of Amazon basin, in this study we use long term satellite and field campaign measurements to depict the characteristics of deep convection and the relationships between lightning and convective fluxes in this region. Precipitation and lightning activity from the Tropical Rainfall Measuring Mission (TRMM) satellite are combined with estimates of aerosol concentrations and reanalysis data to delineate the overall controls on thunderstorms. A more detailed analysis is obtained studying these controls on the relationship between lightning activity and convective mass fluxes using radar wind profiler and 3D total lightning during GoAmazon 2014/15 field campaign. We find evidences that the large scale conditions control the distribution of the precipitation, with widespread and more frequent mass fluxes of moderate intensity during the wet season, resulting in less vigorous convection and lower lightning activity. Under higher convective available potential energy, lightning is enhanced in polluted and background aerosol conditions. The relationships found in this study can be used in model parameterizations and ensemble evaluations of both lightning activity and lightning NOx from seasonal forecasting to climate projections and in a broader sense to Earth Climate
NASA Astrophysics Data System (ADS)
Lin, Qinhao; Zhang, Guohua; Peng, Long; Bi, Xinhui; Wang, Xinming; Brechtel, Fred J.; Li, Mei; Chen, Duohong; Peng, Ping'an; Sheng, Guoying; Zhou, Zhen
2017-07-01
To investigate how atmospheric aerosol particles interact with chemical composition of cloud droplets, a ground-based counterflow virtual impactor (GCVI) coupled with a real-time single-particle aerosol mass spectrometer (SPAMS) was used to assess the chemical composition and mixing state of individual cloud residue particles in the Nanling Mountains (1690 m a. s. l. ), southern China, in January 2016. The cloud residues were classified into nine particle types: aged elemental carbon (EC), potassium-rich (K-rich), amine, dust, Pb, Fe, organic carbon (OC), sodium-rich (Na-rich) and Other
. The largest fraction of the total cloud residues was the aged EC type (49.3 %), followed by the K-rich type (33.9 %). Abundant aged EC cloud residues that mixed internally with inorganic salts were found in air masses from northerly polluted areas. The number fraction (NF) of the K-rich cloud residues increased within southwesterly air masses from fire activities in Southeast Asia. When air masses changed from northerly polluted areas to southwesterly ocean and livestock areas, the amine particles increased from 0.2 to 15.1 % of the total cloud residues. The dust, Fe, Pb, Na-rich and OC particle types had a low contribution (0.5-4.1 %) to the total cloud residues. Higher fraction of nitrate (88-89 %) was found in the dust and Na-rich cloud residues relative to sulfate (41-42 %) and ammonium (15-23 %). Higher intensity of nitrate was found in the cloud residues relative to the ambient particles. Compared with nonactivated particles, nitrate intensity decreased in all cloud residues except for dust type. To our knowledge, this study is the first report on in situ observation of the chemical composition and mixing state of individual cloud residue particles in China.
Machado, Luiz A. T.; Calheiros, Alan J. P.; Biscaro, Thiago; ...
2018-05-07
This study provides an overview of precipitation processes and their sensitivities to environmental conditions in the Central Amazon Basin near Manaus during the GoAmazon2014/5 and ACRIDICON-CHUVA experiments. Here, this study takes advantage of the numerous measurement platforms and instrument systems operating during both campaigns to sample cloud structure and environmental conditions during 2014 and 2015; the rainfall variability among seasons, aerosol loading, land surface type, and topography has been carefully characterized using these data. Differences between the wet and dry seasons were examined from a variety of perspectives. The rainfall rates distribution, total amount of rainfall, and raindrop size distribution (the mass-weightedmore » mean diameter) were quantified over both seasons. The dry season generally exhibited higher rainfall rates than the wet season and included more intense rainfall periods. However, the cumulative rainfall during the wet season was 4 times greater than that during the total dry season rainfall, as shown in the total rainfall accumulation data. The typical size and life cycle of Amazon cloud clusters (observed by satellite) and rain cells (observed by radar) were examined, as were differences in these systems between the seasons. Moreover, monthly mean thermodynamic and dynamic variables were analysed using radiosondes to elucidate the differences in rainfall characteristics during the wet and dry seasons. The sensitivity of rainfall to atmospheric aerosol loading was discussed with regard to mass-weighted mean diameter and rain rate. This topic was evaluated only during the wet season due to the insignificant statistics of rainfall events for different aerosol loading ranges and the low frequency of precipitation events during the dry season. The impacts of aerosols on cloud droplet diameter varied based on droplet size. For the wet season, we observed no dependence between land surface type and rain rate. However
NASA Astrophysics Data System (ADS)
Machado, Luiz A. T.; Calheiros, Alan J. P.; Biscaro, Thiago; Giangrande, Scott; Silva Dias, Maria A. F.; Cecchini, Micael A.; Albrecht, Rachel; Andreae, Meinrat O.; Araujo, Wagner F.; Artaxo, Paulo; Borrmann, Stephan; Braga, Ramon; Burleyson, Casey; Eichholz, Cristiano W.; Fan, Jiwen; Feng, Zhe; Fisch, Gilberto F.; Jensen, Michael P.; Martin, Scot T.; Pöschl, Ulrich; Pöhlker, Christopher; Pöhlker, Mira L.; Ribaud, Jean-François; Rosenfeld, Daniel; Saraiva, Jaci M. B.; Schumacher, Courtney; Thalman, Ryan; Walter, David; Wendisch, Manfred
2018-05-01
This study provides an overview of precipitation processes and their sensitivities to environmental conditions in the Central Amazon Basin near Manaus during the GoAmazon2014/5 and ACRIDICON-CHUVA experiments. This study takes advantage of the numerous measurement platforms and instrument systems operating during both campaigns to sample cloud structure and environmental conditions during 2014 and 2015; the rainfall variability among seasons, aerosol loading, land surface type, and topography has been carefully characterized using these data. Differences between the wet and dry seasons were examined from a variety of perspectives. The rainfall rates distribution, total amount of rainfall, and raindrop size distribution (the mass-weighted mean diameter) were quantified over both seasons. The dry season generally exhibited higher rainfall rates than the wet season and included more intense rainfall periods. However, the cumulative rainfall during the wet season was 4 times greater than that during the total dry season rainfall, as shown in the total rainfall accumulation data. The typical size and life cycle of Amazon cloud clusters (observed by satellite) and rain cells (observed by radar) were examined, as were differences in these systems between the seasons. Moreover, monthly mean thermodynamic and dynamic variables were analysed using radiosondes to elucidate the differences in rainfall characteristics during the wet and dry seasons. The sensitivity of rainfall to atmospheric aerosol loading was discussed with regard to mass-weighted mean diameter and rain rate. This topic was evaluated only during the wet season due to the insignificant statistics of rainfall events for different aerosol loading ranges and the low frequency of precipitation events during the dry season. The impacts of aerosols on cloud droplet diameter varied based on droplet size. For the wet season, we observed no dependence between land surface type and rain rate. However, during the dry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machado, Luiz A. T.; Calheiros, Alan J. P.; Biscaro, Thiago
This study provides an overview of precipitation processes and their sensitivities to environmental conditions in the Central Amazon Basin near Manaus during the GoAmazon2014/5 and ACRIDICON-CHUVA experiments. Here, this study takes advantage of the numerous measurement platforms and instrument systems operating during both campaigns to sample cloud structure and environmental conditions during 2014 and 2015; the rainfall variability among seasons, aerosol loading, land surface type, and topography has been carefully characterized using these data. Differences between the wet and dry seasons were examined from a variety of perspectives. The rainfall rates distribution, total amount of rainfall, and raindrop size distribution (the mass-weightedmore » mean diameter) were quantified over both seasons. The dry season generally exhibited higher rainfall rates than the wet season and included more intense rainfall periods. However, the cumulative rainfall during the wet season was 4 times greater than that during the total dry season rainfall, as shown in the total rainfall accumulation data. The typical size and life cycle of Amazon cloud clusters (observed by satellite) and rain cells (observed by radar) were examined, as were differences in these systems between the seasons. Moreover, monthly mean thermodynamic and dynamic variables were analysed using radiosondes to elucidate the differences in rainfall characteristics during the wet and dry seasons. The sensitivity of rainfall to atmospheric aerosol loading was discussed with regard to mass-weighted mean diameter and rain rate. This topic was evaluated only during the wet season due to the insignificant statistics of rainfall events for different aerosol loading ranges and the low frequency of precipitation events during the dry season. The impacts of aerosols on cloud droplet diameter varied based on droplet size. For the wet season, we observed no dependence between land surface type and rain rate. However
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Astrophysics Data System (ADS)
Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.
2011-12-01
Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly
Fischer, Nina M; Favrot, Claude; Birkmann, Katharina; Jackson, Michele; Schwarzwald, Colin C; Müller, Martin; Tobler, Kurt; Geisseler, Marco; Lange, Christian E
2014-06-01
The DNA of equine papillomavirus type 2 (EcPV2) is consistently found in equine papillomas and squamous cell carcinomas, indicating a causal association of EcPV2 in the pathogenesis of these tumours; however, little is known about the prevalence of this virus. The aim of this study was to determine the geno- and seroprevalence of EcPV2 in clinically healthy horses in Switzerland. Fifty horses presented to the equine department of the university clinic, displaying no skin or mucous membrane lesions or severe signs of other diseases, were sampled. Cytobrush samples from the penis or vulva and serum samples were collected. To determine the genoprevalence of EcPV2, DNA was extracted from cytobrush samples and tested for viral DNA with a PCR assay amplifying a 338 bp fragment of the E7/E1 region of the viral genome. Seroprevalence was tested using an enzyme-linked immunosorbent assay aimed to detect antibodies against the major capsid protein (L1) of EcPV2. In five of 50 horses (10%), EcPV2-specific DNA was amplified but no antibodies could be detected, whereas in 14 of 50 horses (28%), antibodies against EcPV2 but no DNA were demonstrated. Both antibodies and viral DNA were detected in four of 50 horses (8%). Neither antibodies nor viral DNA were found in 27 of 50 horses (54%). The seroprevalence suggests that EcPV2 is prevalent in the Swiss equine population, while the genoprevalence indicates that currently ongoing infections are less common. The discrepancy between geno- and seroprevalence probably indicates different stages of infection in the tested cohort. © 2014 ESVD and ACVD.
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...
2017-09-29
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.
2014-01-01
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933
A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.
2017-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
Sharing Planetary-Scale Data in the Cloud
NASA Astrophysics Data System (ADS)
Sundwall, J.; Flasher, J.
2016-12-01
On 19 March 2015, Amazon Web Services (AWS) announced Landsat on AWS, an initiative to make data from the U.S. Geological Survey's Landsat satellite program freely available in the cloud. Because of Landsat's global coverage and long history, it has become a reference point for all Earth observation work and is considered the gold standard of natural resource satellite imagery. Within the first year of Landsat on AWS, the service served over a billion requests for Landsat imagery and metadata, globally. Availability of the data in the cloud has led to new product development by companies and startups including Mapbox, Esri, CartoDB, MathWorks, Development Seed, Trimble, Astro Digital, Blue Raster and Timbr.io. The model of staging data for analysis in the cloud established by Landsat on AWS has since been applied to high resolution radar data, European Space Agency satellite imagery, global elevation data and EPA air quality models. This session will provide an overview of lessons learned throughout these projects. It will demonstrate how cloud-based object storage is democratizing access to massive publicly-funded data sets that have previously only been available to people with access to large amounts of storage, bandwidth, and computing power. Technical discussion points will include: The differences between staging data for analysis using object storage versus file storage Using object stores to design simple RESTful APIs through thoughtful file naming conventions, header fields, and HTTP Range Requests Managing costs through data architecture and Amazon S3's "requester pays" feature Building tools that allow users to take their algorithm to the data in the cloud Using serverless technologies to display dynamic frontends for massive data sets
Stable estimate of primary OC/EC ratios in the EC tracer method
NASA Astrophysics Data System (ADS)
Chu, Shao-Hang
In fine particulate matter studies, the primary OC/EC ratio plays an important role in estimating the secondary organic aerosol contribution to PM2.5 concentrations using the EC tracer method. In this study, numerical experiments are carried out to test and compare various statistical techniques in the estimation of primary OC/EC ratios. The influence of random measurement errors in both primary OC and EC measurements on the estimation of the expected primary OC/EC ratios is examined. It is found that random measurement errors in EC generally create an underestimation of the slope and an overestimation of the intercept of the ordinary least-squares regression line. The Deming regression analysis performs much better than the ordinary regression, but it tends to overcorrect the problem by slightly overestimating the slope and underestimating the intercept. Averaging the ratios directly is usually undesirable because the average is strongly influenced by unrealistically high values of OC/EC ratios resulting from random measurement errors at low EC concentrations. The errors generally result in a skewed distribution of the OC/EC ratios even if the parent distributions of OC and EC are close to normal. When measured OC contains a significant amount of non-combustion OC Deming regression is a much better tool and should be used to estimate both the primary OC/EC ratio and the non-combustion OC. However, if the non-combustion OC is negligibly small the best and most robust estimator of the OC/EC ratio turns out to be the simple ratio of the OC and EC averages. It not only reduces random errors by averaging individual variables separately but also acts as a weighted average of ratios to minimize the influence of unrealistically high OC/EC ratios created by measurement errors at low EC concentrations. The median of OC/EC ratios ranks a close second, and the geometric mean of ratios ranks third. This is because their estimations are insensitive to questionable extreme
Scientific Data Storage for Cloud Computing
NASA Astrophysics Data System (ADS)
Readey, J.
2014-12-01
Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.
NASA Astrophysics Data System (ADS)
Shilling, J.; Pekour, M. S.; Fortner, E.; Hubbe, J. M.; Longo, K.; Martin, S. T.; Mei, F.; Springston, S. R.; Tomlinson, J. M.; Wang, J.
2014-12-01
The Green Ocean Amazon (GoAmazon) campaign conducted from January 2014 - December 2015 in the vicinity of Manaus, Brazil, was designed to study the aerosol lifecycle and aerosol-cloud interactions in both pristine and anthropogenically influenced conditions. As part of this campaign, the DOE G-1 research aircraft was deployed from February 17th - March 25th 2014 and September 6th - October 5th 2014 to investigate aerosol and cloud properties aloft. An Aerodyne High Resolution Aerosol Mass Spectrometer (AMS) and an Ionicon Proton Transfer Reaction Mass Spectrometer (PTRMS) were part of the G-1 research aircraft payload and were used to investigate aerosol gas- and particle-phase chemical composition. Here we present preliminary analysis of the aerosol and gas phase chemical composition. PTR-MS measurements show that isoprene and its oxidation products are the dominant VOCs during research flights. HR-AMS measurements reveal that the particle phase is dominated by organic material with smaller concentrations of sulfate and nitrate observed. Organic particle concentrations are enhanced when encountering the urban plume from Manaus. During the wet season, we observe increased concentrations of organic particle when passing through low-altitude clouds. PMF analysis of the organic mass spectra shows that the chemical composition of the particles observed in-cloud is distinctly different from particles observed outside clouds. We will also compare measurements made during the wet and dry seasons.
The Ethics of Cloud Computing.
de Bruin, Boudewijn; Floridi, Luciano
2017-02-01
Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business ethics dealing with this new technology. It analyzes the informational duties of hosting companies that own and operate cloud computing datacentres (e.g., Amazon). It considers the cloud services providers leasing 'space in the cloud' from hosting companies (e.g., Dropbox, Salesforce). And it examines the business and private 'clouders' using these services. The first part of the paper argues that hosting companies, services providers and clouders have mutual informational (epistemic) obligations to provide and seek information about relevant issues such as consumer privacy, reliability of services, data mining and data ownership. The concept of interlucency is developed as an epistemic virtue governing ethically effective communication. The second part considers potential forms of government restrictions on or proscriptions against the development and use of cloud computing technology. Referring to the concept of technology neutrality, it argues that interference with hosting companies and cloud services providers is hardly ever necessary or justified. It is argued, too, however, that businesses using cloud services (e.g., banks, law firms, hospitals etc. storing client data in the cloud) will have to follow rather more stringent regulations.
NASA Astrophysics Data System (ADS)
Li, Xing; Xiao, Jingfeng; He, Binbin
2018-04-01
Amazon forests play an important role in the global carbon cycle and Earth’s climate. The vulnerability of Amazon forests to drought remains highly controversial. Here we examine the impacts of the 2015 drought on the photosynthesis of Amazon forests to understand how solar radiation and precipitation jointly control forest photosynthesis during the severe drought. We use a variety of gridded vegetation and climate datasets, including solar-induced chlorophyll fluorescence (SIF), photosynthetic active radiation (PAR), the fraction of absorbed PAR (APAR), leaf area index (LAI), precipitation, soil moisture, cloud cover, and vapor pressure deficit (VPD) in our analysis. Satellite-derived SIF observations provide a direct diagnosis of plant photosynthesis from space. The decomposition of SIF to SIF yield (SIFyield) and APAR (the product of PAR and fPAR) reveals the relative effects of precipitation and solar radiation on photosynthesis. We found that the drought significantly reduced SIFyield, the emitted SIF per photon absorbed. The higher APAR resulting from lower cloud cover and higher LAI partly offset the negative effects of water stress on the photosynthesis of Amazon forests, leading to a smaller reduction in SIF than in SIFyield and precipitation. We further found that SIFyield anomalies were more sensitive to precipitation and VPD anomalies in the southern regions of the Amazon than in the central and northern regions. Our findings shed light on the relative and combined effects of precipitation and solar radiation on photosynthesis, and can improve our understanding of the responses of Amazon forests to drought.
NASA Astrophysics Data System (ADS)
Fraund, M. W.; Pham, D.; Harder, T.; O'Brien, R.; Wang, B.; Laskin, A.; Gilles, M. K.; Moffet, R.
2015-12-01
The role that anthropogenic aerosols play in cloud formation is uncertain and contributes largely to the uncertainty in predicting future climate. One region of particular importance is the Amazon rainforest, which accounts for over half of the world's rainforest. During GoAmazon2014/15 IOP2, aerosol samples were collected at multiple sites in and around the rapidly growing industrial city of Manaus in the Amazon basin. Manaus is of scientific interest due to the pristine nature of the surrounding rainforest and the high levels of pollution coming from the city in the form of SO2, NOx, and soot. Some sites, such as the Terrestrial Ecosystem Science center (TES, also designated ZF2) located to the north of Manaus, represent air masses which have not interacted with emissions from the city. The comparison of pristine atmosphere with heavy pollution allows both for the determination of a natural baseline level of pollutants, as well as the study of pollutant's impact on the conversion of biogenic volatile organic compounds to secondary organic aerosols. Towards this goal, samples from ZF2 and other unpolluted sites will be compared to samples from the Atmospheric Radiation Measurement (ARM) climate research facility in Manacapuru (T3), which is southwest (downwind) of Manaus. Spatially resolved spectra were recorded at the sub-particle level using scanning transmission X-ray microscopy (STXM) at the carbon, nitrogen, and oxygen K-absorption edges. Scanning electron microscopy coupled with energy dispersive x-ray spectroscopy (SEM/EDX) was also performed on to characterize higher Z elements. These two techniques together will allow for the mass fraction of atmospherically relevant elements to be determined on a per-particle basis. We will apply established procedures to determine the mixing state index for samples collected at ZF2 and T3 using elemental mass fractions. Preliminary results will be presented which focus on investigating the difference between mixing
Cloud-Based Numerical Weather Prediction for Near Real-Time Forecasting and Disaster Response
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venners, Jason; Schroeder, Richard; Checchi, Milton; Zavodsky, Bradley; Limaye, Ashutosh; O'Brien, Raymond
2015-01-01
The use of cloud computing resources continues to grow within the public and private sector components of the weather enterprise as users become more familiar with cloud-computing concepts, and competition among service providers continues to reduce costs and other barriers to entry. Cloud resources can also provide capabilities similar to high-performance computing environments, supporting multi-node systems required for near real-time, regional weather predictions. Referred to as "Infrastructure as a Service", or IaaS, the use of cloud-based computing hardware in an on-demand payment system allows for rapid deployment of a modeling system in environments lacking access to a large, supercomputing infrastructure. Use of IaaS capabilities to support regional weather prediction may be of particular interest to developing countries that have not yet established large supercomputing resources, but would otherwise benefit from a regional weather forecasting capability. Recently, collaborators from NASA Marshall Space Flight Center and Ames Research Center have developed a scripted, on-demand capability for launching the NOAA/NWS Science and Training Resource Center (STRC) Environmental Modeling System (EMS), which includes pre-compiled binaries of the latest version of the Weather Research and Forecasting (WRF) model. The WRF-EMS provides scripting for downloading appropriate initial and boundary conditions from global models, along with higher-resolution vegetation, land surface, and sea surface temperature data sets provided by the NASA Short-term Prediction Research and Transition (SPoRT) Center. This presentation will provide an overview of the modeling system capabilities and benchmarks performed on the Amazon Elastic Compute Cloud (EC2) environment. In addition, the presentation will discuss future opportunities to deploy the system in support of weather prediction in developing countries supported by NASA's SERVIR Project, which provides capacity building
Mouth of the Amazon River as seen from STS-58
1993-10-30
STS058-107-083 (18 Oct.-1 Nov. 1993) --- A near-nadir view of the mouth of the Amazon River, that shows all signs of being a relatively healthy system, breathing and exhaling. The well-developed cumulus field over the forested areas on both the north and south sides of the river (the view is slightly to the west) shows that good evapotranspiration is underway. The change in the cloud field from the moisture influx from the Atlantic (the cloud fields over the ocean are parallel to the wind direction) to perpendicular cloud fields over the land surface are normal. This change in direction is caused by the increased surface roughness over the land area. The plume of the river, although turbid, is no more or less turbid than it has been reported since the Portuguese first rounded Brasil's coast at the end of the 15th Century.
Mobile healthcare information management utilizing Cloud Computing and Android OS.
Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias
2010-01-01
Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.
NASA Technical Reports Server (NTRS)
O'Brien, Raymond
2017-01-01
In 2016, Ames supported the NASA CIO in delivering an initial operating capability for Agency use of commercial cloud computing. This presentation provides an overview of the project, the services approach followed, and the major components of the capability that was delivered. The presentation is being given at the request of Amazon Web Services to a contingent representing the Brazilian Federal Government and Defense Organization that is interested in the use of Amazon Web Services (AWS). NASA is currently a customer of AWS and delivered the Initial Operating Capability using AWS as its first commercial cloud provider. The IOC, however, designed to also support other cloud providers in the future.
Cloud computing for comparative genomics with windows azure platform.
Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P
2012-01-01
Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.
Making the most of cloud storage - a toolkit for exploitation by WLCG experiments
NASA Astrophysics Data System (ADS)
Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea
2017-10-01
Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.
Comparing Amazon Basin CO2 fluxes from an atmospheric inversion with TRENDY biosphere models
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.; Alden, C. B.; Harper, A. B.; Ahlström, A.; Touma, D. E.; Miller, J. B.; Gatti, L. V.; Gloor, M.
2015-12-01
Net exchange of carbon dioxide (CO2) between the atmosphere and the terrestrial biosphere is sensitive to environmental conditions, including extreme heat and drought. Of particular importance for local and global carbon balance and climate are the expansive tracts of tropical rainforest located in the Amazon Basin. Because of the Basin's size and ecological heterogeneity, net biosphere CO2 exchange with the atmosphere remains largely un-constrained. In particular, the response of net CO2 exchange to changes in environmental conditions such as temperature and precipitation are not yet well known. However, proper representation of these relationships in biosphere models is a necessary constraint for accurately modeling future climate and climate-carbon cycle feedbacks. In an effort to compare biosphere response to climate across different biosphere models, the TRENDY model intercomparison project coordinated the simulation of CO2 fluxes between the biosphere and atmosphere, in response to historical climate forcing, by 9 different Dynamic Global Vegetation Models. We examine the TRENDY model results in the Amazon Basin, and compare this "bottom-up" method with fluxes derived from a "top-down" approach to estimating net CO2 fluxes, obtained through atmospheric inverse modeling using CO2 measurements sampled by aircraft above the basin. We compare the "bottom-up" and "top-down" fluxes in 5 sub-regions of the Amazon basin on a monthly basis for 2010-2012. Our results show important periods of agreement between some models in the TRENDY suite and atmospheric inverse model results, notably the simulation of increased biosphere CO2 loss during wet season heat in the Central Amazon. During the dry season, however, model ability to simulate observed response of net CO2 exchange to drought was varied, with few models able to reproduce the "top-down" inversion flux signals. Our results highlight the value of atmospheric trace gas observations for helping to narrow the
Lambs, L; Horwath, A; Otto, T; Julien, F; Antoine, P-O
2012-04-15
The Amazon River is a huge network of long tributaries, and little is known about the headwaters. Here we present a study of one wet tropical Amazon forest side, and one dry and cold Atiplano plateau, originating from the same cordillera. The aim is to see how this difference affects the water characteristics. Different kind of water (spring, lake, river, rainfall) were sampled to determine their stable isotopes ratios (oxygen 18/16 and hydrogen 2/1) by continuous flow isotope ratio mass spectrometry (IRMS). These ratios coupled with chemical analysis enabled us to determine the origin of the water, the evaporation process and the water recycling over the Amazon plain forest and montane cloud forest. Our study shows that the water flowing in the upper Madre de Dios basin comes mainly from the foothill humid forest, with a characteristic water recycling process signature, and not from higher glacier melt. On the contrary, the water flowing in the Altiplano Rivers is mainly from glacier melts, with a high evaporation process. This snow and glacier are fed mainly by Atlantic moisture which transits over the large Amazon forest. The Atlantic moisture and its recycling over this huge tropical forest display a progressive isotopic gradient, as a function of distance from the ocean. At the level of the montane cloud forest and on the altiplano, respectively, additional water recycling and evaporation occur, but they are insignificant in the total water discharge. Copyright © 2012 John Wiley & Sons, Ltd.
Aerosol characteristics and particle production in the upper troposphere over the Amazon Basin
NASA Astrophysics Data System (ADS)
Andreae, Meinrat O.; Afchine, Armin; Albrecht, Rachel; Amorim Holanda, Bruna; Artaxo, Paulo; Barbosa, Henrique M. J.; Borrmann, Stephan; Cecchini, Micael A.; Costa, Anja; Dollner, Maximilian; Fütterer, Daniel; Järvinen, Emma; Jurkat, Tina; Klimach, Thomas; Konemann, Tobias; Knote, Christoph; Krämer, Martina; Krisna, Trismono; Machado, Luiz A. T.; Mertes, Stephan; Minikin, Andreas; Pöhlker, Christopher; Pöhlker, Mira L.; Pöschl, Ulrich; Rosenfeld, Daniel; Sauer, Daniel; Schlager, Hans; Schnaiter, Martin; Schneider, Johannes; Schulz, Christiane; Spanu, Antonio; Sperling, Vinicius B.; Voigt, Christiane; Walser, Adrian; Wang, Jian; Weinzierl, Bernadett; Wendisch, Manfred; Ziereis, Helmut
2018-01-01
Airborne observations over the Amazon Basin showed high aerosol particle concentrations in the upper troposphere (UT) between 8 and 15 km altitude, with number densities (normalized to standard temperature and pressure) often exceeding those in the planetary boundary layer (PBL) by 1 or 2 orders of magnitude. The measurements were made during the German-Brazilian cooperative aircraft campaign ACRIDICON-CHUVA, where ACRIDICON stands for Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems
and CHUVA is the acronym for Cloud Processes of the Main Precipitation Systems in Brazil: A Contribution to Cloud Resolving Modeling and to the GPM (global precipitation measurement)
, on the German High Altitude and Long Range Research Aircraft (HALO). The campaign took place in September-October 2014, with the objective of studying tropical deep convective clouds over the Amazon rainforest and their interactions with atmospheric trace gases, aerosol particles, and atmospheric radiation. Aerosol enhancements were observed consistently on all flights during which the UT was probed, using several aerosol metrics, including condensation nuclei (CN) and cloud condensation nuclei (CCN) number concentrations and chemical species mass concentrations. The UT particles differed sharply in their chemical composition and size distribution from those in the PBL, ruling out convective transport of combustion-derived particles from the boundary layer (BL) as a source. The air in the immediate outflow of deep convective clouds was depleted of aerosol particles, whereas strongly enhanced number concentrations of small particles (< 90 nm diameter) were found in UT regions that had experienced outflow from deep convection in the preceding 5-72 h. We also found elevated concentrations of larger (> 90 nm) particles in the UT, which consisted mostly of organic matter and nitrate and were very effective CCN. Our findings suggest a
Aerosol characteristics and particle production in the upper troposphere over the Amazon Basin
Andreae, Meinrat O.; Afchine, Armin; Albrecht, Rachel; ...
2018-01-25
Airborne observations over the Amazon Basin showed high aerosol particle concentrations in the upper troposphere (UT) between 8 and 15 km altitude, with number densities (normalized to standard temperature and pressure) often exceeding those in the planetary boundary layer (PBL) by 1 or 2 orders of magnitude. The measurements were made during the German–Brazilian cooperative aircraft campaign ACRIDICON–CHUVA, where ACRIDICON stands for Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems and CHUVA is the acronym for Cloud Processes of the Main Precipitation Systems in Brazil: A Contribution to Cloud Resolving Modeling and to the GPM (globalmore » precipitation measurement), on the German High Altitude and Long Range Research Aircraft (HALO). The campaign took place in September–October 2014, with the objective of studying tropical deep convective clouds over the Amazon rainforest and their interactions with atmospheric trace gases, aerosol particles, and atmospheric radiation. Aerosol enhancements were observed consistently on all flights during which the UT was probed, using several aerosol metrics, including condensation nuclei (CN) and cloud condensation nuclei (CCN) number concentrations and chemical species mass concentrations. The UT particles differed sharply in their chemical composition and size distribution from those in the PBL, ruling out convective transport of combustion-derived particles from the boundary layer (BL) as a source. The air in the immediate outflow of deep convective clouds was depleted of aerosol particles, whereas strongly enhanced number concentrations of small particles (< 90 nm diameter) were found in UT regions that had experienced outflow from deep convection in the preceding 5–72 h. We also found elevated concentrations of larger (> 90 nm) particles in the UT, which consisted mostly of organic matter and nitrate and were very effective CCN. Our findings suggest a conceptual
Aerosol characteristics and particle production in the upper troposphere over the Amazon Basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreae, Meinrat O.; Afchine, Armin; Albrecht, Rachel
Airborne observations over the Amazon Basin showed high aerosol particle concentrations in the upper troposphere (UT) between 8 and 15 km altitude, with number densities (normalized to standard temperature and pressure) often exceeding those in the planetary boundary layer (PBL) by 1 or 2 orders of magnitude. The measurements were made during the German–Brazilian cooperative aircraft campaign ACRIDICON–CHUVA, where ACRIDICON stands for Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems and CHUVA is the acronym for Cloud Processes of the Main Precipitation Systems in Brazil: A Contribution to Cloud Resolving Modeling and to the GPM (globalmore » precipitation measurement), on the German High Altitude and Long Range Research Aircraft (HALO). The campaign took place in September–October 2014, with the objective of studying tropical deep convective clouds over the Amazon rainforest and their interactions with atmospheric trace gases, aerosol particles, and atmospheric radiation. Aerosol enhancements were observed consistently on all flights during which the UT was probed, using several aerosol metrics, including condensation nuclei (CN) and cloud condensation nuclei (CCN) number concentrations and chemical species mass concentrations. The UT particles differed sharply in their chemical composition and size distribution from those in the PBL, ruling out convective transport of combustion-derived particles from the boundary layer (BL) as a source. The air in the immediate outflow of deep convective clouds was depleted of aerosol particles, whereas strongly enhanced number concentrations of small particles (< 90 nm diameter) were found in UT regions that had experienced outflow from deep convection in the preceding 5–72 h. We also found elevated concentrations of larger (> 90 nm) particles in the UT, which consisted mostly of organic matter and nitrate and were very effective CCN. Our findings suggest a conceptual
Hybrid cloud and cluster computing paradigms for life science applications
2010-01-01
Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982
Hybrid cloud and cluster computing paradigms for life science applications.
Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey
2010-12-21
Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.
Cloud Computing for Comparative Genomics with Windows Azure Platform
Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.
2012-01-01
Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609
Usability evaluation of cloud-based mapping tools for the display of very large datasets
NASA Astrophysics Data System (ADS)
Stotz, Nicole Marie
The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would
Evaluating the cloud radiative forcing over East Asia during summer simulated by CMIP5 models
NASA Astrophysics Data System (ADS)
Lin, Z.; Wang, Y.; Liu, X.
2017-12-01
A large degree of uncertainty in global climate models (GCMs) can be attributed to the representation of clouds and its radiative forcing (CRF). In this study, the simulated CRFs, total cloud fraction (CF) and cloud properties over East Asia from 20 CMIP5 AMIP models are evaluated and compared with multiple satellite observations, and the possible causes for the CRF bias in the CMIP5 models are then investigated. Based on the satellite observation, strong Long wave CRF (LWCRF) and Short wave CRF (SWCRF) are found to be located over Southwestern China, with minimum SWCRF less than -130Wm-2 and this is associated with the large amount of cloud in the region. By contrast, weak CRFs are located over Northwest China and Western Pacific region because of less cloud amount. In Northeastern China, the strong SWCRF and week LWCRF can be found due to the dominant low-level cloud. In Eastern China, the CRFs is moderate due to the co-existence of the multi-layer cloud. CMIP5 models can basically capture the structure of CRFs in East Asia, with the spatial correlation coefficient between 0.5 and 0.9. But most models underestimate CRFs in East Asia, which is highly associated with the underestimation of cloud amount in the region. The performance of CMIP5 models varies in different part of East Asian region, with a larger deviation in Eastern China (EC). Further investigation suggests that, underestimation of the cloud amount in EC can lead to the weak bias of CRFs in EC, however, this CRF bias can be cancelled out by the overestimation effect of CRF due to excessive cloud optical depth (COD) simulated by the models. The annual cycle of simulated CRF over Eastern China is also examined, and it is found, CMIP models are unable to reproduce the northward migration of CRF in summer monsoon season, which is closely related with northward shift of East Asian summer monsoon rain belt.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dzib, Sergio; Loinard, Laurent; Rodriguez, Luis F.
2010-08-01
Using the Very Long Base Array, we observed the young stellar object EC 95 in the Serpens cloud core at eight epochs from 2007 December to 2009 December. Two sources are detected in our field and are shown to form a tight binary system. The primary (EC 95a) is a 4-5 M {sub sun} proto-Herbig AeBe object (arguably the youngest such object known), whereas the secondary (EC 95b) is most likely a low-mass T Tauri star. Interestingly, both sources are non-thermal emitters. While T Tauri stars are expected to power a corona because they are convective while they go downmore » the Hayashi track, intermediate-mass stars approach the main sequence on radiative tracks. Thus, they are not expected to have strong superficial magnetic fields, and should not be magnetically active. We review several mechanisms that could produce the non-thermal emission of EC 95a and argue that the observed properties of EC 95a might be most readily interpreted if it possessed a corona powered by a rotation-driven convective layer. Using our observations, we show that the trigonometric parallax of EC 95 is {pi} = 2.41 {+-} 0.02 mas, corresponding to a distance of 414.9{sup +4.4} {sub -4.3} pc. We argue that this implies a distance to the Serpens core of 415 {+-} 5 pc and a mean distance to the Serpens cloud of 415 {+-} 25 pc. This value is significantly larger than previous estimates (d {approx} 260 pc) based on measurements of the extinction suffered by stars in the direction of Serpens. A possible explanation for this discrepancy is that these previous observations picked out foreground dust clouds associated with the Aquila Rift system rather than Serpens itself.« less
NASA Astrophysics Data System (ADS)
Mishra, S.; Shukla, A.; Sahu, R.; Kota, V. K. B.
2008-08-01
The β+/EC half-lives of medium heavy N~Z nuclei with mass number A~64-80 are calculated within the deformed shell model (DSM) based on Hartree-Fock states by employing a modified Kuo interaction in (2p3/2,1f5/2,2p1/2,1g9/2) space. The DSM model has been quite successful in predicting many spectroscopic properties of N~Z medium heavy nuclei with A~64-80. The calculated β+/EC half-lives, for prolate and oblate shapes, compare well with the predictions of the calculations with Skyrme force by Sarriguren Going further, following recent searches, half-lives for 2ν β+β+/β+EC/ECEC decay for the nucleus Kr78 are calculated using DSM and the results compare well with QRPA predictions.
Amazon Deforestation Fires Increase Plant Productivity through Changes in Diffuse Radiation
NASA Astrophysics Data System (ADS)
Rap, A.; Reddington, C.; Spracklen, D. V.; Mercado, L.; Haywood, J. M.; Bonal, D.; Butt, N.; Phillips, O.
2013-12-01
Over the past few decades a large increase in carbon storage has been observed in undisturbed forests across Amazonia. The reason for such a sink is unclear, although many possible mechanisms have been suggested, including changes in temperature, carbon dioxide, precipitation, clouds, and solar radiation. In this work we focus on one such mechanism, namely the increase in plant photosynthesis due to changes in diffuse radiation caused by atmospheric aerosols from large-scale deforestation fires that now occur throughout the Amazon region. We estimate that this mechanism has increased dry season (August-September) net primary productivity (NPP) by up to 30% across wide regions of the Amazon. We conclude that aerosol from deforestation fires may be responsible for a substantial fraction of the Amazon carbon sink that has been observed. Our approach is based on the combined use of three models: (i) the Global Model of Aerosol Processes (GLOMAP), (ii) the Edwards-Slingo radiation model, and (iii) the UK Met Office JULES land-surface scheme, constrained against in-situ aerosol and radiation observation datasets from several Amazonian sites. A 10 year (1999-2008) GLOMAP simulation using GFED3 biomass burning emissions is first evaluated against aerosol observations, indicating that the model is able to capture the Amazon aerosol seasonality, with enhanced concentrations during the dry season driven by biomass burning. The radiation scheme is then shown to be in good agreement with total and diffuse radiation in-situ observations, the model being able to capture the high total and low diffuse radiation flux in the dry season, as well as the low total and high diffuse radiation flux in the wet season. We then use our modelling framework to quantify the contribution of deforestation fires to diffuse/direct radiation fraction and forest productivity. We calculate that deforestation fires increase dry season diffuse radiation by up to 60% or 30 Wm-2. Finally, we use the JULES
Task 28: Web Accessible APIs in the Cloud Trade Study
NASA Technical Reports Server (NTRS)
Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun
2017-01-01
This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.
Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds
Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn; ...
2016-02-18
In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less
Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn
In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less
Impact of Amazon deforestation on climate simulations using the NCAR CCM2/BATS model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hahmann, A.N.; Dickinson, R.E.
Model validation and results are briefly presented for a simulation of deforestation of the Amazon rainforest. This initial study is made using assumptions regarding deforestation similar to those in earlier studies with several versions of the NCAR Community Climate Model (CCM) couples to the Biosphere-Atmosphere Transfer Scheme (BATS). The model used is a revised version of the NCAR CCM Version 2 coupled to BATS Version 1e. This paper discusses the portion of validation dealing with the distribution of precipitation; the simulation displays very good agreement with observed rainfall rates for the austral summer. Preliminary results from an 8-year simulation ofmore » deforestation are similar to that of previous studies. Annual precipitation and evaporation are reduced, while surface air temperatures show a slight increase. A substantial bimodal pattern appears in the results, with the Amazon decrease of precipitation and temperature increase accompanied by changes in the opposite sign to the southeast of the Amazon. Similar patterns have occurred in other studies, but not always in exactly the same locations. Evidently, how much of the region of rainfall increase occurs in the deforested area over the Amazon strongly affects the inferred statistics. It is likely that this pattern depends on the model control climatology and possibly other features. 16 refs., 2 figs., 2 tabs.« less
Atmospheric Science Data Center
2018-06-20
... V1 Level: L2 Platform: DEEP SPACE CLIMATE OBSERVATORY Instrument: Enhanced Polychromatic ... assuming ice phase Cloud Optical Thickness – assuming liquid phase EPIC Cloud Mask Oxygen A-band Cloud Effective Height (in ...
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
The Green Ocean Over the Amazon: Implications for Cloud Electrification
NASA Technical Reports Server (NTRS)
Williams, E.; Blakeslee, R.; Boccippio, D.; Arnold, James E. (Technical Monitor)
2001-01-01
A convective regime with distinct maritime characteristics (weak updraft, low CCN, prevalent coalescence and rainout, weak mixed phase reflectivity, low glaciation temperature, and little if any lightning) is documented over the Amazon basin of the South American continent, and dubbed the "green ocean". Radar, lightning, thermodynamic and AVHRR satellite observations are examined to shed light on the roles of updraft and aerosol in providing departures from the green ocean regime toward continental behavior. Extreme case studies are identified in which the updraft control is dominant and in which the aerosol control is dominant. The tentative conclusion gives importance to both updrafts and aerosol in shaping the electrification of tropical convection.
Lidar observed seasonal variation of vertical canopy structure in the Amazon evergreen forests
NASA Astrophysics Data System (ADS)
Tang, H.; Dubayah, R.
2017-12-01
Both light and water are important environmental factors governing tree growth. Responses of tropical forests to their changes are complicated and can vary substantially across different spatial and temporal scales. Of particular interest is the dry-season greening-up of Amazon forests, a phenomenon undergoing considerable debates whether it is real or a "light illusion" caused by artifacts of passive optical remote sensing techniques. Here we analyze seasonal dynamic patterns of vertical canopy structure in the Amazon forests using lidar observations from NASA's Ice, Cloud, and and land Elevation Satellite (ICESat). We found that the net greening of canopy layer coincides with the wet-to-dry transition period, and its net browning occurs mostly at the late dry season. The understory also shows a seasonal cycle, but with an opposite variation to canopy and minimal correlation to seasonal variations in rainfall or radiation. Our results further suggest a potential interaction between canopy layers in the light regime that can optimize the growth of Amazon forests during the dry season. This light regime variability that exists in both spatial and temporal domains can better reveal the dry-season greening-up phenomenon, which appears less obvious when treating the Amazon forests as a whole.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pöhlker, Mira L.; Pöhlker, Christopher; Ditas, Florian
Size-resolved long-term measurements of atmospheric aerosol and cloud condensation nuclei (CCN) concentrations and hygroscopicity were conducted at the remote Amazon Tall Tower Observatory (ATTO) in the central Amazon Basin over a 1-year period and full seasonal cycle (March 2014–February 2015). Our measurements provide a climatology of CCN properties characteristic of a remote central Amazonian rain forest site.The CCN measurements were continuously cycled through 10 levels of supersaturation ( S=0.11 to 1.10 %) and span the aerosol particle size range from 20 to 245 nm. The mean critical diameters of CCN activation range from 43 nm at S = 1.10 % to 172more » nm at S = 0.11 %. Furthermore, the particle hygroscopicity exhibits a pronounced size dependence with lower values for the Aitken mode ( κ Ait = 0.14 ± 0.03), higher values for the accumulation mode ( κ Acc = 0.22 ± 0.05), and an overall mean value of κ mean = 0.17 ± 0.06, consistent with high fractions of organic aerosol.The hygroscopicity parameter, κ, exhibits remarkably little temporal variability: no pronounced diurnal cycles, only weak seasonal trends, and few short-term variations during long-range transport events. In contrast, the CCN number concentrations exhibit a pronounced seasonal cycle, tracking the pollution-related seasonality in total aerosol concentration. Here, we find that the variability in the CCN concentrations in the central Amazon is mostly driven by aerosol particle number concentration and size distribution, while variations in aerosol hygroscopicity and chemical composition matter only during a few episodes.For modeling purposes, we compare different approaches of predicting CCN number concentration and present a novel parametrization, which allows accurate CCN predictions based on a small set of input data.« less
Pöhlker, Mira L.; Pöhlker, Christopher; Ditas, Florian; ...
2016-12-20
Size-resolved long-term measurements of atmospheric aerosol and cloud condensation nuclei (CCN) concentrations and hygroscopicity were conducted at the remote Amazon Tall Tower Observatory (ATTO) in the central Amazon Basin over a 1-year period and full seasonal cycle (March 2014–February 2015). Our measurements provide a climatology of CCN properties characteristic of a remote central Amazonian rain forest site.The CCN measurements were continuously cycled through 10 levels of supersaturation ( S=0.11 to 1.10 %) and span the aerosol particle size range from 20 to 245 nm. The mean critical diameters of CCN activation range from 43 nm at S = 1.10 % to 172more » nm at S = 0.11 %. Furthermore, the particle hygroscopicity exhibits a pronounced size dependence with lower values for the Aitken mode ( κ Ait = 0.14 ± 0.03), higher values for the accumulation mode ( κ Acc = 0.22 ± 0.05), and an overall mean value of κ mean = 0.17 ± 0.06, consistent with high fractions of organic aerosol.The hygroscopicity parameter, κ, exhibits remarkably little temporal variability: no pronounced diurnal cycles, only weak seasonal trends, and few short-term variations during long-range transport events. In contrast, the CCN number concentrations exhibit a pronounced seasonal cycle, tracking the pollution-related seasonality in total aerosol concentration. Here, we find that the variability in the CCN concentrations in the central Amazon is mostly driven by aerosol particle number concentration and size distribution, while variations in aerosol hygroscopicity and chemical composition matter only during a few episodes.For modeling purposes, we compare different approaches of predicting CCN number concentration and present a novel parametrization, which allows accurate CCN predictions based on a small set of input data.« less
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.
2017-12-01
Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.
Satellite-based estimation of cloud-base updrafts for convective clouds and stratocumulus
NASA Astrophysics Data System (ADS)
Zheng, Y.; Rosenfeld, D.; Li, Z.
2017-12-01
Updraft speeds of thermals have always been notoriously difficult to measure, despite significant roles they play in transporting pollutants and in cloud formation and precipitation. To our knowledge, no attempt to date has been made to estimate updraft speed from satellite information. In this study, we introduce three methods of retrieving updraft speeds at cloud base () for convective clouds and marine stratocumulus with VIIRS onboard Suomi-NPP satellite. The first method uses ground-air temperature difference to characterize the surface sensible heat flux, which is found to be correlated with updraft speeds measured by the Doppler lidar over the Southern Great Plains (SGP). Based on the relationship, we use the satellite-retrieved surface skin temperature and reanalysis surface air temperature to estimate the updrafts. The second method is based on a good linear correlation between cloud base height and updrafts, which was found over the SGP, the central Amazon, and on board a ship sailing between Honolulu and Los Angeles. We found a universal relationship for both land and ocean. The third method is for marine stratocumulus. A statistically significant relationship between Wb and cloud-top radiative cooling rate (CTRC) is found from measurements over northeastern Pacific and Atlantic. Based on this relation, satellite- and reanalysis-derived CTRC is utilized to infer the Wb of stratocumulus clouds. Evaluations against ground-based Doppler lidar measurements show estimation errors of 24%, 21% and 22% for the three methods, respectively.
Shen, Yuan; Wang, Xiaoyu; Xu, Jianping; Lu, Lin
2017-07-04
The SerpinE2 pathway is evolutionarily conserved and plays an important role in tumorigenesis. SerpinE2 (a small ubiquitin-related modifier), like ubiquitin, conjugates SerpinE2 proteins onto lysine residues of target proteins. SerpinE2 over-expression has been found in several tumors. Here, we detected the level of SerpinE2 in 72 samples of EC tissue using immunohistochemistry to assess the role of SerpinE2 in EC prognosis. Meanwhile, we knocked down SerpinE2 by siRNA in the HTB-111 and Ishikawa EC cell lines and analyzed the viability and mobility change using an MTT assay, an annexin V/PI apoptosis assay, a wound scratch test and a transwell assay. A Kaplan-Meier analysis indicated a negative correlation between the level of SerpinE2 and the EC prognosis. Silencing SerpinE2 induced cell apoptosis and reduced the migration ability. Our data suggest SerpinE2 works as an oncogene in EC.
Bai, Wei-li; Yan, Ting-yuan; Wang, Zhi-xiang; Huang, De-chun; Yan, Ting-xuan; Li, Ping
2015-01-01
Curcumin-ethyl-cellulose (EC) sustained-release composite particles were prepared by using supercritical CO2 anti-solvent technology. With drug loading and yield of inclusion complex as evaluation indexes, on the basis of single factor tests, orthogonal experimental design was used to optimize the preparation process of curcumin-EC sustained-release composite particles. The experiments such as drug loading, yield, particle size distribution, electron microscope analysis (SEM) , infrared spectrum (IR), differential scanning calorimetry (DSC) and in vitro dissolution were used to analyze the optimal process combination. The orthogonal experimental optimization process conditions were set as follows: crystallization temperature 45 degrees C, crystallization pressure 10 MPa, curcumin concentration 8 g x L(-1), solvent flow rate 0.9 mL x min(-1), and CO2 velocity 4 L x min(-1). Under the optimal conditions, the average drug loading and yield of curcumin-EC sustained-release composite particles were 33.01% and 83.97%, and the average particle size of the particles was 20.632 μm. IR and DSC analysis showed that curcumin might complex with EC. The experiments of in vitro dissolution showed that curcumin-EC composite particles had good sustained-release effect. Curcumin-EC sustained-release composite particles can be prepared by supercritical CO2 anti-solvent technology.
Cloud fraction and cloud base measurements from scanning Doppler lidar during WFIP-2
NASA Astrophysics Data System (ADS)
Bonin, T.; Long, C.; Lantz, K. O.; Choukulkar, A.; Pichugina, Y. L.; McCarty, B.; Banta, R. M.; Brewer, A.; Marquis, M.
2017-12-01
The second Wind Forecast Improvement Project (WFIP-2) consisted of an 18-month field deployment of a variety of instrumentation with the principle objective of validating and improving NWP forecasts for wind energy applications in complex terrain. As a part of the set of instrumentation, several scanning Doppler lidars were installed across the study domain to primarily measure profiles of the mean wind and turbulence at high-resolution within the planetary boundary layer. In addition to these measurements, Doppler lidar observations can be used to directly quantify the cloud fraction and cloud base, since clouds appear as a high backscatter return. These supplementary measurements of clouds can then be used to validate cloud cover and other properties in NWP output. Herein, statistics of the cloud fraction and cloud base height from the duration of WFIP-2 are presented. Additionally, these cloud fraction estimates from Doppler lidar are compared with similar measurements from a Total Sky Imager and Radiative Flux Analysis (RadFlux) retrievals at the Wasco site. During mostly cloudy to overcast conditions, estimates of the cloud radiating temperature from the RadFlux methodology are also compared with Doppler lidar measured cloud base height.
Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus
NASA Astrophysics Data System (ADS)
Baun, Christian; Kunze, Marcel
Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.
CloudNeo: a cloud pipeline for identifying patient-specific tumor neoantigens.
Bais, Preeti; Namburi, Sandeep; Gatti, Daniel M; Zhang, Xinyu; Chuang, Jeffrey H
2017-10-01
We present CloudNeo, a cloud-based computational workflow for identifying patient-specific tumor neoantigens from next generation sequencing data. Tumor-specific mutant peptides can be detected by the immune system through their interactions with the human leukocyte antigen complex, and neoantigen presence has recently been shown to correlate with anti T-cell immunity and efficacy of checkpoint inhibitor therapy. However computing capabilities to identify neoantigens from genomic sequencing data are a limiting factor for understanding their role. This challenge has grown as cancer datasets become increasingly abundant, making them cumbersome to store and analyze on local servers. Our cloud-based pipeline provides scalable computation capabilities for neoantigen identification while eliminating the need to invest in local infrastructure for data transfer, storage or compute. The pipeline is a Common Workflow Language (CWL) implementation of human leukocyte antigen (HLA) typing using Polysolver or HLAminer combined with custom scripts for mutant peptide identification and NetMHCpan for neoantigen prediction. We have demonstrated the efficacy of these pipelines on Amazon cloud instances through the Seven Bridges Genomics implementation of the NCI Cancer Genomics Cloud, which provides graphical interfaces for running and editing, infrastructure for workflow sharing and version tracking, and access to TCGA data. The CWL implementation is at: https://github.com/TheJacksonLaboratory/CloudNeo. For users who have obtained licenses for all internal software, integrated versions in CWL and on the Seven Bridges Cancer Genomics Cloud platform (https://cgc.sbgenomics.com/, recommended version) can be obtained by contacting the authors. jeff.chuang@jax.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment
Muthurajan, Vinothkumar; Narayanasamy, Balaji
2016-01-01
Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation. PMID:26981584
An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.
Muthurajan, Vinothkumar; Narayanasamy, Balaji
2016-01-01
Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.
NASA Astrophysics Data System (ADS)
Tosca, M. G.; Diner, D. J.; Garay, M. J.; Kalashnikova, O. V.
2012-12-01
Fire-emitted aerosols modify cloud and precipitation dynamics by acting as cloud condensation nuclei in what is known as the first and second aerosol indirect effect. The cloud response to the indirect effect varies regionally and is not well understood in the highly convective tropics. We analyzed nine years (2003-2011) of aerosol data from the Multi-angle Imaging SpectroRadiometer (MISR), and fire emissions data from the Global Fire Emissions Database, version 3 (GFED3) over southeastern tropical Asia (Indonesia), and identified scenes that contained both a high atmospheric aerosol burden and large surface fire emissions. We then collected scenes from the Cloud Profiling Radar (CPR) on board the CLOUDSAT satellite that corresponded both spatially and temporally to the high-burning scenes from MISR, and identified differences in convective cloud dynamics over areas with varying aerosol optical depths. Differences in overpass times (MISR in the morning, CLOUDSAT in the afternoon) improved our ability to infer that changes in cloud dynamics were a response to increased or decreased aerosol emissions. Our results extended conclusions from initial studies over the Amazon that used remote sensing techniques to identify cloud fraction reductions in high burning areas (Koren et al., 2004; Rosenfeld, 1999) References Koren, I., Y.J. Kaufman, L.A. Remer and J.V. Martins (2004), Measurement of the effect of Amazon smoke on inhibition of cloud formation, Science, 303, 1342-1345 Rosenfeld, D. (1999), TRMM observed first direct evidence of smoke from forest fires inhibiting rainfall, Gephys. Res. Lett., 26, 3105.
Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application
NASA Astrophysics Data System (ADS)
Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.
2013-12-01
The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.
H2O2 modulates the energetic metabolism of the cloud microbiome
NASA Astrophysics Data System (ADS)
Wirgot, Nolwenn; Vinatier, Virginie; Deguillaume, Laurent; Sancelme, Martine; Delort, Anne-Marie
2017-12-01
Chemical reactions in clouds lead to oxidation processes driven by radicals (mainly HO⚫, NO3⚫, or HO2⚫) or strong oxidants such as H2O2, O3, nitrate, and nitrite. Among those species, hydrogen peroxide plays a central role in the cloud chemistry by driving its oxidant capacity. In cloud droplets, H2O2 is transformed by microorganisms which are metabolically active. Biological activity can therefore impact the cloud oxidant capacity. The present article aims at highlighting the interactions between H2O2 and microorganisms within the cloud system. First, experiments were performed with selected strains studied as a reference isolated from clouds in microcosms designed to mimic the cloud chemical composition, including the presence of light and iron. Biotic and abiotic degradation rates of H2O2 were measured and results showed that biodegradation was the most efficient process together with the photo-Fenton process. H2O2 strongly impacted the microbial energetic state as shown by adenosine triphosphate (ATP) measurements in the presence and absence of H2O2. This ATP depletion was not due to the loss of cell viability. Secondly, correlation studies were performed based on real cloud measurements from 37 cloud samples collected at the PUY station (1465 m a.s.l., France). The results support a strong correlation between ATP and H2O2 concentrations and confirm that H2O2 modulates the energetic metabolism of the cloud microbiome. The modulation of microbial metabolism by H2O2 concentration could thus impact cloud chemistry, in particular the biotransformation rates of carbon compounds, and consequently can perturb the way the cloud system is modifying the global atmospheric chemistry.
NASA Astrophysics Data System (ADS)
Luo, G.; Yu, F.
2016-12-01
GEOS-Chem, presently used by many research groups, is a state-of-the-art global 3-D model of atmospheric composition driven by assimilated meteorology from the Goddard Earth Observing System. Our comparisons of GEOS-5 cloud properties, used in photolysis rate calculation, with MODIS retrievals show that GEOS-5 underestimates cloud optical depth (COD) by a factor of more than 2 over most regions. Our further analysis indicates that the COD underestimation in the released GEOS-5 meteorology products is likely to be associated with the fact that the GEOS-5 doesn't take into account the impact of aerosol on cloud microphysics and optical properties. A new COD parameterization (called NewC thereafter), which re-calculates COD from GEOS-5 water content and GEOS-Chem simulated size-resolved particle properties, cloud condensation nuclei abundance, and cloud droplet number concentration, has been developed and incorporated into GEOS-Chem. The NewC increases the GEOS-5 derived annual mean CODs averaged between 60ºS and 60ºN from 2.0 to 4.3, in much better agreement with corresponding MODIS value of 4.3. The enhanced COD based on NewC scheme reduces global average boundary layer OH concentration by 9.8%. Zonal averaged OH is increased 3-7% above clouds due to backscattering and decreased 10-15% below clouds due to the attenuation of solar radiation. The global mean OH concentration simulated by GEOS-Chem driven by GEOS-5 COD and NewC COD are respectively12.9 × 105molec cm-3 and 12.2 × 105molec cm-3, with the latter closer to the ACCMIP multi-model estimation (11.7±1 105molec cm-3). Surface OH concentrations over major anthropogenic regions such as Eastern US, Europe, and East Asia decrease by up to -18.6%, -14.4%, and -19.9%, respectively. The relative change of surface OH concentration appears to be the largest over the Amazon rainforest, reaching up to -30%. After switching from old COD to NewC COD, GEOS-Chem simulated column HCHO and surface isoprene over Amazon
Cai, Jin-Yuan; Huang, De-Chun; Wang, Zhi-Xiang; Dang, Bei-Lei; Wang, Qiu-Ling; Su, Xin-Guang
2012-06-01
Ibuprofen/ethyl-cellulose (EC)-polyvinylpyrrolidone (PVP) sustained-release composite particles were prepared by using supercritical CO2 anti-solvent technology. With drug loading as the main evaluation index, orthogonal experimental design was used to optimize the preparation process of EC-PVP/ibuprofen composite particles. The experiments such as encapsulation efficiency, particle size distribution, electron microscope analysis, infrared spectrum (IR), differential scanning calorimetry (DSC) and in vitro dissolution were used to analyze the optimal process combination. The orthogonal experimental optimization process conditions were set as follows: crystallization temperature 40 degrees C, crystallization pressure 12 MPa, PVP concentration 4 mgmL(-1), and CO2 velocity 3.5 Lmin(-1). Under the optimal conditions, the drug loading and encapsulation efficiency of ibuprofen/EC-PVP composite particles were 12.14% and 52.21%, and the average particle size of the particles was 27.621 microm. IR and DSC analysis showed that PVP might complex with EC. The experiments of in vitro dissolution showed that ibuprofen/EC-PVP composite particles had good sustained-release effect. Experiment results showed that, ibuprofen/EC-PVP sustained-release composite particles can be prepared by supercritical CO2 anti-solvent technology.
Effects of CO2 Physiological Forcing on Amazon Climate
NASA Astrophysics Data System (ADS)
Halladay, K.; Good, P.; Kay, G.; Betts, R.
2014-12-01
Earth system models provide us with an opportunity to examine the complex interactions and feedbacks between land surface, vegetation and atmosphere. A more thorough understanding of these interactions is essential in reducing uncertainty surrounding the potential impacts of climate and environmental change on the future state and extent of the Amazon rainforest. This forest is a important resource for the region and globally in terms of ecosystem services, hydrology and biodiversity. We aim to investigate the effect of CO2 physiological forcing on the Amazon rainforest and its feedback on regional climate by using the CMIP5 idealised 1% CO2 simulations with a focus on HadGEM2-ES. In these simulations, the atmospheric CO2 concentration is increased by 1% per year for 140 years, reaching around 1150ppm at the end of the simulation. The use of idealised simulations allows the effect of CO2 to be separated from other forcings and the sensitivities to be quantified. In particular, it enables non-linear feedbacks to be identified. In addition to the fully coupled 1% CO2 simulation, in which all schemes respond to the forcing, we use simulations in which (a) only the biochemistry scheme sees the rising CO2 concentration, and (b) in which rising CO2 is only seen by the radiation scheme. With these simulations we examine the degree to which CO2 effects are additive or non-linear when in combination. We also show regional differences in climate and vegetation response, highlighting areas of increased sensitivity.
Research on Key Technologies of Cloud Computing
NASA Astrophysics Data System (ADS)
Zhang, Shufen; Yan, Hongcan; Chen, Xuebin
With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.
The Monoceros R2 Molecular Cloud
NASA Astrophysics Data System (ADS)
Carpenter, J. M.; Hodapp, K. W.
2008-12-01
The Monoceros R2 region was first recognized as a chain of reflection nebulae illuminated by A- and B-type stars. These nebulae are associated with a giant molecular cloud that is one of the closest massive star forming regions to the Sun. This chapter reviews the properties of the Mon R2 region, including the namesake reflection nebulae, the large scale molecula= r cloud, global star formation activity, and properties of prominent star forming regions in the cloud.
Genotyping in the cloud with Crossbow.
Gurtowski, James; Schatz, Michael C; Langmead, Ben
2012-09-01
Crossbow is a scalable, portable, and automatic cloud computing tool for identifying SNPs from high-coverage, short-read resequencing data. It is built on Apache Hadoop, an implementation of the MapReduce software framework. Hadoop allows Crossbow to distribute read alignment and SNP calling subtasks over a cluster of commodity computers. Two robust tools, Bowtie and SOAPsnp, implement the fundamental alignment and variant calling operations respectively, and have demonstrated capabilities within Crossbow of analyzing approximately one billion short reads per hour on a commodity Hadoop cluster with 320 cores. Through protocol examples, this unit will demonstrate the use of Crossbow for identifying variations in three different operating modes: on a Hadoop cluster, on a single computer, and on the Amazon Elastic MapReduce cloud computing service.
Crop classification and mapping based on Sentinel missions data in cloud environment
NASA Astrophysics Data System (ADS)
Lavreniuk, M. S.; Kussul, N.; Shelestov, A.; Vasiliev, V.
2017-12-01
Availability of high resolution satellite imagery (Sentinel-1/2/3, Landsat) over large territories opens new opportunities in agricultural monitoring. In particular, it becomes feasible to solve crop classification and crop mapping task at country and regional scale using time series of heterogenous satellite imagery. But in this case, we face with the problem of Big Data. Dealing with time series of high resolution (10 m) multispectral imagery we need to download huge volumes of data and then process them. The solution is to move "processing chain" closer to data itself to drastically shorten time for data transfer. One more advantage of such approach is the possibility to parallelize data processing workflow and efficiently implement machine learning algorithms. This could be done with cloud platform where Sentinel imagery are stored. In this study, we investigate usability and efficiency of two different cloud platforms Amazon and Google for crop classification and crop mapping problems. Two pilot areas were investigated - Ukraine and England. Google provides user friendly environment Google Earth Engine for Earth observation applications with a lot of data processing and machine learning tools already deployed. At the same time with Amazon one gets much more flexibility in implementation of his own workflow. Detailed analysis of pros and cons will be done in the presentation.
Comparison between SAGE II and ISCCP high-level clouds. 2: Locating clouds tops
NASA Technical Reports Server (NTRS)
Liao, Xiaohan; Rossow, William B.; Rind, David
1995-01-01
A comparison is made of the vertical distribution of high-level cloud tops derived from the Stratospheric Aerosol and Gas Experiment II (SAGE II) occultation measurements and from the International Satellite Cloud Climatology Project (ISCCP) for all Julys and Januarys in 1985 to 1990. The results suggest that ISCCP overestimates the pressure of high-level clouds by up to 50-150 mbar, particularly at low latitudes. This is caused by the frequent presence of clouds with diffuse tops (greater than 50% time when cloudy events are observed). The averaged vertical extent of the diffuse top is about 1.5 km. At midlatitudes where the SAGE II and ISCCP cloud top pressure agree best, clouds with distinct tops reach a maximum relative proportion of the total level cloud amount (about 30-40%), and diffuse-topped clouds are reduced to their minimum (30-40%). The ISCCP-defined cloud top pressure should be regarded not as the material physical height of the clouds but as the level which emits the same infrared radiance as observed. SAGE II and ISCCP cloud top pressures agree for clouds with distinct tops. There is also an indication that the cloud top pressures of optically thin clouds not overlying thicker clouds are poorly estimated by ISCCP at middle latitudes. The average vertical extent of these thin clouds is about 2.5 km.
Confluence of the Amazon and Topajos Rivers, Brazil, South America
1991-08-11
This view shows the confluence of the Amazon and the Topajos Rivers at Santarem, Brazil (2.0S, 55.0W). The Am,azon flows from lower left to upper right of the photo. Below the river juncture of the Amazon and Tapajos, there is considerable deforestation activity along the Trans-Amazon Highway.
Ocean Heat Uptake Slows 21st Century Surface Warming Driven by Extratropical Cloud Feedbacks
NASA Astrophysics Data System (ADS)
Frey, W.; Maroon, E.; Pendergrass, A. G.; Kay, J. E.
2017-12-01
Equilibrium climate sensitivity (ECS), the warming in response to instantaneously doubled CO2, has long been used to compare climate models. In many models, ECS is well correlated with warming produced by transient forcing experiments. Modifications to cloud phase at high latitudes in a state-of-the-art climate model, the Community Earth System Model (CESM), produce a large increase in ECS (1.5 K) via extratropical cloud feedbacks. However, only a small surface warming increase occurs in a realistic 21st century simulation including a full-depth dynamic ocean and the "business as usual" RCP8.5 emissions scenario. In fact, the increase in surface warming is only barely above the internal variability-generated range in the CESM Large Ensemble. The small change in 21st century warming is attributed to subpolar ocean heat uptake in both hemispheres. In the Southern Ocean, the mean-state circulation takes up heat while in the North Atlantic a slowdown in circulation acts as a feedback to slow surface warming. These results show the importance of subpolar ocean heat uptake in controlling the pace of warming and demonstrate that ECS cannot be used to reliably infer transient warming when it is driven by extratropical feedbacks.
Privacy authentication using key attribute-based encryption in mobile cloud computing
NASA Astrophysics Data System (ADS)
Mohan Kumar, M.; Vijayan, R.
2017-11-01
Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.
Hybrid Cloud Computing Environment for EarthCube and Geoscience Community
NASA Astrophysics Data System (ADS)
Yang, C. P.; Qin, H.
2016-12-01
The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.
Zhang, Cheng; Wang, Jinju; Ma, Xiaotang; Wang, Wenjun; Zhao, Bin; Chen, Yanfang; Chen, Can; Bihl, Ji C
2018-03-01
Oxidative stress is one of the mechanisms of ageing-associated vascular dysfunction. Angiotensin-converting enzyme 2 (ACE2) and microRNA (miR)-18a have shown to be down-regulated in ageing cells. Our previous study has shown that ACE2-primed endothelial progenitor cells (ACE2-EPCs) have protective effects on endothelial cells (ECs), which might be due to their released exosomes (EXs). Here, we aimed to investigate whether ACE2-EPC-EXs could attenuate hypoxia/reoxygenation (H/R)-induced injury in ageing ECs through their carried miR-18a. Young and angiotensin II-induced ageing ECs were subjected to H/R and co-cultured with vehicle (medium), EPC-EXs, ACE2-EPCs-EXs, ACE2-EPCs-EXs + DX600 or ACE2-EPCs-EXs with miR-18a deficiency (ACE2-EPCs-EXs anti-miR-18a ). Results showed (1) ageing ECs displayed increased senescence, apoptosis and ROS production, but decreased ACE2 and miR-18a expressions and tube formation ability; (2) under H/R condition, ageing ECs showed higher rate of apoptosis, ROS overproduction and nitric oxide reduction, up-regulation of Nox2, down-regulation of ACE2, miR-18a and eNOS, and compromised tube formation ability; (3) compared with EPC-EXs, ACE2-EPC-EXs had better efficiencies on protecting ECs from H/R-induced changes; (4) The protective effects were less seen in ACE2-EPCs-EXs + DX600 and ACE2-EPCs-EXs anti-miR-18a groups. These data suggest that ACE-EPCs-EXs have better protective effects on H/R injury in ageing ECs which could be through their carried miR-18a and subsequently down-regulating the Nox2/ROS pathway. © 2018 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.
40 CFR Table 2 of Subpart Ec to... - Toxic Equivalency Factors
Code of Federal Regulations, 2010 CFR
2010-07-01
... Ec to Part 60—Toxic Equivalency Factors Dioxin/furan congener Toxic equivalency factor 2,3,7,8-tetrachlorinated dibenzo-p-dioxin 1 1,2,3,7,8-pentachlorinated dibenzo-p-dioxin 0.5 1,2,3,4,7,8-hexachlorinated dibenzo-p-dioxin 0.1 1,2,3,7,8,9-hexachlorinated dibenzo-p-dioxin 0.1 1,2,3,6,7,8-hexachlorinated dibenzo...
Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing
NASA Astrophysics Data System (ADS)
Wilson, Brian; Manipon, Gerald; Hua, Hook; Fetzer, Eric
2014-05-01
achieved "clock time" speedups in fusing datasets on our own nodes and in the Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept and prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed "near" the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be perform
NASA Astrophysics Data System (ADS)
Rifai, S. W.; Anderson, L. O.; Bohlman, S.
2015-12-01
Blowdowns, which are large tree mortality events caused by downbursts, create large pulses of carbon emissions in the short term and alter successional dynamics and species composition of forests, thus affecting long term biogeochemical cycling of tropical forests. Changing climate, especially increasing temperatures and frequency of extreme climate events, may cause changes in the frequency of blowdowns, but there has been little spatiotemporal analysis to associate the interannual variation in the frequency of blowdowns with annual climate parameters. We mapped blowdowns greater than 25 ha using a time series of Landsat images from 1984-2012 in the northwestern Amazon to estimate the annual size distribution of these blowdowns. The difference in forest area affected by blowdowns between the years with the highest and lowest blowdown activity were on the order of 10 - 30 times greater depending on location. Spatially, we found the probability of large blowdowns to be higher in regions with higher annual rainfall. Temporally, we found a positive correlation between the probability of large blowdown events and maximum dry season air temperature (R2 = 0.1-0.46). Mean and maximum blowdown size also increased with maximum dry season air temperature. The strength of these relationships varied between scene locations which may be related to cloud cover obscuring the land surface in the satellite images, or biophysical characteristics of the sites. Potentially, elevated dry season temperatures during the transition from the dry season to the wet season (October - December) may exacerbate atmospheric instabilities, which promote downburst occurrences. Most global circulation models predict dry season air temperatures to increase 2-5 ℃ in the northwestern Amazon by 2050. Should the blowdown disturbance regime continue increasing with elevated dry season temperatures, the northwestern Amazon is likely to experience more catastrophic tree mortality events which has direct
The 3-D Tropical Convective Cloud Spectrum in AMIE Radar Observations and Global Climate Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schumacher, Courtney
2015-08-31
During the three years of this grant performance, the PI and her research group have made a number of significant contributions towards determining properties of tropical deep convective clouds and how models depict and respond to the heating associated with tropical convective systems. The PI has also been an active ARM/ASR science team member, including playing a significant role in AMIE and GoAmazon2014/5. She served on the DOE ASR radar science steering committee and was a joint chair of the Mesoscale Convective Organization group under the Cloud Life Cycle working group. This grant has funded a number of graduate students,more » many of them women, and the PI and her group have presented their DOE-supported work at various universities and national meetings. The PI and her group participated in the AMIE (2011-12) and GoAmazon2014/5 (2014-15) DOE field deployments that occurred in the tropical Indian Ocean and Brazilian Amazon, respectively. AMIE observational results (DePasquale et al. 2014, Feng et al. 2014, Ahmed and Schumacher 2015) focus on the variation and possible importance of Kelvin waves in various phases of the Madden-Julian Oscillation (MJO), on the synergy of the different wavelength radars deployed on Addu Atoll, and on the importance of humidity thresholds in the tropics on stratiform rain production. Much of the PIs GoAmazon2014/5 results to date relate to overviews of the observations made during the field campaign (Martin et al. 2015, 2016; Fuentes et al. 2016), but also include the introduction of the descending arm and its link to ozone transport from the mid-troposphere to the surface (Gerken et al. 2016). Vertical motion and mass flux profiles from GoAmazon (Giangrande et al. 2016) also show interesting patterns between seasons and provide targets for model simulations. Results from TWP-ICE (Schumacher et al. 2015), which took place in Darwin, Australia in 2006 show that vertical velocity retrievals from the profilers provide structure
Alden, Caroline B; Miller, John B; Gatti, Luciana V; Gloor, Manuel M; Guan, Kaiyu; Michalak, Anna M; van der Laan-Luijkx, Ingrid T; Touma, Danielle; Andrews, Arlyn; Basso, Luana S; Correia, Caio S C; Domingues, Lucas G; Joiner, Joanna; Krol, Maarten C; Lyapustin, Alexei I; Peters, Wouter; Shiga, Yoichi P; Thoning, Kirk; van der Velde, Ivar R; van Leeuwen, Thijs T; Yadav, Vineet; Diffenbaugh, Noah S
2016-10-01
Understanding tropical rainforest carbon exchange and its response to heat and drought is critical for quantifying the effects of climate change on tropical ecosystems, including global climate-carbon feedbacks. Of particular importance for the global carbon budget is net biome exchange of CO2 with the atmosphere (NBE), which represents nonfire carbon fluxes into and out of biomass and soils. Subannual and sub-Basin Amazon NBE estimates have relied heavily on process-based biosphere models, despite lack of model agreement with plot-scale observations. We present a new analysis of airborne measurements that reveals monthly, regional-scale (~1-8 × 10(6) km(2) ) NBE variations. We develop a regional atmospheric CO2 inversion that provides the first analysis of geographic and temporal variability in Amazon biosphere-atmosphere carbon exchange and that is minimally influenced by biosphere model-based first guesses of seasonal and annual mean fluxes. We find little evidence for a clear seasonal cycle in Amazon NBE but do find NBE sensitivity to aberrations from long-term mean climate. In particular, we observe increased NBE (more carbon emitted to the atmosphere) associated with heat and drought in 2010, and correlations between wet season NBE and precipitation (negative correlation) and temperature (positive correlation). In the eastern Amazon, pulses of increased NBE persisted through 2011, suggesting legacy effects of 2010 heat and drought. We also identify regional differences in postdrought NBE that appear related to long-term water availability. We examine satellite proxies and find evidence for higher gross primary productivity (GPP) during a pulse of increased carbon uptake in 2011, and lower GPP during a period of increased NBE in the 2010 dry season drought, but links between GPP and NBE changes are not conclusive. These results provide novel evidence of NBE sensitivity to short-term temperature and moisture extremes in the Amazon, where monthly and sub
NASA Technical Reports Server (NTRS)
Alden, Caroline B.; Miller, John B.; Gatti, Luciana V.; Gloor, Manuel M.; Guan, Kaiyu; Michalak, Anna M.; van der Laan-Luijkx, Ingrid; Touma, Danielle; Andrews, Arlyn; Basso, Luana G.;
2016-01-01
Understanding tropical rainforest carbon exchange and its response to heat and drought is critical for quantifying the effects of climate change on tropical ecosystems, including global climate carbon feedbacks. Of particular importance for the global carbon budget is net biome exchange of CO2 with the atmosphere (NBE), which represents nonfire carbon fluxes into and out of biomass and soils. Subannual and sub-Basin Amazon NBE estimates have relied heavily on process-based biosphere models, despite lack of model agreement with plot-scale observations. We present a new analysis of airborne measurements that reveals monthly, regional-scale (Approx.1-8 x 10(exp -6) km2) NBE variations. We develop a regional atmospheric CO2 inversion that provides the first analysis of geographic and temporal variability in Amazon biosphere-atmosphere carbon exchange and that is minimally influenced by biosphere model-based first guesses of seasonal and annual mean fluxes. We find little evidence for a clear seasonal cycle in Amazon NBE but do find NBE sensitivity to aberrations from long-term mean climate. In particular, we observe increased NBE (more carbon emitted to the atmosphere) associated with heat and drought in 2010, and correlations between wet season NBE and precipitation (negative correlation) and temperature (positive correlation). In the eastern Amazon, pulses of increased NBE persisted through 2011, suggesting legacy effects of 2010 heat and drought. We also identify regional differences in postdrought NBE that appear related to long-term water availability. We examine satellite proxies and find evidence for higher gross primary productivity (GPP) during a pulse of increased carbon uptake in 2011, and lower GPP during a period of increased NBE in the 2010 dry season drought, but links between GPP and NBE changes are not conclusive. These results provide novel evidence of NBE sensitivity to short-term temperature and moisture extremes in the Amazon, where monthly and sub
Lidar Comparison for GoAmazon 2014/15 Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbosa, Henrique MJ; Barja, B; Landulfo, E
2016-04-01
The Observations and Modeling of the Green Ocean Amazon 2014/15 (GoAmazon 2014/15) experiment uses the city of Manaus, Amazonas (AM), Brazil, in the setting of the surrounding green ocean as a natural laboratory for understanding the effects of present and future anthropogenic pollution on the aerosol and cloud life cycle in the tropics. The U.S. Department of Energy (DOE) supported this experiment through the deployment of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s first Mobile Facility (AMF-1) in the city of Manacapuru, which is 100 km downwind of Manaus, from January 1 2014 to December 31 2015. During themore » second Intensive Operational Period (IOP) from August 15 to October 15 2014, three lidar systems were operated simultaneously at different experimental sites, and an instrument comparison campaign was carried out during the period October 4 to 10, during which the mobile lidar system from Instituto de Pesquisas Energéticas e Nucleares-Universidade de São Paulo was brought from the T2 site (Iranduba) to the other sites (T3 [Manacapuru] and then T0e-Embrapa). In this report we present the data collected by the mobile lidar system at the DOE-ARM site and compare its measurements with those from the micro-pulse lidar system running at that site.« less
Scaling predictive modeling in drug development with cloud computing.
Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola
2015-01-26
Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.
Clustering, randomness, and regularity in cloud fields: 2. Cumulus cloud fields
NASA Astrophysics Data System (ADS)
Zhu, T.; Lee, J.; Weger, R. C.; Welch, R. M.
1992-12-01
During the last decade a major controversy has been brewing concerning the proper characterization of cumulus convection. The prevailing view has been that cumulus clouds form in clusters, in which cloud spacing is closer than that found for the overall cloud field and which maintains its identity over many cloud lifetimes. This "mutual protection hypothesis" of Randall and Huffman (1980) has been challenged by the "inhibition hypothesis" of Ramirez et al. (1990) which strongly suggests that the spatial distribution of cumuli must tend toward a regular distribution. A dilemma has resulted because observations have been reported to support both hypotheses. The present work reports a detailed analysis of cumulus cloud field spatial distributions based upon Landsat, Advanced Very High Resolution Radiometer, and Skylab data. Both nearest-neighbor and point-to-cloud cumulative distribution function statistics are investigated. The results show unequivocally that when both large and small clouds are included in the cloud field distribution, the cloud field always has a strong clustering signal. The strength of clustering is largest at cloud diameters of about 200-300 m, diminishing with increasing cloud diameter. In many cases, clusters of small clouds are found which are not closely associated with large clouds. As the small clouds are eliminated from consideration, the cloud field typically tends towards regularity. Thus it would appear that the "inhibition hypothesis" of Ramirez and Bras (1990) has been verified for the large clouds. However, these results are based upon the analysis of point processes. A more exact analysis also is made which takes into account the cloud size distributions. Since distinct clouds are by definition nonoverlapping, cloud size effects place a restriction upon the possible locations of clouds in the cloud field. The net effect of this analysis is that the large clouds appear to be randomly distributed, with only weak tendencies towards
C-13/C-12 of atmospheric CO2 in the Amazon basin - Forest and river sources
NASA Technical Reports Server (NTRS)
Quay, Paul; King, Stagg; Wilbur, Dave; Richey, Jeffrey; Wofsy, Steven
1989-01-01
Results are presented of measurements of the CO2 concentrations and C-13/C-12 ratios in CO2 in air samples collected from within the Amazonian rain forest and over the Amazon river between 1982 and 1987. Results indicate the presence of a diurnal cycle in the CO2 concentration and the C-13/C-12 ratio. It was found that the CO2 input to air in the forest was derived from the soil respiration, and the CO2 input to air over the Amazon river was derived from the degassing of CO2 from the river. It was also found that plants growing at heights lower than 7 m assimilate soil-derived CO2 with a low C-13/C-12 ratio.
Amazon Rain Forest Classification Using J-ERS-1 SAR Data
NASA Technical Reports Server (NTRS)
Freeman, A.; Kramer, C.; Alves, M.; Chapman, B.
1994-01-01
The Amazon rain forest is a region of the earth that is undergoing rapid change. Man-made disturbance, such as clear cutting for agriculture or mining, is altering the rain forest ecosystem. For many parts of the rain forest, seasonal changes from the wet to the dry season are also significant. Changes in the seasonal cycle of flooding and draining can cause significant alterations in the forest ecosystem.Because much of the Amazon basin is regularly covered by thick clouds, optical and infrared coverage from the LANDSAT and SPOT satellites is sporadic. Imaging radar offers a much better potential for regular monitoring of changes in this region. In particular, the J-ERS-1 satellite carries an L-band HH SAR system, which via an on-board tape recorder, can collect data from almost anywhere on the globe at any time of year.In this paper, we show how J-ERS-1 radar images can be used to accurately classify different forest types (i.e., forest, hill forest, flooded forest), disturbed areas such as clear cuts and urban areas, and river courses in the Amazon basin. J-ERS-1 data has also shown significant differences between the dry and wet season, indicating a strong potential for monitoring seasonal change. The algorithm used to classify J-ERS-1 data is a standard maximum-likelihood classifier, using the radar image local mean and standard deviation of texture as input. Rivers and clear cuts are detected using edge detection and region-growing algorithms. Since this classifier is intended to operate successfully on data taken over the entire Amazon, several options are available to enable the user to modify the algorithm to suit a particular image.
Finer, Matt; Jenkins, Clinton N.
2012-01-01
Due to rising energy demands and abundant untapped potential, hydropower projects are rapidly increasing in the Neotropics. This is especially true in the wet and rugged Andean Amazon, where regional governments are prioritizing new hydroelectric dams as the centerpiece of long-term energy plans. However, the current planning for hydropower lacks adequate regional and basin-scale assessment of potential ecological impacts. This lack of strategic planning is particularly problematic given the intimate link between the Andes and Amazonian flood plain, together one of the most species rich zones on Earth. We examined the potential ecological impacts, in terms of river connectivity and forest loss, of the planned proliferation of hydroelectric dams across all Andean tributaries of the Amazon River. Considering data on the full portfolios of existing and planned dams, along with data on roads and transmission line systems, we developed a new conceptual framework to estimate the relative impacts of all planned dams. There are plans for 151 new dams greater than 2 MW over the next 20 years, more than a 300% increase. These dams would include five of the six major Andean tributaries of the Amazon. Our ecological impact analysis classified 47% of the potential new dams as high impact and just 19% as low impact. Sixty percent of the dams would cause the first major break in connectivity between protected Andean headwaters and the lowland Amazon. More than 80% would drive deforestation due to new roads, transmission lines, or inundation. We conclude with a discussion of three major policy implications of these findings. 1) There is a critical need for further strategic regional and basin scale evaluation of dams. 2) There is an urgent need for a strategic plan to maintain Andes-Amazon connectivity. 3) Reconsideration of hydropower as a low-impact energy source in the Neotropics. PMID:22529979
Finer, Matt; Jenkins, Clinton N
2012-01-01
Due to rising energy demands and abundant untapped potential, hydropower projects are rapidly increasing in the Neotropics. This is especially true in the wet and rugged Andean Amazon, where regional governments are prioritizing new hydroelectric dams as the centerpiece of long-term energy plans. However, the current planning for hydropower lacks adequate regional and basin-scale assessment of potential ecological impacts. This lack of strategic planning is particularly problematic given the intimate link between the Andes and Amazonian flood plain, together one of the most species rich zones on Earth. We examined the potential ecological impacts, in terms of river connectivity and forest loss, of the planned proliferation of hydroelectric dams across all Andean tributaries of the Amazon River. Considering data on the full portfolios of existing and planned dams, along with data on roads and transmission line systems, we developed a new conceptual framework to estimate the relative impacts of all planned dams. There are plans for 151 new dams greater than 2 MW over the next 20 years, more than a 300% increase. These dams would include five of the six major Andean tributaries of the Amazon. Our ecological impact analysis classified 47% of the potential new dams as high impact and just 19% as low impact. Sixty percent of the dams would cause the first major break in connectivity between protected Andean headwaters and the lowland Amazon. More than 80% would drive deforestation due to new roads, transmission lines, or inundation. We conclude with a discussion of three major policy implications of these findings. 1) There is a critical need for further strategic regional and basin scale evaluation of dams. 2) There is an urgent need for a strategic plan to maintain Andes-Amazon connectivity. 3) Reconsideration of hydropower as a low-impact energy source in the Neotropics.
Investigating the Use of Cloudbursts for High-Throughput Medical Image Registration
Kim, Hyunjoo; Parashar, Manish; Foran, David J.; Yang, Lin
2010-01-01
This paper investigates the use of clouds and autonomic cloudbursting to support a medical image registration. The goal is to enable a virtual computational cloud that integrates local computational environments and public cloud services on-the-fly, and support image registration requests from different distributed researcher groups with varied computational requirements and QoS constraints. The virtual cloud essentially implements shared and coordinated task-spaces, which coordinates the scheduling of jobs submitted by a dynamic set of research groups to their local job queues. A policy-driven scheduling agent uses the QoS constraints along with performance history and the state of the resources to determine the appropriate size and mix of the public and private cloud resource that should be allocated to a specific request. The virtual computational cloud and the medical image registration service have been developed using the CometCloud engine and have been deployed on a combination of private clouds at Rutgers University and the Cancer Institute of New Jersey and Amazon EC2. An experimental evaluation is presented and demonstrates the effectiveness of autonomic cloudbursts and policy-based autonomic scheduling for this application. PMID:20640235
The Role of Intraseasonal Variability in Supporting the Shallow-to-Deep Transition in the Amazon
NASA Astrophysics Data System (ADS)
Serra, Y. L.; Rowe, A.; Adams, D. K.; Barbosa, H. M.; Kiladis, G. N.
2016-12-01
The shallow-to-deep convective transition over land typically refers to the growth of the convective boundary layer after sunrise, followed by the development of cumulus congestus clouds in the late morning/early afternoon and transitioning to deep convective clouds in the late afternoon and early evening. Under favorable conditions, this diurnal convection can result in organized mesoscale convective systems (MCSs) that last through the following morning. While many studies have focused on improving this process in models, the shallow-to-deep transition remains poorly represented especially over land. The recent DOE ARM mobile facility deployment in the Amazon, launched as part of GOAmazon, along with a dense GNSS network supported by Universidade do Estado do Amazonas (UEA)/Instituto Nacional de Pesquisas Espaciais (INPE) and co-located with the CHUVA Project sites for GOAmazon, are used here to examine land-based convective processes in the tropics. In particular, this aspect of a larger study of the shallow-to-deep transition explores the role of large-scale intraseasonal wave activity in supporting the growth of MCSs over the GoAmazon region. These results will be placed in the context of local forcing mechanisms for convective growth over the region in ongoing work.
NASA Astrophysics Data System (ADS)
McLaughlin, B. D.; Pawloski, A. W.
2017-12-01
NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. From surprising network policy limitations, billing challenges in a government-based cost model, and obtaining certificates in an NASA security-compliant manner to working with multiple applications in a shared and resource-constrained AWS account, these have been the relevant challenges in taking advantage of a cloud model. And most surprising of all… well, you'll just have to wait and see the "gotcha" that caught our entire team off guard!
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, T.K.; Anderson, J.L.; Condie, K.G.
Experiments designed to investigate surface dryout in a heated, ribbed annulus test section simulating one of the annular coolant channels of a Savannah River Plant production reactor Mark 22 fuel assembly have been conducted at the Idaho National Engineering Laboratory. The inner surface of the annulus was constructed of aluminum and was electrically heated to provide an axial cosine power profile and a flat azimuthal power shape. Data presented in this report are from the ECS-2, WSR, and ECS-2cE series of tests. These experiments were conducted to examine the onset of wall thermal excursion for a range of flow, inletmore » fluid temperature, and annulus outlet pressure. Hydraulic boundary conditions on the test section represent flowrates (0.1--1.4 1/s), inlet fluid temperatures (293--345 K), and outlet pressures (-18--139.7 cm of water relative to the bottom of the heated length (61--200 cm of water relative to the bottom of the lower plenum)) expected to occur during the Emergency Coolant System (ECS) phase of postulated Loss-of-Coolant Accident in a production reactor. The onset of thermal excursion based on the present data is consistent with data gathered in test rigs with flat axial power profiles. The data indicate that wall dryout is primarily a function of liquid superficial velocity. Air entrainment rate was observed to be a strong function of the boundary conditions (primarily flowrate and liquid temperature), but had a minor effect on the power at the onset of thermal excursion for the range of conditions examined. 14 refs., 33 figs., 13 tabs.« less
Unidata Cyberinfrastructure in the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Young, J. W.
2016-12-01
, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.
Heinstra, P W; Geer, B W; Seykens, D; Langevin, M
1989-01-01
Both aldehyde dehydrogenase (ALDH, EC 1.2.1.3) and the aldehyde dehydrogenase activity of alcohol dehydrogenase (ADH, EC 1.1.1.1) were found to coexist in Drosophila melanogaster larvae. The enzymes, however, showed different inhibition patterns with respect to pyrazole, cyanamide and disulphiram. ALDH-1 and ALDH-2 isoenzymes were detected in larvae by electrophoretic methods. Nonetheless, in tracer studies in vivo, more than 75% of the acetaldehyde converted to acetate by the ADH ethanol-degrading pathway appeared to be also catalysed by the ADH enzyme. The larval fat body probably was the major site of this pathway. Images Fig. 1. Fig. 2. PMID:2499314
Aircraft measurements of aerosol properties during GoAmazon - G1 and HALO inter-comparison
NASA Astrophysics Data System (ADS)
Mei, F.; Cecchini, M. A.; Wang, J.; Tomlinson, J. M.; Comstock, J. M.; Hubbe, J. M.; Pekour, M. S.; Machado, L.; Wendisch, M.; Longo, K.; Martin, S. T.; Schmid, B.; Weinzierl, B.; Krüger, M. L.; Zöger, M.
2015-12-01
Currently, the indirect effects of atmospheric aerosols remain the most uncertain components in forcing of climate change over the industrial period (IPCC, 2013). This large uncertainty is partially a result of our incomplete understanding of the ability of particles to form cloud droplets under atmospherically relevant supersaturations. One objective of the US Department of Energy (DOE) Green Ocean Amazon Project (GoAmazon2014/5) is to understand the influence of the emission from Manaus, a tropical megacity, on aerosol size, concentration, and chemical composition, and their impact on cloud condensation nuclei (CCN) spectrum. The GoAmazon2014/5 study was an international campaign with the collaboration efforts from US, Brazil and Germany. During the intensive operation period, in the dry season (Sep. 1st - Oct. 10th, 2014), aerosol concentration, size distributions, and CCN spectra, both under pristine conditions and inside the Manaus plume, were characterized in-situ from the DOE Gulfstream-1 (G-1) research aircraft and German HALO aircraft during 4 coordinated flights on Sep. 9th, Sep. 16th, Sep 21st and Oct. 1st, 2014. During those four flights, aerosol number concentrations and CCN concentrations at two supersaturations (0.25% and 0.5%) were measured by condensation particle counters (CPCs) and a DMT dual column CCN counter onboard both G-1 and HALO. Aerosol size distribution was also measured by a Fast Integrated Mobility Spectrometer (FIMS) aboard the G-1 and is compared with the size distribution from Ultra High Sensitivity Aerosol Spectrometer - Airborne (UHSAS-A, DMT), which were deployed both on the G-1 and the HALO. Good agreement between the aerosol properties measured from the two aircraft has been achieved. The vertical profiles of aerosol size distribution and CCN spectrum will be discussed.
NASA Astrophysics Data System (ADS)
Perez, G. L.; Larour, E. Y.; Halkides, D. J.; Cheng, D. L. C.
2015-12-01
The Virtual Ice Sheet Laboratory(VISL) is a Cryosphere outreach effort byscientists at the Jet Propulsion Laboratory(JPL) in Pasadena, CA, Earth and SpaceResearch(ESR) in Seattle, WA, and the University of California at Irvine (UCI), with the goal of providing interactive lessons for K-12 and college level students,while conforming to STEM guidelines. At the core of VISL is the Ice Sheet System Model(ISSM), an open-source project developed jointlyat JPL and UCI whose main purpose is to model the evolution of the polar ice caps in Greenland and Antarctica. By using ISSM, VISL students have access tostate-of-the-art modeling software that is being used to conduct scientificresearch by users all over the world. However, providing this functionality isby no means simple. The modeling of ice sheets in response to sea and atmospheric temperatures, among many other possible parameters, requiressignificant computational resources. Furthermore, this service needs to beresponsive and capable of handling burst requests produced by classrooms ofstudents. Cloud computing providers represent a burgeoning industry. With majorinvestments by tech giants like Amazon, Google and Microsoft, it has never beeneasier or more affordable to deploy computational elements on-demand. This isexactly what VISL needs and ISSM is capable of. Moreover, this is a promisingalternative to investing in expensive and rapidly devaluing hardware.
NASA Astrophysics Data System (ADS)
Smith, J. N.; Park, J. H.; Kuang, C.; Bustillos, J. O. V.; Souza, R. A. F. D.; Wiedemann, K. T.; Munger, J. W.; Wofsy, S. C.; Rizzo, L. V.; Artaxo, P.; Martin, S. T.; Seco, R.; Kim, S.; Guenther, A. B.; Batalha, S. S. A.; Alves, E. G.; Tota, J.
2014-12-01
The Amazon rainforest is a unique and important place for studying aerosol formation and its impacts on atmospheric chemistry and climate. In remote areas, the atmosphere is characterized by low particle number concentrations and high humidity; perturbations in the particle number concentrations and climate-relevant physical and chemical properties could therefore have a great impact on cloud formation and thus on regional climate and precipitation. While it was previously believed that new particle formation occurs rarely in the Amazon, observations in the Amazon of a sustained steady-state particle number concentration, along with an abundance of dry and wet surfaces upon which particles may deposit, imply that sources of new particles must exist in this region. We present observations from two studies, GOAmazon2014 and Tapajos Upwind Forest Flux Study (TUFFS), which seek to identify and quantify the sources of aerosol particles in the Amazon. Measurements of the chemical composition of 20 - 100 nm diameter aerosol particles at the T3 measurement site during the wet and dry season campaigns of GOAmazon2014 show the presence of inorganic ions such as potassium ion and sulfate, as well as organic ion such as oxalate, in ambient nanoparticles. These observations, combined with 1.5 - 300 nm diameter particle number size distributions and trace gas measurements of organic compounds and sulfuric acid, are used to determine the relative importance of sulfuric acid, organic compounds, and primary biological particle emissions to nanoparticle formation and growth. Observations of 3 - 100 nm diameter particle number size distributions at the KM67 tower site during TUFFS show frequent new particle formation events during the wet season in April, transitioning to a scenario of less frequent events in July at the onset of the dry season. These observations highlight the regional nature of new particle formation in the Amazon, and suggest that additional observations at a
GEWEX Cloud System Study (GCSS) Working Group on Cirrus Cloud Systems (WG2)
NASA Technical Reports Server (NTRS)
Starr, David
2002-01-01
Status, progress and plans will be given for current GCSS (GEWEX Cloud System Study) WG2 (Working Group on Cirrus Cloud Systems) projects, including: (a) the Idealized Cirrus Model Comparison Project, (b) the Cirrus Parcel Model Comparison Project (Phase 2), and (c) the developing Hurricane Nora extended outflow model case study project. Past results will be summarized and plans for the upcoming year described. Issues and strategies will be discussed. Prospects for developing improved cloud parameterizations derived from results of GCSS WG2 projects will be assessed. Plans for NASA's CRYSTAL-FACE (Cirrus Regional Study of Tropical Anvils and Layers - Florida Area Cirrus Experiment) potential opportunities for use of those data for WG2 model simulations (future projects) will be briefly described.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.
Cianfrocco, Michael A; Leschziner, Andres E
2015-05-08
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.
Automatic cloud coverage assessment of Formosat-2 image
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2011-11-01
Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.
GATECloud.net: a platform for large-scale, open-source text processing on the cloud.
Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina
2013-01-28
Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.
Tropical forest response to elevated CO2: Model-experiment integration at the AmazonFACE site.
NASA Astrophysics Data System (ADS)
Frankenberg, C.; Berry, J. A.; Guanter, L.; Joiner, J.
2014-12-01
The terrestrial biosphere's response to current and future elevated atmospheric carbon dioxide (eCO2) is a large source of uncertainty in future projections of the C cycle, climate and ecosystem functioning. In particular, the sensitivity of tropical rainforest ecosystems to eCO2 is largely unknown even though the importance of tropical forests for biodiversity, carbon storage and regional and global climate feedbacks is unambiguously recognized. The AmazonFACE (Free-Air Carbon Enrichment) project will be the first ecosystem scale eCO2 experiment undertaken in the tropics, as well as the first to be undertaken in a mature forest. AmazonFACE provides the opportunity to integrate ecosystem modeling with experimental observations right from the beginning of the experiment, harboring a two-way exchange, i.e. models provide hypotheses to be tested, and observations deliver the crucial data to test and improve ecosystem models. We present preliminary exploration of observed and expected process responses to eCO2 at the AmazonFACE site from the dynamic global vegetation model LPJ-GUESS, highlighting opportunities and pitfalls for model integration of tropical FACE experiments. The preliminary analysis provides baseline hypotheses, which are to be further developed with a follow-up multiple model inter-comparison. The analysis builds on the recently undertaken FACE-MDS (Model-Data Synthesis) project, which was applied to two temperate FACE experiments and exceeds the traditional focus on comparing modeled end-target output. The approach has proven successful in identifying well (and less well) represented processes in models, which are separated for six clusters also here; (1) Carbon fluxes, (2) Carbon pools, (3) Energy balance, (4) Hydrology, (5) Nutrient cycling, and (6) Population dynamics. Simulation performance of observed conditions at the AmazonFACE site (a.o. from Manaus K34 eddy flux tower) will highlight process-based model deficiencies, and aid the separation
Tropical forest response to elevated CO2: Model-experiment integration at the AmazonFACE site.
NASA Astrophysics Data System (ADS)
Fleischer, K.
2015-12-01
The terrestrial biosphere's response to current and future elevated atmospheric carbon dioxide (eCO2) is a large source of uncertainty in future projections of the C cycle, climate and ecosystem functioning. In particular, the sensitivity of tropical rainforest ecosystems to eCO2 is largely unknown even though the importance of tropical forests for biodiversity, carbon storage and regional and global climate feedbacks is unambiguously recognized. The AmazonFACE (Free-Air Carbon Enrichment) project will be the first ecosystem scale eCO2 experiment undertaken in the tropics, as well as the first to be undertaken in a mature forest. AmazonFACE provides the opportunity to integrate ecosystem modeling with experimental observations right from the beginning of the experiment, harboring a two-way exchange, i.e. models provide hypotheses to be tested, and observations deliver the crucial data to test and improve ecosystem models. We present preliminary exploration of observed and expected process responses to eCO2 at the AmazonFACE site from the dynamic global vegetation model LPJ-GUESS, highlighting opportunities and pitfalls for model integration of tropical FACE experiments. The preliminary analysis provides baseline hypotheses, which are to be further developed with a follow-up multiple model inter-comparison. The analysis builds on the recently undertaken FACE-MDS (Model-Data Synthesis) project, which was applied to two temperate FACE experiments and exceeds the traditional focus on comparing modeled end-target output. The approach has proven successful in identifying well (and less well) represented processes in models, which are separated for six clusters also here; (1) Carbon fluxes, (2) Carbon pools, (3) Energy balance, (4) Hydrology, (5) Nutrient cycling, and (6) Population dynamics. Simulation performance of observed conditions at the AmazonFACE site (a.o. from Manaus K34 eddy flux tower) will highlight process-based model deficiencies, and aid the separation
Cloud-based uniform ChIP-Seq processing tools for modENCODE and ENCODE.
Trinh, Quang M; Jen, Fei-Yang Arthur; Zhou, Ziru; Chu, Kar Ming; Perry, Marc D; Kephart, Ellen T; Contrino, Sergio; Ruzanov, Peter; Stein, Lincoln D
2013-07-22
Funded by the National Institutes of Health (NIH), the aim of the Model Organism ENCyclopedia of DNA Elements (modENCODE) project is to provide the biological research community with a comprehensive encyclopedia of functional genomic elements for both model organisms C. elegans (worm) and D. melanogaster (fly). With a total size of just under 10 terabytes of data collected and released to the public, one of the challenges faced by researchers is to extract biologically meaningful knowledge from this large data set. While the basic quality control, pre-processing, and analysis of the data has already been performed by members of the modENCODE consortium, many researchers will wish to reinterpret the data set using modifications and enhancements of the original protocols, or combine modENCODE data with other data sets. Unfortunately this can be a time consuming and logistically challenging proposition. In recognition of this challenge, the modENCODE DCC has released uniform computing resources for analyzing modENCODE data on Galaxy (https://github.com/modENCODE-DCC/Galaxy), on the public Amazon Cloud (http://aws.amazon.com), and on the private Bionimbus Cloud for genomic research (http://www.bionimbus.org). In particular, we have released Galaxy workflows for interpreting ChIP-seq data which use the same quality control (QC) and peak calling standards adopted by the modENCODE and ENCODE communities. For convenience of use, we have created Amazon and Bionimbus Cloud machine images containing Galaxy along with all the modENCODE data, software and other dependencies. Using these resources provides a framework for running consistent and reproducible analyses on modENCODE data, ultimately allowing researchers to use more of their time using modENCODE data, and less time moving it around.
Cloud-based uniform ChIP-Seq processing tools for modENCODE and ENCODE
2013-01-01
Background Funded by the National Institutes of Health (NIH), the aim of the Model Organism ENCyclopedia of DNA Elements (modENCODE) project is to provide the biological research community with a comprehensive encyclopedia of functional genomic elements for both model organisms C. elegans (worm) and D. melanogaster (fly). With a total size of just under 10 terabytes of data collected and released to the public, one of the challenges faced by researchers is to extract biologically meaningful knowledge from this large data set. While the basic quality control, pre-processing, and analysis of the data has already been performed by members of the modENCODE consortium, many researchers will wish to reinterpret the data set using modifications and enhancements of the original protocols, or combine modENCODE data with other data sets. Unfortunately this can be a time consuming and logistically challenging proposition. Results In recognition of this challenge, the modENCODE DCC has released uniform computing resources for analyzing modENCODE data on Galaxy (https://github.com/modENCODE-DCC/Galaxy), on the public Amazon Cloud (http://aws.amazon.com), and on the private Bionimbus Cloud for genomic research (http://www.bionimbus.org). In particular, we have released Galaxy workflows for interpreting ChIP-seq data which use the same quality control (QC) and peak calling standards adopted by the modENCODE and ENCODE communities. For convenience of use, we have created Amazon and Bionimbus Cloud machine images containing Galaxy along with all the modENCODE data, software and other dependencies. Conclusions Using these resources provides a framework for running consistent and reproducible analyses on modENCODE data, ultimately allowing researchers to use more of their time using modENCODE data, and less time moving it around. PMID:23875683
NASA Technical Reports Server (NTRS)
Viudez-Mora, Antonio; Kato, Seiji
2015-01-01
This work evaluates the multilayer cloud (MCF) algorithm based on CO2-slicing techniques against CALISPO-CloudSat (CLCS) measurement. This evaluation showed that the MCF underestimates the presence of multilayered clouds compared with CLCS and are retrained to cloud emissivities below 0.8 and cloud optical septs no larger than 0.3.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soto-Garcia, Lydia L.; Andreae, Meinrat O.; Andreae, Tracey W.
2011-06-03
Aerosol samples were collected at a pasture site in the Amazon Basin as part of the project LBA-SMOCC-2002 (Large-Scale Biosphere-Atmosphere Experiment in Amazonia - Smoke Aerosols, Clouds, Rainfall and Climate: Aerosols from Biomass Burning Perturb Global and Regional Climate). Sampling was conducted during the late dry season, when the aerosol composition was dominated by biomass burning emissions, especially in the submicron fraction. A 13-stage Dekati low-pressure impactor (DLPI) was used to collect particles with nominal aerodynamic diameters (D{sub p}) ranging from 0.03 to 0.10 m. Gravimetric analyses of the DLPI substrates and filters were performed to obtain aerosol mass concentrations.more » The concentrations of total, apparent elemental, and organic carbon (TC, EC{sub a}, and OC) were determined using thermal and thermal-optical analysis (TOA) methods. A light transmission method (LTM) was used to determine the concentration of equivalent black carbon (BC{sub e}) or the absorbing fraction at 880 nm for the size-resolved samples. During the dry period, due to the pervasive presence of fires in the region upwind of the sampling site, concentrations of fine aerosols (D{sub p} < 2.5 {mu}m: average 59.8 {mu}g m{sup -3}) were higher than coarse aerosols (D{sub p} > 2.5 {mu}m: 4.1 {mu}g m{sup -3}). Carbonaceous matter, estimated as the sum of the particulate organic matter (i.e., OC x 1.8) plus BC{sub e}, comprised more than 90% to the total aerosol mass. Concentrations of EC{sub a} (estimated by thermal analysis with a correction for charring) and BCe (estimated by LTM) averaged 5.2 {+-} 1.3 and 3.1 {+-} 0.8 {mu}g m{sup -3}, respectively. The determination of EC was improved by extracting water-soluble organic material from the samples, which reduced the average light absorption {angstrom} exponent of particles in the size range of 0.1 to 1.0 {mu}m from > 2.0 to approximately 1.2. The size-resolved BC{sub e} measured by the LTM showed a clear maximum between
2D Radiative Processes Near Cloud Edges
NASA Technical Reports Server (NTRS)
Varnai, T.
2012-01-01
Because of the importance and complexity of dynamical, microphysical, and radiative processes taking place near cloud edges, the transition zone between clouds and cloud free air has been the subject of intense research both in the ASR program and in the wider community. One challenge in this research is that the one-dimensional (1D) radiative models widely used in both remote sensing and dynamical simulations become less accurate near cloud edges: The large horizontal gradients in particle concentrations imply that accurate radiative calculations need to consider multi-dimensional radiative interactions among areas that have widely different optical properties. This study examines the way the importance of multidimensional shortwave radiative interactions changes as we approach cloud edges. For this, the study relies on radiative simulations performed for a multiyear dataset of clouds observed over the NSA, SGP, and TWP sites. This dataset is based on Microbase cloud profiles as well as wind measurements and ARM cloud classification products. The study analyzes the way the difference between 1D and 2D simulation results increases near cloud edges. It considers both monochromatic radiances and broadband radiative heating, and it also examines the influence of factors such as cloud type and height, and solar elevation. The results provide insights into the workings of radiative processes and may help better interpret radiance measurements and better estimate the radiative impacts of this critical region.
NGScloud: RNA-seq analysis of non-model species using cloud computing.
Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai
2018-05-03
RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.
NASA Astrophysics Data System (ADS)
Casey, K. S.; Hausman, S. A.
2016-02-01
In the last year, the NOAA National Oceanographic Data Center (NODC) and its siblings, the National Climatic Data Center and National Geophysical Data Center, were merged into one organization, the NOAA National Centers for Environmental Information (NCEI). Combining its expertise under one management has helped NCEI accelerate its efforts to embrace and integrate private, public, and hybrid cloud environments into its range of data stewardship services. These services span a range of tiers, from basic, long-term preservation and access, through enhanced access and scientific quality control, to authoritative product development and international-level services. Throughout these tiers of stewardship, partnerships and pilot projects have been launched to identify technological and policy-oriented challenges, to establish solutions to these problems, and to highlight success stories for emulation during operational integration of the cloud into NCEI's data stewardship activities. Some of these pilot activities including data storage, access, and reprocessing in Amazon Web Services, the OneStop data discovery and access framework project, and a set of Cooperative Research and Development Agreements under the Big Data Project with Amazon, Google, IBM, Microsoft, and the Open Cloud Consortium. Progress in these efforts will be highlighted along with a future vision of how NCEI could leverage hybrid cloud deployments and federated systems across NOAA to enable effective data stewardship for its oceanographic, atmospheric, climatic, and geophysical Big Data.
Cao, Cen; Huang, Ying; Tang, Qingming; Zhang, Chenguang; Shi, Lei; Zhao, Jiajia; Hu, Li; Hu, Zhewen; Liu, Yun; Chen, Lili
2018-07-01
Co-transplantation of endothelial cells (ECs) and mesenchymal stem cells (MSCs) is an important strategy for repairing complex and large bone defects. However, the ways in which ECs and MSCs interact remain to be fully clarified. We found that forward ephrinB2/Ephs signaling from hBMSCs to hUVECs promoted the tube formation of hUVECs by activating the PI3K/AKT/mTOR pathway. Reverse ephrinB2/Ephs signaling from hUVECs to hBMSCs promoted the proliferation and maintenance of hBMSCs self-renewal via upregulation of OCT4, SOX2, and YAP1. Subcutaneous co-transplantation of ECs and MSCs in nude mice confirmed that forward ephrinB2/Ephs signaling could increase the cross-sectional area of blood vessels in the transplanted area, and reverse ephrinB2/Ephs signaling could maintain the self-renewal of transplanted hBMSCs in vivo. Based on these results, ephrinB2/Ephs bidirectional juxtacrine regulation between ECs and MSCs plays a pivotal role in improving the healing of bone defects by promoting angiogenesis and achieving a sufficient number of MSCs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Abstracting application deployment on Cloud infrastructures
NASA Astrophysics Data System (ADS)
Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.
2017-10-01
Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.
Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less
Measuring ECS Interaction with Biomembranes.
Angelucci, Clotilde B; Sabatucci, Annalaura; Dainese, Enrico
2016-01-01
Understanding the correct interaction among the different components of the endocannabinoid system (ECS) is fundamental for a proper assessment of the function of endocannabinoids (eCBs) as signaling molecules. The knowledge of how membrane environment is able to modulate intracellular trafficking of eCBs and their interacting proteins holds a huge potential in unraveling new mechanisms of ECS modulation.Here, fluorescence resonance energy transfer (FRET) technique is applied to measure the binding affinity of ECS proteins to model membranes (i.e., large unilamellar vesicles, LUVs). In particular, we describe in details the paradigmatic example of the interaction of recombinant rat FAAH-ΔTM with LUVs constituted by 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine (POPC).
NASA Technical Reports Server (NTRS)
Randall, David A.; Fowler, Laura D.; Lin, Xin
1998-01-01
In order to improve our understanding of the interactions between clouds, radiation, and the hydrological cycle simulated in the Colorado State University General Circulation Model (CSU GCM), we focused our research on the analysis of the diurnal cycle of precipitation, top-of-the-atmosphere and surface radiation budgets, and cloudiness using 10-year long Atmospheric Model Intercomparison Project (AMIP) simulations. Comparisons the simulated diurnal cycle were made against the diurnal cycle of Earth Radiation Budget Experiment (ERBE) radiation budget and International Satellite Cloud Climatology Project (ISCCP) cloud products. This report summarizes our major findings over the Amazon Basin.
Amazon River investigations, reconnaissance measurements of July 1963
Oltman, Roy Edwin; Sternberg, H. O'R.; Ames, F.C.; Davis, L.C.
1964-01-01
The first measurements of the flow of the Amazon River were made in July 1963 as a joint project of the University of Brazil, the Brazilian Navy, and the U.S. Geological Survey. The discharge of the Amazon River at Obidos was 7,640,000 cfs at an annual flood stage somewhat lower than the average. For comparison the maximum known discharge of the Mississippi River at Vicksburg is about 2,300,000 cfs. Dissolved-solids concentrations and sediment loads of the Amazon River and of several major tributaries were found to be low.
NASA Astrophysics Data System (ADS)
Viudez-Mora, A.; Kato, S.; Smith, W. L., Jr.; Chang, F. L.
2016-12-01
Knowledge of the vertical cloud distribution is important for a variety of climate and weather applications. The cloud overlapping variations greatly influence the atmospheric heating/cooling rates, with implications for the surface-troposphere radiative balance, global circulation and precipitation. Additionally, an accurate knowledge of the multi-layer cloud distribution in real-time can be used in applications such safety condition for aviation through storms and adverse weather conditions. In this study, we evaluate a multi-layered cloud algorithm (Chang et al. 2005) based on MODIS measurements aboard Aqua satellite (MCF). This algorithm uses the CO2-slicing technique combined with cloud properties determined from VIS, IR and NIR channels to locate high thin clouds over low-level clouds, and retrieve the τ of each layer. We use CALIPSO (Winker et. al, 2010) and CloudSat (Stephens et. al, 2002) (CLCS) derived cloud vertical profiles included in the C3M data product (Kato et al. 2010) to evaluate MCF derived multi-layer cloud properties. We focus on 2 layer overlapping and 1-layer clouds identified by the active sensors and investigate how well these systems are identified by the MODIS multi-layer technique. The results show that for these multi-layered clouds identified by CLCS, the MCF correctly identifies about 83% of the cases as multi-layer. However, it is found that the upper CTH is underestimated by about 2.6±0.4 km, because the CO2-slicing technique is not as sensitive to the cloud physical top as the CLCS. The lower CTH agree better with differences found to be about 1.2±0.5 km. Another outstanding issue for the MCF approach is the large number of multi-layer false alarms that occur in single-layer conditions. References: Chang, F.-L., and Z. Li, 2005: A new method for detection of cirrus overlapping water clouds and determination of their optical properties. J. Atmos. Sci., 62. Kato, S., et al. (2010), Relationships among cloud occurrence frequency
Amazon Forests Response to Droughts: A Perspective from the MAIAC Product
NASA Technical Reports Server (NTRS)
Bi, Jian; Myneni, Ranga; Lyapustin, Alexei; Wang, Yujie; Park, Taejin; Chi, Chen; Yan, Kai; Knyazikhin, Yuri
2016-01-01
Amazon forests experienced two severe droughts at the beginning of the 21st century: one in 2005 and the other in 2010. How Amazon forests responded to these droughts is critical for the future of the Earth's climate system. It is only possible to assess Amazon forests' response to the droughts in large areal extent through satellite remote sensing. Here, we used the Multi-Angle Implementation of Atmospheric Correction (MAIAC) Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation index (VI) data to assess Amazon forests' response to droughts, and compared the results with those from the standard (Collection 5 and Collection 6) MODIS VI data. Overall, the MAIAC data reveal more realistic Amazon forests inter-annual greenness dynamics than the standard MODIS data. Our results from the MAIAC data suggest that: (1) the droughts decreased the greenness (i.e., photosynthetic activity) of Amazon forests; (2) the Amazon wet season precipitation reduction induced by El Niño events could also lead to reduced photosynthetic activity of Amazon forests; and (3) in the subsequent year after the water stresses, the greenness of Amazon forests recovered from the preceding decreases. However, as previous research shows droughts cause Amazon forests to reduce investment in tissue maintenance and defense, it is not clear whether the photosynthesis of Amazon forests will continue to recover after future water stresses, because of the accumulated damages caused by the droughts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marathe, Aniruddha P.; Harris, Rachel A.; Lowenthal, David K.
The use of clouds to execute high-performance computing (HPC) applications has greatly increased recently. Clouds provide several potential advantages over traditional supercomputers and in-house clusters. The most popular cloud is currently Amazon EC2, which provides fixed-cost and variable-cost, auction-based options. The auction market trades lower cost for potential interruptions that necessitate checkpointing; if the market price exceeds the bid price, a node is taken away from the user without warning. We explore techniques to maximize performance per dollar given a time constraint within which an application must complete. Specifically, we design and implement multiple techniques to reduce expected cost bymore » exploiting redundancy in the EC2 auction market. We then design an adaptive algorithm that selects a scheduling algorithm and determines the bid price. We show that our adaptive algorithm executes programs up to seven times cheaper than using the on-demand market and up to 44 percent cheaper than the best non-redundant, auction-market algorithm. We extend our adaptive algorithm to incorporate application scalability characteristics for further cost savings. In conclusion, we show that the adaptive algorithm informed with scalability characteristics of applications achieves up to 56 percent cost savings compared to the expected cost for the base adaptive algorithm run at a fixed, user-defined scale.« less
NASA Astrophysics Data System (ADS)
Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.
2018-06-01
Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.
An observational search for CO2 ice clouds on Mars
NASA Technical Reports Server (NTRS)
Bell, James F., III; Calvin, Wendy M.; Pollack, James B.; Crisp, David
1993-01-01
CO2 ice clouds were first directly identified on Mars by the Mariner 6 and 7 infrared spectrometer limb scans. These observations provided support for early theoretical modeling efforts of CO2 condensation. Mariner 9 IRIS temperature profiles of north polar hood clouds were interpreted as indicating that these clouds were composed of H2O ice at lower latitudes and CO2 ice at higher latitudes. The role of CO2 condensation on Mars has recently received increased attention because (1) Kasting's model results indicated that CO2 cloud condensation limits the magnitude of the proposed early Mars CO2/H2O greenhouse, and (2) Pollack el al.'s GCM results indicated that the formation of CO2 ice clouds is favorable at all polar latitudes during the fall and winter seasons. These latter authors have shown that CO2 clouds play an important role in the polar energy balance, as the amount of CO2 contained in the polar caps is constrained by a balance between latent heat release, heat advected from lower latitudes, and thermal emission to space. The polar hood clouds reduce the amount of CO2 condensation on the polar caps because they reduce the net emission to space. There have been many extensive laboratory spectroscopic studies of H2O and CO2 ices and frosts. In this study, we use results from these and other sources to search for the occurrence of diagnostic CO2 (and H2O) ice and/or frost absorption features in ground based near-infrared imaging spectroscopic data of Mars. Our primary goals are (1) to try to confirm the previous direct observations of CO2 clouds on Mars; (2) to determine the spatial extent, temporal variability, and composition (H2O/CO2 ratio) of any clouds detected; and (3) through radiative transfer modeling, to try to determine the mean particle size and optical depth of polar hood clouds, thus, assessing their role in the polar heat budget.
Carbon Emissions from Deforestation in the Brazilian Amazon Region
NASA Technical Reports Server (NTRS)
Potter, C.; Klooster, S.; Genovese, V.
2009-01-01
A simulation model based on satellite observations of monthly vegetation greenness from the Moderate Resolution Imaging Spectroradiometer (MODIS) was used to estimate monthly carbon fluxes in terrestrial ecosystems of Brazilian Amazon and Cerrado regions over the period 2000-2002. The NASA-CASA (Carnegie Ames Stanford Approach) model estimates of annual forest production were used for the first time as the basis to generate a prediction for the standing pool of carbon in above-ground biomass (AGB; gC/sq m) for forested areas of the Brazilian Amazon region. Plot-level measurements of the residence time of carbon in wood in Amazon forest from Malhi et al. (2006) were interpolated by inverse distance weighting algorithms and used with CASA to generate a new regional map of AGB. Data from the Brazilian PRODES (Estimativa do Desflorestamento da Amazonia) project were used to map deforested areas. Results show that net primary production (NPP) sinks for carbon varied between 4.25 Pg C/yr (1 Pg=10(exp 15)g) and 4.34 Pg C for the region and were highest across the eastern and northern Amazon areas, whereas deforestation sources of CO2 flux from decomposition of residual woody debris were higher and less seasonal in the central Amazon than in the eastern and southern areas. Increased woody debris from past deforestation events was predicted to alter the net ecosystem carbon balance of the Amazon region to generate annual CO2 source fluxes at least two times higher than previously predicted by CASA modeling studies. Variations in climate, land cover, and forest burning were predicted to release carbon at rates of 0.5 to 1 Pg C/yr from the Brazilian Amazon. When direct deforestation emissions of CO2 from forest burning of between 0.2 and 0.6 Pg C/yr in the Legal Amazon are overlooked in regional budgets, the year-to-year variations in this net biome flux may appear to be large, whereas our model results implies net biome fluxes had actually been relatively consistent from
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.
2014-12-01
The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.
Flooding dynamics on the lower Amazon floodplain
NASA Astrophysics Data System (ADS)
Rudorff, C.; Melack, J. M.; Bates, P. D.
2013-05-01
We analyzed flooding dynamics of a large floodplain lake in the lower reach of the Amazon River for the period between 1995 through 2010. Floodplain inundation was simulated using the LISFLOOD-FP model, which combines one-dimensional river routing with two-dimensional overland flow, and a local hydrological model. Accurate representation of floodplain flows and inundation extent depends on the quality of the digital elevation model (DEM). We combined digital topography (derived from the Shuttle Radar Topography Mission) with extensive floodplain echo-sounding data to generate a hydraulically sound DEM. Analysis of daily water balances revealed that the dominant source of inflow alternated seasonally among direct rain and local runoff (October through January), Amazon River (March through August), and seepage (September). As inflows from the Amazon River increase during the rising limb of the hydrograph, regional floodwaters encounter the floodplain partially inundated from local hydrological inputs. At peak flow the floodplain routes, on average, 2.5% of the total discharge for this reach. The falling limb of the hydrograph coincides with the locally dry period, allowing seepage of water stored in sediments to become a dominant source. The average annual inflow from the Amazon River was 58.8 km3 (SD = 33.5), representing more than three thirds (80%) of inputs from all sources, with substantial inter-annual variability. The average annual net export of water from the floodplain to the Amazon River was 7.9 km3 (SD = 2.7).
Cloud-based adaptive exon prediction for DNA analysis
Putluri, Srinivasareddy; Fathima, Shaik Yasmeen
2018-01-01
Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database. PMID:29515813
Cloud-based adaptive exon prediction for DNA analysis.
Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen
2018-02-01
Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.
Cloud-based NEXRAD Data Processing and Analysis for Hydrologic Applications
NASA Astrophysics Data System (ADS)
Seo, B. C.; Demir, I.; Keem, M.; Goska, R.; Weber, J.; Krajewski, W. F.
2016-12-01
The real-time and full historical archive of NEXRAD Level II data, covering the entire United States from 1991 to present, recently became available on Amazon cloud S3. This provides a new opportunity to rebuild the Hydro-NEXRAD software system that enabled users to access vast amounts of NEXRAD radar data in support of a wide range of research. The system processes basic radar data (Level II) and delivers radar-rainfall products based on the user's custom selection of features such as space and time domain, river basin, rainfall product space and time resolution, and rainfall estimation algorithms. The cloud-based new system can eliminate prior challenges faced by Hydro-NEXRAD data acquisition and processing: (1) temporal and spatial limitation arising from the limited data storage; (2) archive (past) data ingestion and format conversion; and (3) separate data processing flow for the past and real-time Level II data. To enhance massive data processing and computational efficiency, the new system is implemented and tested for the Iowa domain. This pilot study begins by ingesting rainfall metadata and implementing Hydro-NEXRAD capabilities on the cloud using the new polarimetric features, as well as the existing algorithm modules and scripts. The authors address the reliability and feasibility of cloud computation and processing, followed by an assessment of response times from an interactive web-based system.
NASA Astrophysics Data System (ADS)
Holanda, Bruna; Pöhlker, Mira; Klimach, Thomas; Saturno, Jorge; Ditas, Florian; Ditas, Jeannine; Ma, Nan; Zhang, Yuxuan; Cheng, Yafang; Wendisch, Manfred; Machado, Luiz; Barbosa, Henrique; Pöhlker, Christopher; Artaxo, Paulo; Pöschl, Ulrich; Andreae, Meinrat
2017-04-01
Black carbon (BC) particles are emitted directly into the atmosphere by processes of incomplete combustion and therefore can be used as a tracer of atmospheric pollution. BC is considered one of the drivers of global warming due to its efficient absorption of solar and infra-red radiation (Bond et al., 2013). Depending on abundance and size, aerosols can also modify the characteristics of clouds and enhance or suppress precipitation (Pöschl et al., 2010). The BC particles can gain surface coatings by condensation of low and semi-volatile compounds, coagulation, and cloud processing. The inclusion of a non-absorbing coating influences the way that BC particles act as cloud nuclei and may increase their absorption through the lensing effect (Fuller et al., 1999). These aging processes change significantly the optical, chemical and physical properties of the particles, as well as their atmospheric lifetime, making BC a source of large uncertainties in current atmospheric models. Taking into account the complex dynamics of BC particles in the atmosphere, we are analyzing data from the ACRIDICON-CHUVA aircraft campaign, which took place in the Amazon basin, Brazil, during the dry season of 2014 (Wendisch et al., 2016). A detailed characterization of BC particles was done using the Single Particle Soot Photometer (SP2) instrument, which directly measures the mass of individual refractory BC particles (rBC). Additionally, the SP2 provides information about the size distribution of rBC cores and their associated coatings. These properties were measured covering a wide geographic area with different pollution conditions and at several levels of the atmosphere at high time resolution. The rBC concentrations change significantly with altitude and with the source of pollution, being a few nanograms per cubic meter for altitudes higher that 5 km. In the surroundings of Manaus city, the mean BC concentration was 0.7 μg/m3, with core sizes peaking at 180 nm. The highest BC mass
Hydrogeochemistry of the Overland Flow in Soil at Agroecosystems in Eastern Amazon
NASA Astrophysics Data System (ADS)
Costa, C. F. G. D.; Figueiredo, R. O.; Oliveira, F. D. A.
2014-12-01
In the watershed of the Timboteua and Buiuna streams, northeast of Pará state, Amazon, it was characterized the overland flow dissolved material by some hydrogeochemical variables: electrical conductivity (EC), pH, chloride (Cl-), nitrate (NO3-), phosphate (PO43-), and sulfate (SO42-). In two small holder properties three overland flow experimental plots (1m2) were placed in each of the six evaluated ecosystems under similar biophysical conditions, totaling 18 plots. There was also installed three rainwater collectors and two rain gauges in a nearby area. In the rainy season were collected 234 samples of rainwater and overland flow. The evaluation of the measured variables promote the hydrogeochemical characterization of the overland flow at soil under chop-and-mulch and slash-and-burn practices in the different ecosystems found in the familiar agriculture of this watershed, in which it was identified some distinct hydrogeochemical characteristics of the overland flow. The lowest losses of NO3- (variation range = 0.07 to 2.57 μM) was found in agroecosystem - chop-and-mulch, this nutrient obtained higher values in agroecosystem - slash-and-burn (RQ). In agroecosystem (RQ) initially, there was a high value of PO43- (8.87 μM); EC (121 μS cm-1) and a subsequent sharp decline. Secondary successional forest (CP) of 20 years presented in overland flow pH 4.8 and EC 25 μS cm-1 (average 6 months), low loss of NO3- (0.2 μM) and PO43- (0.05 μM), and large range of variation of SO42- (0.7 to 21.5 μM). While Cl- and SO42- overland flow concentrations were affect by the rainfall variation, the increase of NO3- and PO43-concentrations were more related to the ecosystem management, with the first element responding to the presence of nitrogen-fixing species and the second responding to the burning practices. In summary: This study was efficient to characterize the hydrogeochemical of the overland flow and its relation to the altered ecosystems by Amazonian family farming.
NASA Astrophysics Data System (ADS)
Nelson, R. R.; Taylor, T.; O'Dell, C.; Cronk, H. Q.; Partain, P.; Frankenberg, C.; Eldering, A.; Crisp, D.; Gunson, M. R.; Chang, A.; Fisher, B.; Osterman, G. B.; Pollock, H. R.; Savtchenko, A.; Rosenthal, E. J.
2015-12-01
Effective cloud and aerosol screening is critically important to the Orbiting Carbon Observatory-2 (OCO-2), which can accurately determine column averaged dry air mole fraction of carbon dioxide (XCO2) only when scenes are sufficiently clear of scattering material. It is crucial to avoid sampling biases, in order to maintain a globally unbiased XCO2 record for inversion modeling to determine sources and sinks of carbon dioxide. This work presents analysis from the current operational B7 data set, which is identifying as clear approximately 20% of the order one million daily soundings. Of those soundings that are passed to the L2 retrieval algorithm, we find that almost 80% are yielding XCO2 estimates that converge. Two primary preprocessor algorithms are used to cloud screen the OCO-2 soundings. The A-Band Preprocessor (ABP) uses measurements in the Oxygen-A band near 0.76 microns (mm) to determine scenes with large photon path length modifications due to scattering by aerosol and clouds. The Iterative Maximum A-Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) algorithm (IDP) computes ratios of retrieved CO2 (and H2O) in the 1.6mm (weak CO2) and 2.0mm (strong CO2) spectral bands to determine scenes with spectral differences, indicating contamination by scattering materials. We demonstrate that applying these two algorithms in tandem provides robust cloud screening of the OCO-2 data set. We compare the OCO-2 cloud screening results to collocated Moderate Resolution Imaging Spectroradiometer (MODIS) cloud mask data and show that agreement between the two sensors is approximately 85-90%. A detailed statistical analysis is performed on a winter and spring 16-day repeat cycle for the nadir-land, glint-land and glint-water viewing geometries. No strong seasonal, spatial or footprint dependencies are found, although the agreement tends to be worse at high solar zenith angles and for snow and ice covered surfaces.
The future of the Amazon: new perspectives from climate, ecosystem and social sciences.
Betts, Richard A; Malhi, Yadvinder; Roberts, J Timmons
2008-05-27
The potential loss or large-scale degradation of the tropical rainforests has become one of the iconic images of the impacts of twenty-first century environmental change and may be one of our century's most profound legacies. In the Amazon region, the direct threat of deforestation and degradation is now strongly intertwined with an indirect challenge we are just beginning to understand: the possibility of substantial regional drought driven by global climate change. The Amazon region hosts more than half of the world's remaining tropical forests, and some parts have among the greatest concentrations of biodiversity found anywhere on Earth. Overall, the region is estimated to host about a quarter of all global biodiversity. It acts as one of the major 'flywheels' of global climate, transpiring water and generating clouds, affecting atmospheric circulation across continents and hemispheres, and storing substantial reserves of biomass and soil carbon. Hence, the ongoing degradation of Amazonia is a threat to local climate stability and a contributor to the global atmospheric climate change crisis. Conversely, the stabilization of Amazonian deforestation and degradation would be an opportunity for local adaptation to climate change, as well as a potential global contributor towards mitigation of climate change. However, addressing deforestation in the Amazon raises substantial challenges in policy, governance, sustainability and economic science. This paper introduces a theme issue dedicated to a multidisciplinary analysis of these challenges.
Ketone EC50 values in the Microtox test.
Chen, H F; Hee, S S
1995-03-01
The Microtox EC50 values for the following ketones are reported in the following homologous series: straight chain methyl ketones (acetone, 2-butanone, 2-pentanone, 2-hepatonone, 2-octanone, 2-decanone, and 2-tridecanone); methyl ketones substituted at one alpha carbon (3-methyl-2-butanone; 3,3-dimethyl-2-butanone); methyl substituted at two alpha carbons (2,4-dimethyl-3-pentanone; 2,2,4,4-tetramethyl-3-pentanone); phenyl groups replacing methyl in acetone (acetophenone; benzophenone); methyl groups substituted at the alpha carbons of cyclohexanone; and 2,3- 2,4-, and 2,5-hexanediones, most for the first time. While there were linear relationships between log EC50 and MW for the straight chain methyl ketones, and for methyl substitution at the alpha carbon for methyl ketones, there were no other linear relationships. As molecular weight increased, the EC50 values of soluble ketones decreased; as distance between two carbonyl groups decreased so too did EC50 values. Thus, for the ketones the geometry around the carbonyl group is an important determinant of toxicity as well as MW, water solubility, and octanol/water coefficient.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud
Cianfrocco, Michael A; Leschziner, Andres E
2015-01-01
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969
NASA Astrophysics Data System (ADS)
Grossel, Agnes; Bureau, Jordan; Loubet, Benjamin; Laville, Patricia; Massad, Raia; Haas, Edwin; Butterbach-Bahl, Klaus; Guimbaud, Christophe; Hénault, Catherine
2017-04-01
The objective of this study was to develop and evaluate an attribution method based on a combination of Eddy Covariance (EC) and chamber measurements to map N2O emissions over a 3-km2 area of croplands and forests in France. During 2 months of spring 2015, N2O fluxes were measured (i) by EC at 15 m height and (ii) punctually with a mobile chamber at 16 places within 1-km of EC mast. The attribution method was based on coupling the EC measurements, information on footprints (Loubet et al., 20101) and emission ratios based on crops and fertilizations, calculated based on chamber measurements. The results were evaluated against an independent flux dataset measured by automatic chambers in a wheat field within the area. At the landscape scale, the method estimated a total emission of 114-271 kg N-N2O during the campaign. This new approach allowed estimating continuously N2O emission and better accounting for the spatial variability of N2O emission at the landscape scale.
Adventures in Private Cloud: Balancing Cost and Capability at the CloudSat Data Processing Center
NASA Astrophysics Data System (ADS)
Partain, P.; Finley, S.; Fluke, J.; Haynes, J. M.; Cronk, H. Q.; Miller, S. D.
2016-12-01
Since the beginning of the CloudSat Mission in 2006, The CloudSat Data Processing Center (DPC) at the Cooperative Institute for Research in the Atmosphere (CIRA) has been ingesting data from the satellite and other A-Train sensors, producing data products, and distributing them to researchers around the world. The computing infrastructure was specifically designed to fulfill the requirements as specified at the beginning of what nominally was a two-year mission. The environment consisted of servers dedicated to specific processing tasks in a rigid workflow to generate the required products. To the benefit of science and with credit to the mission engineers, CloudSat has lasted well beyond its planned lifetime and is still collecting data ten years later. Over that period requirements of the data processing system have greatly expanded and opportunities for providing value-added services have presented themselves. But while demands on the system have increased, the initial design allowed for very little expansion in terms of scalability and flexibility. The design did change to include virtual machine processing nodes and distributed workflows but infrastructure management was still a time consuming task when system modification was required to run new tests or implement new processes. To address the scalability, flexibility, and manageability of the system Cloud computing methods and technologies are now being employed. The use of a public cloud like Amazon Elastic Compute Cloud or Google Compute Engine was considered but, among other issues, data transfer and storage cost becomes a problem especially when demand fluctuates as a result of reprocessing and the introduction of new products and services. Instead, the existing system was converted to an on premises private Cloud using the OpenStack computing platform and Ceph software defined storage to reap the benefits of the Cloud computing paradigm. This work details the decisions that were made, the benefits that
NASA Astrophysics Data System (ADS)
Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.
2017-04-01
Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on
Developing cloud applications using the e-Science Central platform.
Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek
2013-01-28
This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction.
Developing cloud applications using the e-Science Central platform
Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek
2013-01-01
This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction. PMID:23230161
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
Volunteered Cloud Computing for Disaster Management
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects
NASA Astrophysics Data System (ADS)
Giangrande, S. E.; WANG, D.; Hardin, J. C.; Mitchell, J.
2017-12-01
As part of the 2 year Department of Energy Atmospheric Radiation Measurement (ARM) Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign, the ARM Mobile Facility (AMF) collected a unique set of observations in a region of strong climatic significance near Manacapuru, Brazil. An important example for the beneficial observational record obtained by ARM during this campaign was that of the Radar Wind Profiler (RWP). This dataset has been previously documented for providing critical convective cloud vertical air velocity retrievals and precipitation properties (e.g., calibrated reflectivity factor Z, rainfall rates) under a wide variety of atmospheric conditions. Vertical air motion estimates to within deep convective cores such as those available from this RWP system have been previously identified as critical constraints for ongoing global climate modeling activities and deep convective cloud process studies. As an extended deployment within this `green ocean' region, the RWP site and collocated AMF surface gauge instrumentation experienced a unique hybrid of tropical and continental precipitation conditions, including multiple wet and dry season precipitation regimes, convective and organized stratiform storm dynamics and contributions to rainfall accumulation, pristine aerosol conditions of the locale, as well as the effects of the Manaus, Brazil, mega city pollution plume. For hydrological applications and potential ARM products, machine learning methods developed using this dataset are explored to demonstrate advantages in geophysical retrievals when compared to traditional methods. Emphasis is on performance improvements when providing additional information on storm structure and regime or echo type classifications. Since deep convective cloud dynamic insights (core updraft/downdraft properties) are difficult to obtain directly by conventional radars that also observe radar reflectivity factor profiles similar to RWP systems, we also
Consistency of vegetation index seasonality across the Amazon rainforest
NASA Astrophysics Data System (ADS)
Maeda, Eduardo Eiji; Moura, Yhasmin Mendes; Wagner, Fabien; Hilker, Thomas; Lyapustin, Alexei I.; Wang, Yujie; Chave, Jérôme; Mõttus, Matti; Aragão, Luiz E. O. C.; Shimabukuro, Yosio
2016-10-01
Vegetation indices (VIs) calculated from remotely sensed reflectance are widely used tools for characterizing the extent and status of vegetated areas. Recently, however, their capability to monitor the Amazon forest phenology has been intensely scrutinized. In this study, we analyze the consistency of VIs seasonal patterns obtained from two MODIS products: the Collection 5 BRDF product (MCD43) and the Multi-Angle Implementation of Atmospheric Correction algorithm (MAIAC). The spatio-temporal patterns of the VIs were also compared with field measured leaf litterfall, gross ecosystem productivity and active microwave data. Our results show that significant seasonal patterns are observed in all VIs after the removal of view-illumination effects and cloud contamination. However, we demonstrate inconsistencies in the characteristics of seasonal patterns between different VIs and MODIS products. We demonstrate that differences in the original reflectance band values form a major source of discrepancy between MODIS VI products. The MAIAC atmospheric correction algorithm significantly reduces noise signals in the red and blue bands. Another important source of discrepancy is caused by differences in the availability of clear-sky data, as the MAIAC product allows increased availability of valid pixels in the equatorial Amazon. Finally, differences in VIs seasonal patterns were also caused by MODIS collection 5 calibration degradation. The correlation of remote sensing and field data also varied spatially, leading to different temporal offsets between VIs, active microwave and field measured data. We conclude that recent improvements in the MAIAC product have led to changes in the characteristics of spatio-temporal patterns of VIs seasonality across the Amazon forest, when compared to the MCD43 product. Nevertheless, despite improved quality and reduced uncertainties in the MAIAC product, a robust biophysical interpretation of VIs seasonality is still missing.
Consistency of Vegetation Index Seasonality Across the Amazon Rainforest
NASA Technical Reports Server (NTRS)
Maeda, Eduardo Eiji; Moura, Yhasmin Mendes; Wagner, Fabien; Hilker, Thomas; Lyapustin, Alexei I.; Wang, Yujie; Chave, Jerome; Mottus, Matti; Aragao, Luiz E.O.C.; Shimabukuro, Yosio
2016-01-01
Vegetation indices (VIs) calculated from remotely sensed reflectance are widely used tools for characterizing the extent and status of vegetated areas. Recently, however, their capability to monitor the Amazon forest phenology has been intensely scrutinized. In this study, we analyze the consistency of VIs seasonal patterns obtained from two MODIS products: the Collection 5 BRDF product (MCD43) and the Multi-Angle Implementation of Atmospheric Correction algorithm (MAIAC). The spatio-temporal patterns of the VIs were also compared with field measured leaf litterfall, gross ecosystem productivity and active microwave data. Our results show that significant seasonal patterns are observed in all VIs after the removal of view-illumination effects and cloud contamination. However, we demonstrate inconsistencies in the characteristics of seasonal patterns between different VIs and MODIS products. We demonstrate that differences in the original reflectance band values form a major source of discrepancy between MODIS VI products. The MAIAC atmospheric correction algorithm significantly reduces noise signals in the red and blue bands. Another important source of discrepancy is caused by differences in the availability of clear-sky data, as the MAIAC product allows increased availability of valid pixels in the equatorial Amazon. Finally, differences in VIs seasonal patterns were also caused by MODIS collection 5 calibration degradation. The correlation of remote sensing and field data also varied spatially, leading to different temporal offsets between VIs, active microwave and field measured data. We conclude that recent improvements in the MAIAC product have led to changes in the characteristics of spatio-temporal patterns of VIs seasonality across the Amazon forest, when compared to the MCD43 product. Nevertheless, despite improved quality and reduced uncertainties in the MAIAC product, a robust biophysical interpretation of VIs seasonality is still missing.
NASA Astrophysics Data System (ADS)
Paralovo, Sarah L.; Borillo, Guilherme C.; Barbosa, Cybelli G. G.; Godoi, Ana Flavia L.; Yamamoto, Carlos I.; de Souza, Rodrigo A. F.; Andreoli, Rita V.; Costa, Patrícia S.; Almeida, Gerson P.; Manzi, Antonio O.; Pöhlker, Christopher; Yáñez-Serrano, Ana M.; Kesselmeier, Jürgen; Godoi, Ricardo H. M.
2016-03-01
The Amazon region is one of the most significant natural ecosystems on the planet. Of special interest as a major study area is the interface between the forest and Manaus city, a state capital in Brazil embedded in the heart of the Amazon forest. In view of the interactions between natural and anthropogenic processes, an integrated experiment was conducted measuring the concentrations of the volatile organic compounds (VOCs) benzene, toluene, ethylbenzene and meta, ortho, para-xylene (known as BTEX), all of them regarded as pollutants with harmful effects on human health and vegetation and acting also as important precursors of tropospheric ozone. Furthermore, these compounds also take part in the formation of secondary organic aerosols, which can influence the pattern of cloud formation, and thus the regional water cycle and climate. The samples were collected in 2012/2013 at three different sites: (i) The Amazon Tall Tower Observatory (ATTO), a pristine rain forest region in the central Amazon Basin; (ii) Manacapuru, a semi-urban site located southwest and downwind of Manaus as a preview of the Green Ocean Amazon Experiment (GoAmazon 2014/15); and (iii) the city of Manaus (distributed over three sites). Results indicate that there is an increase in pollutant concentrations with increasing proximity to urban areas. For instance, the benzene concentration ranges were 0.237-19.6 (Manaus), 0.036-0.948 (Manacapuru) and 0.018-0.313 μg m-3 (ATTO). Toluene ranges were 0.700-832 (Manaus), 0.091-2.75 μg m-3 (Manacapuru) and 0.011-4.93 (ATTO). For ethylbenzene, they were 0.165-447 (Manaus), 0.018-1.20 μg m-3 (Manacapuru) and 0.047-0.401 (ATTO). Some indication was found for toluene to be released from the forest. No significant difference was found between the BTEX levels measured in the dry season and the wet seasons. Furthermore, it was observed that, in general, the city of Manaus seems to be less impacted by these pollutants than other cities in Brazil and in other
Development of GK-2A cloud optical and microphysical properties retrieval algorithm
NASA Astrophysics Data System (ADS)
Yang, Y.; Yum, S. S.; Um, J.
2017-12-01
Cloud and aerosol radiative forcing is known to be one of the the largest uncertainties in climate change prediction. To reduce this uncertainty, remote sensing observation of cloud radiative and microphysical properties have been used since 1970s and the corresponding remote sensing techniques and instruments have been developed. As a part of such effort, Geo-KOMPSAT-2A (Geostationary Korea Multi-Purpose Satellite-2A, GK-2A) will be launched in 2018. On the GK-2A, the Advanced Meteorological Imager (AMI) is primary instrument which have 3 visible, 3 near-infrared, and 10 infrared channels. To retrieve optical and microphysical properties of clouds using AMI measurements, the preliminary version of new cloud retrieval algorithm for GK-2A was developed and several validation tests were conducted. This algorithm retrieves cloud optical thickness (COT), cloud effective radius (CER), liquid water path (LWP), and ice water path (IWP), so we named this algorithm as Daytime Cloud Optical thickness, Effective radius and liquid and ice Water path (DCOEW). The DCOEW uses cloud reflectance at visible and near-infrared channels as input data. An optimal estimation (OE) approach that requires appropriate a-priori values and measurement error information is used to retrieve COT and CER. LWP and IWP are calculated using empirical relationships between COT/CER and cloud water path that were determined previously. To validate retrieved cloud properties, we compared DCOEW output data with other operational satellite data. For COT and CER validation, we used two different data sets. To compare algorithms that use cloud reflectance at visible and near-IR channels as input data, MODIS MYD06 cloud product was selected. For the validation with cloud products that are based on microwave measurements, COT(2B-TAU)/CER(2C-ICE) data retrieved from CloudSat cloud profiling radar (W-band, 94 GHz) was used. For cloud water path validation, AMSR-2 Level-3 Cloud liquid water data was used
[CII] observations of H2 molecular layers in transition clouds
NASA Astrophysics Data System (ADS)
Velusamy, T.; Langer, W. D.; Pineda, J. L.; Goldsmith, P. F.; Li, D.; Yorke, H. W.
2010-10-01
We present the first results on the diffuse transition clouds observed in [CII] line emission at 158 μm (1.9 THz) towards Galactic longitudes near 340° (5 LOSs) & 20° (11 LOSs) as part of the HIFI tests and GOT C+ survey. Out of the total 146 [CII] velocity components detected by profile fitting we identify 53 as diffuse molecular clouds with associated 12CO emission but without 13CO emission and characterized by AV < 5 mag. We estimate the fraction of the [CII] emission in the diffuse HI layer in each cloud and then determine the [CII] emitted from the molecular layers in the cloud. We show that the excess [CII] intensities detected in a few clouds is indicative of a thick H2 layer around the CO core. The wide range of clouds in our sample with thin to thick H2 layers suggests that these are at various evolutionary states characterized by the formation of H2 and CO layers from HI and C+, respectively. In about 30% of the clouds the H2 column densities (“dark gas”) traced by the [CII] is 50% or more than that traced by 12CO emission. On the average ~25% of the total H2 in these clouds is in an H2 layer which is not traced by CO. We use the HI, [CII], and 12CO intensities in each cloud along with simple chemical models to obtain constraints on the FUV fields and cosmic ray ionization rates. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.
Smoke, Clouds, and Radiation-Brazil (SCAR-B) Experiment
NASA Technical Reports Server (NTRS)
Kaufman, Y. J.; Hobbs, P. V.; Kirchoff, V. W. J. H.; Artaxo, P.; Remer, L. A.; Holben, B. N.; King, M. D.; Ward, D. E.; Prins, E. M.; Longo, K. M.;
1998-01-01
The Smoke, Clouds, and Radiation-Brazil (SCAR-B) field project took place in the Brazilian Amazon and cerrado regions in August-September 1995 as a collaboration between Brazilian and American scientists. SCAR-B, a comprehensive experiment to study biomass burning, emphasized measurements of surface biomass, fires, smoke aerosol and trace gases, clouds, and radiation. their climatic effects, and remote sensing from aircraft and satellites. It included aircraft and ground-based in situ measurements of smoke emission factors and the compositions, sizes, and optical properties of the smoke particles; studies of the formation of ozone; the transport and evolution of smoke; and smoke interactions with water vapor and clouds. This overview paper introduces SCAR-B and summarizes some of the main results obtained so far. (1) Fires: measurements of the size distribution of fires, using the 50 m resolution MODIS Airborne Simulator, show that most of the fires are small (e.g. 0.005 square km), but the satellite sensors (e.g., AVHRR and MODIS with I km resolution) can detect fires in Brazil which are responsible for 60-85% of the burned biomass: (2) Aerosol: smoke particles emitted from fires increase their radius by as much as 60%, during their first three days in the atmosphere due to condensation and coagulation, reaching a mass median radius of 0.13-0.17 microns: (3) Radiative forcing: estimates of the globally averaged direct radiative forcing due to smoke worldwide, based on the properties of smoke measured in SCAR-B (-O.l to -0.3 W m(exp -2)), are smaller than previously modeled due to a lower single-scattering albedo (0.8 to 0.9), smaller scattering efficiency (3 square meters g(exp -2) at 550 nm), and low humidification factor; and (4) Effect on clouds: a good relationship was found between cloud condensation nuclei and smoke volume concentrations, thus an increase in the smoke emission is expected to affect cloud properties. In SCAR-B, new techniques were developed
Pastor, Antoni; Farré, Magí; Fitó, Montserrat; Fernandez-Aranda, Fernando; de la Torre, Rafael
2014-05-01
The analysis of peripheral endocannabinoids (ECs) is a good biomarker of the EC system. Their concentrations, from clinical studies, strongly depend on sample collection and time processing conditions taking place in clinical and laboratory settings. The analysis of 2-monoacylglycerols (MGs) (i.e., 2-arachidonoylglycerol or 2-oleoylglycerol) is a particularly challenging issue because of their ex vivo formation and chemical isomerization that occur after blood sample collection. We provide evidence that their ex vivo formation can be minimized by adding Orlistat, an enzymatic lipase inhibitor, to plasma. Taking into consideration the low cost of Orlistat, we recommend its addition to plasma collecting tubes while maintaining sample cold chain until storage. We have validated a method for the determination of the EC profile of a range of MGs and N-acylethanolamides in plasma that preserves the original isomer ratio of MGs. Nevertheless, the chemical isomerization of 2-MGs can only be avoided by an immediate processing and analysis of samples due to their instability during conservation. We believe that this new methodology can aid in the harmonization of the measurement of ECs and related compounds in clinical samples.
Using CDOM optical properties for estimating DOC concentrations and pCO 2 in the Lower Amazon River
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valerio, Aline de Matos; Kampel, Milton; Vantrepotte, Vincent
Colored dissolved organic matter (CDOM) is largely responsible for the optical properties of freshwaters and coastal areas and can be used as a proxy to assess the non-optical carbon content as the dissolved organic carbon (DOC) and the partial pressure of carbon dioxide (pCO 2). Nevertheless, riverine studies that explores the former relationship are more challenging due to the spectral mixture caused by the high content of inorganic materials in the suspended sediment. Here we evaluate the spatial-temporal variability of CDOM, DOC and pCO 2, as well as the potential of CDOM absorption coefficient (aCDOM(412)) for estimating DOC concentration andmore » pCO 2 along the lower Amazon River. Our results revealed differences in the dissolved organic matter (DOM) quality between clear water (CW) tributaries and the Amazon River mainstem. A linear relationship between DOC and CDOM was observed when tributaries and mainstem are evaluated separately (Amazon waters: N=42, R2=0.74; CW: N= 13, R2 = 0.57). However, this linear relationship was not observed during periods of higher rainfall and river discharge, requiring a model specific to these time periods to be developed (N = 25, R2 = 0.58). A strong linear positive relation was found between aCDOM(412) and pCO 2( N=69, R2=0.65) along the lower river. pCO 2 was less affected by the optical difference between tributaries and mainstem water or by the presence of higher hygrometric conditions when compared to CDOM to DOC relationships. Including the river water temperature in the model improves our ability to estimate pCO 2 (N=69; R2 = 0.80). Our results also illustrate the complexity of DOM temporal dynamics in the lower Amazon River where the occurrence of extreme high and low discharge due to factors such as El Niño, can significantly alter the expected seasonal oscillation, as was the case during this study period. The ability to remotely assess both DOC and pCO 2 from CDOM optical properties highlight the importance
NASA Astrophysics Data System (ADS)
Wang, Zhe; Wang, Zhenhui; Cao, Xiaozhong; Tao, Fa
2018-01-01
Clouds are currently observed by both ground-based and satellite remote sensing techniques. Each technique has its own strengths and weaknesses depending on the observation method, instrument performance and the methods used for retrieval. It is important to study synergistic cloud measurements to improve the reliability of the observations and to verify the different techniques. The FY-2 geostationary orbiting meteorological satellites continuously observe the sky over China. Their cloud top temperature product can be processed to retrieve the cloud top height (CTH). The ground-based millimeter wavelength cloud radar can acquire information about the vertical structure of clouds-such as the cloud base height (CBH), CTH and the cloud thickness-and can continuously monitor changes in the vertical profiles of clouds. The CTHs were retrieved using both cloud top temperature data from the FY-2 satellites and the cloud radar reflectivity data for the same time period (June 2015 to May 2016) and the resulting datasets were compared in order to evaluate the accuracy of CTH retrievals using FY-2 satellites. The results show that the concordance rate of cloud detection between the two datasets was 78.1%. Higher consistencies were obtained for thicker clouds with larger echo intensity and for more continuous clouds. The average difference in the CTH between the two techniques was 1.46 km. The difference in CTH between low- and mid-level clouds was less than that for high-level clouds. An attenuation threshold of the cloud radar for rainfall was 0.2 mm/min; a rainfall intensity below this threshold had no effect on the CTH. The satellite CTH can be used to compensate for the attenuation error in the cloud radar data.
NASA Astrophysics Data System (ADS)
Yamasoe, M. A.; do Rosário, N. M. E.; Barros, K. M.
2017-01-01
We analyzed the variability of downward solar irradiance reaching the surface at São Paulo city, Brazil, and estimated the climatological aerosol and cloud radiative effects. Eleven years of irradiance were analyzed, from 2005 to 2015. To distinguish the aerosol from the cloud effect, the radiative transfer code LibRadtran was used to calculate downward solar irradiance. Two runs were performed, one considering only ozone and water vapor daily variability, with AOD set to zero and the second allowing the three variables to change, according to mean climatological values. The difference of the 24 h mean irradiance calculated with and without aerosol resulted in the shortwave aerosol direct radiative effect, while the difference between the measured and calculated, including the aerosol, represented the cloud effect. Results showed that, climatologically, clouds can be 4 times more effective than aerosols. The cloud shortwave radiative effect presented a maximum reduction of about -170 W m-2 in January and a minimum in July, of -37 W m-2. The aerosol direct radiative effect was maximum in spring, when the transport of smoke from the Amazon and central parts of South America is frequent toward São Paulo. Around mid-September, the 24 h radiative effect due to aerosol only was estimated to be -50 W m-2. Throughout the rest of the year, the mean aerosol effect was around -20 W m-2 and was attributed to local urban sources. The effect of the cloud fraction on the cloud modification factor, defined as the ratio of all-sky irradiation to cloudless sky irradiation, showed dependence on the cloud height. Low clouds presented the highest impact while the presence of high clouds only almost did not affect solar transmittance, even in overcast conditions.
Analysis of cloud-based solutions on EHRs systems in different scenarios.
Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C
2012-12-01
Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.
NASA Technical Reports Server (NTRS)
Yu, Hongbin; Chin, Mian; Yuan, Tianle; Bian, Huisheng; Remer, Lorraine A.; Prospero, Joseph M.; Omar, Ali; Winker, David; Yang, Yuekui; Zhang, Yan;
2015-01-01
The productivity of the Amazon rainforest is constrained by the availability of nutrients, in particular phosphorus (P). Deposition of long-range transported African dust is recognized as a potentially important but poorly quantified source of phosphorus. This study provides a first multiyear satellite-based estimate of dust deposition into the Amazon Basin using three dimensional (3D) aerosol measurements over 2007-2013 from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP). The 7-year average of dust deposition into the Amazon Basin is estimated to be 28 (8 to approximately 48) Tg a(exp -1) or 29 (8 to approximately 50) kg ha(exp -1) a(exp -1). The dust deposition shows significant interannual variation that is negatively correlated with the prior-year rainfall in the Sahel. The CALIOP-based multi-year mean estimate of dust deposition matches better with estimates from in-situ measurements and model simulations than a previous satellite-based estimate does. The closer agreement benefits from a more realistic geographic definition of the Amazon Basin and inclusion of meridional dust transport calculation in addition to the 3D nature of CALIOP aerosol measurements. The imported dust could provide about 0.022 (0.0060.037) Tg P of phosphorus per year, equivalent to 23 (7 to approximately 39) g P ha(exp -1) a(exp -1) to fertilize the Amazon rainforest. This out-of-Basin P input largely compensates the hydrological loss of P from the Basin, suggesting an important role of African dust in preventing phosphorus depletion on time scales of decades to centuries.
Park, Keun Hyung; Lee, Tae Hoon; Kim, Chan Woo; Kim, Jiyoung
2013-06-15
CCL15, a member of the CC chemokine family, is a potent chemoattractant for leukocytes and endothelial cells (ECs). Given that chemokines play key roles in vascular inflammation, we investigated the effects of hypoxia/reoxygenation (H/R) on expression of human CCL15 and a role of CCL15 in upregulating ICAM-1 in ECs. We found that exposure of ECs to H/R increased expression of CCL15 and ICAM-1, which resulted in an increase in monocyte adhesivity to the ECs. Further studies revealed that knockdown of CCL15 or CCR1 attenuated expression of ICAM-1 in ECs after H/R, suggesting that expression of ICAM-1 is upregulated by CCL15. Stimulation of ECs with CCL15 significantly increased expression of ICAM-1 predominantly via the CCR1 receptor. We observed that phosphorylation of JAK2 and STAT3 was stimulated by CCL15 treatment of ECs. Results from reporter and chromatin immunoprecipitation assays revealed that CCL15 activates transcription from the IFN-γ activation site promoter and stimulates binding of STAT3 to the ICAM-1 promoter. Our data also showed that CCL15 increased cell adhesion of human monocytes to ECs under static and shear-stress conditions. Pretreatment of these cells with inhibitors for JAK, PI3K, and AKT prevented the CCL15-induced expression of ICAM-1 and monocyte adhesion to ECs, suggesting the involvement of those signaling molecules in ICAM-1 gene activation by CCL15. The results suggest that CCR1 and its ligands may be a potential target for treating inflammatory diseases involving upregulation of cell adhesion molecules.
NASA Astrophysics Data System (ADS)
Bogota-Angel, Raul; Chemale Junior, Farid; Davila, Roberto; Soares, Emilson; Pinto, Ricardo; Do Carmo, Dermeval; Hoorn, Carina
2014-05-01
Origen and development of the highly diverse Amazon tropical forest has mostly been inferred from continental sites. However, sediment records in the marine Foz do Amazonas Basin can provide important information to better understand the influence of the Andes uplift and climate change on its plant biomes evolution since the Neogene. Sediment analyses of samples from BP-Petrobras well 1 and 2, drilled in the Amazon Fan, allowed to infer the onset of the transcontinental Amazon river and the fan phase during the middle to late Miocene (c. 10.5 Ma). As part of the CLIMAMAZON research programme we performed pollen analysis on the 10.5 to 0.4 Ma time interval. 76 ditch cutting samples of the upper 4165 m sediments of well 2 permitted us to infer changes in floral composition in the Amazon Basin. The palynological spectra across this interval (nannofossil based age model) include pollen, fern spores, dinocysts and foram lignings. When possible pollen and fern spores were grouped in four vegetation types: estuarine, tropical, mountain forest and high mountain open treeless vegetation. Pollen is generally corroded and reflects the effects of sediment transportation while reworked material is also common. Good pollen producers such as Poaceae, Asteraceae and Cyperaceae are common and reflect indistinctive vegetation types particularly those associated to riverine systems. Rhizophora/Zonocostites spp. indicate "close-distance" mangrove development. Tropical forest biomes are represented by pollen that resemble Moraceae-Urticaceae, Melastomataceae-Combretaceae, Sapotaceae, Alchornea, Euphorbiaceae, Rubiaceae, Bignoniaceae, Mauritia and Arecaceae. Myrica, and particularly sporadic occurrences of fossil fern spores like Lophosoria, and Cyathea suggest the development of a moist Andean forest in areas above 1000 m. First indicators of high altitudes appear in the last part of late Miocene with taxa associated to current Valeriana and particularly Polylepis, a neotropical taxon
NASA Astrophysics Data System (ADS)
Cak, A. D.
2017-12-01
The Amazon Basin has faced innumerable pressures in recent years, including logging, mining and resource extraction, agricultural expansion, road building, and urbanization. These changes have drastically altered the landscape, transforming a predominantly forested environment into a mosaic of different types of land cover. The resulting fragmentation has caused dramatic and negative impacts on its structure and function, including on biodiversity and the transfer of water and energy to and from soil, vegetation, and the atmosphere (e.g., evapotranspiration). Because evapotranspiration from forested areas, which is affected by factors including temperature and water availability, plays a significant role in water dynamics in the Amazon Basin, measuring land surface temperature (LST) across the region can provide a dynamic assessment of hydrological, vegetation, and land use and land cover changes. It can also help to identify widespread urban development, which often has a higher LST signal relative to surrounding vegetation. Here, we discuss results from work to measure and identify drivers of change in LST across the entire Amazon Basin through analysis of past and current thermal and infrared satellite imagery. We leverage cloud computing resources in new ways to allow for more efficient analysis of imagery over the Amazon Basin across multiple years and multiple sensors. We also assess potential drivers of change in LST using spatial and multivariate statistical analyses with additional data sources of land cover, urban development, and demographics.
Cloud Based Earth Observation Data Exploitation Platforms
NASA Astrophysics Data System (ADS)
Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.
2017-12-01
and the Amazon Web Services cloud. This work will present an overview of the TEPs and the multi-cloud EO data processing platform, and discuss their main achievements and their impacts in the context of distributed Research Infrastructures such as EPOS and EOSC.
NASA Astrophysics Data System (ADS)
Taylor, T. E.; O'Dell, C. W.; Frankenberg, C.; Partain, P.; Cronk, H. Q.; Savtchenko, A.; Nelson, R. R.; Rosenthal, E. J.; Chang, A. Y.; Fisher, B.; Osterman, G.; Pollock, R. H.; Crisp, D.; Eldering, A.; Gunson, M. R.
2015-12-01
The objective of the National Aeronautics and Space Administration's (NASA) Orbiting Carbon Observatory-2 (OCO-2) mission is to retrieve the column-averaged carbon dioxide (CO2) dry air mole fraction (XCO2) from satellite measurements of reflected sunlight in the near-infrared. These estimates can be biased by clouds and aerosols within the instrument's field of view (FOV). Screening of the most contaminated soundings minimizes unnecessary calls to the computationally expensive Level 2 (L2) XCO2 retrieval algorithm. Hence, robust cloud screening methods have been an important focus of the OCO-2 algorithm development team. Two distinct, computationally inexpensive cloud screening algorithms have been developed for this application. The A-Band Preprocessor (ABP) retrieves the surface pressure using measurements in the 0.76 μm O2 A-band, neglecting scattering by clouds and aerosols, which introduce photon path-length (PPL) differences that can cause large deviations between the expected and retrieved surface pressure. The Iterative Maximum A-Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) Preprocessor (IDP) retrieves independent estimates of the CO2 and H2O column abundances using observations taken at 1.61 μm (weak CO2 band) and 2.06 μm (strong CO2 band), while neglecting atmospheric scattering. The CO2 and H2O column abundances retrieved in these two spectral regions differ significantly in the presence of cloud and scattering aerosols. The combination of these two algorithms, which key off of different features in the spectra, provides the basis for cloud screening of the OCO-2 data set. To validate the OCO-2 cloud screening approach, collocated measurements from NASA's Moderate Resolution Imaging Spectrometer (MODIS), aboard the Aqua platform, were compared to results from the two OCO-2 cloud screening algorithms. With tuning to allow throughputs of ≃ 30 %, agreement between the OCO-2 and MODIS cloud screening methods is found to be
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burleyson, Casey D.; Feng, Zhe; Hagos, Samson M.
The Amazon rainforest is one of a few regions of the world where continental tropical deep convection occurs. The Amazon’s isolation makes it challenging to observe, but also creates a unique natural laboratory to study anthropogenic impacts on clouds and precipitation in an otherwise pristine environment. Extensive measurements were made upwind and downwind of the large city of Manaus, Brazil during the Observations and Modeling of the Green Ocean Amazon 2014-2015 (GoAmazon2014/5) field campaign. In this study, 15 years of high-resolution satellite data are analyzed to examine the spatial and diurnal variability of convection occurring around the GoAmazon2014/5 sites. Interpretationmore » of anthropogenic differences between the upwind (T0) and downwind (T1-T3) sites is complicated by naturally-occurring spatial variability between the sites. During the rainy season, the inland propagation of the previous day’s sea-breeze front happens to be in phase with the background diurnal cycle near Manaus, but is out of phase elsewhere. Enhanced convergence between the river-breezes and the easterly trade winds generates up to 10% more frequent deep convection at the GoAmazon2014/5 sites east of the river (T0a, T0t/k, and T1) compared to the T3 site which was located near the western bank. In general, the annual and diurnal cycles during 2014 were representative of the 2000-2013 distributions. The only exceptions were in March when the monthly mean rainrate was above the 95th percentile and September when both rain frequency and intensity were suppressed. The natural spatial variability must be accounted for before interpreting anthropogenically-induced differences among the GoAmazon2014/5 sites.« less
Nitroprusside and ECS-induced retrograde amnesia.
Sudha, S; Andrade, C; Anand, A; Guido, S; Venkataraman, B V
2001-03-01
Previous research found that the administration of verapamil and felodipine immediately before electroconvulsive shocks (ECS) attenuated ECS-induced retrograde amnesia. This study examined whether sodium nitroprusside, an antihypertensive drug that does not affect calcium channels, has a similar action. Adult male Sprague-Dawley rats received nitroprusside (0.5 mg/kg ip) or saline 3 minutes before each of three once-daily true or sham ECS. Retention of pre-ECS learning was studied 1 day after ECS using a passive avoidance task. Nitroprusside was associated with increased seizure duration in ECS-treated rats, and with enhanced recall in both true and sham ECS groups. The latter finding suggests that nitroprusside nonspecifically improves cognitive functions, and does not support the hypothesis that ECS-induced cognitive impairment is a result of blood-brain barrier breach. Nitric oxide mechanisms may underlie the benefits purveyed by nitroprusside.
Deforestation effects on Amazon forest resilience
NASA Astrophysics Data System (ADS)
Zemp, D. C.; Schleussner, C.-F.; Barbosa, H. M. J.; Rammig, A.
2017-06-01
Through vegetation-atmosphere feedbacks, rainfall reductions as a result of Amazon deforestation could reduce the resilience on the remaining forest to perturbations and potentially lead to large-scale Amazon forest loss. We track observation-based water fluxes from sources (evapotranspiration) to sinks (rainfall) to assess the effect of deforestation on continental rainfall. By studying 21st century deforestation scenarios, we show that deforestation can reduce dry season rainfall by up to 20% far from the deforested area, namely, over the western Amazon basin and the La Plata basin. As a consequence, forest resilience is systematically eroded in the southwestern region covering a quarter of the current Amazon forest. Our findings suggest that the climatological effects of deforestation can lead to permanent forest loss in this region. We identify hot spot regions where forest loss should be avoided to maintain the ecological integrity of the Amazon forest.
Hydrologic resilience and Amazon productivity.
Ahlström, Anders; Canadell, Josep G; Schurgers, Guy; Wu, Minchao; Berry, Joseph A; Guan, Kaiyu; Jackson, Robert B
2017-08-30
The Amazon rainforest is disproportionately important for global carbon storage and biodiversity. The system couples the atmosphere and land, with moist forest that depends on convection to sustain gross primary productivity and growth. Earth system models that estimate future climate and vegetation show little agreement in Amazon simulations. Here we show that biases in internally generated climate, primarily precipitation, explain most of the uncertainty in Earth system model results; models, empirical data and theory converge when precipitation biases are accounted for. Gross primary productivity, above-ground biomass and tree cover align on a hydrological relationship with a breakpoint at ~2000 mm annual precipitation, where the system transitions between water and radiation limitation of evapotranspiration. The breakpoint appears to be fairly stable in the future, suggesting resilience of the Amazon to climate change. Changes in precipitation and land use are therefore more likely to govern biomass and vegetation structure in Amazonia.Earth system model simulations of future climate in the Amazon show little agreement. Here, the authors show that biases in internally generated climate explain most of this uncertainty and that the balance between water-saturated and water-limited evapotranspiration controls the Amazon resilience to climate change.
Observations and Modeling of the Green Ocean Amazon: Sounding Enhancement Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schumacher, Courtney
2016-05-01
The goal of this campaign was to provide higher temporal sampling of the vertical structure of the atmosphere during the two intensive observational periods (IOPs) of the GoAmazon 2014/15 campaign. The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s first ARM Mobile Facility (AMF1) baseline launches for 2014 and 2015 was 4 sondes/day at 2 am, 8 am, 2 pm, and 8 pm local time (LT) (6, 12, 18 and 0 Coordinated Universal Time [UTC]). However, rapid changes in boundary layer and free tropospheric temperature, humidity, and wind profiles happen throughout the diurnal cycle over Manaus,more » Brazil's complex forest canopy with resulting responses in aerosol, cloud, and precipitation characteristics. This campaign increased sampling to 5 sondes/day for the 2014 wet and dry season IOPs by adding a launch at 11 am (15 UTC) to capture rapid changes in boundary layer properties and convective cloud growth during that time. The extra launch also corresponded to the time of day the ARM Gulfstream (G-1) and German HALO aircraft most often flew, thus providing useful measurements of the large-scale environment during the flights. In addition, the extra launch will significantly add to the quality of AMF1 instrument retrievals and variational analysis forcing data set during the IOPs.« less
OCRA radiometric cloud fractions for GOME-2 on MetOp-A/B
NASA Astrophysics Data System (ADS)
Lutz, R.; Loyola, D.; Gimeno García, S.; Romahn, F.
2015-12-01
This paper describes an approach for cloud parameter retrieval (radiometric cloud fraction estimation) using the polarization measurements of the Global Ozone Monitoring Experiment-2 (GOME-2) on-board the MetOp-A/B satellites. The core component of the Optical Cloud Recognition Algorithm (OCRA) is the calculation of monthly cloud-free reflectances for a global grid (resolution of 0.2° in longitude and 0.2° in latitude) and to derive radiometric cloud fractions. These cloud fractions will serve as a priori information for the retrieval of cloud top height (CTH), cloud top pressure (CTP), cloud top albedo (CTA) and cloud optical thickness (COT) with the Retrieval Of Cloud Information using Neural Networks (ROCINN) algorithm. This approach is already being implemented operationally for the GOME/ERS-2 and SCIAMACHY/ENVISAT sensors and here we present version 3.0 of the OCRA algorithm applied to the GOME-2 sensors. Based on more than six years of GOME-2A data (February 2007-June 2013), reflectances are calculated for ≈ 35 000 orbits. For each measurement a degradation correction as well as a viewing angle dependent and latitude dependent correction is applied. In addition, an empirical correction scheme is introduced in order to remove the effect of oceanic sun glint. A comparison of the GOME-2A/B OCRA cloud fractions with co-located AVHRR geometrical cloud fractions shows a general good agreement with a mean difference of -0.15±0.20. From operational point of view, an advantage of the OCRA algorithm is its extremely fast computational time and its straightforward transferability to similar sensors like OMI (Ozone Monitoring Instrument), TROPOMI (TROPOspheric Monitoring Instrument) on Sentinel 5 Precursor, as well as Sentinel 4 and Sentinel 5. In conclusion, it is shown that a robust, accurate and fast radiometric cloud fraction estimation for GOME-2 can be achieved with OCRA by using the polarization measurement devices (PMDs).
OCRA radiometric cloud fractions for GOME-2 on MetOp-A/B
NASA Astrophysics Data System (ADS)
Lutz, Ronny; Loyola, Diego; Gimeno García, Sebastián; Romahn, Fabian
2016-05-01
This paper describes an approach for cloud parameter retrieval (radiometric cloud-fraction estimation) using the polarization measurements of the Global Ozone Monitoring Experiment-2 (GOME-2) onboard the MetOp-A/B satellites. The core component of the Optical Cloud Recognition Algorithm (OCRA) is the calculation of monthly cloud-free reflectances for a global grid (resolution of 0.2° in longitude and 0.2° in latitude) to derive radiometric cloud fractions. These cloud fractions will serve as a priori information for the retrieval of cloud-top height (CTH), cloud-top pressure (CTP), cloud-top albedo (CTA) and cloud optical thickness (COT) with the Retrieval Of Cloud Information using Neural Networks (ROCINN) algorithm. This approach is already being implemented operationally for the GOME/ERS-2 and SCIAMACHY/ENVISAT sensors and here we present version 3.0 of the OCRA algorithm applied to the GOME-2 sensors. Based on more than five years of GOME-2A data (April 2008 to June 2013), reflectances are calculated for ≈ 35 000 orbits. For each measurement a degradation correction as well as a viewing-angle-dependent and latitude-dependent correction is applied. In addition, an empirical correction scheme is introduced in order to remove the effect of oceanic sun glint. A comparison of the GOME-2A/B OCRA cloud fractions with colocated AVHRR (Advanced Very High Resolution Radiometer) geometrical cloud fractions shows a general good agreement with a mean difference of -0.15 ± 0.20. From an operational point of view, an advantage of the OCRA algorithm is its very fast computational time and its straightforward transferability to similar sensors like OMI (Ozone Monitoring Instrument), TROPOMI (TROPOspheric Monitoring Instrument) on Sentinel 5 Precursor, as well as Sentinel 4 and Sentinel 5. In conclusion, it is shown that a robust, accurate and fast radiometric cloud-fraction estimation for GOME-2 can be achieved with OCRA using polarization measurement devices (PMDs).
Carbon uptake by mature Amazon forests has mitigated Amazon nations' carbon emissions.
Phillips, Oliver L; Brienen, Roel J W
2017-12-01
Several independent lines of evidence suggest that Amazon forests have provided a significant carbon sink service, and also that the Amazon carbon sink in intact, mature forests may now be threatened as a result of different processes. There has however been no work done to quantify non-land-use-change forest carbon fluxes on a national basis within Amazonia, or to place these national fluxes and their possible changes in the context of the major anthropogenic carbon fluxes in the region. Here we present a first attempt to interpret results from ground-based monitoring of mature forest carbon fluxes in a biogeographically, politically, and temporally differentiated way. Specifically, using results from a large long-term network of forest plots, we estimate the Amazon biomass carbon balance over the last three decades for the different regions and nine nations of Amazonia, and evaluate the magnitude and trajectory of these differentiated balances in relation to major national anthropogenic carbon emissions. The sink of carbon into mature forests has been remarkably geographically ubiquitous across Amazonia, being substantial and persistent in each of the five biogeographic regions within Amazonia. Between 1980 and 2010, it has more than mitigated the fossil fuel emissions of every single national economy, except that of Venezuela. For most nations (Bolivia, Colombia, Ecuador, French Guiana, Guyana, Peru, Suriname) the sink has probably additionally mitigated all anthropogenic carbon emissions due to Amazon deforestation and other land use change. While the sink has weakened in some regions since 2000, our analysis suggests that Amazon nations which are able to conserve large areas of natural and semi-natural landscape still contribute globally-significant carbon sequestration. Mature forests across all of Amazonia have contributed significantly to mitigating climate change for decades. Yet Amazon nations have not directly benefited from providing this global scale
The MSG-SEVIRI-based cloud property data record CLAAS-2
NASA Astrophysics Data System (ADS)
Benas, Nikos; Finkensieper, Stephan; Stengel, Martin; van Zadelhoff, Gerd-Jan; Hanschmann, Timo; Hollmann, Rainer; Fokke Meirink, Jan
2017-07-01
Clouds play a central role in the Earth's atmosphere, and satellite observations are crucial for monitoring clouds and understanding their impact on the energy budget and water cycle. Within the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) Satellite Application Facility on Climate Monitoring (CM SAF), a new cloud property data record was derived from geostationary Meteosat Spinning Enhanced Visible and Infrared Imager (SEVIRI) measurements for the time frame 2004-2015. The resulting CLAAS-2 (CLoud property dAtAset using SEVIRI, Edition 2) data record is publicly available via the CM SAF website (https://doi.org/10.5676/EUM_SAF_CM/CLAAS/V002). In this paper we present an extensive evaluation of the CLAAS-2 cloud products, which include cloud fractional coverage, thermodynamic phase, cloud top properties, liquid/ice cloud water path and corresponding optical thickness and particle effective radius. Data validation and comparisons were performed on both level 2 (native SEVIRI grid and repeat cycle) and level 3 (daily and monthly averages and histograms) with reference datasets derived from lidar, microwave and passive imager measurements. The evaluation results show very good overall agreement with matching spatial distributions and temporal variability and small biases attributed mainly to differences in sensor characteristics, retrieval approaches, spatial and temporal samplings and viewing geometries. No major discrepancies were found. Underpinned by the good evaluation results, CLAAS-2 demonstrates that it is fit for the envisaged applications, such as process studies of the diurnal cycle of clouds and the evaluation of regional climate models. The data record is planned to be extended and updated in the future.
NASA Astrophysics Data System (ADS)
Taylor, Thomas E.; O'Dell, Christopher W.; Frankenberg, Christian; Partain, Philip T.; Cronk, Heather Q.; Savtchenko, Andrey; Nelson, Robert R.; Rosenthal, Emily J.; Chang, Albert Y.; Fisher, Brenden; Osterman, Gregory B.; Pollock, Randy H.; Crisp, David; Eldering, Annmarie; Gunson, Michael R.
2016-03-01
The objective of the National Aeronautics and Space Administration's (NASA) Orbiting Carbon Observatory-2 (OCO-2) mission is to retrieve the column-averaged carbon dioxide (CO2) dry air mole fraction (XCO2) from satellite measurements of reflected sunlight in the near-infrared. These estimates can be biased by clouds and aerosols, i.e., contamination, within the instrument's field of view. Screening of the most contaminated soundings minimizes unnecessary calls to the computationally expensive Level 2 (L2) XCO2 retrieval algorithm. Hence, robust cloud screening methods have been an important focus of the OCO-2 algorithm development team. Two distinct, computationally inexpensive cloud screening algorithms have been developed for this application. The A-Band Preprocessor (ABP) retrieves the surface pressure using measurements in the 0.76 µm O2 A band, neglecting scattering by clouds and aerosols, which introduce photon path-length differences that can cause large deviations between the expected and retrieved surface pressure. The Iterative Maximum A Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) Preprocessor (IDP) retrieves independent estimates of the CO2 and H2O column abundances using observations taken at 1.61 µm (weak CO2 band) and 2.06 µm (strong CO2 band), while neglecting atmospheric scattering. The CO2 and H2O column abundances retrieved in these two spectral regions differ significantly in the presence of cloud and scattering aerosols. The combination of these two algorithms, which are sensitive to different features in the spectra, provides the basis for cloud screening of the OCO-2 data set.To validate the OCO-2 cloud screening approach, collocated measurements from NASA's Moderate Resolution Imaging Spectrometer (MODIS), aboard the Aqua platform, were compared to results from the two OCO-2 cloud screening algorithms. With tuning of algorithmic threshold parameters that allows for processing of ≃ 20-25 % of all OCO-2 soundings
NASA Astrophysics Data System (ADS)
Trepte, Qing; Minnis, Patrick; Sun-Mack, Sunny; Trepte, Charles
Clouds and aerosol play important roles in the global climate system. Accurately detecting their presence, altitude, and properties using satellite radiance measurements is a crucial first step in determining their influence on surface and top-of-atmosphere radiative fluxes. This paper presents a comparison analysis of a new version of the Clouds and Earth's Radiant Energy System (CERES) Edition 3 cloud detection algorithms using Aqua MODIS data with the recently released Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) Version 2 Vertical Feature Mask (VFM). Improvements in CERES Edition 3 cloud mask include dust detection, thin cirrus tests, enhanced low cloud detection at night, and a smoother transition from mid-latitude to polar regions. For the CALIPSO Version 2 data set, changes to the lidar calibration can result in significant improvements to its identification of optically thick aerosol layers. The Aqua and CALIPSO satellites, part of the A-train satellite constellation, provide a unique opportunity for validating passive sensor cloud and aerosol detection using an active sensor. In this paper, individual comparison cases will be discussed for different types of clouds and aerosols over various surfaces, for daytime and nighttime conditions, and for regions ranging from the tropics to the poles. Examples will include an assessment of the CERES detection algorithm for optically thin cirrus, marine stratus, and polar night clouds as well as its ability to characterize Saharan dust plumes off the African coast. With the CALIPSO lidar's unique ability to probe the vertical structure of clouds and aerosol layers, it provides an excellent validation data set for cloud detection algorithms, especially for polar nighttime clouds.
The HEPiX Virtualisation Working Group: Towards a Grid of Clouds
NASA Astrophysics Data System (ADS)
Cass, Tony
2012-12-01
The use of virtual machine images, as for example with Cloud services such as Amazon's Elastic Compute Cloud, is attractive for users as they have a guaranteed execution environment, something that cannot today be provided across sites participating in computing grids such as the Worldwide LHC Computing Grid. However, Grid sites often operate within computer security frameworks which preclude the use of remotely generated images. The HEPiX Virtualisation Working Group was setup with the objective to enable use of remotely generated virtual machine images at Grid sites and, to this end, has introduced the idea of trusted virtual machine images which are guaranteed to be secure and configurable by sites such that security policy commitments can be met. This paper describes the requirements and details of these trusted virtual machine images and presents a model for their use to facilitate the integration of Grid- and Cloud-based computing environments for High Energy Physics.
Detection and monitoring of H2O and CO2 ice clouds on Mars
Bell, J.F.; Calvin, W.M.; Ockert-Bell, M. E.; Crisp, D.; Pollack, James B.; Spencer, J.
1996-01-01
We have developed an observational scheme for the detection and discrimination of Mars atmospheric H2O and CO2 clouds using ground-based instruments in the near infrared. We report the results of our cloud detection and characterization study using Mars near IR images obtained during the 1990 and 1993 oppositions. We focused on specific wavelengths that have the potential, based on previous laboratory studies of H2O and CO2 ices, of yielding the greatest degree of cloud detectability and compositional discriminability. We have detected and mapped absorption features at some of these wavelengths in both the northern and southern polar regions of Mars. Compositional information on the nature of these absorption features was derived from comparisons with laboratory ice spectra and with a simplified radiative transfer model of a CO2 ice cloud overlying a bright surface. Our results indicate that both H2O and CO2 ices can be detected and distinguished in the polar hood clouds. The region near 3.00 ??m is most useful for the detection of water ice clouds because there is a strong H2O ice absorption at this wavelength but only a weak CO2 ice band. The region near 3.33 ??m is most useful for the detection of CO2 ice clouds because there is a strong, relatively narrow CO2 ice band at this wavelength but only broad "continuum" H2O ice absorption. Weaker features near 2.30 ??m could arise from CO2 ice at coarse grain sizes, or surface/dust minerals. Narrow features near 2.00 ??m, which could potentially be very diagnostic of CO2 ice clouds, suffer from contamination by Mars atmospheric CO2 absorptions and are difficult to interpret because of the rather poor knowledge of surface elevation at high latitudes. These results indicate that future ground-based, Earth-orbital, and spacecraft studies over a more extended span of the seasonal cycle should yield substantial information on the style and timing of volatile transport on Mars, as well as a more detailed understanding of
The Amazon Boundary-Layer Experiment (ABLE 2B) - A meteorological perspective
NASA Technical Reports Server (NTRS)
Garstang, Michael; Greco, Steven; Scala, John; Swap, Robert; Ulanski, Stanley; Fitzjarrald, David; Martin, David; Browell, Edward; Shipman, Mark; Connors, Vickie
1990-01-01
The Amazon Boundary-Layer Experiments (ABLE) 2A and 2B, which were performed near Manaus, Brazil in July-August, 1985, and April-May, 1987 are discussed. The experiments were performed to study the sources, sinks, concentrations, and transports of trace gases and aerosols in rain forest soils, wetlands, and vegetation. Consideration is given the design and preliminary results of the experiment, focusing on the relationships between meteorological scales of motion and the flux, transports, and reactions of chemical species and aerosols embedded in the atmospheric fluid. Meteorological results are presented and the role of the meteorological results in the atmospheric chemistry experiment is examined.
Extratropical Respones to Amazon Deforestation
NASA Astrophysics Data System (ADS)
Badger, A.; Dirmeyer, P.
2014-12-01
Land-use change (LUC) is known to impact local climate conditions through modifications of land-atmosphere interactions. Large-scale LUC, such as Amazon deforestation, could have a significant effect on the local and regional climates. The question remains as to what the global impact of large-scale LUC could be, as previous modeling studies have shown non-local responses due to Amazon deforestation. A common shortcoming in many previous modeling studies is the use of prescribed ocean conditions, which can act as a boundary condition to dampen the global response with respect to changes in the mean and variability. Using fully coupled modeling simulations with the Community Earth System Model version 1.2.0, the Amazon rainforest has been replaced with a distribution of representative tropical crops. Through the modifications of local land-atmosphere interactions, a significant change in the region, both at the surface and throughout the atmosphere, can be quantified. Accompanying these local changes are significant changes to the atmospheric circulation across all scales, thus modifying regional climates in other locales. Notable impacts include significant changes in precipitation, surface fluxes, basin-wide sea surface temperatures and ENSO behavior.
Soriano, Marlene; Mohren, Frits; Ascarrunz, Nataly; Dressler, Wolfram; Peña-Claros, Marielos
2017-01-01
The Bolivian Amazon holds a complex configuration of people and forested landscapes in which communities hold secure tenure rights over a rich ecosystem offering a range of livelihood income opportunities. A large share of this income is derived from Amazon nut (Bertholletia excelsa). Many communities also have long-standing experience with community timber management plans. However, livelihood needs and desires for better living conditions may continue to place these resources under considerable stress as income needs and opportunities intensify and diversify. We aim to identify the socioeconomic and biophysical factors determining the income from forests, husbandry, off-farm and two keystone forest products (i.e., Amazon nut and timber) in the Bolivian Amazon region. We used structural equation modelling tools to account for the complex inter-relationships between socioeconomic and biophysical factors in predicting each source of income. The potential exists to increase incomes from existing livelihood activities in ways that reduce dependency upon forest resources. For example, changes in off-farm income sources can act to increase or decrease forest incomes. Market accessibility, social, financial, and natural and physical assets determined the amount of income community households could derive from Amazon nut and timber. Factors related to community households' local ecological knowledge, such as the number of non-timber forest products harvested and the number of management practices applied to enhance Amazon nut production, defined the amount of income these households could derive from Amazon nut and timber, respectively. The (inter) relationships found among socioeconomic and biophysical factors over income shed light on ways to improve forest-dependent livelihoods in the Bolivian Amazon. We believe that our analysis could be applicable to other contexts throughout the tropics as well.
Mohren, Frits; Ascarrunz, Nataly; Dressler, Wolfram; Peña-Claros, Marielos
2017-01-01
The Bolivian Amazon holds a complex configuration of people and forested landscapes in which communities hold secure tenure rights over a rich ecosystem offering a range of livelihood income opportunities. A large share of this income is derived from Amazon nut (Bertholletia excelsa). Many communities also have long-standing experience with community timber management plans. However, livelihood needs and desires for better living conditions may continue to place these resources under considerable stress as income needs and opportunities intensify and diversify. We aim to identify the socioeconomic and biophysical factors determining the income from forests, husbandry, off-farm and two keystone forest products (i.e., Amazon nut and timber) in the Bolivian Amazon region. We used structural equation modelling tools to account for the complex inter-relationships between socioeconomic and biophysical factors in predicting each source of income. The potential exists to increase incomes from existing livelihood activities in ways that reduce dependency upon forest resources. For example, changes in off-farm income sources can act to increase or decrease forest incomes. Market accessibility, social, financial, and natural and physical assets determined the amount of income community households could derive from Amazon nut and timber. Factors related to community households’ local ecological knowledge, such as the number of non-timber forest products harvested and the number of management practices applied to enhance Amazon nut production, defined the amount of income these households could derive from Amazon nut and timber, respectively. The (inter) relationships found among socioeconomic and biophysical factors over income shed light on ways to improve forest-dependent livelihoods in the Bolivian Amazon. We believe that our analysis could be applicable to other contexts throughout the tropics as well. PMID:28235090
Integrating multiple scientific computing needs via a Private Cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Berzano, D.; Brunetti, R.; Lusso, S.; Vallero, S.
2014-06-01
In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.
Security of the Brazilian Amazon Area
1992-04-01
effect in Amazonia". Brazil’s Institute for Space Research. Sio Paulo, April 1991: 5-6. Thompson, Dick. "A Global Agenda for the Amazon." Time, 18...to be overcome as Brazil pursues settlement and development of the Amazon. The natural ecologic systems of the Amazon must be defended with...agricultural techniques appropriate to the region and developed within the context of a comprehensive, responsible program that meets Brazil’s needs for
Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2
NASA Technical Reports Server (NTRS)
Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.
2003-01-01
Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used
Using S3 cloud storage with ROOT and CvmFS
NASA Astrophysics Data System (ADS)
Arsuaga-Ríos, María; Heikkilä, Seppo S.; Duellmann, Dirk; Meusel, René; Blomer, Jakob; Couturier, Ben
2015-12-01
Amazon S3 is a widely adopted web API for scalable cloud storage that could also fulfill storage requirements of the high-energy physics community. CERN has been evaluating this option using some key HEP applications such as ROOT and the CernVM filesystem (CvmFS) with S3 back-ends. In this contribution we present an evaluation of two versions of the Huawei UDS storage system stressed with a large number of clients executing HEP software applications. The performance of concurrently storing individual objects is presented alongside with more complex data access patterns as produced by the ROOT data analysis framework. Both Huawei UDS generations show a successful scalability by supporting multiple byte-range requests in contrast with Amazon S3 or Ceph which do not support these commonly used HEP operations. We further report the S3 integration with recent CvmFS versions and summarize the experience with CvmFS/S3 for publishing daily releases of the full LHCb experiment software stack.
The impact of precipitation evaporation on the atmospheric aerosol distribution in EC-Earth v3.2.0
NASA Astrophysics Data System (ADS)
de Bruine, Marco; Krol, Maarten; van Noije, Twan; Le Sager, Philippe; Röckmann, Thomas
2018-04-01
The representation of aerosol-cloud interaction in global climate models (GCMs) remains a large source of uncertainty in climate projections. Due to its complexity, precipitation evaporation is either ignored or taken into account in a simplified manner in GCMs. This research explores various ways to treat aerosol resuspension and determines the possible impact of precipitation evaporation and subsequent aerosol resuspension on global aerosol burdens and distribution. The representation of aerosol wet deposition by large-scale precipitation in the EC-Earth model has been improved by utilising additional precipitation-related 3-D fields from the dynamical core, the Integrated Forecasting System (IFS) general circulation model, in the chemistry and aerosol module Tracer Model, version 5 (TM5). A simple approach of scaling aerosol release with evaporated precipitation fraction leads to an increase in the global aerosol burden (+7.8 to +15 % for different aerosol species). However, when taking into account the different sizes and evaporation rate of raindrops following Gong et al. (2006), the release of aerosols is strongly reduced, and the total aerosol burden decreases by -3.0 to -8.5 %. Moreover, inclusion of cloud processing based on observations by Mitra et al. (1992) transforms scavenged small aerosol to coarse particles, which enhances removal by sedimentation and hence leads to a -10 to -11 % lower aerosol burden. Finally, when these two effects are combined, the global aerosol burden decreases by -11 to -19 %. Compared to the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite observations, aerosol optical depth (AOD) is generally underestimated in most parts of the world in all configurations of the TM5 model and although the representation is now physically more realistic, global AOD shows no large improvements in spatial patterns. Similarly, the agreement of the vertical profile with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP
The effects of CO2 on phytoplankton community structure in the Amazon River Plume
NASA Astrophysics Data System (ADS)
Chen, T. L.; Goes, J. I.; Gomes, H. R.; McKee, K. T.
2013-12-01
The Amazon River Plume results from an enormous discharge of freshwater and organic matter into the Atlantic Ocean. It is a unique environment with a natural pCO2 gradient in the surface waters of the plume that range from 130-950 μatm. The response of coastal marine phytoplankton to increased anthropogenic CO2 emission is still unknown, hence the Amazon River Plume gradient can serve as a natural laboratory to examine the potential influence of atmospheric CO2 increases and ocean acidification on phytoplankton community composition. A two pronged study was undertaken: the first in which shipboard samples from a 2010 cruise to the Amazon River Plume were analyzed to examine the distribution of 3 major phytoplankton groups (diatoms, diatom-diazotroph associations [DDAs], and the diazotroph Trichodesmium spp.) with respect to the natural pCO2 gradient; the second in which the growth response of Thalassiosira weisflogii, a representative diatom species, was examined under experimentally manipulated CO2 conditions. Cruise data analysis showed that diatoms were found with higher cell counts around 150 μatm; DDAs seemed to dominate waters within the narrow range of 350-400 μatm CO2; and the diazotroph Trichodesmium spp. grew in a wide range of pCO2 conditions, but with higher cell counts at upwards of 500 μatm. Phytoplankton group distributions along the CO2 gradient may be due to differences in their carbon-concentrating mechanism (CCMs) efficiencies. The CO2 manipulation apparatus was assembled such that the cells were grown under three different CO2 environments. Differential growth of T. weisflogii was observed at 150, 400, and 800 ppm CO2 treatment. T. weisflogii grew at all three CO2 concentrations, reflecting diatoms' physiological flexibility and efficient CCMs. Absorption spectra analysis of pigments and Fast Repetition Rate Fluorometer analysis indicate potential changes in photosynthetic machinery with different CO2 treatments. Future CO2 manipulation
CATS Cloud and Aerosol Level 2 Heritage Edition Data Products.
NASA Astrophysics Data System (ADS)
Rodier, S. D.; Vaughan, M.; Yorks, J. E.; Palm, S. P.; Selmer, P. A.; Hlavka, D. L.; McGill, M. J.; Trepte, C. R.
2017-12-01
The Cloud-Aerosol Transport System (CATS) instrument was developed at NASA's Goddard Space Flight Center (GSFC) and deployed to the International Space Station (ISS) in January 2015. The CATS elastic backscatter lidars have been operating continuously in one of two science modes since February 2015. One of the primary science objectives of CATS is to continue the CALIPSO aerosol and cloud profile data record to provide continuity of lidar climate observations during the transition from CALIPSO to EarthCARE. To accomplish this, the CATS project at NASA's Goddard Space Flight Center (GSFC) and the CALIPSO project at NASA's Langley Research Center (LaRC) closely collaborated to develop and deliver a full suite of CALIPSO-like level 2 data products using the latest version of the CALIPSO level 2 Version 4 algorithms for the CATS data acquired while operating in science mode 1 (Multi-beam backscatter detection at 1064 and 532 nm, with depolarization measurement at both wavelengths). In this work, we present the current status of the CATS Heritage (i.e. CALIPSO-like) level 2 data products derived from the recent released CATS Level 1B V2-08 data. Extensive comparisons are performed between the three data sets (CALIPSO V4.10 Level 2, CATS Level 2 Operational V2-00 and CATS Heritage V1.00) for cloud and aerosol measurements (e.g., cloud-top height cloud-phase, cloud-layer occurrence frequency and cloud-aerosol discrimination) along the ISS path. In addition, global comparisons (between 52°S and 52°N) of aerosol extinction profiles derived from the CATS Level 2 Operational products and CALIOP V4 Level 2 products are presented. Comparisons of aerosol optical depths retrieved from active sensors (CATS and CALIOP) and passive sensors (MODIS) will provide context for the extinction profile comparisons.
Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María
2017-10-01
New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Wu, Jin; Kobayashi, Hideki; Stark, Scott C; Meng, Ran; Guan, Kaiyu; Tran, Ngoc Nguyen; Gao, Sicong; Yang, Wei; Restrepo-Coupe, Natalia; Miura, Tomoaki; Oliviera, Raimundo Cosme; Rogers, Alistair; Dye, Dennis G; Nelson, Bruce W; Serbin, Shawn P; Huete, Alfredo R; Saleska, Scott R
2018-03-01
Satellite observations of Amazon forests show seasonal and interannual variations, but the underlying biological processes remain debated. Here we combined radiative transfer models (RTMs) with field observations of Amazon forest leaf and canopy characteristics to test three hypotheses for satellite-observed canopy reflectance seasonality: seasonal changes in leaf area index, in canopy-surface leafless crown fraction and/or in leaf demography. Canopy RTMs (PROSAIL and FLiES), driven by these three factors combined, simulated satellite-observed seasonal patterns well, explaining c. 70% of the variability in a key reflectance-based vegetation index (MAIAC EVI, which removes artifacts that would otherwise arise from clouds/aerosols and sun-sensor geometry). Leaf area index, leafless crown fraction and leaf demography independently accounted for 1, 33 and 66% of FLiES-simulated EVI seasonality, respectively. These factors also strongly influenced modeled near-infrared (NIR) reflectance, explaining why both modeled and observed EVI, which is especially sensitive to NIR, captures canopy seasonal dynamics well. Our improved analysis of canopy-scale biophysics rules out satellite artifacts as significant causes of satellite-observed seasonal patterns at this site, implying that aggregated phenology explains the larger scale remotely observed patterns. This work significantly reconciles current controversies about satellite-detected Amazon phenology, and improves our use of satellite observations to study climate-phenology relationships in the tropics. No claim to original US Government works New Phytologist © 2017 New Phytologist Trust.
Black carbon over the Amazon during SAMBBA: it gets everywhere
NASA Astrophysics Data System (ADS)
Morgan, W.; Allan, J. D.; Flynn, M.; Darbyshire, E.; Liu, D.; Szpek, K.; Langridge, J.; Johnson, B. T.; Haywood, J.; Longo, K.; Artaxo, P.; Coe, H.
2014-12-01
Biomass burning represents a major source of Black Carbon (BC) aerosol to the atmosphere, which can result in major perturbations to weather, climate and ecosystem development. Large uncertainties in these impacts prevail, particularly on regional scales. One such region is the Amazon Basin, where large, intense and frequent burning occurs on an annual basis during the dry season. Absorption by atmospheric aerosols is underestimated by models over South America, which points to significant uncertainties relating to BC aerosol properties. Results from the South American Biomass Burning Analysis (SAMBBA) field experiment, which took place during September and October 2012 over Brazil on-board the UK Facility for Airborne Atmospheric Measurement (FAAM) BAe-146 research aircraft, are presented here. Aerosol chemical composition was measured by a DMT Single Particle Soot Photometer (SP2) and an Aerodyne Aerosol Mass Spectrometer (AMS). The physical, chemical and optical properties of BC-containing particles across the region will be characterised, with particular emphasis on the vertical distribution. BC was ubiquitous across the region, with measurements extending from heavily deforested regions in the Western Amazon Basin, through to agricultural fires in the Cerrado (Savannah-like) region and more pristine areas over the Amazon Rainforest. Measurements in the vicinity of Manaus (a city located deep into the jungle) were also conducted. BC concentrations peaked within the boundary layer at a height of around 1.5km. BC-containing particles were found to be rapidly coated in the near-field, with little evidence for additional coating upon advection and dilution. Biomass burning layers within the free troposphere were routinely observed. BC-containing particles within such layers were typically associated with less coating than those within the boundary layer, suggestive of wet removal of more coated BC particles. The importance of such properties in relation to the
The effects of clouds on CO2 forcing
NASA Technical Reports Server (NTRS)
Randall, David A.
1990-01-01
The cloud radiative forcing (CRF) is the difference between the radiative flux (at the top of the atmosphere) which actually occurs in the presence of clouds, and that which would occur if the clouds were removed but the atmospheric state were otherwise unchanged. The CO2 forcing is defined, in analogy with the cloud forcing, as the difference in fluxes and/or infrared heating rates obtained by instantaneously changing CO2 concentration (doubling it) without changing anything else, i.e., without allowing any feedback. An increased CO2 concentration leads to a reduced net upward longwave flux at the Earth's surface. This induced net upward flux is due to an increased downward emission by the CO2 in the atmosphere above. The negative increment to the net upward flux becomes more intense at higher levels in the troposphere, reaching a peak intensity roughly at the tropopause. It then weakens with height in the stratosphere. This profile implies a warming of the troposphere and cooling of the stratosphere. The CSU GCM was recently used to make some preliminary CO2 forcing calculations, for a single simulated, for July conditions. The longwave radiation routine was called twice, to determine the radiative fluxes and heating rates for both 2 x CO2 and 1 x CO2. As diagnostics, the 2-D distributions of the longwave fluxes at the surface and the top of atmosphere, as well as the 3-D distribution of the longwave cooling in the interior was saved. In addition, the pressure was saved (near the tropopause) where the difference in the longwave flux due to CO2 doubling has its largest magnitude. For convenience, this level is referred to as the CO2 tropopause. The actual difference in the flux at that level was also saved. Finally, all of these fields were duplicated for the hypothetical case of no cloudiness (clear sky), so that the effects of the clouds can be isolated.
Amazon Forest Responses to Drought and Fire
NASA Astrophysics Data System (ADS)
Morton, D. C.
2015-12-01
Deforestation and agricultural land uses provide a consistent source of ignitions along the Amazon frontier during the dry season. The risk of understory fires in Amazon forests is amplified by drought conditions, when fires at the forest edge may spread for weeks before rains begin. Fire activity also impacts the regional response of intact forests to drought through diffuse light effects and nutrient redistribution, highlighting the complexity of feedbacks in this coupled human and natural system. This talk will focus on recent advances in our understanding of fire-climate feedbacks in the Amazon, building on research themes initiated under NASA's Large-scale Biosphere-Atmosphere Experiment in Amazonia (LBA). NASA's LBA program began in the wake of the 1997-1998 El Niño, a strong event that exposed the vulnerability of Amazon forests to drought and fire under current climate and projections of climate change. With forecasts of another strong El Niño event in 2015-2016, this talk will provide a multi-scale synthesis of Amazon forest responses to drought and fire based on field measurements, airborne lidar data, and satellite observations of fires, rainfall, and terrestrial water storage. These studies offer new insights into the mechanisms governing fire season severity in the southern Amazon and regional variability in carbon losses from understory fires. The contributions from remote sensing to our understanding of drought and fire in Amazon forests reflect the legacy of NASA's LBA program and the sustained commitment to interdisciplinary research across the Amazon region.
Calibration of LOFAR data on the cloud
NASA Astrophysics Data System (ADS)
Sabater, J.; Sánchez-Expósito, S.; Best, P.; Garrido, J.; Verdes-Montenegro, L.; Lezzi, D.
2017-04-01
New scientific instruments are starting to generate an unprecedented amount of data. The Low Frequency Array (LOFAR), one of the Square Kilometre Array (SKA) pathfinders, is already producing data on a petabyte scale. The calibration of these data presents a huge challenge for final users: (a) extensive storage and computing resources are required; (b) the installation and maintenance of the software required for the processing is not trivial; and (c) the requirements of calibration pipelines, which are experimental and under development, are quickly evolving. After encountering some limitations in classical infrastructures like dedicated clusters, we investigated the viability of cloud infrastructures as a solution. We found that the installation and operation of LOFAR data calibration pipelines is not only possible, but can also be efficient in cloud infrastructures. The main advantages were: (1) the ease of software installation and maintenance, and the availability of standard APIs and tools, widely used in the industry; this reduces the requirement for significant manual intervention, which can have a highly negative impact in some infrastructures; (2) the flexibility to adapt the infrastructure to the needs of the problem, especially as those demands change over time; (3) the on-demand consumption of (shared) resources. We found that a critical factor (also in other infrastructures) is the availability of scratch storage areas of an appropriate size. We found no significant impediments associated with the speed of data transfer, the use of virtualization, the use of external block storage, or the memory available (provided a minimum threshold is reached). Finally, we considered the cost-effectiveness of a commercial cloud like Amazon Web Services. While a cloud solution is more expensive than the operation of a large, fully-utilized cluster completely dedicated to LOFAR data reduction, we found that its costs are competitive if the number of datasets to be
NASA Astrophysics Data System (ADS)
Li, Wenhong; Fu, Rong; Dickinson, Robert E.
2006-01-01
The global climate models for the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4) predict very different changes of rainfall over the Amazon under the SRES A1B scenario for global climate change. Five of the eleven models predict an increase of annual rainfall, three models predict a decrease of rainfall, and the other three models predict no significant changes in the Amazon rainfall. We have further examined two models. The UKMO-HadCM3 model predicts an El Niño-like sea surface temperature (SST) change and warming in the northern tropical Atlantic which appear to enhance atmospheric subsidence and consequently reduce clouds over the Amazon. The resultant increase of surface solar absorption causes a stronger surface sensible heat flux and thus reduces relative humidity of the surface air. These changes decrease the rate and length of wet season rainfall and surface latent heat flux. This decreased wet season rainfall leads to drier soil during the subsequent dry season, which in turn can delay the transition from the dry to wet season. GISS-ER predicts a weaker SST warming in the western Pacific and the southern tropical Atlantic which increases moisture transport and hence rainfall in the Amazon. In the southern Amazon and Nordeste where the strongest rainfall increase occurs, the resultant higher soil moisture supports a higher surface latent heat flux during the dry and transition season and leads to an earlier wet season onset.
Martian equatorial CO2 clouds: a complementary OMEGA and HRSC data analysis
NASA Astrophysics Data System (ADS)
Määttänen, A.; Montmessin, F.; Gondet, B.; Hoffmann, H.; Scholten, F.; Hauber, E.; Bibring, J.-P.; Neukum, G.
2009-04-01
One of the unique features of the Martian climate is the existence of CO2 ice clouds formed from the main atmospheric constituent. These clouds were thought to form only in the polar night, where the CO2 condenses on the winter pole. Recently, Mars Express has observed several occurrences of high-altitude CO2 clouds mainly in the equatorial areas. We use observations by OMEGA (Bibring et al., 2004) and HRSC (Jaumann et al., 2007) to analyse these high-altitude CO2 cloud occurrences. As shown by Montmessin et al. (2007), the spectral signature of CO2 clouds seen in OMEGA spectra exhibits one or two distinct peaks that appear inside a strong CO2 gas absorption band centered at 4.3 microns. We have mapped this spectral signature with a 3-sigma detection method. The mapping of the clouds in three Martian years of OMEGA data have provided a cloud dataset of about 60 occurrences. These observations provide information on the spatial and seasonal distribution of CO2 cloud formation at the equatorial region and information on variations of cloud particle size, related to the variations in the spectral signature of the clouds. The clouds exhibit variable morphology from clearly convective type, round structures (about 15% of all cases), to more filamented, cirrus type clouds. We have also analysed some properties of the clouds (altitude, particle size, opacity) using two shadow observations by OMEGA. We will present the results acquired so far using the datasets of the two instruments. OMEGA shows that the clouds exhibit interannual variations, but in general the clouds are concentrated on specific spatial and seasonal bins, mainly around the equator and around Ls=45 and Ls=135, before and after the northern summer solstice. Most high-altitude clouds are observed in a longitudinally limited area, between 150 W and 30 E. During the first year of observations the cloud shadow was also observed on two orbits. The analysis of the cloud observations have revealed that the clouds
Electron temperatures within magnetic clouds between 2 and 4 AU: Voyager 2 observations
NASA Astrophysics Data System (ADS)
Sittler, E. C.; Burlaga, L. F.
1998-08-01
We have performed an analysis of Voyager 2 plasma electron observations within magnetic clouds between 2 and 4 AU identified by Burlaga and Behannon [1982]. The analysis has been confined to three of the magnetic clouds identified by Burlaga and Behannon that had high-quality data. The general properties of the plasma electrons within a magnetic cloud are that (1) the moment electron temperature anticorrelates with the electron density within the cloud, (2) the ratio Te/Tp tends to be >1, and (3) on average, Te/Tp~7.0. All three results are consistent with previous electron observations within magnetic clouds. Detailed analyses of the core and halo populations within the magnetic clouds show no evidence of either an anticorrelation between the core temperature TC and the electron density Ne or an anticorrelation between the halo temperature TH and the electron density. Within the magnetic clouds the halo component can contribute more than 50% of the electron pressure. The anticorrelation of Te relative to Ne can be traced to the density of the halo component relative to the density of the core component. The core electrons dominate the electron density. When the density goes up, the halo electrons contribute less to the electron pressure, so we get a lower Te. When the electron density goes down, the halo electrons contribute more to the electron pressure, and Te goes up. We find a relation between the electron pressure and density of the form Pe=αNeγ with γ~0.5.
The CM SAF CLAAS-2 cloud property data record
NASA Astrophysics Data System (ADS)
Benas, Nikos; Finkensieper, Stephan; Stengel, Martin; van Zadelhoff, Gerd-Jan; Hanschmann, Timo; Hollmann, Rainer; Fokke Meirink, Jan
2017-04-01
A new cloud property data record was lately released by the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF), based on measurements from geostationary Meteosat Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensors, spanning the period 2004-2015. The CLAAS-2 (Cloud property dAtAset using SEVIRI, Edition 2) data record includes cloud fractional coverage, thermodynamic phase, cloud top height, water path and corresponding optical thickness and particle effective radius separately for liquid and ice clouds. These variables are available at high resolution 15-minute, daily and monthly basis. In this presentation the main improvements in the retrieval algorithms compared to the first edition of the data record (CLAAS-1) are highlighted along with their impact on the quality of the data record. Subsequently, the results of extensive validation and inter-comparison efforts against ground observations, as well as active and passive satellite sensors are summarized. Overall good agreement is found, with similar spatial and temporal characteristics, along with small biases caused mainly by differences in retrieval approaches, spatial/temporal samplings and viewing geometries.
Longo, Marcos; Knox, Ryan G; Levine, Naomi M; Alves, Luciana F; Bonal, Damien; Camargo, Plinio B; Fitzjarrald, David R; Hayek, Matthew N; Restrepo-Coupe, Natalia; Saleska, Scott R; da Silva, Rodrigo; Stark, Scott C; Tapajós, Raphael P; Wiedemann, Kenia T; Zhang, Ke; Wofsy, Steven C; Moorcroft, Paul R
2018-05-22
The impact of increases in drought frequency on the Amazon forest's composition, structure and functioning remain uncertain. We used a process- and individual-based ecosystem model (ED2) to quantify the forest's vulnerability to increased drought recurrence. We generated meteorologically realistic, drier-than-observed rainfall scenarios for two Amazon forest sites, Paracou (wetter) and Tapajós (drier), to evaluate the impacts of more frequent droughts on forest biomass, structure and composition. The wet site was insensitive to the tested scenarios, whereas at the dry site biomass declined when average rainfall reduction exceeded 15%, due to high mortality of large-sized evergreen trees. Biomass losses persisted when year-long drought recurrence was shorter than 2-7 yr, depending upon soil texture and leaf phenology. From the site-level scenario results, we developed regionally applicable metrics to quantify the Amazon forest's climatological proximity to rainfall regimes likely to cause biomass loss > 20% in 50 yr according to ED2 predictions. Nearly 25% (1.8 million km 2 ) of the Amazon forests could experience frequent droughts and biomass loss if mean annual rainfall or interannual variability changed by 2σ. At least 10% of the high-emission climate projections (CMIP5/RCP8.5 models) predict critically dry regimes over 25% of the Amazon forest area by 2100. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.
The Amazon Region; A Vision of Sovereignty
1998-04-06
and SPOT remote sensing satellites images, about 90% of the Amazon jungle remains almost untouched9. This 280 million hectares of vegetation hold...increasing energy needs, remain unanswered. Indian rights Has the Indian population been jeopardized by the development of the Amazon Region...or government agency. STRATEGY RESEARCH PROJECT THE AMAZON REGION; A VISION OF SOVEREIGNTY BY LIEUTENANT COLONEL EDUARDO JOSE BARBOSA
Schaffer, Débora P H; de Araújo, Nayone L L C; Raposo, Ana Cláudia S; Filho, Emanoel F Martins; Vieira, João Victor R; Oriá, Arianne P
2017-09-01
Safe and effective sedation protocols are important for chemical restraint of birds in clinical and diagnostic procedures, such as clinical evaluations, radiographic positioning, and blood collection. These protocols may reduce stress and ease the management of wild-caught birds, which are susceptible to injury or death when exposed to stressful situations. We compare the sedative effect of intranasal midazolam in wild-caught blue-fronted (Amazona aestiva) and orange-winged (Amazona amazonica) Amazon parrots. Ten adult parrots of each species (n = 20), of unknown sex, weighing 0.337 ± 0.04 (blue-fronted) and 0.390 ± 0.03 kg (orange-winged), kg were used. Midazolam (2 mg/kg) was administered intranasally and the total volume of the drug was divided equally between the 2 nostrils. Onset time and total sedation time were assessed. Satisfactory sedation for clinical evaluation was induced in all birds. Onset time and total sedation times were similar in both species: 5.36 ± 1.16 and 25.40 ± 5.72 minutes, respectively, for blue-fronted Amazons and 5.09 ± 0.89 and 27.10 ± 3.73 minutes, respectively, for orange-winged Amazons. A total of 15 animals showed absence of vocalization, with moderate muscle relaxation and wing movement upon handling, and 2 animals presented with lateral recumbence, with intense muscle relaxation and no wing movement, requiring no restraint. Three blue-fronted Amazons had no effective sedation. Intranasally administered midazolam at a dose of 2 mg/kg effectively promoted sedative effects with a short latency time and fast recovery in wild-caught parrots.
Glisić, Radmila; Koko, Vesna; Todorović, Vera; Drndarević, Neda; Cvijić, Gordana
2006-09-11
The aim of our study was to investigate the morphological, immunohistochemical and ultrastructural changes of rat serotonin-producing enterochromaffin (EC) cells of gastrointestinal mucosa in dexamethasone-treated rats (D). After 12-daily intraperitoneal administration of 2 mg/kg dexamethasone, rats developed diabetes similar to human diabetes type 2. Stomach, small and large intestines were examined. Large serotonin positive EC cells appeared in the corpus mucosa epithelium of D group of rats, although these cells were not present in control (C) rats. Both volume fraction and the number of EC cells per mm(2) of mucosa were significantly increased only in the duodenum. However, the number of EC cells per circular sections of both antrum and small intestine was increased, but reduced both in the ascending and descending colon in D group. The dexamethasone treatment caused a strong reduction in number of granules in the antral EC cells, while it was gradually increased beginning from the jejunum to descending colon. The mean granular content was reduced in the antral EC cells but increased in the jejunal EC cells in D group. In conclusion, the present study showed that morphological changes in gut serotonin-producing EC cells occurred in diabetic rats.
Analysis of cloud top height and cloud coverage from satellites using the O2 A and B bands
NASA Technical Reports Server (NTRS)
Kuze, Akihiko; Chance, Kelly V.
1994-01-01
Cloud height and cloud coverage detection are important for total ozone retrieval using ultraviolet and visible scattered light. Use of the O2 A and B bands, around 761 and 687 nm, by a satellite-borne instrument of moderately high spectral resolution viewing in the nadir makes it possible to detect cloud top height and related parameters, including fractional coverage. The measured values of a satellite-borne spectrometer are convolutions of the instrument slit function and the atmospheric transmittance between cloud top and satellite. Studies here determine the optical depth between a satellite orbit and the Earth or cloud top height to high accuracy using FASCODE 3. Cloud top height and a cloud coverage parameter are determined by least squares fitting to calculated radiance ratios in the oxygen bands. A grid search method is used to search the parameter space of cloud top height and the coverage parameter to minimize an appropriate sum of squares of deviations. For this search, nonlinearity of the atmospheric transmittance (i.e., leverage based on varying amounts of saturation in the absorption spectrum) is important for distinguishing between cloud top height and fractional coverage. Using the above-mentioned method, an operational cloud detection algorithm which uses minimal computation time can be implemented.
NASA Astrophysics Data System (ADS)
de Oliveira, Cleber Gonzales; Paradella, Waldir Renato; da Silva, Arnaldo de Queiroz
The Brazilian Amazon is a vast territory with an enormous need for mapping and monitoring of renewable and non-renewable resources. Due to the adverse environmental condition (rain, cloud, dense vegetation) and difficult access, topographic information is still poor, and when available needs to be updated or re-mapped. In this paper, the feasibility of using Digital Surface Models (DSMs) extracted from TerraSAR-X Stripmap stereo-pair images for detailed topographic mapping was investigated for a mountainous area in the Carajás Mineral Province, located on the easternmost border of the Brazilian Amazon. The quality of the radargrammetric DSMs was evaluated regarding field altimetric measurements. Precise topographic field information acquired from a Global Positioning System (GPS) was used as Ground Control Points (GCPs) for the modeling of the stereoscopic DSMs and as Independent Check Points (ICPs) for the calculation of elevation accuracies. The analysis was performed following two ways: (1) the use of Root Mean Square Error (RMSE) and (2) calculations of systematic error (bias) and precision. The test for significant systematic error was based on the Student's-t distribution and the test of precision was based on the Chi-squared distribution. The investigation has shown that the accuracy of the TerraSAR-X Stripmap DSMs met the requirements for 1:50,000 map (Class A) as requested by the Brazilian Standard for Cartographic Accuracy. Thus, the use of TerraSAR-X Stripmap images can be considered a promising alternative for detailed topographic mapping in similar environments of the Amazon region, where available topographic information is rare or presents low quality.
Modeling CO 2 ice clouds with a Mars Global Climate Model
NASA Astrophysics Data System (ADS)
Audouard, Joachim; Määttänen, Anni; Listowski, Constantino; Millour, Ehouarn; Forget, Francois; Spiga, Aymeric
2016-10-01
Since the first claimed detection of CO2 ice clouds by the Mariner campaign (Herr and Pimentel, 1970), more recent observations and modelling works have put new constraints concerning their altitude, region, time and mechanisms of formation (Clancy and Sandor, 1998; Montmessin et al., 2007; Colaprete et al., 2008; Määttänen et al., 2010; Vincendon et al., 2011; Spiga et al. 2012; Listowski et al. 2014). CO2 clouds are observed at the poles at low altitudes (< 20 km) during the winter and at high altitudes (60-110 km) in the equatorial regions during the first half of the year. However, Martian CO2 clouds's variability and dynamics remain somehow elusive.Towards an understanding of Martian CO2 clouds and especially of their precise radiative impact on the climate throughout the history of the planet, including their formation and evolution in a Global Climate Model (GCM) is necessary.Adapting the CO2 clouds microphysics modeling work of Listowski et al. (2013; 2014), we aim at implementing a complete CO2 clouds scheme in the GCM of the Laboratoire de Météorologie Dynamique (LMD, Forget et al., 1999). It covers CO2 microphysics, growth, evolution and dynamics with a methodology inspired from the water ice clouds scheme recently included in the LMD GCM (Navarro et al., 2014).Two main factors control the formation and evolution of CO2 clouds in the Martian atmosphere: sufficient supersaturation of CO2 is needed and condensation nuclei must be available. Topography-induced gravity-waves (GW) are expected to propagate to the upper atmosphere where they produce cold pockets of supersaturated CO2 (Spiga et al., 2012), thus allowing the formation of clouds provided enough condensation nuclei are present. Such supersaturations have been observed by various instruments, in situ (Schofield et al., 1997) and from orbit (Montmessin et al., 2006, 2011; Forget et al., 2009).Using a GW-induced temperature profile and the 1-D version of the GCM, we simulate the formation of CO2
Geochemistry of the Amazon Estuary
Smoak, Joseph M.; Krest, James M.; Swarzenski, Peter W
2006-01-01
The Amazon River supplies more freshwater to the ocean than any other river in the world. This enormous volume of freshwater forces the estuarine mixing out of the river channel and onto the continental shelf. On the continental shelf, the estuarine mixing occurs in a very dynamic environment unlike that of a typical estuary. The tides, the wind, and the boundary current that sweeps the continental shelf have a pronounced influence on the chemical and biological processes occurring within the estuary. The dynamic environment, along with the enormous supply of water, solutes and particles makes the Amazon estuary unique. This chapter describes the unique features of the Amazon estuary and how these features influence the processes occurring within the estuary. Examined are the supply and cycling of major and minor elements, and the use of naturally occurring radionuclides to trace processes including water movement, scavenging, sediment-water interaction, and sediment accumulation rates. The biogeochemical cycling of carbon, nitrogen, and phosphorus, and the significances of the Amazon estuary in the global mass balance of these elements are examined.
EMCS EC Connector Inspection Imagery
2018-02-02
iss054e026863 (Feb. 2, 2018) --- The Plant Gravity Perception experiment in a centrifuge before its second run on the European Modular Cultivation System (EMCS) Experiment Container (EC) to test the gravity-sensing ability of plants in microgravity.
Methods for estimating 2D cloud size distributions from 1D observations
Romps, David M.; Vogelmann, Andrew M.
2017-08-04
The two-dimensional (2D) size distribution of clouds in the horizontal plane plays a central role in the calculation of cloud cover, cloud radiative forcing, convective entrainment rates, and the likelihood of precipitation. Here, a simple method is proposed for calculating the area-weighted mean cloud size and for approximating the 2D size distribution from the 1D cloud chord lengths measured by aircraft and vertically pointing lidar and radar. This simple method (which is exact for square clouds) compares favorably against the inverse Abel transform (which is exact for circular clouds) in the context of theoretical size distributions. Both methods also performmore » well when used to predict the size distribution of real clouds from a Landsat scene. When applied to a large number of Landsat scenes, the simple method is able to accurately estimate the mean cloud size. Finally, as a demonstration, the methods are applied to aircraft measurements of shallow cumuli during the RACORO campaign, which then allow for an estimate of the true area-weighted mean cloud size.« less
Methods for estimating 2D cloud size distributions from 1D observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romps, David M.; Vogelmann, Andrew M.
The two-dimensional (2D) size distribution of clouds in the horizontal plane plays a central role in the calculation of cloud cover, cloud radiative forcing, convective entrainment rates, and the likelihood of precipitation. Here, a simple method is proposed for calculating the area-weighted mean cloud size and for approximating the 2D size distribution from the 1D cloud chord lengths measured by aircraft and vertically pointing lidar and radar. This simple method (which is exact for square clouds) compares favorably against the inverse Abel transform (which is exact for circular clouds) in the context of theoretical size distributions. Both methods also performmore » well when used to predict the size distribution of real clouds from a Landsat scene. When applied to a large number of Landsat scenes, the simple method is able to accurately estimate the mean cloud size. Finally, as a demonstration, the methods are applied to aircraft measurements of shallow cumuli during the RACORO campaign, which then allow for an estimate of the true area-weighted mean cloud size.« less
Future drying of the southern Amazon and central Brazil
NASA Astrophysics Data System (ADS)
Yoon, J.; Zeng, N.; Cook, B.
2008-12-01
Recent climate modeling suggests that the Amazon rainforest could exhibit considerable dieback under future climate change, a prediction that has raised considerable interest as well as controversy. To determine the likelihood and causes of such changes, we analyzed the output of 15 models from the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC/AR4) and a dynamic vegetation model VEGAS driven by these climate output. Our results suggest that the core of the Amazon rainforest should remain largely stable. However, the periphery, notably the southern edge, is in danger of drying out, driven by two main processes. First, a decline in precipitation of 24% in the southern Amazon lengthens the dry season and reduces soil moisture, despite of an increase in precipitation during the wet season, due to the nonlinear response in hydrology and ecosystem dynamics. Two dynamical mechanisms may explain the lower dry season precipitation: (1) a stronger north-south tropical Atlantic sea surface temperature gradient; (2) a general subtropical drying under global warming when the dry season southern Amazon is under the control of the subtropical high pressure. Secondly, evaporation will increase due to the general warming, thus also reducing soil moisture. As a consequence, the median of the models projects a reduction of vegetation by 20%, and enhanced fire carbon flux by 10-15% in the southern Amazon, central Brazil, and parts of the Andean Mountains. Because the southern Amazon is also under intense human influence, the double pressure of deforestation and climate change may subject the region to dramatic changes in the 21st century.
An explicit GIS-based river basin framework for aquatic ecosystem conservation in the Amazon
NASA Astrophysics Data System (ADS)
Venticinque, Eduardo; Forsberg, Bruce; Barthem, Ronaldo; Petry, Paulo; Hess, Laura; Mercado, Armando; Cañas, Carlos; Montoya, Mariana; Durigan, Carlos; Goulding, Michael
2016-11-01
Despite large-scale infrastructure development, deforestation, mining and petroleum exploration in the Amazon Basin, relatively little attention has been paid to the management scale required for the protection of wetlands, fisheries and other aspects of aquatic ecosystems. This is due, in part, to the enormous size, multinational composition and interconnected nature of the Amazon River system, as well as to the absence of an adequate spatial model for integrating data across the entire Amazon Basin. In this data article we present a spatially uniform multi-scale GIS framework that was developed especially for the analysis, management and monitoring of various aspects of aquatic systems in the Amazon Basin. The Amazon GIS-Based River Basin Framework is accessible as an ESRI geodatabase at 2KX8" target="_blank">doi:10.5063/F1BG2KX8.