Sample records for production cloud services

  1. A cloud-based production system for information and service integration: an internet of things case study on waste electronics

    NASA Astrophysics Data System (ADS)

    Wang, Xi Vincent; Wang, Lihui

    2017-08-01

    Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.

  2. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  3. Evaluating Commercial and Private Cloud Services for Facility-Scale Geodetic Data Access, Analysis, and Services

    NASA Astrophysics Data System (ADS)

    Meertens, C. M.; Boler, F. M.; Ertz, D. J.; Mencin, D.; Phillips, D.; Baker, S.

    2017-12-01

    UNAVCO, in its role as a NSF facility for geodetic infrastructure and data, has succeeded for over two decades using on-premises infrastructure, and while the promise of cloud-based infrastructure is well-established, significant questions about suitability of such infrastructure for facility-scale services remain. Primarily through the GeoSciCloud award from NSF EarthCube, UNAVCO is investigating the costs, advantages, and disadvantages of providing its geodetic data and services in the cloud versus using UNAVCO's on-premises infrastructure. (IRIS is a collaborator on the project and is performing its own suite of investigations). In contrast to the 2-3 year time scale for the research cycle, the time scale of operation and planning for NSF facilities is for a minimum of five years and for some services extends to a decade or more. Planning for on-premises infrastructure is deliberate, and migrations typically take months to years to fully implement. Migrations to a cloud environment can only go forward with similar deliberate planning and understanding of all costs and benefits. The EarthCube GeoSciCloud project is intended to address the uncertainties of facility-level operations in the cloud. Investigations are being performed in a commercial cloud environment (Amazon AWS) during the first year of the project and in a private cloud environment (NSF XSEDE resource at the Texas Advanced Computing Center) during the second year. These investigations are expected to illuminate the potential as well as the limitations of running facility scale production services in the cloud. The work includes running parallel equivalent cloud-based services to on premises services and includes: data serving via ftp from a large data store, operation of a metadata database, production scale processing of multiple months of geodetic data, web services delivery of quality checked data and products, large-scale compute services for event post-processing, and serving real time data from a network of 700-plus GPS stations. The evaluation is based on a suite of metrics that we have developed to elucidate the effectiveness of cloud-based services in price, performance, and management. Services are currently running in AWS and evaluation is underway.

  4. Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Fisher, W.; Yoksas, T.

    2014-12-01

    Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.

  5. Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan; Fisher, Ward; Yoksas, Tom

    2015-04-01

    Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high expectations from students who have grown up with smartphones and tablets. These changes are upending traditional approaches to accessing and using data and software. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable in the form of downloadable Unidata-in-a-box virtual images, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our ongoing efforts to deploy a suite of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.

  6. A European Federated Cloud: Innovative distributed computing solutions by EGI

    NASA Astrophysics Data System (ADS)

    Sipos, Gergely; Turilli, Matteo; Newhouse, Steven; Kacsuk, Peter

    2013-04-01

    The European Grid Infrastructure (EGI) is the result of pioneering work that has, over the last decade, built a collaborative production infrastructure of uniform services through the federation of national resource providers that supports multi-disciplinary science across Europe and around the world. This presentation will provide an overview of the recently established 'federated cloud computing services' that the National Grid Initiatives (NGIs), operators of EGI, offer to scientific communities. The presentation will explain the technical capabilities of the 'EGI Federated Cloud' and the processes whereby earth and space science researchers can engage with it. EGI's resource centres have been providing services for collaborative, compute- and data-intensive applications for over a decade. Besides the well-established 'grid services', several NGIs already offer privately run cloud services to their national researchers. Many of these researchers recently expressed the need to share these cloud capabilities within their international research collaborations - a model similar to the way the grid emerged through the federation of institutional batch computing and file storage servers. To facilitate the setup of a pan-European cloud service from the NGIs' resources, the EGI-InSPIRE project established a Federated Cloud Task Force in September 2011. The Task Force has a mandate to identify and test technologies for a multinational federated cloud that could be provisioned within EGI by the NGIs. A guiding principle for the EGI Federated Cloud is to remain technology neutral and flexible for both resource providers and users: • Resource providers are allowed to use any cloud hypervisor and management technology to join virtualised resources into the EGI Federated Cloud as long as the site is subscribed to the user-facing interfaces selected by the EGI community. • Users can integrate high level services - such as brokers, portals and customised Virtual Research Environments - with the EGI Federated Cloud as long as these services access cloud resources through the user-facing interfaces selected by the EGI community. The Task Force will be closed in May 2013. It already • Identified key enabling technologies by which a multinational, federated 'Infrastructure as a Service' (IaaS) type cloud can be built from the NGIs' resources; • Deployed a test bed to evaluate the integration of virtualised resources within EGI and to engage with early adopter use cases from different scientific domains; • Integrated cloud resources into the EGI production infrastructure through cloud specific bindings of the EGI information system, monitoring system, authentication system, etc.; • Collected and catalogued requirements concerning the federated cloud services from the feedback of early adopter use cases; • Provided feedback and requirements to relevant technology providers on their implementations and worked with these providers to address those requirements; • Identified issues that need to be addressed by other areas of EGI (such as portal solutions, resource allocation policies, marketing and user support) to reach a production system. The Task Force will publish a blueprint in April 2013. The blueprint will drive the establishment of a production level EGI Federated Cloud service after May 2013.

  7. Cloud Computing Security Issue: Survey

    NASA Astrophysics Data System (ADS)

    Kamal, Shailza; Kaur, Rajpreet

    2011-12-01

    Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.

  8. Default Parallels Plesk Panel Page

    Science.gov Websites

    services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this

  9. Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions.

    PubMed

    Williams, Daniel R; Tang, Yinshan

    2013-05-07

    Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft's cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.

  10. Unidata cyberinfrastructure in the cloud: A progress report

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan

    2016-04-01

    Data services, software, and committed support are critical components of geosciences cyber-infrastructure that can help scientists address problems of unprecedented complexity, scale, and scope. Unidata is currently working on innovative ideas, new paradigms, and novel techniques to complement and extend its offerings. Our goal is to empower users so that they can tackle major, heretofore difficult problems. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. To realize the above vision, Unidata is working toward: * Providing access to many types of data from a cloud (e.g., TDS, RAMADDA and EDEX); * Deploying data-proximate tools to easily process, analyze and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Fostering partnerships with NOAA and public cloud vendors (e.g., Amazon) to harness their capabilities and resources for the benefit of the academic community.

  11. The thinking of Cloud computing in the digital construction of the oil companies

    NASA Astrophysics Data System (ADS)

    CaoLei, Qizhilin; Dengsheng, Lei

    In order to speed up digital construction of the oil companies and enhance productivity and decision-support capabilities while avoiding the disadvantages from the waste of the original process of building digital and duplication of development and input. This paper presents a cloud-based models for the build in the digital construction of the oil companies that National oil companies though the private network will join the cloud data of the oil companies and service center equipment integrated into a whole cloud system, then according to the needs of various departments to prepare their own virtual service center, which can provide a strong service industry and computing power for the Oil companies.

  12. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  13. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  14. Heterogeneous access and processing of EO-Data on a Cloud based Infrastructure delivering operational Products

    NASA Astrophysics Data System (ADS)

    Niggemann, F.; Appel, F.; Bach, H.; de la Mar, J.; Schirpke, B.; Dutting, K.; Rucker, G.; Leimbach, D.

    2015-04-01

    To address the challenges of effective data handling faced by Small and Medium Sized Enterprises (SMEs) a cloud-based infrastructure for accessing and processing of Earth Observation(EO)-data has been developed within the project APPS4GMES(www.apps4gmes.de). To gain homogenous multi mission data access an Input Data Portal (IDP) been implemented on this infrastructure. The IDP consists of an Open Geospatial Consortium (OGC) conformant catalogue, a consolidation module for format conversion and an OGC-conformant ordering framework. Metadata of various EO-sources and with different standards is harvested and transferred to an OGC conformant Earth Observation Product standard and inserted into the catalogue by a Metadata Harvester. The IDP can be accessed for search and ordering of the harvested datasets by the services implemented on the cloud infrastructure. Different land-surface services have been realised by the project partners, using the implemented IDP and cloud infrastructure. Results of these are customer ready products, as well as pre-products (e.g. atmospheric corrected EO data), serving as a basis for other services. Within the IDP an automated access to ESA's Sentinel-1 Scientific Data Hub has been implemented. Searching and downloading of the SAR data can be performed in an automated way. With the implementation of the Sentinel-1 Toolbox and own software, for processing of the datasets for further use, for example for Vista's snow monitoring, delivering input for the flood forecast services, can also be performed in an automated way. For performance tests of the cloud environment a sophisticated model based atmospheric correction and pre-classification service has been implemented. Tests conducted an automated synchronised processing of one entire Landsat 8 (LS-8) coverage for Germany and performance comparisons to standard desktop systems. Results of these tests, showing a performance improvement by the factor of six, proved the high flexibility and computing power of the cloud environment. To make full use of the cloud capabilities a possibility for automated upscaling of the hardware resources has been implemented. Together with the IDP infrastructure fast and automated processing of various satellite sources to deliver market ready products can be realised, thus increasing customer needs and numbers can be satisfied without loss of accuracy and quality.

  15. AIRS Version 6 Products and Data Services at NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Ding, F.; Savtchenko, A. K.; Hearty, T. J.; Theobald, M. L.; Vollmer, B.; Esfandiari, E.

    2013-12-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is the home of processing, archiving, and distribution services for data from the Atmospheric Infrared Sounder (AIRS) mission. The AIRS mission is entering its 11th year of global observations of the atmospheric state, including temperature and humidity profiles, outgoing longwave radiation, cloud properties, and trace gases. The GES DISC, in collaboration with the AIRS Project, released data from the Version 6 algorithm in early 2013. The new algorithm represents a significant improvement over previous versions in terms of greater stability, yield, and quality of products. Among the most substantial advances are: improved soundings of Tropospheric and Sea Surface Temperatures; larger improvements with increasing cloud cover; improved retrievals of surface spectral emissivity; near-complete removal of spurious temperature bias trends seen in earlier versions; substantially improved retrieval yield (i.e., number of soundings accepted for output) for climate studies; AIRS-Only retrievals with comparable accuracy to AIRS+AMSU (Advanced Microwave Sounding Unit) retrievals; and more realistic hemispheric seasonal variability and global distribution of carbon monoxide. The GES DISC is working to bring the distribution services up-to-date with these new developments. Our focus is on popular services, like variable subsetting and quality screening, which are impacted by the new elements in Version 6. Other developments in visualization services, such as Giovanni, Near-Real Time imagery, and a granule-map viewer, are progressing along with the introduction of the new data; each service presents its own challenge. This presentation will demonstrate the most significant improvements in Version 6 AIRS products, such as newly added variables (higher resolution outgoing longwave radiation, new cloud property products, etc.), the new quality control schema, and improved retrieval yields. We will also demonstrate the various distribution and visualization services for AIRS data products. The cloud properties, model physics, and water and energy cycles research communities are invited to take advantage of the improvements in Version 6 AIRS products and the various services at GES DISC which provide them.

  16. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  17. The North Alabama Lightning Warning Product

    NASA Technical Reports Server (NTRS)

    Buechler, Dennis E.; Blakeslee, R. J.; Stano, G. T.

    2009-01-01

    The North Alabama Lightning Mapping Array NALMA has been collecting total lightning data on storms in the Tennessee Valley region since 2001. Forecasters from nearby National Weather Service (NWS) offices have been ingesting this data for display with other AWIPS products. The current lightning product used by the offices is the lightning source density plot. The new product provides a probabalistic, short-term, graphical forecast of the probability of lightning activity occurring at 5 min intervals over the next 30 minutes . One of the uses of the current lightning source density product by the Huntsville National Weather Service Office is to identify areas of potential for cloud-to-ground flashes based on where LMA total lightning is occurring. This product quantifies that observation. The Lightning Warning Product is derived from total lightning observations from the Washington, D.C. (DCLMA) and North Alabama Lightning Mapping Arrays and cloud-to-ground lightning flashes detected by the National Lightning Detection Network (NLDN). Probability predictions are provided for both intracloud and cloud-to-ground flashes. The gridded product can be displayed on AWIPS workstations in a manner similar to that of the lightning source density product.

  18. Cloud-based Web Services for Near-Real-Time Web access to NPP Satellite Imagery and other Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.

    2010-12-01

    We are building a scalable, cloud computing-based infrastructure for Web access to near-real-time data products synthesized from the U.S. National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP) and other geospatial and meteorological data. Given recent and ongoing changes in the the NPP and NPOESS programs (now Joint Polar Satellite System), the need for timely delivery of NPP data is urgent. We propose an alternative to a traditional, centralized ground segment, using distributed Direct Broadcast facilities linked to industry-standard Web services by a streamlined processing chain running in a scalable cloud computing environment. Our processing chain, currently implemented on Amazon.com's Elastic Compute Cloud (EC2), retrieves raw data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and synthesizes data products such as Sea-Surface Temperature, Vegetation Indices, etc. The cloud computing approach lets us grow and shrink computing resources to meet large and rapid fluctuations (twice daily) in both end-user demand and data availability from polar-orbiting sensors. Early prototypes have delivered various data products to end-users with latencies between 6 and 32 minutes. We have begun to replicate machine instances in the cloud, so as to reduce latency and maintain near-real time data access regardless of increased data input rates or user demand -- all at quite moderate monthly costs. Our service-based approach (in which users invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored and composite (e.g., false-color multiband) products on demand. To facilitate broad impact and adoption of our technology, we have emphasized open, industry-standard software interfaces and open source software. Through our work, we envision the widespread establishment of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sensors. A scalable architecture based on cloud computing ensures cost-effective, real-time processing and delivery of NPP and other data. Access via standard Web services maximizes its interoperability and usefulness.

  19. Lost in Cloud

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Shetye, Sandeep D.; Chilukuri, Sri; Sturken, Ian

    2012-01-01

    Cloud computing can reduce cost significantly because businesses can share computing resources. In recent years Small and Medium Businesses (SMB) have used Cloud effectively for cost saving and for sharing IT expenses. With the success of SMBs, many perceive that the larger enterprises ought to move into Cloud environment as well. Government agency s stove-piped environments are being considered as candidates for potential use of Cloud either as an enterprise entity or pockets of small communities. Cloud Computing is the delivery of computing as a service rather than as a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over a network. Underneath the offered services, there exists a modern infrastructure cost of which is often spread across its services or its investors. As NASA is considered as an Enterprise class organization, like other enterprises, a shift has been occurring in perceiving its IT services as candidates for Cloud services. This paper discusses market trends in cloud computing from an enterprise angle and then addresses the topic of Cloud Computing for NASA in two possible forms. First, in the form of a public Cloud to support it as an enterprise, as well as to share it with the commercial and public at large. Second, as a private Cloud wherein the infrastructure is operated solely for NASA, whether managed internally or by a third-party and hosted internally or externally. The paper addresses the strengths and weaknesses of both paradigms of public and private Clouds, in both internally and externally operated settings. The content of the paper is from a NASA perspective but is applicable to any large enterprise with thousands of employees and contractors.

  20. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    PubMed Central

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  1. Secure encapsulation and publication of biological services in the cloud computing environment.

    PubMed

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  2. iRODS-Based Climate Data Services and Virtualization-as-a-Service in the NASA Center for Climate Simulation

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, D.; Gill, R.; Sinno, S. S.; Shen, Y.; Carriere, L. E.; Brieger, L.; Moore, R.; Rajasekar, A.; Schroeder, W.; Wan, M.

    2011-12-01

    Scientific data services are becoming an important part of the NASA Center for Climate Simulation's mission. Our technological response to this expanding role is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service. A virtual climate data server is an OAIS-compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have developed prototype vCDSs to manage NetCDF, HDF, and GeoTIF data products. We use RPM scripts to build vCDS images in our local computing environment, our local Virtual Machine Environment, NASA's Nebula Cloud Services, and Amazon's Elastic Compute Cloud. Once provisioned into these virtualized resources, multiple vCDSs can use iRODS's federation and realized object capabilities to create an integrated ecosystem of data servers that can scale and adapt to changing requirements. This approach enables platform- or software-as-a-service deployment of the vCDSs and allows the NCCS to offer virtualization-as-a-service, a capacity to respond in an agile way to new customer requests for data services, and a path for migrating existing services into the cloud. We have registered MODIS Atmosphere data products in a vCDS that contains 54 million registered files, 630TB of data, and over 300 million metadata values. We are now assembling IPCC AR5 data into a production vCDS that will provide the platform upon which NCCS's Earth System Grid (ESG) node publishes to the extended science community. In this talk, we describe our approach, experiences, lessons learned, and plans for the future.

  3. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  4. Grids, virtualization, and clouds at Fermilab

    DOE PAGES

    Timm, S.; Chadwick, K.; Garzoglio, G.; ...

    2014-06-11

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less

  5. Grids, virtualization, and clouds at Fermilab

    NASA Astrophysics Data System (ADS)

    Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.

    2014-06-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.

  6. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  7. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift

    PubMed Central

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-01-01

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives. PMID:26703606

  8. Cloud-Based Automated Design and Additive Manufacturing: A Usage Data-Enabled Paradigm Shift.

    PubMed

    Lehmhus, Dirk; Wuest, Thorsten; Wellsandt, Stefan; Bosse, Stefan; Kaihara, Toshiya; Thoben, Klaus-Dieter; Busse, Matthias

    2015-12-19

    Integration of sensors into various kinds of products and machines provides access to in-depth usage information as basis for product optimization. Presently, this large potential for more user-friendly and efficient products is not being realized because (a) sensor integration and thus usage information is not available on a large scale and (b) product optimization requires considerable efforts in terms of manpower and adaptation of production equipment. However, with the advent of cloud-based services and highly flexible additive manufacturing techniques, these obstacles are currently crumbling away at rapid pace. The present study explores the state of the art in gathering and evaluating product usage and life cycle data, additive manufacturing and sensor integration, automated design and cloud-based services in manufacturing. By joining and extrapolating development trends in these areas, it delimits the foundations of a manufacturing concept that will allow continuous and economically viable product optimization on a general, user group or individual user level. This projection is checked against three different application scenarios, each of which stresses different aspects of the underlying holistic concept. The following discussion identifies critical issues and research needs by adopting the relevant stakeholder perspectives.

  9. MagCloud: magazine self-publishing for the long tail

    NASA Astrophysics Data System (ADS)

    Koh, Kok-Wei; Chatow, Ehud

    2010-02-01

    In June of 2008, Hewlett-Packard Labs launched MagCloud, a print-on-demand web service for magazine selfpublishing. MagCloud enables anyone to publish their own magazine by simply uploading a PDF file to the site. There are no setup fees, minimum print runs, storage requirements or waste due to unsold magazines. Magazines are only printed when an order is placed, and are shipped directly to the end customer. In the course of building this web service, a number of technological challenges were encountered. In this paper, we will discuss these challenges and the methods used to overcome them. Perhaps the most important decision in enabling the successful launch of MagCloud was the choice to offer a single product. This simplified the PDF validation phase and streamlined the print fulfillment process such that orders can be printed, folded and trimmed in batches, rather than one-by-one. In a sense, MagCloud adopted the Ford Model T approach to manufacturing, where having just a single model with little or no options allows for efficiencies in the production line, enabling a lower product price and opening the market to a much larger customer base. This platform has resulted in a number of new niche publications - the long tail of publishing.

  10. SeaDataCloud - further developing the pan-European SeaDataNet infrastructure for marine and ocean data management

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Fichaut, Michele

    2017-04-01

    SeaDataCloud marks the third phase of developing the pan-European SeaDataNet infrastructure for marine and ocean data management. The SeaDataCloud project is funded by EU and runs for 4 years from 1st November 2016. It succeeds the successful SeaDataNet II (2011 - 2015) and SeaDataNet (2006 - 2011) projects. SeaDataNet has set up and operates a pan-European infrastructure for managing marine and ocean data and is undertaken by National Oceanographic Data Centres (NODC's) and oceanographic data focal points from 34 coastal states in Europe. The infrastructure comprises a network of interconnected data centres and central SeaDataNet portal. The portal provides users a harmonised set of metadata directories and controlled access to the large collections of datasets, managed by the interconnected data centres. The population of directories has increased considerably in cooperation with and involvement in many associated EU projects and initiatives such as EMODnet. SeaDataNet at present gives overview and access to more than 1.9 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. SeaDataNet is also active in setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), and OGC (WMS, WFS, CS-W and SWE). Standards and associated SeaDataNet tools are made available at the SeaDataNet portal for wide uptake by data handling and managing organisations. SeaDataCloud aims at further developing standards, innovating services & products, adopting new technologies, and giving more attention to users. Moreover, it is about implementing a cooperation between the SeaDataNet consortium of marine data centres and the EUDAT consortium of e-infrastructure service providers. SeaDataCloud aims at considerably advancing services and increasing their usage by adopting cloud and High Performance Computing technology. SeaDataCloud will empower researchers with a packaged collection of services and tools, tailored to their specific needs, supporting research and enabling generation of added-value products from marine and ocean data. Substantial activities will be focused on developing added-value services, such as data subsetting, analysis, visualisation, and publishing workflows for users, both regular and advanced users, as part of a Virtual Research Environment (VRE). SeaDataCloud aims at a number of leading user communities that have new challenges for upgrading and expanding the SeaDataNet standards and services: Science, EMODnet, Copernicus Marine Environmental Monitoring Service (CMEMS) and EuroGOOS, and International scientific programmes. The presentation will give information on present services of the SeaDataNet infrastructure and services, and the new challenges in SeaDataCloud, and will highlight a number of key achievements in SeaDataCloud so far.

  11. Adventures in Private Cloud: Balancing Cost and Capability at the CloudSat Data Processing Center

    NASA Astrophysics Data System (ADS)

    Partain, P.; Finley, S.; Fluke, J.; Haynes, J. M.; Cronk, H. Q.; Miller, S. D.

    2016-12-01

    Since the beginning of the CloudSat Mission in 2006, The CloudSat Data Processing Center (DPC) at the Cooperative Institute for Research in the Atmosphere (CIRA) has been ingesting data from the satellite and other A-Train sensors, producing data products, and distributing them to researchers around the world. The computing infrastructure was specifically designed to fulfill the requirements as specified at the beginning of what nominally was a two-year mission. The environment consisted of servers dedicated to specific processing tasks in a rigid workflow to generate the required products. To the benefit of science and with credit to the mission engineers, CloudSat has lasted well beyond its planned lifetime and is still collecting data ten years later. Over that period requirements of the data processing system have greatly expanded and opportunities for providing value-added services have presented themselves. But while demands on the system have increased, the initial design allowed for very little expansion in terms of scalability and flexibility. The design did change to include virtual machine processing nodes and distributed workflows but infrastructure management was still a time consuming task when system modification was required to run new tests or implement new processes. To address the scalability, flexibility, and manageability of the system Cloud computing methods and technologies are now being employed. The use of a public cloud like Amazon Elastic Compute Cloud or Google Compute Engine was considered but, among other issues, data transfer and storage cost becomes a problem especially when demand fluctuates as a result of reprocessing and the introduction of new products and services. Instead, the existing system was converted to an on premises private Cloud using the OpenStack computing platform and Ceph software defined storage to reap the benefits of the Cloud computing paradigm. This work details the decisions that were made, the benefits that have been realized, the difficulties that were encountered and issues that still exist.

  12. Analyzing the requirements for a robust security criteria and management of multi-level security in the clouds

    NASA Astrophysics Data System (ADS)

    Farroha, Bassam S.; Farroha, Deborah L.

    2011-06-01

    The new corporate approach to efficient processing and storage is migrating from in-house service-center services to the newly coined approach of Cloud Computing. This approach advocates thin clients and providing services by the service provider over time-shared resources. The concept is not new, however the implementation approach presents a strategic shift in the way organizations provision and manage their IT resources. The requirements on some of the data sets targeted to be run on the cloud vary depending on the data type, originator, user, and confidentiality level. Additionally, the systems that fuse such data would have to deal with the classifying the product and clearing the computing resources prior to allowing new application to be executed. This indicates that we could end up with a multi-level security system that needs to follow specific rules and can send the output to a protected network and systems in order not to have data spill or contaminated resources. The paper discusses these requirements and potential impact on the cloud architecture. Additionally, the paper discusses the unexpected advantages of the cloud framework providing a sophisticated environment for information sharing and data mining.

  13. Hybrid Pluggable Processing Pipeline (HyP3): Programmatic Access to Cloud-Based Processing of SAR Data

    NASA Astrophysics Data System (ADS)

    Weeden, R.; Horn, W. B.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    A problem often faced by Earth science researchers is the question of how to scale algorithms that were developed against few datasets and take them to regional or global scales. This problem only gets worse as we look to a future with larger and larger datasets becoming available. One significant hurdle can be having the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon cloud services such as Lambda, Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. HyP3 provides an Application Programming Interface (API) through which users can programmatically interface with the HyP3 system; allowing them to monitor and control processing jobs running in HyP3, and retrieve the generated HyP3 products when completed. This presentation will focus on the development techniques and enabling technologies that were used in developing the HyP3 system. Data and process flow, from new subscription through to order completion will be shown, highlighting the benefits of the cloud for each step. Because the HyP3 system can be accessed directly from a user's Python scripts, powerful applications leveraging SAR products can be put together fairly easily. This is the true power of HyP3; allowing people to programmatically leverage the power of the cloud.

  14. Providing Access and Visualization to Global Cloud Properties from GEO Satellites

    NASA Astrophysics Data System (ADS)

    Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.; Ayers, J. K.

    2015-12-01

    Providing public access to cloud macro and microphysical properties is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and method that allows end users to easily browse and access cloud information that is otherwise difficult to acquire and manipulate. The core of the tool is an application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the dynamically generated imagery as an input into their own work flows for both image generation and cloud product requisition. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite derived cloud product information available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider. Using the Open Geospatial Consortium's Web Processing Service as our access method, we describe a system that uses a hybrid local and cloud based parallel processing system that can return both satellite imagery and cloud product imagery as well as the binary data used to generate them in multiple formats. The images and cloud products are sourced from multiple satellites and also "merged" datasets created by temporally and spatially matching satellite sensors. Finally, the tool and API allow users to access information that spans the time ranges that our group has information available. In the case of satellite imagery, the temporal range can span the entire lifetime of the sensor.

  15. A Framework and Improvements of the Korea Cloud Services Certification System.

    PubMed

    Jeon, Hangoo; Seo, Kwang-Kyu

    2015-01-01

    Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed.

  16. A Framework and Improvements of the Korea Cloud Services Certification System

    PubMed Central

    Jeon, Hangoo

    2015-01-01

    Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed. PMID:26125049

  17. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  18. Analysis of data for LANDSAT (ERTS) follow-on

    NASA Technical Reports Server (NTRS)

    Sizer, J. E. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. Daily weather service satellite photographs of the midwest were found to be of great assistance before ordering EROS Data Center products. These weather satellite images are a quick and inexpensive record of the location of cloud masses, which supplements the percent of cloud catalogues. Savings of time and money were made because the location of cloud cover was known before any imagery was ordered.

  19. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    NASA Astrophysics Data System (ADS)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  20. The StratusLab cloud distribution: Use-cases and support for scientific applications

    NASA Astrophysics Data System (ADS)

    Floros, E.

    2012-04-01

    The StratusLab project is integrating an open cloud software distribution that enables organizations to setup and provide their own private or public IaaS (Infrastructure as a Service) computing clouds. StratusLab distribution capitalizes on popular infrastructure virtualization solutions like KVM, the OpenNebula virtual machine manager, Claudia service manager and SlipStream deployment platform, which are further enhanced and expanded with additional components developed within the project. The StratusLab distribution covers the core aspects of a cloud IaaS architecture, namely Computing (life-cycle management of virtual machines), Storage, Appliance management and Networking. The resulting software stack provides a packaged turn-key solution for deploying cloud computing services. The cloud computing infrastructures deployed using StratusLab can support a wide range of scientific and business use cases. Grid computing has been the primary use case pursued by the project and for this reason the initial priority has been the support for the deployment and operation of fully virtualized production-level grid sites; a goal that has already been achieved by operating such a site as part of EGI's (European Grid Initiative) pan-european grid infrastructure. In this area the project is currently working to provide non-trivial capabilities like elastic and autonomic management of grid site resources. Although grid computing has been the motivating paradigm, StratusLab's cloud distribution can support a wider range of use cases. Towards this direction, we have developed and currently provide support for setting up general purpose computing solutions like Hadoop, MPI and Torque clusters. For what concerns scientific applications the project is collaborating closely with the Bioinformatics community in order to prepare VM appliances and deploy optimized services for bioinformatics applications. In a similar manner additional scientific disciplines like Earth Science can take advantage of StratusLab cloud solutions. Interested users are welcomed to join StratusLab's user community by getting access to the reference cloud services deployed by the project and offered to the public.

  1. A service brokering and recommendation mechanism for better selecting cloud services.

    PubMed

    Gui, Zhipeng; Yang, Chaowei; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Yu, Manzhu; Sun, Min; Zhou, Nanyin; Jin, Baoxuan

    2014-01-01

    Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI).

  2. A Service Brokering and Recommendation Mechanism for Better Selecting Cloud Services

    PubMed Central

    Gui, Zhipeng; Yang, Chaowei; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Yu, Manzhu; Sun, Min; Zhou, Nanyin; Jin, Baoxuan

    2014-01-01

    Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI). PMID:25170937

  3. Performance Evaluation of Cloud Service Considering Fault Recovery

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Tan, Feng; Dai, Yuan-Shun; Guo, Suchang

    In cloud computing, cloud service performance is an important issue. To improve cloud service reliability, fault recovery may be used. However, the use of fault recovery could have impact on the performance of cloud service. In this paper, we conduct a preliminary study on this issue. Cloud service performance is quantified by service response time, whose probability density function as well as the mean is derived.

  4. Returns to Scale in the Production of Hospital Services

    PubMed Central

    Berry, Ralph E.

    1967-01-01

    The primary purpose of this article is to investigate whether or not economies of scale exist in the production of hospital services. In previous studies the results have implied the existence of economies of scale, but the question has not been satisfactorily resolved. The factor most responsible for clouding the issue is the overwhelming prevalence of product differences in the outputs of hospitals. In this study a method which avoids the problem of product differentiation is developed. The analysis strongly supports the conclusion that hospital services are produced subject to economies of scale. PMID:6054380

  5. Generic-distributed framework for cloud services marketplace based on unified ontology.

    PubMed

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  6. Improvements in Cloud Remote Sensing from Fusing VIIRS and CrIS data

    NASA Astrophysics Data System (ADS)

    Heidinger, A. K.; Walther, A.; Lindsey, D. T.; Li, Y.; NOH, Y. J.; Botambekov, D.; Miller, S. D.; Foster, M. J.

    2016-12-01

    In the fall of 2016, NOAA began the operational production of cloud products from the S-NPP Visible and Infrared Imaging Radiometer Suite (VIIRS) using the NOAA Enterprise Algorithms. VIIRS, while providing unprecedented spatial resolution and imaging clarity, does lack certain IR channels that are beneficial to cloud remote sensing. At the UW Space Science and Engineering Center (SSEC), tools were written to generate the missing IR channels from the Cross Track Infrared Sounder (CrIS) and to map them into the VIIRS swath. The NOAA Enterprise Algorithms are also implemented into the NESDIS CLAVR-x system. CLAVR-x has been modified to use the fused VIIRS and CrIS data. This presentation will highlight the benefits offered by the CrIS data to the NOAA Enterprise Algorithms. In addition, these benefits also have enabled the generation of 3D cloud retrievals to support the request from the National Weather Service (NWS) for a Cloud Cover Layers product. Lastly, the benefits of using VIIRS and CrIS for achieving consistency with GOES-R will also be demonstrated.

  7. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    PubMed

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services indicated that the security infrastructure in cloud services may be more compliant with the Health Insurance Portability and Accountability Act of 1996 regulations than traditional methods. To gauge the DoD's adoption of cloud technologies proposed metrics included cost factors, ease of use, automation, availability, accessibility, security, and policy compliance. Since 2009, plans and policies were developed for the use of cloud technology to help consolidate and reduce the number of data centers which were expected to reduce costs, improve environmental factors, enhance information technology security, and maintain mission support for service members. Cloud technologies were also expected to improve employee efficiency and productivity. Federal cloud computing policies within the last decade also offered increased opportunities to advance military healthcare. It was assumed that these opportunities would benefit consumers of healthcare and health science data by allowing more access to centralized cloud computer facilities to store, analyze, search and share relevant data, to enhance standardization, and to reduce potential duplications of effort. We recommend that cloud computing be considered by DoD biomedical researchers for increasing connectivity, presumably by facilitating communications and data sharing, among the various intra- and extramural laboratories. We also recommend that policies and other guidances be updated to include developing additional metrics that will help stakeholders evaluate the above mentioned assumptions and expectations. Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  8. Prototyping manufacturing in the cloud

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2017-08-01

    This paper attempts a theoretical approach to cloud systems with impacts on production systems. I call systems as cloud computing because form a relatively new concept in the field of informatics, representing an overall distributed computing services, applications, access to information and data storage without the user to know the physical location and configuration of systems. The advantages of this approach are especially computing speed and storage capacity without investment in additional configurations, synchronizing user data, data processing using web applications. The disadvantage is that it wants to identify a solution for data security, leading to mistrust users. The case study is applied to a module of the system of production, because the system is complex.

  9. Future of Department of Defense Cloud Computing Amid Cultural Confusion

    DTIC Science & Technology

    2013-03-01

    enterprise cloud - computing environment and transition to a public cloud service provider. Services have started the development of individual cloud - computing environments...endorsing cloud computing . It addresses related issues in matters of service culture changes and how strategic leaders will dictate the future of cloud ...through data center consolidation and individual Service provided cloud computing .

  10. A study on strategic provisioning of cloud computing services.

    PubMed

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  11. A Study on Strategic Provisioning of Cloud Computing Services

    PubMed Central

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  12. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  13. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  14. A Cloud-Based Infrastructure for Near-Real-Time Processing and Dissemination of NPP Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.; Chettri, S. S.

    2011-12-01

    We are building a scalable cloud-based infrastructure for generating and disseminating near-real-time data products from a variety of geospatial and meteorological data sources, including the new National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP). Our approach relies on linking Direct Broadcast and other data streams to a suite of scientific algorithms coordinated by NASA's International Polar-Orbiter Processing Package (IPOPP). The resulting data products are directly accessible to a wide variety of end-user applications, via industry-standard protocols such as OGC Web Services, Unidata Local Data Manager, or OPeNDAP, using open source software components. The processing chain employs on-demand computing resources from Amazon.com's Elastic Compute Cloud and NASA's Nebula cloud services. Our current prototype targets short-term weather forecasting, in collaboration with NASA's Short-term Prediction Research and Transition (SPoRT) program and the National Weather Service. Direct Broadcast is especially crucial for NPP, whose current ground segment is unlikely to deliver data quickly enough for short-term weather forecasters and other near-real-time users. Direct Broadcast also allows full local control over data handling, from the receiving antenna to end-user applications: this provides opportunities to streamline processes for data ingest, processing, and dissemination, and thus to make interpreted data products (Environmental Data Records) available to practitioners within minutes of data capture at the sensor. Cloud computing lets us grow and shrink computing resources to meet large and rapid fluctuations in data availability (twice daily for polar orbiters) - and similarly large fluctuations in demand from our target (near-real-time) users. This offers a compelling business case for cloud computing: the processing or dissemination systems can grow arbitrarily large to sustain near-real time data access despite surges in data volumes or user demand, but that computing capacity (and hourly costs) can be dropped almost instantly once the surge passes. Cloud computing also allows low-risk experimentation with a variety of machine architectures (processor types; bandwidth, memory, and storage capacities, etc.) and of system configurations (including massively parallel computing patterns). Finally, our service-based approach (in which user applications invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored products on demand. To maximize the usefulness and impact of our technology, we have emphasized open, industry-standard software interfaces. We are also using and developing open source software to facilitate the widespread adoption of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sources.

  15. Integration of Cloud Technologies for Data Stewardship at the NOAA National Centers for Environmental Information (NCEI)

    NASA Astrophysics Data System (ADS)

    Casey, K. S.; Hausman, S. A.

    2016-02-01

    In the last year, the NOAA National Oceanographic Data Center (NODC) and its siblings, the National Climatic Data Center and National Geophysical Data Center, were merged into one organization, the NOAA National Centers for Environmental Information (NCEI). Combining its expertise under one management has helped NCEI accelerate its efforts to embrace and integrate private, public, and hybrid cloud environments into its range of data stewardship services. These services span a range of tiers, from basic, long-term preservation and access, through enhanced access and scientific quality control, to authoritative product development and international-level services. Throughout these tiers of stewardship, partnerships and pilot projects have been launched to identify technological and policy-oriented challenges, to establish solutions to these problems, and to highlight success stories for emulation during operational integration of the cloud into NCEI's data stewardship activities. Some of these pilot activities including data storage, access, and reprocessing in Amazon Web Services, the OneStop data discovery and access framework project, and a set of Cooperative Research and Development Agreements under the Big Data Project with Amazon, Google, IBM, Microsoft, and the Open Cloud Consortium. Progress in these efforts will be highlighted along with a future vision of how NCEI could leverage hybrid cloud deployments and federated systems across NOAA to enable effective data stewardship for its oceanographic, atmospheric, climatic, and geophysical Big Data.

  16. Testing as a Service with HammerCloud

    NASA Astrophysics Data System (ADS)

    Medrano Llamas, Ramón; Barrand, Quentin; Elmsheuser, Johannes; Legger, Federica; Sciacca, Gianfranco; Sciabà, Andrea; van der Ster, Daniel

    2014-06-01

    HammerCloud was designed and born under the needs of the grid community to test the resources and automate operations from a user perspective. The recent developments in the IT space propose a shift to the software defined data centres, in which every layer of the infrastructure can be offered as a service. Testing and monitoring is an integral part of the development, validation and operations of big systems, like the grid. This area is not escaping the paradigm shift and we are starting to perceive as natural the Testing as a Service (TaaS) offerings, which allow testing any infrastructure service, such as the Infrastructure as a Service (IaaS) platforms being deployed in many grid sites, both from the functional and stressing perspectives. This work will review the recent developments in HammerCloud and its evolution to a TaaS conception, in particular its deployment on the Agile Infrastructure platform at CERN and the testing of many IaaS providers across Europe in the context of experiment requirements. The first section will review the architectural changes that a service running in the cloud needs, such an orchestration service or new storage requirements in order to provide functional and stress testing. The second section will review the first tests of infrastructure providers on the perspective of the challenges discovered from the architectural point of view. Finally, the third section will evaluate future requirements of scalability and features to increase testing productivity.

  17. Sentinel-1 Archive and Processing in the Cloud using the Hybrid Pluggable Processing Pipeline (HyP3) at the ASF DAAC

    NASA Astrophysics Data System (ADS)

    Arko, S. A.; Hogenson, R.; Geiger, A.; Herrmann, J.; Buechler, B.; Hogenson, K.

    2016-12-01

    In the coming years there will be an unprecedented amount of SAR data available on a free and open basis to research and operational users around the globe. The Alaska Satellite Facility (ASF) DAAC hosts, through an international agreement, data from the Sentinel-1 spacecraft and will be hosting data from the upcoming NASA ISRO SAR (NISAR) mission. To more effectively manage and exploit these vast datasets, ASF DAAC has begun moving portions of the archive to the cloud and utilizing cloud services to provide higher-level processing on the data. The Hybrid Pluggable Processing Pipeline (HyP3) project is designed to support higher-level data processing in the cloud and extend the capabilities of researchers to larger scales. Built upon a set of core Amazon cloud services, the HyP3 system allows users to request data processing using a number of canned algorithms or their own algorithms once they have been uploaded to the cloud. The HyP3 system automatically accesses the ASF cloud-based archive through the DAAC RESTful application programming interface and processes the data on Amazon's elastic compute cluster (EC2). Final products are distributed through Amazon's simple storage service (S3) and are available for user download. This presentation will provide an overview of ASF DAAC's activities moving the Sentinel-1 archive into the cloud and developing the integrated HyP3 system, covering both the benefits and difficulties of working in the cloud. Additionally, we will focus on the utilization of HyP3 for higher-level processing of SAR data. Two example algorithms, for sea-ice tracking and change detection, will be discussed as well as the mechanism for integrating new algorithms into the pipeline for community use.

  18. Climate Analytics-As-a-Service (CAaas), Advanced Information Systems, and Services to Accelerate the Climate Sciences.

    NASA Astrophysics Data System (ADS)

    McInerney, M.; Schnase, J. L.; Duffy, D.; Tamkin, G.; Nadeau, D.; Strong, S.; Thompson, J. H.; Sinno, S.; Lazar, D.

    2014-12-01

    The climate sciences represent a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with big data that ultimately product societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. Within this framework, cloud computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics-as-a-service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the big data challenges in this domain. This poster will highlight specific examples of CAaaS using climate reanalysis data, high-performance cloud computing, map reduce, and the Climate Data Services API.

  19. AceCloud: Molecular Dynamics Simulations in the Cloud.

    PubMed

    Harvey, M J; De Fabritiis, G

    2015-05-26

    We present AceCloud, an on-demand service for molecular dynamics simulations. AceCloud is designed to facilitate the secure execution of large ensembles of simulations on an external cloud computing service (currently Amazon Web Services). The AceCloud client, integrated into the ACEMD molecular dynamics package, provides an easy-to-use interface that abstracts all aspects of interaction with the cloud services. This gives the user the experience that all simulations are running on their local machine, minimizing the learning curve typically associated with the transition to using high performance computing services.

  20. NASA Wrangler: Automated Cloud-Based Data Assembly in the RECOVER Wildfire Decision Support System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Carroll, Mark; Gill, Roger; Wooten, Margaret; Weber, Keith; Blair, Kindra; May, Jeffrey; Toombs, William

    2017-01-01

    NASA Wrangler is a loosely-coupled, event driven, highly parallel data aggregation service designed to take advantageof the elastic resource capabilities of cloud computing. Wrangler automatically collects Earth observational data, climate model outputs, derived remote sensing data products, and historic biophysical data for pre-, active-, and post-wildfire decision making. It is a core service of the RECOVER decision support system, which is providing rapid-response GIS analytic capabilities to state and local government agencies. Wrangler reduces to minutes the time needed to assemble and deliver crucial wildfire-related data.

  1. A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.

    2017-12-01

    Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now being made available to the public via the SARVIEWS hazard portal. These improvements have led to the expansion of SARVIEWS toward a broader set of hazard situations, now including volcanoes, earthquakes, and severe weather. We provide details on newly developed techniques and show examples of disasters for which SARVIEWS was invoked.

  2. Cloud Infrastructure & Applications - CloudIA

    NASA Astrophysics Data System (ADS)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  3. Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud

    NASA Astrophysics Data System (ADS)

    Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok

    Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.

  4. Green Cloud on the Horizon

    NASA Astrophysics Data System (ADS)

    Ali, Mufajjul

    This paper proposes a Green Cloud model for mobile Cloud computing. The proposed model leverage on the current trend of IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service), and look at new paradigm called "Network as a Service" (NaaS). The Green Cloud model proposes various Telco's revenue generating streams and services with the CaaS (Cloud as a Service) for the near future.

  5. Decadal GPS Time Series and Velocity Fields Spanning the North American Continent and Beyond: New Data Products, Cyberinfrastructure and Case Studies from the EarthScope Plate Boundary Observatory (PBO) and Other Regional Networks

    NASA Astrophysics Data System (ADS)

    Phillips, D. A.; Herring, T.; Melbourne, T. I.; Murray, M. H.; Szeliga, W. M.; Floyd, M.; Puskas, C. M.; King, R. W.; Boler, F. M.; Meertens, C. M.; Mattioli, G. S.

    2017-12-01

    The Geodesy Advancing Geosciences and EarthScope (GAGE) Facility, operated by UNAVCO, provides a diverse suite of geodetic data, derived products and cyberinfrastructure services to support community Earth science research and education. GPS data and products including decadal station position time series and velocities are provided for 2000+ continuous GPS stations from the Plate Boundary Observatory (PBO) and other networks distributed throughout the high Arctic, North America, and Caribbean regions. The position time series contain a multitude of signals in addition to the secular motions, including coseismic and postseismic displacements, interseismic strain accumulation, and transient signals associated with hydrologic and other processes. We present our latest velocity field solutions, new time series offset estimate products, and new time series examples associated with various phenomena. Position time series, and the signals they contain, are inherently dependent upon analysis parameters such as network scaling and reference frame realization. The estimation of scale changes for example, a common practice, has large impacts on vertical motion estimates. GAGE/PBO velocities and time series are currently provided in IGS (IGb08) and North America (NAM08, IGb08 rotated to a fixed North America Plate) reference frames. We are reprocessing all data (1996 to present) as part of the transition from IGb08 to IGS14 that began in 2017. New NAM14 and IGS14 data products are discussed. GAGE/PBO GPS data products are currently generated using onsite computing clusters. As part of an NSF funded EarthCube Building Blocks project called "Deploying MultiFacility Cyberinfrastructure in Commercial and Private Cloud-based Systems (GeoSciCloud)", we are investigating performance, cost, and efficiency differences between local computing resources and cloud based resources. Test environments include a commercial cloud provider (Amazon/AWS), NSF cloud-like infrastructures within XSEDE (TACC, the Texas Advanced Computing Center), and in-house cyberinfrastructures. Preliminary findings from this effort are presented. Web services developed by UNAVCO to facilitate the discovery, customization and dissemination of GPS data and products are also presented.

  6. Operational Estimation of Accumulated Precipitation using Satellite Observation, by Eumetsat Satellite Application facility in Support to Hydrology (H-SAF Consortium).

    NASA Astrophysics Data System (ADS)

    di Diodato, A.; de Leonibus, L.; Zauli, F.; Biron, D.; Melfi, D.

    2009-04-01

    Operational Estimation of Accumulated Precipitation using Satellite Observation, by Eumetsat Satellite Application facility in Support to Hydrology (H-SAF Consortium). Cap. Attilio DI DIODATO(*), T.Col. Luigi DE LEONIBUS(*), T.Col Francesco ZAULI(*), Cap. Daniele BIRON(*), Ten. Davide Melfi(*) Satellite Application Facilities (SAFs) are specialised development and processing centres of the EUMETSAT Distributed Ground Segment. SAFs process level 1b data from meteorological satellites (geostationary and polar ones) in conjunction with all other relevant sources of data and appropriate models to generate services and level 2 products. Each SAF is a consortium of EUMETSAT European partners lead by a host institute responsible for the management of the complete SAF project. The Meteorological Service of Italian Air Force is the host Institute for the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF). HSAF has the commitment to develop and to provide, operationally after 2010, products regarding precipitation, soil moisture and snow. HSAF is going to provide information on error structure of its products and validation of the products via their impacts into Hydrological models. To that purpose it has been structured a specific subgroups. Accumulated precipitation is computed by temporal integration of the instantaneous rain rate achieved by the blended LEO/MW and GEO/IR precipitation rate products generated by Rapid Update method available every 15 minutes. The algorithm provides four outputs, consisting in accumulated precipitation in 3, 6, 12 and 24 hours, delivered every 3 hours at the synoptic hours. These outputs are our precipitation background fields. Satellite estimates can cover most of the globe, however, they suffer from errors due to lack of a direct relationship between observation parameters and precipitation, the poor sampling and algorithm imperfections. For this reason the 3 hours accumulated precipitation is compared by climatic thresholds got, basically, by the project "Climate Atlas of Europe" led by Meteo France inside the project ECSN (European Climate Support Network) of EUMETNET. To reduce the bias errors introduced by satellite estimates the rain gauge data are used to make an intercalibration with the satellite estimates, using information achieved by GTS network. Precipitation increments are estimated at each observation location from the observation and the interpolated background field. A field of the increments is carried out by standard Kriging method. The final precipitation analysis is achieved by the sum of the increments and the precipitation estimation at each grid points. It is also considered that major error sources in retrieval 15 minutes instantaneous precipitation from cloud top temperature comes from high (cold) non precipitating clouds and the use of same regression coefficients both for warm clouds (stratus) and cold clouds (convective). As that error is intrinsic in the blending technique applied, we are going to improve performances making use of cloud type specified retrievals. To apply such scheme on the products, we apply a discrimination from convective and stratified clouds, then we retrieve precipitation in parallel for the two clouds classes; the two outputs are merged again into one products, solving the double retrieval pixels keeping the convection retrieval. Basic tools for that is the computation of two different lookup tables to associate precipitation at a brightness temperature for the two kinds of cloudiness. The clouds discrimination will be done by the NWC-SAF product named "cloud type" for the stratified clouds and with an application, running operationally at Italian Met Service, named NEFODINA for automatic detection of convective phenomena. Results of studies to improve the accumulated precipitation as well are presented. The studies exploit the potential to use other source of information like quantitative precipitation forecast (QPF) got by numerical weather prediction model to improve the algorithm where the density of ground observations is low, or using it as a background field to generate a precipitation analysis by an optimal interpolation technique. (*) Centro Nazionale Meteorologia e Climatologia Aeronautica - CNMCA

  7. Home Page - Satellite Products and Services Division/Office of Satellite

    Science.gov Websites

    Products (New!) CLAVR-x Cloud Products Surface Oil Analysis Deepwater Horizon East Coast IR GOES East West Coast IR GOES West Selected image of the Day Image of the Day Volcano Information Washington VAAC ) Search The Employee National Locator (non-NOAA Employees) USA.gov is the U.S. government's official web

  8. The Geo Data Portal an Example Physical and Application Architecture Demonstrating the Power of the "Cloud" Concept.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.

  9. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  10. Analysis of the Security and Privacy Requirements of Cloud-Based Electronic Health Records Systems

    PubMed Central

    Fernández, Gonzalo; López-Coronado, Miguel

    2013-01-01

    Background The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients’ medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. Objective To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. Methods To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Results Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Conclusions Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access and data breaches. Patients must be kept informed about how their data are being managed. PMID:23965254

  11. Analysis of the security and privacy requirements of cloud-based electronic health records systems.

    PubMed

    Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel

    2013-08-21

    The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access and data breaches. Patients must be kept informed about how their data are being managed.

  12. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service.

    PubMed

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee; Yoo, Sooyoung

    2015-04-01

    To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs.

  13. Cloud Computing Fundamentals

    NASA Astrophysics Data System (ADS)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  14. NASA Cloud-Based Climate Data Services

    NASA Astrophysics Data System (ADS)

    McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.

    2012-12-01

    Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.

  15. Architectural Implications of Cloud Computing

    DTIC Science & Technology

    2011-10-24

    Public Cloud Infrastructure-as-a- Service (IaaS) Software -as-a- Service ( SaaS ) Cloud Computing Types Platform-as-a- Service (PaaS) Based on Type of...Twitter #SEIVirtualForum © 2011 Carnegie Mellon University Software -as-a- Service ( SaaS ) Model of software deployment in which a third-party...and System Solutions (RTSS) Program. Her current interests and projects are in service -oriented architecture (SOA), cloud computing, and context

  16. Implementation of cloud computing in higher education

    NASA Astrophysics Data System (ADS)

    Asniar; Budiawan, R.

    2016-04-01

    Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.

  17. Global Software Development with Cloud Platforms

    NASA Astrophysics Data System (ADS)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  18. A Novel Deployment Method for Communication-Intensive Applications in Service Clouds

    PubMed Central

    Liu, Chuanchang; Yang, Jingqi

    2014-01-01

    The service platforms are migrating to clouds for reasonably solving long construction periods, low resource utilizations, and isolated constructions of service platforms. However, when the migration is conducted in service clouds, there is a little focus of deploying communication-intensive applications in previous deployment methods. To address this problem, this paper proposed the combination of the online deployment and the offline deployment for deploying communication-intensive applications in service clouds. Firstly, the system architecture was designed for implementing the communication-aware deployment method for communication-intensive applications in service clouds. Secondly, in the online-deployment algorithm and the offline-deployment algorithm, service instances were deployed in an optimal cloud node based on the communication overhead which is determined by the communication traffic between services, as well as the communication performance between cloud nodes. Finally, the experimental results demonstrated that the proposed methods deployed communication-intensive applications effectively with lower latency and lower load compared with existing algorithms. PMID:25140331

  19. A novel deployment method for communication-intensive applications in service clouds.

    PubMed

    Liu, Chuanchang; Yang, Jingqi

    2014-01-01

    The service platforms are migrating to clouds for reasonably solving long construction periods, low resource utilizations, and isolated constructions of service platforms. However, when the migration is conducted in service clouds, there is a little focus of deploying communication-intensive applications in previous deployment methods. To address this problem, this paper proposed the combination of the online deployment and the offline deployment for deploying communication-intensive applications in service clouds. Firstly, the system architecture was designed for implementing the communication-aware deployment method for communication-intensive applications in service clouds. Secondly, in the online-deployment algorithm and the offline-deployment algorithm, service instances were deployed in an optimal cloud node based on the communication overhead which is determined by the communication traffic between services, as well as the communication performance between cloud nodes. Finally, the experimental results demonstrated that the proposed methods deployed communication-intensive applications effectively with lower latency and lower load compared with existing algorithms.

  20. A Multilateral Negotiation Model for Cloud Service Market

    NASA Astrophysics Data System (ADS)

    Yoo, Dongjin; Sim, Kwang Mong

    Trading cloud services between consumers and providers is a complicated issue of cloud computing. Since a consumer can negotiate with multiple providers to acquire the same service and each provider can receive many requests from multiple consumers, to facilitate the trading of cloud services among multiple consumers and providers, a multilateral negotiation model for cloud market is necessary. The contribution of this work is the proposal of a business model supporting a multilateral price negotiation for trading cloud services. The design of proposed systems for cloud service market includes considering a many-to-many negotiation protocol, and price determining factor from service level feature. Two negotiation strategies are implemented: 1) MDA (Market Driven Agent); and 2) adaptive concession making responding to changes of bargaining position are proposed for cloud service market. Empirical results shows that MDA achieved better performance in some cases that the adaptive concession making strategy, it is noted that unlike the MDA, the adaptive concession making strategy does not assume that an agent has information of the number of competitors (e.g., a consumer agent adopting the adaptive concession making strategy need not know the number of consumer agents competing for the same service).

  1. Adopting Cloud Computing in the Pakistan Navy

    DTIC Science & Technology

    2015-06-01

    administrative aspect is required to operate optimally, provide synchronized delivery of cloud services, and integrate multi-provider cloud environment...AND ABBREVIATIONS ANSI American National Standards Institute AWS Amazon web services CIA Confidentiality Integrity Availability CIO Chief...also adopted cloud computing as an integral component of military operations conducted either locally or remotely. With the use of 2 cloud services

  2. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science through Cloud-Enabled Climate Analytics-as-a-Service

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.

    2013-12-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.

  3. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science Through Cloud-enabled Climate Analytics-as-a-service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.

    2014-01-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.

  4. The Ethics of Cloud Computing.

    PubMed

    de Bruin, Boudewijn; Floridi, Luciano

    2017-02-01

    Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business ethics dealing with this new technology. It analyzes the informational duties of hosting companies that own and operate cloud computing datacentres (e.g., Amazon). It considers the cloud services providers leasing 'space in the cloud' from hosting companies (e.g., Dropbox, Salesforce). And it examines the business and private 'clouders' using these services. The first part of the paper argues that hosting companies, services providers and clouders have mutual informational (epistemic) obligations to provide and seek information about relevant issues such as consumer privacy, reliability of services, data mining and data ownership. The concept of interlucency is developed as an epistemic virtue governing ethically effective communication. The second part considers potential forms of government restrictions on or proscriptions against the development and use of cloud computing technology. Referring to the concept of technology neutrality, it argues that interference with hosting companies and cloud services providers is hardly ever necessary or justified. It is argued, too, however, that businesses using cloud services (e.g., banks, law firms, hospitals etc. storing client data in the cloud) will have to follow rather more stringent regulations.

  5. Services for domain specific developments in the Cloud

    NASA Astrophysics Data System (ADS)

    Schwichtenberg, Horst; Gemuend, André

    2015-04-01

    We will discuss and demonstrate the possibilities of new Cloud Services where the complete development of code is in the Cloud. We will discuss the possibilities of such services where the complete development cycle from programing to testing is in the cloud. This can be also combined with dedicated research domain specific services and hide the burden of accessing available infrastructures. As an example, we will show a service that is intended to complement the services of the VERCE projects infrastructure, a service that utilizes Cloud resources to offer simplified execution of data pre- and post-processing scripts. It offers users access to the ObsPy seismological toolbox for processing data with the Python programming language, executed on virtual Cloud resources in a secured sandbox. The solution encompasses a frontend with a modern graphical user interface, a messaging infrastructure as well as Python worker nodes for background processing. All components are deployable in the Cloud and have been tested on different environments based on OpenStack and OpenNebula. Deployments on commercial, public Clouds will be tested in the future.

  6. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service

    PubMed Central

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee

    2015-01-01

    Objectives To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. Methods We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. Results The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. Conclusions We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs. PMID:25995962

  7. Department of Defense Use of Commercial Cloud Computing Capabilities and Services

    DTIC Science & Technology

    2015-11-01

    models (Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service ( SaaS )), and four deployment models (Public...NIST defines three main models for cloud computing: IaaS, PaaS, and SaaS . These models help differentiate the implementation responsibilities that fall...and SaaS . 3. Public, Private, Community, and Hybrid Clouds Cloud services come in different forms, depending on the customer’s specific needs

  8. A new data collaboration service based on cloud computing security

    NASA Astrophysics Data System (ADS)

    Ying, Ren; Li, Hua-Wei; Wang, Li na

    2017-09-01

    With the rapid development of cloud computing, the storage and usage of data have undergone revolutionary changes. Data owners can store data in the cloud. While bringing convenience, it also brings many new challenges to cloud data security. A key issue is how to support a secure data collaboration service that supports access and updates to cloud data. This paper proposes a secure, efficient and extensible data collaboration service, which prevents data leaks in cloud storage, supports one to many encryption mechanisms, and also enables cloud data writing and fine-grained access control.

  9. Evaluating the Usage of Cloud-Based Collaboration Services through Teamwork

    ERIC Educational Resources Information Center

    Qin, Li; Hsu, Jeffrey; Stern, Mel

    2016-01-01

    With the proliferation of cloud computing for both organizational and educational use, cloud-based collaboration services are transforming how people work in teams. The authors investigated the determinants of the usage of cloud-based collaboration services including teamwork quality, computer self-efficacy, and prior experience, as well as its…

  10. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    simultaneous cloud nodes. 1. INTRODUCTION The proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as...Amazon Web Services and Google Compute Engine means more cloud tenants are hosting sensitive, private, and business critical data and applications in the...thousands of IaaS resources as they are elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features

  11. Service Mediation and Negotiation Bootstrapping as First Achievements Towards Self-adaptable Cloud Services

    NASA Astrophysics Data System (ADS)

    Brandic, Ivona; Music, Dejan; Dustdar, Schahram

    Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.

  12. 75 FR 65383 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-Telemanagement Forum

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-22

    ..., CA; Compunet Services, Inc., Stockbridge, GA; Cordys, Putten, THE NETHERLANDS; Cosmo Bulgaria Mobile... Inc. to Cloud.com , Cupertino, CA; Globul to Cosmo Bulgaria Mobile EAD(GloBul), Sofia, BULGARIA; CTBC...

  13. Supporting the scientific lifecycle through cloud services

    NASA Astrophysics Data System (ADS)

    Gensch, S.; Klump, J. F.; Bertelmann, R.; Braune, C.

    2014-12-01

    Cloud computing has made resources and applications available for numerous use cases ranging from business processes in the private sector to scientific applications. Developers have created tools for data management, collaborative writing, social networking, data access and visualization, project management and many more; either for free or as paid premium services with additional or extended features. Scientists have begun to incorporate tools that fit their needs into their daily work. To satisfy specialized needs, some cloud applications specifically address the needs of scientists for sharing research data, literature search, laboratory documentation, or data visualization. Cloud services may vary in extent, user coverage, and inter-service integration and are also at risk of being abandonend or changed by the service providers making changes to their business model, or leaving the field entirely.Within the project Academic Enterprise Cloud we examine cloud based services that support the research lifecycle, using feature models to describe key properties in the areas of infrastructure and service provision, compliance to legal regulations, and data curation. Emphasis is put on the term Enterprise as to establish an academic cloud service provider infrastructure that satisfies demands of the research community through continious provision across the whole cloud stack. This could enable the research community to be independent from service providers regarding changes to terms of service and ensuring full control of its extent and usage. This shift towards a self-empowered scientific cloud provider infrastructure and its community raises implications about feasability of provision and overall costs. Legal aspects and licensing issues have to be considered, when moving data into cloud services, especially when personal data is involved.Educating researchers about cloud based tools is important to help in the transition towards effective and safe use. Scientists can benefit from the provision of standard services, like weblog and website creation, virtual machine deployments, and groupware provision using cloud based app store-like portals. And, other than in an industrial environment, researchers will want to keep their existing user profile when moving from one institution to another.

  14. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  15. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  16. Impact of different cloud deployments on real-time video applications for mobile video cloud users

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2015-02-01

    The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical results are presented and discussed to quantify and explain the different impacts resulted from various cloud deployments, video application and wireless/mobile network setting, and user mobility. Additionally, this paper analyses the advantages, disadvantages, limitations and optimization techniques in various cloud networking deployments, in particular the cloudlet approach compared with the Internet cloud approach, with recommendations of optimized deployments highlighted. Finally, federated clouds and inter-cloud collaboration challenges and opportunities are discussed in the context of supporting real-time video applications for mobile users.

  17. Bigdata Driven Cloud Security: A Survey

    NASA Astrophysics Data System (ADS)

    Raja, K.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.

  18. Improving oceanographic data delivery through pipeline processing in a Commercial Cloud Services environment: the Australian Integrated Marine Observing System

    NASA Astrophysics Data System (ADS)

    Besnard, Laurent; Blain, Peter; Mancini, Sebastien; Proctor, Roger

    2017-04-01

    The Integrated Marine Observing System (IMOS) is a national project funded by the Australian government established to deliver ocean observations to the marine and climate science community. Now in its 10th year its mission is to undertake systematic and sustained observations and to turn them into data, products and analyses that can be freely used and reused for broad societal benefits. As IMOS has matured as an observing system expectation on the system's availability and reliability has also increased and IMOS is now seen as delivering 'operational' information. In responding to this expectation, IMOS has relocated its services to the commercial cloud service Amazon Web Services. This has enabled IMOS to improve the system architecture, utilizing more advanced features like object storage (S3 - Simple Storage Service) and autoscaling features, and introducing new checking procedures in a pipeline approach. This has improved data availability and resilience while protecting against human errors in data handling and providing a more efficient ingestion process.

  19. Deploying the ATLAS Metadata Interface (AMI) on the cloud with Jenkins

    NASA Astrophysics Data System (ADS)

    Lambert, F.; Odier, J.; Fulachier, J.; ATLAS Collaboration

    2017-10-01

    The ATLAS Metadata Interface (AMI) is a mature application of more than 15 years of existence. Mainly used by the ATLAS experiment at CERN, it consists of a very generic tool ecosystem for metadata aggregation and cataloguing. AMI is used by the ATLAS production system, therefore the service must guarantee a high level of availability. We describe our monitoring and administration systems, and the Jenkins-based strategy used to dynamically test and deploy cloud OpenStack nodes on demand.

  20. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  1. Toward ubiquitous healthcare services with a novel efficient cloud platform.

    PubMed

    He, Chenguang; Fan, Xiaomao; Li, Ye

    2013-01-01

    Ubiquitous healthcare services are becoming more and more popular, especially under the urgent demand of the global aging issue. Cloud computing owns the pervasive and on-demand service-oriented natures, which can fit the characteristics of healthcare services very well. However, the abilities in dealing with multimodal, heterogeneous, and nonstationary physiological signals to provide persistent personalized services, meanwhile keeping high concurrent online analysis for public, are challenges to the general cloud. In this paper, we proposed a private cloud platform architecture which includes six layers according to the specific requirements. This platform utilizes message queue as a cloud engine, and each layer thereby achieves relative independence by this loosely coupled means of communications with publish/subscribe mechanism. Furthermore, a plug-in algorithm framework is also presented, and massive semistructure or unstructured medical data are accessed adaptively by this cloud architecture. As the testing results showing, this proposed cloud platform, with robust, stable, and efficient features, can satisfy high concurrent requests from ubiquitous healthcare services.

  2. Cloud Service Selection Using Multicriteria Decision Analysis

    PubMed Central

    Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645

  3. Cloud service selection using multicriteria decision analysis.

    PubMed

    Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.

  4. Implementation and use of a highly available and innovative IaaS solution: the Cloud Area Padovana

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Biasotto, M.; Dal Pra, S.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Frizziero, E.; Gulmini, M.; Michelotto, M.; Sgaravatto, M.; Traldi, S.; Venaruzzo, M.; Verlato, M.; Zangrando, L.

    2015-12-01

    While in the business world the cloud paradigm is typically implemented purchasing resources and services from third party providers (e.g. Amazon), in the scientific environment there's usually the need of on-premises IaaS infrastructures which allow efficient usage of the hardware distributed among (and owned by) different scientific administrative domains. In addition, the requirement of open source adoption has led to the choice of products like OpenStack by many organizations. We describe a use case of the Italian National Institute for Nuclear Physics (INFN) which resulted in the implementation of a unique cloud service, called ’Cloud Area Padovana’, which encompasses resources spread over two different sites: the INFN Legnaro National Laboratories and the INFN Padova division. We describe how this IaaS has been implemented, which technologies have been adopted and how services have been configured in high-availability (HA) mode. We also discuss how identity and authorization management were implemented, adopting a widely accepted standard architecture based on SAML2 and OpenID: by leveraging the versatility of those standards the integration with authentication federations like IDEM was implemented. We also discuss some other innovative developments, such as a pluggable scheduler, implemented as an extension of the native OpenStack scheduler, which allows the allocation of resources according to a fair-share based model and which provides a persistent queuing mechanism for handling user requests that can not be immediately served. Tools, technologies, procedures used to install, configure, monitor, operate this cloud service are also discussed. Finally we present some examples that show how this IaaS infrastructure is being used.

  5. What CFOs should know before venturing into the cloud.

    PubMed

    Rajendran, Janakan

    2013-05-01

    There are three major trends in the use of cloud-based services for healthcare IT: Cloud computing involves the hosting of health IT applications in a service provider cloud. Cloud storage is a data storage service that can involve, for example, long-term storage and archival of information such as clinical data, medical images, and scanned documents. Data center colocation involves rental of secure space in the cloud from a vendor, an approach that allows a hospital to share power capacity and proven security protocols, reducing costs.

  6. IRIS Product Recommendations

    NASA Technical Reports Server (NTRS)

    Short, David A.

    2000-01-01

    This report presents the Applied Meteorology Unit's (AMU) evaluation of SIGMET Inc.'s Integrated Radar Information System (IRIS) Product Generator and recommendations for products emphasizing lightning and microburst tools. The IRIS Product Generator processes radar reflectivity data from the Weather Surveillance Radar, model 74C (WSR-74C), located on Patrick Air Force Base. The IRIS System was upgraded from version 6.12 to version 7.05 in late December 1999. A statistical analysis of atmospheric temperature variability over the Cape Canaveral Air Force Station (CCAFS) Weather Station provided guidance for the configuration of radar products that provide information on the mixed-phase (liquid and ice) region of clouds, between 0 C and -20 C. Mixed-phase processes at these temperatures are physically linked to electrification and the genesis of severe weather within convectively generated clouds. Day-to-day variations in the atmospheric temperature profile are of sufficient magnitude to warrant periodic reconfiguration of radar products intended for the interpretation of lightning and microburst potential of convectively generated clouds. The AMU also examined the radar volume-scan strategy to determine the scales of vertical gaps within the altitude range of the 0 C to -20 C isotherms over the Kennedy Space Center (KSC)/CCAFS area. This report present's two objective strategies for designing volume scans and proposes a modified scan strategy that reduces the average vertical gap by 37% as a means for improving radar observations of cloud characteristics in the critical 0 C to -20 C layer. The AMU recommends a total of 18 products, including 11 products that require use of the IRIS programming language and the IRIS User Product Insert feature. Included is a cell trends product and display, modeled after the WSR-88D cell trends display in use by the National Weather Service.

  7. Trust-Enhanced Cloud Service Selection Model Based on QoS Analysis.

    PubMed

    Pan, Yuchen; Ding, Shuai; Fan, Wenjuan; Li, Jing; Yang, Shanlin

    2015-01-01

    Cloud computing technology plays a very important role in many areas, such as in the construction and development of the smart city. Meanwhile, numerous cloud services appear on the cloud-based platform. Therefore how to how to select trustworthy cloud services remains a significant problem in such platforms, and extensively investigated owing to the ever-growing needs of users. However, trust relationship in social network has not been taken into account in existing methods of cloud service selection and recommendation. In this paper, we propose a cloud service selection model based on the trust-enhanced similarity. Firstly, the direct, indirect, and hybrid trust degrees are measured based on the interaction frequencies among users. Secondly, we estimate the overall similarity by combining the experience usability measured based on Jaccard's Coefficient and the numerical distance computed by Pearson Correlation Coefficient. Then through using the trust degree to modify the basic similarity, we obtain a trust-enhanced similarity. Finally, we utilize the trust-enhanced similarity to find similar trusted neighbors and predict the missing QoS values as the basis of cloud service selection and recommendation. The experimental results show that our approach is able to obtain optimal results via adjusting parameters and exhibits high effectiveness. The cloud services ranking by our model also have better QoS properties than other methods in the comparison experiments.

  8. Trust-Enhanced Cloud Service Selection Model Based on QoS Analysis

    PubMed Central

    Pan, Yuchen; Ding, Shuai; Fan, Wenjuan; Li, Jing; Yang, Shanlin

    2015-01-01

    Cloud computing technology plays a very important role in many areas, such as in the construction and development of the smart city. Meanwhile, numerous cloud services appear on the cloud-based platform. Therefore how to how to select trustworthy cloud services remains a significant problem in such platforms, and extensively investigated owing to the ever-growing needs of users. However, trust relationship in social network has not been taken into account in existing methods of cloud service selection and recommendation. In this paper, we propose a cloud service selection model based on the trust-enhanced similarity. Firstly, the direct, indirect, and hybrid trust degrees are measured based on the interaction frequencies among users. Secondly, we estimate the overall similarity by combining the experience usability measured based on Jaccard’s Coefficient and the numerical distance computed by Pearson Correlation Coefficient. Then through using the trust degree to modify the basic similarity, we obtain a trust-enhanced similarity. Finally, we utilize the trust-enhanced similarity to find similar trusted neighbors and predict the missing QoS values as the basis of cloud service selection and recommendation. The experimental results show that our approach is able to obtain optimal results via adjusting parameters and exhibits high effectiveness. The cloud services ranking by our model also have better QoS properties than other methods in the comparison experiments. PMID:26606388

  9. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  10. Rheticus: a cloud-based Geo-Information Service for the Detection and Monitoring of Geohazards and Infrastructural Instabilities

    NASA Astrophysics Data System (ADS)

    Chiaradia, M. T.; Samarelli, S.; Massimi, V.; Nutricato, R.; Nitti, D. O.; Morea, A.; Tijani, K.

    2017-12-01

    Geospatial information is today essential for organizations and professionals working in several industries. More and more, huge information is collected from multiple data sources and is freely available to anyone as open data. Rheticus® is an innovative cloud-based data and services hub able to deliver Earth Observation added-value products through automatic complex processes and, if appropriate, a minimum interaction with human operators. This target is achieved by means of programmable components working as different software layers in a modern enterprise system which relies on SOA (Service-Oriented-Architecture) model. Due to its spread architecture, where every functionality is defined and encapsulated in a standalone component, Rheticus is potentially highly scalable and distributable allowing different configurations depending on the user needs. This approach makes the system very flexible with respect to the services implementation, ensuring the ability to rethink and redesign the whole process with little effort. In this work, we outline the overall cloud-based platform and focus on the "Rheticus Displacement" service, aimed at providing accurate information to monitor movements occurring across landslide features or structural instabilities that could affect buildings or infrastructures. Using Sentinel-1 (S1) open data images and Multi-Temporal SAR Interferometry techniques (MTInSAR), the service is complementary to traditional survey methods, providing a long-term solution to slope instability monitoring. Rheticus automatically browses and accesses (on a weekly basis) the products of the rolling archive of ESA S1 Scientific Data Hub. S1 data are then processed by SPINUA (Stable Point Interferometry even in Unurbanized Areas), a robust MTInSAR algorithm, which is responsible of producing displacement maps immediately usable to measure movements of point and distributed scatterers, with sub-centimetric precision. We outline the automatic generation process of displacement maps and we provide examples of the detection and monitoring of geohazard and infrastructure instabilities. ACK: Rheticus® is a registered trademark of Planetek Italia srl. Study carried out in the framework of the FAST4MAP project (ASI Contract n. 2015-020-R.0). Sentinel-1A products provided by ESA.

  11. Cloud-based robot remote control system for smart factory

    NASA Astrophysics Data System (ADS)

    Wu, Zhiming; Li, Lianzhong; Xu, Yang; Zhai, Jingmei

    2015-12-01

    With the development of internet technologies and the wide application of robots, there is a prospect (trend/tendency) of integration between network and robots. A cloud-based robot remote control system over networks for smart factory is proposed, which enables remote users to control robots and then realize intelligent production. To achieve it, a three-layer system architecture is designed including user layer, service layer and physical layer. Remote control applications running on the cloud server is developed on Microsoft Azure. Moreover, DIV+ CSS technologies are used to design human-machine interface to lower maintenance cost and improve development efficiency. Finally, an experiment is implemented to verify the feasibility of the program.

  12. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  13. Context-aware distributed cloud computing using CloudScheduler

    NASA Astrophysics Data System (ADS)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  14. Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*

    PubMed Central

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.

    2015-01-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363

  15. Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.

    PubMed

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L

    2015-02-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  17. Cloud GIS Based Watershed Management

    NASA Astrophysics Data System (ADS)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  18. Comparative study of internet cloud and cloudlet over wireless mesh networks for real-time applications

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2014-05-01

    Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.

  19. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  20. Open Source Surrogate Safety Assessment Model, 2017 Enhancement and Update: SSAM Version 3.0 [Tech Brief

    DOT National Transportation Integrated Search

    2016-11-17

    The ETFOMM (Enhanced Transportation Flow Open Source Microscopic Model) Cloud Service (ECS) is a software product sponsored by the U.S. Department of Transportation in conjunction with the Microscopic Traffic Simulation Models and SoftwareAn Op...

  1. Chemical Transformation System: Cloud Based Cheminformatic Services to Support Integrated Environmental Modeling

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) systems that account for the fate/transport of organics frequently require physicochemical properties as well as transformation products. A myriad of chemical property databases exist but these can be difficult to access and often do not co...

  2. Decision Support for Personalized Cloud Service Selection through Multi-Attribute Trustworthiness Evaluation

    PubMed Central

    Ding, Shuai; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S.

    2014-01-01

    Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment. PMID:24972237

  3. Decision support for personalized cloud service selection through multi-attribute trustworthiness evaluation.

    PubMed

    Ding, Shuai; Xia, Cheng-Yi; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S

    2014-01-01

    Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment.

  4. Hybrid cloud: bridging of private and public cloud computing

    NASA Astrophysics Data System (ADS)

    Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol

    2018-05-01

    Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.

  5. Integration of hybrid wireless networks in cloud services oriented enterprise information systems

    NASA Astrophysics Data System (ADS)

    Li, Shancang; Xu, Lida; Wang, Xinheng; Wang, Jue

    2012-05-01

    This article presents a hybrid wireless network integration scheme in cloud services-based enterprise information systems (EISs). With the emerging hybrid wireless networks and cloud computing technologies, it is necessary to develop a scheme that can seamlessly integrate these new technologies into existing EISs. By combining the hybrid wireless networks and computing in EIS, a new framework is proposed, which includes frontend layer, middle layer and backend layers connected to IP EISs. Based on a collaborative architecture, cloud services management framework and process diagram are presented. As a key feature, the proposed approach integrates access control functionalities within the hybrid framework that provide users with filtered views on available cloud services based on cloud service access requirements and user security credentials. In future work, we will implement the proposed framework over SwanMesh platform by integrating the UPnP standard into an enterprise information system.

  6. Benefits of cloud computing for PACS and archiving.

    PubMed

    Koch, Patrick

    2012-01-01

    The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.

  7. Tourism guide cloud service quality: What actually delights customers?

    PubMed

    Lin, Shu-Ping; Yang, Chen-Lung; Pi, Han-Chung; Ho, Thao-Minh

    2016-01-01

    The emergence of advanced IT and cloud services has beneficially supported the information-intensive tourism industry, simultaneously caused extreme competitions in attracting customers through building efficient service platforms. On response, numerous nations have implemented cloud platforms to provide value-added sightseeing information and personal intelligent service experiences. Despite these efforts, customers' actual perspectives have yet been sufficiently understood. To bridge the gap, this study attempts to investigate what aspects of tourism cloud services actually delight customers' satisfaction and loyalty. 336 valid survey questionnaire answers were analyzed using structural equation modeling method. The results prove positive impacts of function quality, enjoyment, multiple visual aids, and information quality on customers' satisfaction as well as of enjoyment and satisfaction on use loyalty. The findings hope to provide helpful references of customer use behaviors for enhancing cloud service quality in order to achieve better organizational competitiveness.

  8. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup

  9. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  10. Cloud computing basics for librarians.

    PubMed

    Hoy, Matthew B

    2012-01-01

    "Cloud computing" is the name for the recent trend of moving software and computing resources to an online, shared-service model. This article briefly defines cloud computing, discusses different models, explores the advantages and disadvantages, and describes some of the ways cloud computing can be used in libraries. Examples of cloud services are included at the end of the article. Copyright © Taylor & Francis Group, LLC

  11. Efficiently Serving HDF5 Products via OPeNDAP

    NASA Technical Reports Server (NTRS)

    Yang, Kent

    2017-01-01

    Hyrax OPeNDAP services are widely used by the Earth Science data centers in NASA, NOAA and other organizations to serve end users. In this talk, we will present some key features added in the HDF5 Hyrax OPeNDAP handler that can help data centers to better serve the HDF5netCDF-4 data products. Among these new features, we will focus on the following:1.The DAP4 support 2.The memory cache and the disk cache support that can reduce the service access time 3.The enhancement that makes the swath-like HDF5 products visualized by CF-client tools. We will also discuss the role of the HDF5 handler in-depth in the recent study of the Hyrax service in the cloud environment.

  12. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    PubMed

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  13. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    PubMed Central

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  14. Chemical Transformation System: Cloud Based Cheminformatic Services to Support Integrated Environmental Modeling (proceedings)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) systems that account for the fate/transport of organics frequently require physicochemical properties as well as transformation products. A myriad of chemical property databases exist but these can be difficult to access and often do not co...

  15. Atmospheric parameterization schemes for satellite cloud property retrieval during FIRE IFO 2

    NASA Technical Reports Server (NTRS)

    Titlow, James; Baum, Bryan A.

    1993-01-01

    Satellite cloud retrieval algorithms generally require atmospheric temperature and humidity profiles to determine such cloud properties as pressure and height. For instance, the CO2 slicing technique called the ratio method requires the calculation of theoretical upwelling radiances both at the surface and a prescribed number (40) of atmospheric levels. This technique has been applied to data from, for example, the High Resolution Infrared Radiometer Sounder (HIRS/2, henceforth HIRS) flown aboard the NOAA series of polar orbiting satellites and the High Resolution Interferometer Sounder (HIS). In this particular study, four NOAA-11 HIRS channels in the 15-micron region are used. The ratio method may be applied to various channel combinations to estimate cloud top heights using channels in the 15-mu m region. Presently, the multispectral, multiresolution (MSMR) scheme uses 4 HIRS channel combination estimates for mid- to high-level cloud pressure retrieval and Advanced Very High Resolution Radiometer (AVHRR) data for low-level (is greater than 700 mb) cloud level retrieval. In order to determine theoretical upwelling radiances, atmospheric temperature and water vapor profiles must be provided as well as profiles of other radiatively important gas absorber constituents such as CO2, O3, and CH4. The assumed temperature and humidity profiles have a large effect on transmittance and radiance profiles, which in turn are used with HIRS data to calculate cloud pressure, and thus cloud height and temperature. For large spatial scale satellite data analysis, atmospheric parameterization schemes for cloud retrieval algorithms are usually based on a gridded product such as that provided by the European Center for Medium Range Weather Forecasting (ECMWF) or the National Meteorological Center (NMC). These global, gridded products prescribe temperature and humidity profiles for a limited number of pressure levels (up to 14) in a vertical atmospheric column. The FIRE IFO 2 experiment provides an opportunity to investigate current atmospheric profile parameterization schemes, compare satellite cloud height results using both gridded products (ECMWF) and high vertical resolution sonde data from the National Weather Service (NWS) and Cross Chain Loran Atmospheric Sounding System (CLASS), and suggest modifications in atmospheric parameterization schemes based on these results.

  16. Bioinformatics clouds for big data manipulation.

    PubMed

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  17. A price and performance comparison of three different storage architectures for data in cloud-based systems

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Jelenak, A.; Potter, N.; Fulker, D. W.; Habermann, T.

    2017-12-01

    Providing data services based on cloud computing technology that is equivalent to those developed for traditional computing and storage systems is critical for successful migration to cloud-based architectures for data production, scientific analysis and storage. OPeNDAP Web-service capabilities (comprising the Data Access Protocol (DAP) specification plus open-source software for realizing DAP in servers and clients) are among the most widely deployed means for achieving data-as-service functionality in the Earth sciences. OPeNDAP services are especially common in traditional data center environments where servers offer access to datasets stored in (very large) file systems, and a preponderance of the source data for these services is being stored in the Hierarchical Data Format Version 5 (HDF5). Three candidate architectures for serving NASA satellite Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS) were developed and their performance examined for a set of representative use cases. The performance was based both on runtime and incurred cost. The three architectures differ in how HDF5 files are stored in the Amazon Simple Storage Service (S3) and how the Hyrax server (as an EC2 instance) retrieves their data. The results for both the serial and parallel access to HDF5 data in the S3 will be presented. While the study focused on HDF5 data, OPeNDAP and the Hyrax data server, the architectures are generic and the analysis can be extrapolated to many different data formats, web APIs, and data servers.

  18. Research on cloud-based remote measurement and analysis system

    NASA Astrophysics Data System (ADS)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  19. The Education Value of Cloud Computing

    ERIC Educational Resources Information Center

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  20. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  1. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  2. Detecting Abnormal Machine Characteristics in Cloud Infrastructures

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Das, Kamalika; Matthews, Bryan L.

    2011-01-01

    In the cloud computing environment resources are accessed as services rather than as a product. Monitoring this system for performance is crucial because of typical pay-peruse packages bought by the users for their jobs. With the huge number of machines currently in the cloud system, it is often extremely difficult for system administrators to keep track of all machines using distributed monitoring programs such as Ganglia1 which lacks system health assessment and summarization capabilities. To overcome this problem, we propose a technique for automated anomaly detection using machine performance data in the cloud. Our algorithm is entirely distributed and runs locally on each computing machine on the cloud in order to rank the machines in order of their anomalous behavior for given jobs. There is no need to centralize any of the performance data for the analysis and at the end of the analysis, our algorithm generates error reports, thereby allowing the system administrators to take corrective actions. Experiments performed on real data sets collected for different jobs validate the fact that our algorithm has a low overhead for tracking anomalous machines in a cloud infrastructure.

  3. The vacuum platform

    NASA Astrophysics Data System (ADS)

    McNab, A.

    2017-10-01

    This paper describes GridPP’s Vacuum Platform for managing virtual machines (VMs), which has been used to run production workloads for WLCG and other HEP experiments. The platform provides a uniform interface between VMs and the sites they run at, whether the site is organised as an Infrastructure-as-a-Service cloud system such as OpenStack, or an Infrastructure-as-a-Client system such as Vac. The paper describes our experience in using this platform, in developing and operating VM lifecycle managers Vac and Vcycle, and in interacting with VMs provided by LHCb, ATLAS, ALICE, CMS, and the GridPP DIRAC service to run production workloads.

  4. Cloud computing applications for biomedical science: A perspective.

    PubMed

    Navale, Vivek; Bourne, Philip E

    2018-06-01

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

  5. Cloud computing applications for biomedical science: A perspective

    PubMed Central

    2018-01-01

    Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research. PMID:29902176

  6. Scheduling multimedia services in cloud computing environment

    NASA Astrophysics Data System (ADS)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  7. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package

    PubMed Central

    2012-01-01

    Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941

  8. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    PubMed

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  9. Cloud Computing E-Communication Services in the University Environment

    ERIC Educational Resources Information Center

    Babin, Ron; Halilovic, Branka

    2017-01-01

    The use of cloud computing services has grown dramatically in post-secondary institutions in the last decade. In particular, universities have been attracted to the low-cost and flexibility of acquiring cloud software services from Google, Microsoft and others, to implement e-mail, calendar and document management and other basic office software.…

  10. Identity-Based Authentication for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  11. Cloud Service Provider Methods for Managing Insider Threats: Analysis Phase 1

    DTIC Science & Technology

    2013-11-01

    of Standards and Technology (NIST) Special Publication 800-145 (NIST SP 800-145) defines three types of cloud services : Software as a Service ( SaaS ...among these three models. NIST SP 800-145 describes the three service models as follows: SaaS —The capability provided to the consumer is to use the...Cloud Service Provider Methods for Managing Insider Threats: Analysis Phase I Greg Porter November 2013 TECHNICAL NOTE CMU/SEI-2013-TN-020

  12. Bioinformatics clouds for big data manipulation

    PubMed Central

    2012-01-01

    Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475

  13. Synergistic use of MODIS cloud products and AIRS radiance measurements for retrieval of cloud parameters

    NASA Astrophysics Data System (ADS)

    Li, J.; Menzel, W.; Sun, F.; Schmit, T.

    2003-12-01

    The Moderate-Resolution Imaging Spectroradiometer (MODIS) and Atmospheric Infrared Sounder (AIRS) measurements from the Earth Observing System's (EOS) Aqua satellite will enable global monitoring of the distribution of clouds. MODIS is able to provide at high spatial resolution (1 ~ 5km) the cloud mask, surface and cloud types, cloud phase, cloud-top pressure (CTP), effective cloud amount (ECA), cloud particle size (CPS), and cloud water path (CWP). AIRS is able to provide CTP, ECA, CPS, and CWP within the AIRS footprint with much better accuracy using its greatly enhanced hyperspectral remote sensing capability. The combined MODIS / AIRS system offers the opportunity for cloud products improved over those possible from either system alone. The algorithm developed was applied to process the AIRS longwave cloudy radiance measurements; results are compared with MODIS cloud products, as well as with the Geostationary Operational Environmental Satellite (GOES) sounder cloud products, to demonstrate the advantage of synergistic use of high spatial resolution MODIS cloud products and high spectral resolution AIRS sounder radiance measurements for optimal cloud retrieval. Data from ground-based instrumentation at the Atmospheric Radiation Measurement (ARM) Program Cloud and Radiation Test Bed (CART) in Oklahoma were used for the validation; results show that AIRS improves the MODIS cloud products in certain cases such as low-level clouds.

  14. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    ERIC Educational Resources Information Center

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  15. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  16. Cloud based intelligent system for delivering health care as a service.

    PubMed

    Kaur, Pankaj Deep; Chana, Inderveer

    2014-01-01

    The promising potential of cloud computing and its convergence with technologies such as mobile computing, wireless networks, sensor technologies allows for creation and delivery of newer type of cloud services. In this paper, we advocate the use of cloud computing for the creation and management of cloud based health care services. As a representative case study, we design a Cloud Based Intelligent Health Care Service (CBIHCS) that performs real time monitoring of user health data for diagnosis of chronic illness such as diabetes. Advance body sensor components are utilized to gather user specific health data and store in cloud based storage repositories for subsequent analysis and classification. In addition, infrastructure level mechanisms are proposed to provide dynamic resource elasticity for CBIHCS. Experimental results demonstrate that classification accuracy of 92.59% is achieved with our prototype system and the predicted patterns of CPU usage offer better opportunities for adaptive resource elasticity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  18. Research on the application in disaster reduction for using cloud computing technology

    NASA Astrophysics Data System (ADS)

    Tao, Liang; Fan, Yida; Wang, Xingling

    Cloud Computing technology has been rapidly applied in different domains recently, promotes the progress of the domain's informatization. Based on the analysis of the state of application requirement in disaster reduction and combining the characteristics of Cloud Computing technology, we present the research on the application of Cloud Computing technology in disaster reduction. First of all, we give the architecture of disaster reduction cloud, which consists of disaster reduction infrastructure as a service (IAAS), disaster reduction cloud application platform as a service (PAAS) and disaster reduction software as a service (SAAS). Secondly, we talk about the standard system of disaster reduction in five aspects. Thirdly, we indicate the security system of disaster reduction cloud. Finally, we draw a conclusion the use of cloud computing technology will help us to solve the problems for disaster reduction and promote the development of disaster reduction.

  19. Smart learning services based on smart cloud computing.

    PubMed

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  20. Smart Learning Services Based on Smart Cloud Computing

    PubMed Central

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users. PMID:22164048

  1. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support

    PubMed Central

    Camargo, João; Rochol, Juergen; Gerla, Mario

    2018-01-01

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends. PMID:29364172

  2. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    PubMed

    Rosário, Denis; Schimuneck, Matias; Camargo, João; Nobre, Jéferson; Both, Cristiano; Rochol, Juergen; Gerla, Mario

    2018-01-24

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends.

  3. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services.

    PubMed

    Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-03-02

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.

  4. Security Architecture and Protocol for Trust Verifications Regarding the Integrity of Files Stored in Cloud Services †

    PubMed Central

    2018-01-01

    Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641

  5. Military clouds: utilization of cloud computing systems at the battlefield

    NASA Astrophysics Data System (ADS)

    Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai

    2012-05-01

    Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.

  6. Secure Cloud-Based Solutions for Different eHealth Services in Spanish Rural Health Centers.

    PubMed

    de la Torre-Díez, Isabel; Lopez-Coronado, Miguel; Garcia-Zapirain Soto, Begonya; Mendez-Zorrilla, Amaia

    2015-07-27

    The combination of eHealth applications and/or services with cloud technology provides health care staff—with sufficient mobility and accessibility for them—to be able to transparently check any data they may need without having to worry about its physical location. The main aim of this paper is to put forward secure cloud-based solutions for a range of eHealth services such as electronic health records (EHRs), telecardiology, teleconsultation, and telediagnosis. The scenario chosen for introducing the services is a set of four rural health centers located within the same Spanish region. iCanCloud software was used to perform simulations in the proposed scenario. We chose online traffic and the cost per unit in terms of time as the parameters for choosing the secure solution on the most optimum cloud for each service. We suggest that load balancers always be fitted for all solutions in communication together with several Internet service providers and that smartcards be used to maintain identity to an appropriate extent. The solutions offered via private cloud for EHRs, teleconsultation, and telediagnosis services require a volume of online traffic calculated at being able to reach 2 Gbps per consultation. This may entail an average cost of €500/month. The security solutions put forward for each eHealth service constitute an attempt to centralize all information on the cloud, thus offering greater accessibility to medical information in the case of EHRs alongside more reliable diagnoses and treatment for telecardiology, telediagnosis, and teleconsultation services. Therefore, better health care for the rural patient can be obtained at a reasonable cost.

  7. A cloud system for mobile medical services of traditional Chinese medicine.

    PubMed

    Hu, Nian-Ze; Lee, Chia-Ying; Hou, Mark C; Chen, Ying-Ling

    2013-12-01

    Many medical centers in Taiwan have started to provide Traditional Chinese Medicine (TCM) services for hospitalized patients. Due to the complexity of TCM modality and the increasing need for providing TCM services for patients in different wards at distantly separate locations within the hospital, it is getting difficult to manage the situation in the traditional way. A computerized system with mobile ability can therefore provide a practical solution to the challenge presented. The study tries to develop a cloud system equipped with mobile devices to integrate electronic medical records, facilitate communication between medical workers, and improve the quality of TCM services for the hospitalized patients in a medical center. The system developed in the study includes mobile devices carrying Android operation system and a PC as a cloud server. All the devices use the same TCM management system developed by the study. A website of database is set up for information sharing. The cloud system allows users to access and update patients' medical information, which is of great help to medical workers for verifying patients' identification and giving proper treatments to patients. The information then can be wirelessly transmitted between medical personnel through the cloud system. Several quantitative and qualitative evaluation indexes are developed to measure the effectiveness of the cloud system on the quality of the TCM service. The cloud system is tested and verified based on a sample of hospitalized patients receiving the acupuncture treatment at the Lukang Branch of Changhua Christian Hospital (CCH) in Taiwan. The result shows a great improvement in operating efficiency of the TCM service in that a significant saving in labor time can be attributable to the cloud system. In addition, the cloud system makes it easy to confirm patients' identity through taking a picture of the patient upon receiving any medical treatment. The result also shows that the cloud system achieves significant improvement in the acupuncture treatment. All the acupuncture needles now can be removed at the time they are expected to be removed. Furthermore, through the cloud system, medical workers can access and update patients' medical information on-site, which provides a means of effective communication between medical workers. These functions allow us to make the most use of the portability feature of the acupuncture service. The result shows that the contribution made by the cloud system to the TCM service is multi-dimensional: cost-effective, environment-protective, performance-enhancing etc. Developing and implementing such a cloud system for the TCM service in Taiwan symbolizes a pioneering effort. We believe that the work we have done here can serve as a stepping-stone toward advancing the TCM service quality in the future.

  8. Legal issues in clouds: towards a risk inventory.

    PubMed

    Djemame, Karim; Barnitzke, Benno; Corrales, Marcelo; Kiran, Mariam; Jiang, Ming; Armstrong, Django; Forgó, Nikolaus; Nwankwo, Iheanyi

    2013-01-28

    Cloud computing technologies have reached a high level of development, yet a number of obstacles still exist that must be overcome before widespread commercial adoption can become a reality. In a cloud environment, end users requesting services and cloud providers negotiate service-level agreements (SLAs) that provide explicit statements of all expectations and obligations of the participants. If cloud computing is to experience widespread commercial adoption, then incorporating risk assessment techniques is essential during SLA negotiation and service operation. This article focuses on the legal issues surrounding risk assessment in cloud computing. Specifically, it analyses risk regarding data protection and security, and presents the requirements of an inherent risk inventory. The usefulness of such a risk inventory is described in the context of the OPTIMIS project.

  9. A Secure and Efficient Audit Mechanism for Dynamic Shared Data in Cloud Storage

    PubMed Central

    2014-01-01

    With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data. PMID:24959630

  10. A secure and efficient audit mechanism for dynamic shared data in cloud storage.

    PubMed

    Kwon, Ohmin; Koo, Dongyoung; Shin, Yongjoo; Yoon, Hyunsoo

    2014-01-01

    With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data.

  11. Virtual Business Operating Environment in the Cloud: Conceptual Architecture and Challenges

    NASA Astrophysics Data System (ADS)

    Nezhad, Hamid R. Motahari; Stephenson, Bryan; Singhal, Sharad; Castellanos, Malu

    Advances in service oriented architecture (SOA) have brought us close to the once imaginary vision of establishing and running a virtual business, a business in which most or all of its business functions are outsourced to online services. Cloud computing offers a realization of SOA in which IT resources are offered as services that are more affordable, flexible and attractive to businesses. In this paper, we briefly study advances in cloud computing, and discuss the benefits of using cloud services for businesses and trade-offs that they have to consider. We then present 1) a layered architecture for the virtual business, and 2) a conceptual architecture for a virtual business operating environment. We discuss the opportunities and research challenges that are ahead of us in realizing the technical components of this conceptual architecture. We conclude by giving the outlook and impact of cloud services on both large and small businesses.

  12. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  13. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  14. A Novel Market-Oriented Dynamic Collaborative Cloud Service Platform

    NASA Astrophysics Data System (ADS)

    Hassan, Mohammad Mehedi; Huh, Eui-Nam

    In today's world the emerging Cloud computing (Weiss, 2007) offer a new computing model where resources such as computing power, storage, online applications and networking infrastructures can be shared as "services" over the internet. Cloud providers (CPs) are incentivized by the profits to be made by charging consumers for accessing these services. Consumers, such as enterprises, are attracted by the opportunity for reducing or eliminating costs associated with "in-house" provision of these services.

  15. Can Clouds replace Grids? Will Clouds replace Grids?

    NASA Astrophysics Data System (ADS)

    Shiers, J. D.

    2010-04-01

    The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9oK and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared "open" and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently "Cloud Computing" - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.

  16. Model-as-a-service (MaaS) using the cloud service innovation platform (CSIP)

    USDA-ARS?s Scientific Manuscript database

    Cloud infrastructures for modelling activities such as data processing, performing environmental simulations, or conducting model calibrations/optimizations provide a cost effective alternative to traditional high performance computing approaches. Cloud-based modelling examples emerged into the more...

  17. Educational Cloud Services and the Mathematics Confidence, Affective Engagement, and Behavioral Engagement of Mathematics Education Students in Public University in Benue State, Nigeria

    ERIC Educational Resources Information Center

    Iji, Clement Onwu; Abah, Joshua Abah; Anyor, Joseph Wuave

    2018-01-01

    This study investigated the impact of cloud services on mathematics education students' mathematics confidence, affective engagement, and behavioral engagement in public universities in Benue State, Nigeria. Ex-post facto research design was adopted for the study. The instrument for the study was the researcher-developed Cloud Services Mathematics…

  18. Impact of Cloud Services on Students' Attitude towards Mathematics Education in Public Universities in Benue State, Nigeria

    ERIC Educational Resources Information Center

    Iji, Clement Onwu; Abah, Joshua Abah; Anyor, Joseph Wuave

    2017-01-01

    This study focused on the impact of cloud services on students' attitude towards mathematics education in public universities in Benue State, Nigeria. Ex-post facto research design was adopted for the study. The instrument for the study is the researcher-developed Cloud Service Impact Questionnaire--CSIQ (Cronbach Alpha Coefficient = 0.92). The…

  19. Services for Emodnet-Chemistry Data Products

    NASA Astrophysics Data System (ADS)

    Santinelli, Giorgio; Hendriksen, Gerrit; Barth, Alexander

    2016-04-01

    In the framework of Emodnet Chemistry lot, data products from regional leaders were made available in order to transform information into a database. This has been done by using functions and scripts, reading so-called enriched ODV files and inserting data directly into a cloud relational geodatabase. The main table is the one of observations which contains the main data and meta-data associated with the enriched ODV files. A particular implementation in data loading is used in order to improve on-the-fly computational speed. Data from Baltic Sea, North Sea, Mediterrean, Black Sea and part of the Atlantic region has been entered into the geodatabase, and consequently being instantly available from the OceanBrowser Emodnet portal. Furthermore, Deltares has developed an application that provides additional visualisation services for the aggregated and validated data collections. The visualisations are produced by making use of part of the OpenEarthTool stack (http://www.openearth.eu), by the integration of Web Feature Services and by the implementation of Web Processing Services. The goal is the generation of server-side plots of timeseries, profiles, timeprofiles and maps of selected parameters from data sets of selected stations. Regional data collections are retrieved using Emodnet Chemistry cloud relational geo-database. The spatial resolution in time and the intensity of data availability for selected parameters is shown using Web Service requests via the OceanBrowser Emodnet Web portal. OceanBrowser also shows station reference codes, which are used to establish a link for additional metadata, further data shopping and download.

  20. GES DISC Data Recipes in Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.

    2017-12-01

    The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.

  1. An Efficient Virtual Machine Consolidation Scheme for Multimedia Cloud Computing.

    PubMed

    Han, Guangjie; Que, Wenhui; Jia, Gangyong; Shu, Lei

    2016-02-18

    Cloud computing has innovated the IT industry in recent years, as it can delivery subscription-based services to users in the pay-as-you-go model. Meanwhile, multimedia cloud computing is emerging based on cloud computing to provide a variety of media services on the Internet. However, with the growing popularity of multimedia cloud computing, its large energy consumption cannot only contribute to greenhouse gas emissions, but also result in the rising of cloud users' costs. Therefore, the multimedia cloud providers should try to minimize its energy consumption as much as possible while satisfying the consumers' resource requirements and guaranteeing quality of service (QoS). In this paper, we have proposed a remaining utilization-aware (RUA) algorithm for virtual machine (VM) placement, and a power-aware algorithm (PA) is proposed to find proper hosts to shut down for energy saving. These two algorithms have been combined and applied to cloud data centers for completing the process of VM consolidation. Simulation results have shown that there exists a trade-off between the cloud data center's energy consumption and service-level agreement (SLA) violations. Besides, the RUA algorithm is able to deal with variable workload to prevent hosts from overloading after VM placement and to reduce the SLA violations dramatically.

  2. An Efficient Virtual Machine Consolidation Scheme for Multimedia Cloud Computing

    PubMed Central

    Han, Guangjie; Que, Wenhui; Jia, Gangyong; Shu, Lei

    2016-01-01

    Cloud computing has innovated the IT industry in recent years, as it can delivery subscription-based services to users in the pay-as-you-go model. Meanwhile, multimedia cloud computing is emerging based on cloud computing to provide a variety of media services on the Internet. However, with the growing popularity of multimedia cloud computing, its large energy consumption cannot only contribute to greenhouse gas emissions, but also result in the rising of cloud users’ costs. Therefore, the multimedia cloud providers should try to minimize its energy consumption as much as possible while satisfying the consumers’ resource requirements and guaranteeing quality of service (QoS). In this paper, we have proposed a remaining utilization-aware (RUA) algorithm for virtual machine (VM) placement, and a power-aware algorithm (PA) is proposed to find proper hosts to shut down for energy saving. These two algorithms have been combined and applied to cloud data centers for completing the process of VM consolidation. Simulation results have shown that there exists a trade-off between the cloud data center’s energy consumption and service-level agreement (SLA) violations. Besides, the RUA algorithm is able to deal with variable workload to prevent hosts from overloading after VM placement and to reduce the SLA violations dramatically. PMID:26901201

  3. Mobile Learning on the Basis of the Cloud Services

    ERIC Educational Resources Information Center

    Makarchuk, Tatyana

    2017-01-01

    Spreading of interactive applications for mobile devices became one of the trends of IT development in 2015-2017. In higher education mobile applications are being used to advance the productivity of professors and students, which raises the overall quality of education. In the article SkyDrive, GoogleDisk mobile applications' features for group…

  4. Secure Cloud-Based Solutions for Different eHealth Services in Spanish Rural Health Centers

    PubMed Central

    2015-01-01

    Background The combination of eHealth applications and/or services with cloud technology provides health care staff—with sufficient mobility and accessibility for them—to be able to transparently check any data they may need without having to worry about its physical location. Objective The main aim of this paper is to put forward secure cloud-based solutions for a range of eHealth services such as electronic health records (EHRs), telecardiology, teleconsultation, and telediagnosis. Methods The scenario chosen for introducing the services is a set of four rural health centers located within the same Spanish region. iCanCloud software was used to perform simulations in the proposed scenario. We chose online traffic and the cost per unit in terms of time as the parameters for choosing the secure solution on the most optimum cloud for each service. Results We suggest that load balancers always be fitted for all solutions in communication together with several Internet service providers and that smartcards be used to maintain identity to an appropriate extent. The solutions offered via private cloud for EHRs, teleconsultation, and telediagnosis services require a volume of online traffic calculated at being able to reach 2 Gbps per consultation. This may entail an average cost of €500/month. Conclusions The security solutions put forward for each eHealth service constitute an attempt to centralize all information on the cloud, thus offering greater accessibility to medical information in the case of EHRs alongside more reliable diagnoses and treatment for telecardiology, telediagnosis, and teleconsultation services. Therefore, better health care for the rural patient can be obtained at a reasonable cost. PMID:26215155

  5. BlueSky Cloud Framework: An E-Learning Framework Embracing Cloud Computing

    NASA Astrophysics Data System (ADS)

    Dong, Bo; Zheng, Qinghua; Qiao, Mu; Shu, Jian; Yang, Jie

    Currently, E-Learning has grown into a widely accepted way of learning. With the huge growth of users, services, education contents and resources, E-Learning systems are facing challenges of optimizing resource allocations, dealing with dynamic concurrency demands, handling rapid storage growth requirements and cost controlling. In this paper, an E-Learning framework based on cloud computing is presented, namely BlueSky cloud framework. Particularly, the architecture and core components of BlueSky cloud framework are introduced. In BlueSky cloud framework, physical machines are virtualized, and allocated on demand for E-Learning systems. Moreover, BlueSky cloud framework combines with traditional middleware functions (such as load balancing and data caching) to serve for E-Learning systems as a general architecture. It delivers reliable, scalable and cost-efficient services to E-Learning systems, and E-Learning organizations can establish systems through these services in a simple way. BlueSky cloud framework solves the challenges faced by E-Learning, and improves the performance, availability and scalability of E-Learning systems.

  6. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications.

    PubMed

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-03-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.

  7. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications †

    PubMed Central

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-01-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model. PMID:28257067

  8. Sentinel-1 Interferometry from the Cloud to the Scientist

    NASA Astrophysics Data System (ADS)

    Garron, J.; Stoner, C.; Johnston, A.; Arko, S. A.

    2017-12-01

    Big data problems and solutions are growing in the technological and scientific sectors daily. Cloud computing is a vertically and horizontally scalable solution available now for archiving and processing large volumes of data quickly, without significant on-site computing hardware costs. Be that as it may, the conversion of scientific data processors to these powerful platforms requires not only the proof of concept, but the demonstration of credibility in an operational setting. The Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC), in partnership with NASA's Jet Propulsion Laboratory, is exploring the functional architecture of Amazon Web Services cloud computing environment for the processing, distribution and archival of Synthetic Aperture Radar data in preparation for the NASA-ISRO Synthetic Aperture Radar (NISAR) Mission. Leveraging built-in AWS services for logging, monitoring and dashboarding, the GRFN (Getting Ready for NISAR) team has built a scalable processing, distribution and archival system of Sentinel-1 L2 interferograms produced using the ISCE algorithm. This cloud-based functional prototype provides interferograms over selected global land deformation features (volcanoes, land subsidence, seismic zones) and are accessible to scientists via NASA's EarthData Search client and the ASF DAACs primary SAR interface, Vertex, for direct download. The interferograms are produced using nearest-neighbor logic for identifying pairs of granules for interferometric processing, creating deep stacks of BETA products from almost every satellite orbit for scientists to explore. This presentation highlights the functional lessons learned to date from this exercise, including the cost analysis of various data lifecycle policies as implemented through AWS. While demonstrating the architecture choices in support of efficient big science data management, we invite feedback and questions about the process and products from the InSAR community.

  9. Unidata Cyberinfrastructure in the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Young, J. W.

    2016-12-01

    Data services, software, and user support are critical components of geosciences cyber-infrastructure to help researchers to advance science. With the maturity of and significant advances in cloud computing, it has recently emerged as an alternative new paradigm for developing and delivering a broad array of services over the Internet. Cloud computing is now mature enough in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Given the enormous potential of cloud-based services, Unidata has been moving to augment its software, services, data delivery mechanisms to align with the cloud-computing paradigm. To realize the above vision, Unidata has worked toward: * Providing access to many types of data from a cloud (e.g., via the THREDDS Data Server, RAMADDA and EDEX servers); * Deploying data-proximate tools to easily process, analyze, and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Leveraging Jupyter as a central platform and hub with its powerful set of interlinking tools to connect interactively data servers, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.

  10. Usability evaluation of cloud-based mapping tools for the display of very large datasets

    NASA Astrophysics Data System (ADS)

    Stotz, Nicole Marie

    The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would need to be dropped. This is a free product and would therefore be the best option if cost is an issue, but this map may not be supported in the future.

  11. Opportunities and challenges of cloud computing to improve health care services.

    PubMed

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  12. Off the Shelf Cloud Robotics for the Smart Home: Empowering a Wireless Robot through Cloud Computing.

    PubMed

    Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio

    2017-03-06

    In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user's home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered.

  13. Off the Shelf Cloud Robotics for the Smart Home: Empowering a Wireless Robot through Cloud Computing

    PubMed Central

    Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio

    2017-01-01

    In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user’s home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered. PMID:28272305

  14. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  15. A vital signs telemonitoring system - interoperability supported by a personal health record systema and a cloud service.

    PubMed

    Gutiérrez, Miguel F; Cajiao, Alejandro; Hidalgo, José A; Cerón, Jesús D; López, Diego M; Quintero, Víctor M; Rendón, Alvaro

    2014-01-01

    This article presents the development process of an acquisition and data storage system managing clinical variables through a cloud storage service and a Personal Health Record (PHR) System. First, the paper explains how a Wireless Body Area Network (WBAN) that captures data from two sensors corresponding to arterial pressure and heart rate is designed. Second, this paper illustrates how data collected by the WBAN are transmitted to a cloud storage service. It is worth mentioning that this cloud service allows the data to be stored in a persistent way on an online database system. Finally, the paper describes, how the data stored in the cloud service are sent to the Indivo PHR System, where they are registered and charted for future revision by health professionals. The research demonstrated the feasibility of implementing WBAN networks for the acquisition of clinical data, and particularly for the use of Web technologies and standards to provide interoperability with PHR Systems at technical and syntactic levels.

  16. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    NASA Astrophysics Data System (ADS)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.

  17. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  18. Modelling operations and security of cloud systems using Z-notation and Chinese Wall security policy

    NASA Astrophysics Data System (ADS)

    Basu, Srijita; Sengupta, Anirban; Mazumdar, Chandan

    2016-11-01

    Enterprises are increasingly using cloud computing for hosting their applications. Availability of fast Internet and cheap bandwidth are causing greater number of people to use cloud-based services. This has the advantage of lower cost and minimum maintenance. However, ensuring security of user data and proper management of cloud infrastructure remain major areas of concern. Existing techniques are either too complex, or fail to properly represent the actual cloud scenario. This article presents a formal cloud model using the constructs of Z-notation. Principles of the Chinese Wall security policy have been applied to design secure cloud-specific operations. The proposed methodology will enable users to safely host their services, as well as process sensitive data, on cloud.

  19. A price- and-time-slot-negotiation mechanism for Cloud service reservations.

    PubMed

    Son, Seokho; Sim, Kwang Mong

    2012-06-01

    When making reservations for Cloud services, consumers and providers need to establish service-level agreements through negotiation. Whereas it is essential for both a consumer and a provider to reach an agreement on the price of a service and when to use the service, to date, there is little or no negotiation support for both price and time-slot negotiations (PTNs) for Cloud service reservations. This paper presents a multi-issue negotiation mechanism to facilitate the following: 1) PTNs between Cloud agents and 2) tradeoff between price and time-slot utilities. Unlike many existing negotiation mechanisms in which a negotiation agent can only make one proposal at a time, agents in this work are designed to concurrently make multiple proposals in a negotiation round that generate the same aggregated utility, differing only in terms of individual price and time-slot utilities. Another novelty of this work is formulating a novel time-slot utility function that characterizes preferences for different time slots. These ideas are implemented in an agent-based Cloud testbed. Using the testbed, experiments were carried out to compare this work with related approaches. Empirical results show that PTN agents reach faster agreements and achieve higher utilities than other related approaches. A case study was carried out to demonstrate the application of the PTN mechanism for pricing Cloud resources.

  20. Cloud-Based Speech Technology for Assistive Technology Applications (CloudCAST).

    PubMed

    Cunningham, Stuart; Green, Phil; Christensen, Heidi; Atria, José Joaquín; Coy, André; Malavasi, Massimiliano; Desideri, Lorenzo; Rudzicz, Frank

    2017-01-01

    The CloudCAST platform provides a series of speech recognition services that can be integrated into assistive technology applications. The platform and the services provided by the public API are described. Several exemplar applications have been developed to demonstrate the platform to potential developers and users.

  1. IoT-based flood embankments monitoring system

    NASA Astrophysics Data System (ADS)

    Michta, E.; Szulim, R.; Sojka-Piotrowska, A.; Piotrowski, K.

    2017-08-01

    In the paper a concept of flood embankments monitoring system based on using Internet of Things approach and Cloud Computing technologies will be presented. The proposed system consists of sensors, IoT nodes, Gateways and Cloud based services. Nodes communicates with the sensors measuring certain physical parameters describing the state of the embankments and communicates with the Gateways. Gateways are specialized active devices responsible for direct communication with the nodes, collecting sensor data, preprocess the data, applying local rules and communicate with the Cloud Services using communication API delivered by cloud services providers. Architecture of all of the system components will be proposed consisting IoT devices functionalities description, their communication model, software modules and services bases on using a public cloud computing platform like Microsoft Azure will be proposed. The most important aspects of maintaining the communication in a secure way will be shown.

  2. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    USDA-ARS?s Scientific Manuscript database

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  3. A Survey on Personal Data Cloud

    PubMed Central

    Wang, Jiaqiu; Wang, Zhongjie

    2014-01-01

    Personal data represent the e-history of a person and are of great significance to the person, but they are essentially produced and governed by various distributed services and there lacks a global and centralized view. In recent years, researchers pay attention to Personal Data Cloud (PDC) which aggregates the heterogeneous personal data scattered in different clouds into one cloud, so that a person could effectively store, acquire, and share their data. This paper makes a short survey on PDC research by summarizing related papers published in recent years. The concept, classification, and significance of personal data are elaborately introduced and then the semantics correlation and semantics representation of personal data are discussed. A multilayer reference architecture of PDC, including its core components and a real-world operational scenario showing how the reference architecture works, is introduced in detail. Existing commercial PDC products/prototypes are listed and compared from several perspectives. Five open issues to improve the shortcomings of current PDC research are put forward. PMID:25165753

  4. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  5. Cloud Based Drive Forensic and DDoS Analysis on Seafile as Case Study

    NASA Astrophysics Data System (ADS)

    Bahaweres, R. B.; Santo, N. B.; Ningsih, A. S.

    2017-01-01

    The rapid development of Internet due to increasing data rates through both broadband cable networks and 4G wireless mobile, make everyone easily connected to the internet. Storages as Services (StaaS) is more popular and many users want to store their data in one place so that whenever they need they can easily access anywhere, any place and anytime in the cloud. The use of the service makes it vulnerable to use by someone to commit a crime or can do Denial of Service (DoS) on cloud storage services. The criminals can use the cloud storage services to store, upload and download illegal file or document to the cloud storage. In this study, we try to implement a private cloud storage using Seafile on Raspberry Pi and perform simulations in Local Area Network and Wi-Fi environment to analyze forensically to discover or open a criminal act can be traced and proved forensically. Also, we can identify, collect and analyze the artifact of server and client, such as a registry of the desktop client, the file system, the log of seafile, the cache of the browser, and database forensic.

  6. If the MODIS Aerosol Product is so Infested with Cloud Contamination, Why Does Everybody Use the Product?

    NASA Technical Reports Server (NTRS)

    Remeer, Lorraine A.

    2011-01-01

    The MODIS aerosol cloud mask is based on a spatial variability test, using the assumption that aerosols are more homogeneous than clouds. On top of this first line of defense are a series of additional tests based on threshold values and ratios of various MODIS channels. The goal is to eliminate clouds and keep the aerosol. How well have we succeeded? There have been several studies showing cloud contamination in the MODIS aerosol product and several alternative cloud masks proposed. There are even "competing" MODIS aerosol products that offer an alternative "cloud free" world. Are these alternative products an improvement to the old standard product? We find there is a trade-off between retrieval availability and cloud contamination, and for many applications it is better to have a little bit of cloud in the product than to not have enough product. I will review the decisions that led us to the present MODIS cloud mask, and show how it is simultaneously too liberal and too conservative, some ideas on how to make it better and why in the end it doesn't matter. I hope to inspire a spirited discussion and will be very willing to take your complaints and suggestions.

  7. Sharing Planetary-Scale Data in the Cloud

    NASA Astrophysics Data System (ADS)

    Sundwall, J.; Flasher, J.

    2016-12-01

    On 19 March 2015, Amazon Web Services (AWS) announced Landsat on AWS, an initiative to make data from the U.S. Geological Survey's Landsat satellite program freely available in the cloud. Because of Landsat's global coverage and long history, it has become a reference point for all Earth observation work and is considered the gold standard of natural resource satellite imagery. Within the first year of Landsat on AWS, the service served over a billion requests for Landsat imagery and metadata, globally. Availability of the data in the cloud has led to new product development by companies and startups including Mapbox, Esri, CartoDB, MathWorks, Development Seed, Trimble, Astro Digital, Blue Raster and Timbr.io. The model of staging data for analysis in the cloud established by Landsat on AWS has since been applied to high resolution radar data, European Space Agency satellite imagery, global elevation data and EPA air quality models. This session will provide an overview of lessons learned throughout these projects. It will demonstrate how cloud-based object storage is democratizing access to massive publicly-funded data sets that have previously only been available to people with access to large amounts of storage, bandwidth, and computing power. Technical discussion points will include: The differences between staging data for analysis using object storage versus file storage Using object stores to design simple RESTful APIs through thoughtful file naming conventions, header fields, and HTTP Range Requests Managing costs through data architecture and Amazon S3's "requester pays" feature Building tools that allow users to take their algorithm to the data in the cloud Using serverless technologies to display dynamic frontends for massive data sets

  8. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Caballero, J.; Ernst, M.; Guan, W.; Hover, J.; Lesny, D.; Maeno, T.; Nilsson, P.; Tsulaia, V.; van Gemmeren, P.; Vaniachine, A.; Wang, F.; Wenaus, T.; ATLAS Collaboration

    2016-10-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  9. Making Spatial Statistics Service Accessible On Cloud Platform

    NASA Astrophysics Data System (ADS)

    Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.

    2014-04-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.

  10. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Service (IaaS) Software -as-a- Service ( SaaS ) Cloud Computing Types Platform-as-a- Service (PaaS) Based on Type of Capability Based on access Based...Mellon University Software -as-a- Service ( SaaS ) Application-specific capabilities, e.g., service that provides customer management Allows organizations...as a Service ( SaaS ) Model of software deployment in which a provider licenses an application to customers for use as a service on

  11. Avoidable Software Procurements

    DTIC Science & Technology

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  12. cadcVOFS: A FUSE Based File System Layer for VOSpace

    NASA Astrophysics Data System (ADS)

    Kavelaars, J.; Dowler, P.; Jenkins, D.; Hill, N.; Damian, A.

    2012-09-01

    The CADC is now making extensive use of the VOSpace protocol for user managed storage. The VOSpace standard allows a diverse set of rich data services to be delivered to users via a simple protocol. We have recently developed the cadcVOFS, a FUSE based file-system layer for VOSpace. cadcVOFS provides a filesystem layer on-top of VOSpace so that standard Unix tools (such as ‘find’, ‘emacs’, ‘awk’ etc) can be used directly on the data objects stored in VOSpace. Once mounted the VOSpace appears as a network storage volume inside the operating system. Within the CADC Cloud Computing project (CANFAR) we have used VOSpace as the method for retrieving and storing processing inputs and products. The abstraction of storage is an important component of Cloud Computing and the high use level of our VOSpace service reflects this.

  13. Trust Model to Enhance Security and Interoperability of Cloud Environment

    NASA Astrophysics Data System (ADS)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  14. Putting Order Into the Cloud: Object-oriented UML-based Rule Enforcement for Document and Application Organization

    DTIC Science & Technology

    2010-09-01

    Cloud computing describes a new distributed computing paradigm for IT data and services that involves over-the-Internet provision of dynamically scalable and often virtualized resources. While cost reduction and flexibility in storage, services, and maintenance are important considerations when deciding on whether or how to migrate data and applications to the cloud, large organizations like the Department of Defense need to consider the organization and structure of data on the cloud and the operations on such data in order to reap the full benefit of cloud

  15. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data.

    PubMed

    Xie, Qingqing; Wang, Liangmin

    2016-11-25

    With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP) can provide location-based service (LBS) for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman) algorithm and ciphertext policy attribute-based encryption (CP-ABE) scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA) and efficient enough for practical applications in terms of user side computation overhead.

  16. Privacy-Preserving Location-Based Service Scheme for Mobile Sensing Data †

    PubMed Central

    Xie, Qingqing; Wang, Liangmin

    2016-01-01

    With the wide use of mobile sensing application, more and more location-embedded data are collected and stored in mobile clouds, such as iCloud, Samsung cloud, etc. Using these data, the cloud service provider (CSP) can provide location-based service (LBS) for users. However, the mobile cloud is untrustworthy. The privacy concerns force the sensitive locations to be stored on the mobile cloud in an encrypted form. However, this brings a great challenge to utilize these data to provide efficient LBS. To solve this problem, we propose a privacy-preserving LBS scheme for mobile sensing data, based on the RSA (for Rivest, Shamir and Adleman) algorithm and ciphertext policy attribute-based encryption (CP-ABE) scheme. The mobile cloud can perform location distance computing and comparison efficiently for authorized users, without location privacy leakage. In the end, theoretical security analysis and experimental evaluation demonstrate that our scheme is secure against the chosen plaintext attack (CPA) and efficient enough for practical applications in terms of user side computation overhead. PMID:27897984

  17. NASA Enterprise Managed Cloud Computing (EMCC): Delivering an Initial Operating Capability (IOC) for NASA use of Commercial Infrastructure-as-a-Service (IaaS)

    NASA Technical Reports Server (NTRS)

    O'Brien, Raymond

    2017-01-01

    In 2016, Ames supported the NASA CIO in delivering an initial operating capability for Agency use of commercial cloud computing. This presentation provides an overview of the project, the services approach followed, and the major components of the capability that was delivered. The presentation is being given at the request of Amazon Web Services to a contingent representing the Brazilian Federal Government and Defense Organization that is interested in the use of Amazon Web Services (AWS). NASA is currently a customer of AWS and delivered the Initial Operating Capability using AWS as its first commercial cloud provider. The IOC, however, designed to also support other cloud providers in the future.

  18. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    PubMed Central

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  19. A Simple Technique for Securing Data at Rest Stored in a Computing Cloud

    NASA Astrophysics Data System (ADS)

    Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai

    "Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.

  20. The JASMIN Cloud: specialised and hybrid to meet the needs of the Environmental Sciences Community

    NASA Astrophysics Data System (ADS)

    Kershaw, Philip; Lawrence, Bryan; Churchill, Jonathan; Pritchard, Matt

    2014-05-01

    Cloud computing provides enormous opportunities for the research community. The large public cloud providers provide near-limitless scaling capability. However, adapting Cloud to scientific workloads is not without its problems. The commodity nature of the public cloud infrastructure can be at odds with the specialist requirements of the research community. Issues such as trust, ownership of data, WAN bandwidth and costing models make additional barriers to more widespread adoption. Alongside the application of public cloud for scientific applications, a number of private cloud initiatives are underway in the research community of which the JASMIN Cloud is one example. Here, cloud service models are being effectively super-imposed over more established services such as data centres, compute cluster facilities and Grids. These have the potential to deliver the specialist infrastructure needed for the science community coupled with the benefits of a Cloud service model. The JASMIN facility based at the Rutherford Appleton Laboratory was established in 2012 to support the data analysis requirements of the climate and Earth Observation community. In its first year of operation, the 5PB of available storage capacity was filled and the hosted compute capability used extensively. JASMIN has modelled the concept of a centralised large-volume data analysis facility. Key characteristics have enabled success: peta-scale fast disk connected via low latency networks to compute resources and the use of virtualisation for effective management of the resources for a range of users. A second phase is now underway funded through NERC's (Natural Environment Research Council) Big Data initiative. This will see significant expansion to the resources available with a doubling of disk-based storage to 12PB and an increase of compute capacity by a factor of ten to over 3000 processing cores. This expansion is accompanied by a broadening in the scope for JASMIN, as a service available to the entire UK environmental science community. Experience with the first phase demonstrated the range of user needs. A trade-off is needed between access privileges to resources, flexibility of use and security. This has influenced the form and types of service under development for the new phase. JASMIN will deploy a specialised private cloud organised into "Managed" and "Unmanaged" components. In the Managed Cloud, users have direct access to the storage and compute resources for optimal performance but for reasons of security, via a more restrictive PaaS (Platform-as-a-Service) interface. The Unmanaged Cloud is deployed in an isolated part of the network but co-located with the rest of the infrastructure. This enables greater liberty to tenants - full IaaS (Infrastructure-as-a-Service) capability to provision customised infrastructure - whilst at the same time protecting more sensitive parts of the system from direct access using these elevated privileges. The private cloud will be augmented with cloud-bursting capability so that it can exploit the resources available from public clouds, making it effectively a hybrid solution. A single interface will overlay the functionality of both the private cloud and external interfaces to public cloud providers giving users the flexibility to migrate resources between infrastructures as requirements dictate.

  1. Cloud computing in pharmaceutical R&D: business risks and mitigations.

    PubMed

    Geiger, Karl

    2010-05-01

    Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.

  2. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  3. Reviews on Security Issues and Challenges in Cloud Computing

    NASA Astrophysics Data System (ADS)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  4. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    NASA Astrophysics Data System (ADS)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of the DEM difference to analyze the surface changes in 3D. The automated point cloud generation and analysis introduced here can be embedded in virtually any existing geospatial workflow for operational applications. Three integration options were implemented in this case study: • Integration within any ArcGIS environment whether deployed on the desktop, in the cloud, or online. Execution uses a customized ArcGIS script tool. A Python script file retrieves the parameters from the user interface and runs the precompiled IDL code. That IDL code is used to interface between the Python script and the relevant ENVITasks. • Publishing the point cloud processing tasks as services via the ENVI Services Engine (ESE). ESE is a cloud-based image analysis solution to publish and deploy advanced ENVI image and data analytics to existing enterprise infrastructures. For this purpose the entire IDL code can be capsuled in a single ENVITask. • Integration in an existing geospatial workflow using the Python-to-IDL Bridge. This mechanism allows calling IDL code within Python on a user-defined platform. The results of this case study allow a 3D estimation of the topographic changes within the tectonically active and anthropogenically invaded Malin area after the landslide event. Accordingly, the point cloud analysis was correlated successfully with modelled displacement contours of the slope. Based on optical satellite imagery, such point clouds of high precision and density distribution can be obtained in a few minutes to support the operational monitoring of landslide processes.

  5. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  6. Migrating To The Cloud: Preparing The USMC CDET For MCEITS

    DTIC Science & Technology

    2016-03-01

    Service SAAR System Authorization Access Request SaaS Software as a... Service (IaaS), Platform as a Service (PaaS), Software as a Service ( SaaS ), and Data as a Service (DaaS) (Takai, 2012). A closer examination of each...8 3. Software as a Service NIST described SaaS as a model of cloud computing where the service provider offers its customers fee-based access

  7. Cloud Property Retrieval Products for Graciosa Island, Azores

    DOE Data Explorer

    Dong, Xiquan

    2014-05-05

    The motivation for developing this product was to use the Dong et al. 1998 method to retrieve cloud microphysical properties, such as cloud droplet effective radius, cloud droplets number concentration, and optical thickness. These retrieved properties have been used to validate the satellite retrieval, and evaluate the climate simulations and reanalyses. We had been using this method to retrieve cloud microphysical properties over ARM SGP and NSA sites. We also modified the method for the AMF at Shouxian, China and some IOPs, e.g. ARM IOP at SGP in March, 2000. The ARSCL data from ARM data archive over the SGP and NSA have been used to determine the cloud boundary and cloud phase. For these ARM permanent sites, the ARSCL data was developed based on MMCR measurements, however, there were no data available at the Azores field campaign. We followed the steps to generate this derived product and also include the MPLCMASK cloud retrievals to determine the most accurate cloud boundaries, including the thin cirrus clouds that WACR may under-detect. We use these as input to retrieve the cloud microphysical properties. Due to the different temporal resolutions of the derived cloud boundary heights product and the cloud properties product, we submit them as two separate netcdf files.

  8. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Memarsadeghi, Nargess; Overoye, David; Littlefield, Brain

    2017-01-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBEs education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing, storage, as well as production, staging and backup systems. We outline the migration teams skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  9. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Memarsadeghi, Nargess; Overoye, David; Littlefield, Bryan

    2016-01-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBEs education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing, storage, as well as production, staging and backup systems. We outline the migration teams skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  10. Technical Challenges and Lessons from the Migration of the GLOBE Data and Information System to Utilize Cloud Computing Service

    NASA Astrophysics Data System (ADS)

    Moses, J. F.; Memarsadeghi, N.; Overoye, D.; Littlefield, B.

    2016-12-01

    The Global Learning and Observation to Benefit the Environment (GLOBE) Data and Information System supports an international science and education program with capabilities to accept local environment observations, archive, display and visualize them along with global satellite observations. Since its inception twenty years ago, the Web and database system has been upgraded periodically to accommodate the changes in technology and the steady growth of GLOBE's education community and collection of observations. Recently, near the end-of-life of the system hardware, new commercial computer platform options were explored and a decision made to utilize Cloud services. Now the GLOBE DIS has been fully deployed and maintained using Amazon Cloud services for over two years now. This paper reviews the early risks, actual challenges, and some unexpected findings as a result of the GLOBE DIS migration. We describe the plans, cost drivers and estimates, highlight adjustments that were made and suggest improvements. We present the trade studies for provisioning, for load balancing, networks, processing , storage, as well as production, staging and backup systems. We outline the migration team's skills and required level of effort for transition, and resulting changes in the overall maintenance and operations activities. Examples include incremental adjustments to processing capacity and frequency of backups, and efforts previously expended on hardware maintenance that were refocused onto application-specific enhancements.

  11. Cloud occurrences and cloud radiative effects (CREs) from CERES-CALIPSO-CloudSat-MODIS (CCCM) and CloudSat radar-lidar (RL) products

    NASA Astrophysics Data System (ADS)

    Ham, Seung-Hee; Kato, Seiji; Rose, Fred G.; Winker, David; L'Ecuyer, Tristan; Mace, Gerald G.; Painemal, David; Sun-Mack, Sunny; Chen, Yan; Miller, Walter F.

    2017-08-01

    Two kinds of cloud products obtained from Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO), CloudSat, and Moderate Resolution Imaging Spectroradiometer (MODIS) are compared and analyzed in this study: Clouds and the Earth's Radiant Energy System (CERES)-CALIPSO-CloudSat-MODIS (CCCM) product and CloudSat radar-lidar products such as GEOPROF-LIDAR and FLXHR-LIDAR. Compared to GEOPROF-LIDAR, low-level (<1 km) cloud occurrences in CCCM are larger over tropical oceans because the CCCM algorithm uses a more relaxed threshold of cloud-aerosol discrimination score for CALIPSO Vertical Feature Mask product. In contrast, midlevel (1-8 km) cloud occurrences in GEOPROF-LIDAR are larger than CCCM at high latitudes (>40°). The difference occurs when hydrometeors are detected by CALIPSO lidar but are undetected by CloudSat radar. In the comparison of cloud radiative effects (CREs), global mean differences between CCCM and FLXHR-LIDAR are mostly smaller than 5 W m-2, while noticeable regional differences are found. For example, CCCM shortwave (SW) and longwave (LW) CREs are larger than FXLHR-LIDAR along the west coasts of Africa and America because the GEOPROF-LIDAR algorithm misses shallow marine boundary layer clouds. In addition, FLXHR-LIDAR SW CREs are larger than the CCCM counterpart over tropical oceans away from the west coasts of America. Over midlatitude storm-track regions, CCCM SW and LW CREs are larger than the FLXHR-LIDAR counterpart.

  12. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  13. The Role of Networks in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Geng; Devine, Mac

    The confluence of technology advancements and business developments in Broadband Internet, Web services, computing systems, and application software over the past decade has created a perfect storm for cloud computing. The "cloud model" of delivering and consuming IT functions as services is poised to fundamentally transform the IT industry and rebalance the inter-relationships among end users, enterprise IT, software companies, and the service providers in the IT ecosystem (Armbrust et al., 2009; Lin, Fu, Zhu, & Dasmalchi, 2009).

  14. Rheticus Displacement: an Automatic Geo-Information Service Platform for Ground Instabilities Detection and Monitoring

    NASA Astrophysics Data System (ADS)

    Chiaradia, M. T.; Samarelli, S.; Agrimano, L.; Lorusso, A. P.; Nutricato, R.; Nitti, D. O.; Morea, A.; Tijani, K.

    2016-12-01

    Rheticus® is an innovative cloud-based data and services hub able to deliver Earth Observation added-value products through automatic complex processes and a minimum interaction with human operators. This target is achieved by means of programmable components working as different software layers in a modern enterprise system which relies on SOA (service-oriented-architecture) model. Due to its architecture, where every functionality is well defined and encapsulated in a standalone component, Rheticus is potentially highly scalable and distributable allowing different configurations depending on the user needs. Rheticus offers a portfolio of services, ranging from the detection and monitoring of geohazards and infrastructural instabilities, to marine water quality monitoring, wildfires detection or land cover monitoring. In this work, we outline the overall cloud-based platform and focus on the "Rheticus Displacement" service, aimed at providing accurate information to monitor movements occurring across landslide features or structural instabilities that could affect buildings or infrastructures. Using Sentinel-1 (S1) open data images and Multi-Temporal SAR Interferometry techniques (i.e., SPINUA), the service is complementary to traditional survey methods, providing a long-term solution to slope instability monitoring. Rheticus automatically browses and accesses (on a weekly basis) the products of the rolling archive of ESA S1 Scientific Data Hub; S1 data are then handled by a mature running processing chain, which is responsible of producing displacement maps immediately usable to measure with sub-centimetric precision movements of coherent points. Examples are provided, concerning the automatic displacement map generation process, as well as the integration of point and distributed scatterers, the integration of multi-sensors displacement maps (e.g., Sentinel-1 IW and COSMO-SkyMed HIMAGE), the combination of displacement rate maps acquired along both ascending and descending passes. ACK: Study carried out in the framework of the FAST4MAP project and co-funded by the Italian Space Agency (Contract n. 2015-020-R.0). Sentinel-1A products provided by ESA. CSK® Products, ASI, provided by ASI under a license to use. Rheticus® is a registered trademark of Planetek Italia srl.

  15. An efficient approach for improving virtual machine placement in cloud computing environment

    NASA Astrophysics Data System (ADS)

    Ghobaei-Arani, Mostafa; Shamsi, Mahboubeh; Rahmanian, Ali A.

    2017-11-01

    The ever increasing demand for the cloud services requires more data centres. The power consumption in the data centres is a challenging problem for cloud computing, which has not been considered properly by the data centre developer companies. Especially, large data centres struggle with the power cost and the Greenhouse gases production. Hence, employing the power efficient mechanisms are necessary to optimise the mentioned effects. Moreover, virtual machine (VM) placement can be used as an effective method to reduce the power consumption in data centres. In this paper by grouping both virtual and physical machines, and taking into account the maximum absolute deviation during the VM placement, the power consumption as well as the service level agreement (SLA) deviation in data centres are reduced. To this end, the best-fit decreasing algorithm is utilised in the simulation to reduce the power consumption by about 5% compared to the modified best-fit decreasing algorithm, and at the same time, the SLA violation is improved by 6%. Finally, the learning automata are used to a trade-off between power consumption reduction from one side, and SLA violation percentage from the other side.

  16. Three-Dimensional Space to Assess Cloud Interoperability

    DTIC Science & Technology

    2013-03-01

    12 1. Portability and Mobility ...collection of network-enabled services that guarantees to provide a scalable, easy accessible, reliable, and personalized computing infrastructure , based on...are used in research to describe cloud models, such as SaaS (Software as a Service), PaaS (Platform as a service), IaaS ( Infrastructure as a Service

  17. Integration of Satellite-Derived Cloud Phase, Cloud Top Height, and Liquid Water Path into an Operational Aircraft Icing Nowcasting System

    NASA Technical Reports Server (NTRS)

    Haggerty, Julie; McDonough, Frank; Black, Jennifer; Landott, Scott; Wolff, Cory; Mueller, Steven; Minnis, Patrick; Smith, William, Jr.

    2008-01-01

    Operational products used by the U.S. Federal Aviation Administration to alert pilots of hazardous icing provide nowcast and short-term forecast estimates of the potential for the presence of supercooled liquid water and supercooled large droplets. The Current Icing Product (CIP) system employs basic satellite-derived information, including a cloud mask and cloud top temperature estimates, together with multiple other data sources to produce a gridded, three-dimensional, hourly depiction of icing probability and severity. Advanced satellite-derived cloud products developed at the NASA Langley Research Center (LaRC) provide a more detailed description of cloud properties (primarily at cloud top) compared to the basic satellite-derived information used currently in CIP. Cloud hydrometeor phase, liquid water path, cloud effective temperature, and cloud top height as estimated by the LaRC algorithms are into the CIP fuzzy logic scheme and a confidence value is determined. Examples of CIP products before and after the integration of the LaRC satellite-derived products will be presented at the conference.

  18. Enhancement of the MODIS Snow and Ice Product Suite Utilizing Image Segmentation

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Hall, Dorothy K.; Riggs, George A.

    2006-01-01

    A problem has been noticed with the current NODIS Snow and Ice Product in that fringes of certain snow fields are labeled as "cloud" whereas close inspection of the data indicates that the correct labeling is a non-cloud category such as snow or land. This occurs because the current MODIS Snow and Ice Product generation algorithm relies solely on the MODIS Cloud Mask Product for the labeling of image pixels as cloud. It is proposed here that information obtained from image segmentation can be used to determine when it is appropriate to override the cloud indication from the cloud mask product. Initial tests show that this approach can significantly reduce the cloud "fringing" in modified snow cover labeling. More comprehensive testing is required to determine whether or not this approach consistently improves the accuracy of the snow and ice product.

  19. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  20. SLA-based optimisation of virtualised resource for multi-tier web applications in cloud data centres

    NASA Astrophysics Data System (ADS)

    Bi, Jing; Yuan, Haitao; Tie, Ming; Tan, Wei

    2015-10-01

    Dynamic virtualised resource allocation is the key to quality of service assurance for multi-tier web application services in cloud data centre. In this paper, we develop a self-management architecture of cloud data centres with virtualisation mechanism for multi-tier web application services. Based on this architecture, we establish a flexible hybrid queueing model to determine the amount of virtual machines for each tier of virtualised application service environments. Besides, we propose a non-linear constrained optimisation problem with restrictions defined in service level agreement. Furthermore, we develop a heuristic mixed optimisation algorithm to maximise the profit of cloud infrastructure providers, and to meet performance requirements from different clients as well. Finally, we compare the effectiveness of our dynamic allocation strategy with two other allocation strategies. The simulation results show that the proposed resource allocation method is efficient in improving the overall performance and reducing the resource energy cost.

  1. Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model

    NASA Astrophysics Data System (ADS)

    Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming

    Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and cloud computing. SSC provides its users with self-service storage and computing resources at the same time.At present, the prototyping of SSC is underway and the platform is expected to be put into trial operation in August 2014. We hope that as SSC develops, our vision of Digital Space may come true someday.

  2. IAServ: an intelligent home care web services platform in a cloud for aging-in-place.

    PubMed

    Su, Chuan-Jun; Chiang, Chang-Yu

    2013-11-12

    As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients' needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet.

  3. IAServ: An Intelligent Home Care Web Services Platform in a Cloud for Aging-in-Place

    PubMed Central

    Su, Chuan-Jun; Chiang, Chang-Yu

    2013-01-01

    As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients’ needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet. PMID:24225647

  4. Processing ARM VAP data on an AWS cluster

    NASA Astrophysics Data System (ADS)

    Martin, T.; Macduff, M.; Shippert, T.

    2017-12-01

    The Atmospheric Radiation Measurement (ARM) Data Management Facility (DMF) manages over 18,000 processes and 1.3 TB of data each day. This includes many Value Added Products (VAPs) that make use of multiple instruments to produce the derived products that are scientifically relevant. A thermodynamic and cloud profile VAP is being developed to provide input to the ARM Large-eddy simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) project (https://www.arm.gov/capabilities/vaps/lasso-122) . This algorithm is CPU intensive and the processing requirements exceeded the available DMF computing capacity. Amazon Web Service (AWS) along with CfnCluster was investigated to see how it would perform. This cluster environment is cost effective and scales dynamically based on demand. We were able to take advantage of autoscaling which allowed the cluster to grow and shrink based on the size of the processing queue. We also were able to take advantage of the Amazon Web Services spot market to further reduce the cost. Our test was very successful and found that cloud resources can be used to efficiently and effectively process time series data. This poster will present the resources and methodology used to successfully run the algorithm.

  5. A Comparison of Several Techniques to Assign Heights to Cloud Tracers.

    NASA Astrophysics Data System (ADS)

    Nieman, Steven J.; Schmetz, Johannes; Menzel, W. Paul

    1993-09-01

    Satellite-derived cloud-motion vector (CMV) production has been troubled by inaccurate height assignment of cloud tracers, especially in thin semitransparent clouds. This paper presents the results of an intercomparison of current operational height assignment techniques. Currently, heights are assigned by one of three techniques when the appropriate spectral radiance measurements are available. The infrared window (IRW) technique compares measured brightness temperatures to forecast temperature profiles and thus infers opaque cloud levels. In semitransparent or small subpixel clouds, the carbon dioxide (CO2) technique uses the ratio of radiances from different layers of the atmosphere to infer the correct cloud height. In the water vapor (H2O) technique, radiances influenced by upper-tropospheric moisture and IRW radiances are measured for several pixels viewing different cloud amounts, and their linear relationship is used to extrapolate the correct cloud height. The results presented in this paper suggest that the H2O technique is a viable alternative to the CO2 technique for inferring the heights of semitransparent cloud elements. This is important since future National Environmental Satellite, Data, and Information Service (NESDIS) operations will have to rely on H20-derived cloud-height assignments in the wind field determinations with the next operational geostationary satellite. On a given day, the heights from the two approaches compare to within 60 110 hPa rms; drier atmospheric conditions tend to reduce the effectiveness of the H2O technique. By inference one can conclude that the present height algorithms used operationally at NESDIS (with the C02 technique) and at the European Satellite Operations Center (ESOC) (with their version of the H20 technique) are providing similar results. Sample wind fields produced with the ESOC and NESDIS algorithms using Meteosat-4 data show good agreement.

  6. Storm-based Cloud-to-Ground Lightning Probabilities and Warnings

    NASA Astrophysics Data System (ADS)

    Calhoun, K. M.; Meyer, T.; Kingfield, D.

    2017-12-01

    A new cloud-to-ground (CG) lightning probability algorithm has been developed using machine-learning methods. With storm-based inputs of Earth Networks' in-cloud lightning, Vaisala's CG lightning, multi-radar/multi-sensor (MRMS) radar derived products including the Maximum Expected Size of Hail (MESH) and Vertically Integrated Liquid (VIL), and near storm environmental data including lapse rate and CAPE, a random forest algorithm was trained to produce probabilities of CG lightning up to one-hour in advance. As part of the Prototype Probabilistic Hazard Information experiment in the Hazardous Weather Testbed in 2016 and 2017, National Weather Service forecasters were asked to use this CG lightning probability guidance to create rapidly updating probability grids and warnings for the threat of CG lightning for 0-60 minutes. The output from forecasters was shared with end-users, including emergency managers and broadcast meteorologists, as part of an integrated warning team.

  7. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  8. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  9. Compression-based aggregation model for medical web services.

    PubMed

    Al-Shammary, Dhiah; Khalil, Ibrahim

    2010-01-01

    Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction.

  10. Retrieval of ice cloud properties from Himawari-8 satellite measurements by Voronoi ice particle model

    NASA Astrophysics Data System (ADS)

    Letu, H.; Nagao, T. M.; Nakajima, T. Y.; Ishimoto, H.; Riedi, J.; Shang, H.

    2017-12-01

    Ice cloud property product from satellite measurements is applicable in climate change study, numerical weather prediction, as well as atmospheric study. Ishimoto et al., (2010) and Letu et al., (2016) developed a single scattering property of the highly irregular ice particle model, called the Voronoi model for developing ice cloud product of the GCOM-C satellite program. It is investigated that Voronoi model has a good performance on retrieval of the ice cloud properties by comparing it with other well-known scattering models. Cloud property algorithm (Nakajima et al., 1995, Ishida and Nakajima., 2009, Ishimoto et al., 2009, Letu et al., 2012, 2014, 2016) of the GCOM-C satellite program is improved to produce the Himawari-8/AHI cloud products based on the variation of the solar zenith angle. Himawari-8 is the new-generational geostationary meteorological satellite, which is successfully launched by the Japan Meteorological Agency (JMA) on 7 October 2014. In this study, ice cloud optical and microphysical properties are simulated from RSTAR radiative transfer code by using various model. Scattering property of the Voronoi model is investigated for developing the AHI ice cloud products. Furthermore, optical and microphysical properties of the ice clouds are retrieved from Himawari-8/AHI satellite measurements. Finally, retrieval results from Himawari-8/AHI are compared to MODIS-C6 cloud property products for validation of the AHI cloud products.

  11. Preparing the remote sensing community toward the NPP/NPOESS era

    NASA Astrophysics Data System (ADS)

    Kuciauskas, A. P.; Lee, T. F.; Turk, F. J.; Richardson, K. A.; Hawkins, J. D.; Kent, J. E.; Miller, S. D.; McWilliams, G.

    2008-12-01

    Under the auspices of the National Polar-orbiting Operational Environmental Satellite System (NPOESS) Integrated Program Office (IPO), the Naval Research Laboratory in Monterey (NRLMRY) was tasked to develop NexSat, a weather satellite web-based resource, to illustrate future sensing capabilities within the Visible/Infrared Imager Radiometer Suite (VIIRS) sensor onboard the NPOESS Preparatory Project (NPP) and NPOESS era. NexSat acquires and processes data from polar orbiters (AVHRR, MODIS, SeaWiFS, DMSP, and TRMM) that serve as heritage instruments to the VIIRS. Geostationary sensors and numerical weather prediction (NWP) overlays supplement the image products suite, making NexSat a one-stop shop for current and future environmental monitoring. NRLMRY collaborates with the Cooperative Institute for Research in the Atmosphere (CIRA) and the Cooperative Institute for Meteorological Satellite Studies (CIMSS) for product development. Together with the Cooperative Program for Operational Meteorology, Education and Training (COMET®), NRLMRY provides educational outreach to research and development communities as well as to the general public. This paper intends to describe the products within the NexSat webpage and its training resources. The product suite consists of generic and state of the art images. Along with the standard visible, IR, and water vapor products, NexSat also includes dust enhancement, cloud properties, cloud profiling, snow cloud discrimination, volcanic ash plumes, hot spots, aerosol content over land and water. NexSat training resources will be described, including on-line product tutorials, a course module, as well as outreach efforts to the National Weather Service, government agencies, academic institutions, and international organizations.

  12. The Metadata Cloud: The Last Piece of a Distributed Data System Model

    NASA Astrophysics Data System (ADS)

    King, T. A.; Cecconi, B.; Hughes, J. S.; Walker, R. J.; Roberts, D.; Thieman, J. R.; Joy, S. P.; Mafi, J. N.; Gangloff, M.

    2012-12-01

    Distributed data systems have existed ever since systems were networked together. Over the years the model for distributed data systems have evolved from basic file transfer to client-server to multi-tiered to grid and finally to cloud based systems. Initially metadata was tightly coupled to the data either by embedding the metadata in the same file containing the data or by co-locating the metadata in commonly named files. As the sources of data multiplied, data volumes have increased and services have specialized to improve efficiency; a cloud system model has emerged. In a cloud system computing and storage are provided as services with accessibility emphasized over physical location. Computation and data clouds are common implementations. Effectively using the data and computation capabilities requires metadata. When metadata is stored separately from the data; a metadata cloud is formed. With a metadata cloud information and knowledge about data resources can migrate efficiently from system to system, enabling services and allowing the data to remain efficiently stored until used. This is especially important with "Big Data" where movement of the data is limited by bandwidth. We examine how the metadata cloud completes a general distributed data system model, how standards play a role and relate this to the existing types of cloud computing. We also look at the major science data systems in existence and compare each to the generalized cloud system model.

  13. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    PubMed

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  14. Validation of AIRS/AMSU Cloud Retrievals Using MODIS Cloud Analyses

    NASA Technical Reports Server (NTRS)

    Molnar, Gyula I.; Susskind, Joel

    2005-01-01

    The AIRS/AMSU (flying on the EOS-AQUA satellite) sounding retrieval methodology allows for the retrieval of key atmospheric/surface parameters under partially cloudy conditions (Susskind et al.). In addition, cloud parameters are also derived from the AIRS/AMSU observations. Within each AIRS footprint, cloud parameters at up to 2 cloud layers are determined with differing cloud top pressures and effective (product of infrared emissivity at 11 microns and physical cloud fraction) cloud fractions. However, so far the AIRS cloud product has not been rigorously evaluated/validated. Fortunately, collocated/coincident radiances measured by MODIS/AQUA (at a much lower spectral resolution but roughly an order of-magnitude higher spatial resolution than that of AIRS) are used to determine analogous cloud products from MODIS. This allows us for a rather rare and interesting possibility: the intercomparisons and mutual validation of imager vs. sounder-based cloud products obtained from the same satellite positions. First, we present results of small-scale (granules) instantaneous intercomparisons. Next, we will evaluate differences of temporally averaged (monthly) means as well as the representation of inter-annual variability of cloud parameters as presented by the two cloud data sets. In particular, we present statistical differences in the retrieved parameters of cloud fraction and cloud top pressure. We will investigate what type of cloud systems are retrieved most consistently (if any) with both retrieval schemes, and attempt to assess reasons behind statistically significant differences.

  15. Example MODIS Global Cloud Optical and Microphysical Properties: Comparisons between Terra and Aqua

    NASA Technical Reports Server (NTRS)

    Hubanks, P. A.; Platnick, S.; King, M. D.; Ackerman, S. A.; Frey, R. A.

    2003-01-01

    MODIS observations from the NASA EOS Terra spacecraft (launched in December 1999, 1030 local time equatorial crossing) have provided a unique data set of Earth observations. With the launch of the NASA Aqua spacecraft in May 2002 (1330 local time), two MODIS daytime (sunlit) and nighttime observations are now available in a 24 hour period, allowing for some measure of diurnal variability. We report on an initial analysis of several operational global (Level-3) cloud products from the two platforms. The MODIS atmosphere Level-3 products, which include clear-sky and aerosol products in addition to cloud products, are available as three separate files providing daily, eight-day, and monthly aggregations; each temporal aggregation is spatially aggregated to a 1 degree grid. The files contain approximately 600 statisitical datasets (from simple means and standard deviations to 1 - and 2-dimensional histograms). Operational cloud products include detection (cloud fraction), cloud-top properties, and daytimeonly cloud optical thickness and particle effective radius for both water and ice clouds. We will compare example global Terra and Aqua cloud fraction, optical thickness, and effective radius aggregations.

  16. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    PubMed

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  17. Design for Run-Time Monitor on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.

  18. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.

  19. Developing and Evaluating RGB Composite MODIS Imagery for Applications in National Weather Service Forecast Offices

    NASA Technical Reports Server (NTRS)

    Oswald, Hayden; Molthan, Andrew L.

    2011-01-01

    Satellite remote sensing has gained widespread use in the field of operational meteorology. Although raw satellite imagery is useful, several techniques exist which can convey multiple types of data in a more efficient way. One of these techniques is multispectral compositing. The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed two multispectral satellite imagery products which utilize data from the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Terra and Aqua satellites, based upon products currently generated and used by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT). The nighttime microphysics product allows users to identify clouds occurring at different altitudes, but emphasizes fog and low cloud detection. This product improves upon current spectral difference and single channel infrared techniques. Each of the current products has its own set of advantages for nocturnal fog detection, but each also has limiting drawbacks which can hamper the analysis process. The multispectral product combines each current product with a third channel difference. Since the final image is enhanced with color, it simplifies the fog identification process. Analysis has shown that the nighttime microphysics imagery product represents a substantial improvement to conventional fog detection techniques, as well as provides a preview of future satellite capabilities to forecasters.

  20. iMAGE cloud: medical image processing as a service for regional healthcare in a hybrid cloud environment.

    PubMed

    Liu, Li; Chen, Weiping; Nie, Min; Zhang, Fengjuan; Wang, Yu; He, Ailing; Wang, Xiaonan; Yan, Gen

    2016-11-01

    To handle the emergence of the regional healthcare ecosystem, physicians and surgeons in various departments and healthcare institutions must process medical images securely, conveniently, and efficiently, and must integrate them with electronic medical records (EMRs). In this manuscript, we propose a software as a service (SaaS) cloud called the iMAGE cloud. A three-layer hybrid cloud was created to provide medical image processing services in the smart city of Wuxi, China, in April 2015. In the first step, medical images and EMR data were received and integrated via the hybrid regional healthcare network. Then, traditional and advanced image processing functions were proposed and computed in a unified manner in the high-performance cloud units. Finally, the image processing results were delivered to regional users using the virtual desktop infrastructure (VDI) technology. Security infrastructure was also taken into consideration. Integrated information query and many advanced medical image processing functions-such as coronary extraction, pulmonary reconstruction, vascular extraction, intelligent detection of pulmonary nodules, image fusion, and 3D printing-were available to local physicians and surgeons in various departments and healthcare institutions. Implementation results indicate that the iMAGE cloud can provide convenient, efficient, compatible, and secure medical image processing services in regional healthcare networks. The iMAGE cloud has been proven to be valuable in applications in the regional healthcare system, and it could have a promising future in the healthcare system worldwide.

  1. The Impact of Cloud Correction on the Redistribution of Reactive Nitrogen Species

    NASA Astrophysics Data System (ADS)

    Pour Biazar, A.; McNider, R. T.; Doty, K.; Cameron, R.

    2007-12-01

    Clouds are particularly important to air quality. Yet, correct prediction of clouds in time and space remains to be a great challenge for the air quality models. One aspect of cloud impact on air quality is the modification of photolysis reaction rates by clouds. Clouds can significantly alter the solar radiation in the wavelengths affecting the photolysis rates. Such modifications significantly impact atmospheric photochemistry and alter the chemical composition of the boundary layer. It also alters the partitioning of chemical compounds by creating a new equilibrium state. Since air quality models are often being used for air quality and emission reduction assessment, understanding the uncertainty caused by inaccurate cloud prediction is imperative. In this study we investigate the radiative impact of clouds in altering the partitioning of nitrogen species in the emission source regions. Such alterations affect the local nitrogen budget and thereby alter the atmospheric composition within the boundary layer. The results from two model simulations, one in which the model predicted clouds are used (control), and the other in which the satellite observed clouds have been assimilated in the model were analyzed. We use satellite retrieved cloud transmissivity, cloud top height, and observed cloud fraction to correct photolysis rates for cloud cover in the Community Multiscale Air Quality (CMAQ) modeling system. The simulations were performed at 4- and 12-km resolution domains over Texas, extending east to Mississippi, for the period of August 24 to August 31, 2000. The results clearly indicate that not using the cloud observations in the model can drastically alter the predicted atmospheric chemical composition within the boundary layer and exaggerate or under-predict the ozone concentrations. Cloud impact is acute and more pronounced over the emission source regions and can lead to drastic errors in the model predictions of ozone and its precursors. Clouds also increased the lifetime of ozone precursors leading to their transport out of the source regions and caused further ozone production downwind. The longer lifetimes for NOx and its transport over regions high in biogenic hydrocarbon emissions (in the eastern part of the domain) led to increased ozone production that was missing in the control simulation. An indirect impact of the clouds in the emission source areas is the alteration in partitioning of nitrogen oxides and the impact on nitrogen budget due to surface removal. This is caused by the disparity between the deposition velocity of NOx and the nitrates that are produced from oxidation of NOx. Under clear skies, NOx undergoes a chemical transformation and produces nitrates such as HNO3 and PAN. In the presence of thick clouds, due to the reduction in the photochemical activities, nitrogen monoxide (NO) rapidly consumes ozone (O3) and produces nitrogen dioxide (NO2) while the production of HNO3 and loss of NOx due to chemical transformation is reduced. Therefore, in one case there is more loss of nitrogen in the vicinity of emission sources. A detailed analysis of two emission source regions, Houston-Galveston and New Orleans area, will be presented. Acknowledgments. This work was accomplished under partial support from Cooperative Agreement between the University of Alabama in Huntsville and the Minerals Management Service on the Gulf of Mexico Issues.

  2. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    ERIC Educational Resources Information Center

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  3. An approximate dynamic programming approach to resource management in multi-cloud scenarios

    NASA Astrophysics Data System (ADS)

    Pietrabissa, Antonio; Priscoli, Francesco Delli; Di Giorgio, Alessandro; Giuseppi, Alessandro; Panfili, Martina; Suraci, Vincenzo

    2017-03-01

    The programmability and the virtualisation of network resources are crucial to deploy scalable Information and Communications Technology (ICT) services. The increasing demand of cloud services, mainly devoted to the storage and computing, requires a new functional element, the Cloud Management Broker (CMB), aimed at managing multiple cloud resources to meet the customers' requirements and, simultaneously, to optimise their usage. This paper proposes a multi-cloud resource allocation algorithm that manages the resource requests with the aim of maximising the CMB revenue over time. The algorithm is based on Markov decision process modelling and relies on reinforcement learning techniques to find online an approximate solution.

  4. Implementing a New Cloud Computing Library Management Service: A Symbiotic Approach

    ERIC Educational Resources Information Center

    Dula, Michael; Jacobsen, Lynne; Ferguson, Tyler; Ross, Rob

    2012-01-01

    This article presents the story of how Pepperdine University migrated its library management functions to the cloud using what is now known as OCLC's WorldShare Management Services (WMS). The story of implementing this new service is told from two vantage points: (1) that of the library; and (2) that of the service provider. The authors were the…

  5. DICOM relay over the cloud.

    PubMed

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2013-05-01

    Healthcare institutions worldwide have adopted picture archiving and communication system (PACS) for enterprise access to images, relying on Digital Imaging Communication in Medicine (DICOM) standards for data exchange. However, communication over a wider domain of independent medical institutions is not well standardized. A DICOM-compliant bridge was developed for extending and sharing DICOM services across healthcare institutions without requiring complex network setups or dedicated communication channels. A set of DICOM routers interconnected through a public cloud infrastructure was implemented to support medical image exchange among institutions. Despite the advantages of cloud computing, new challenges were encountered regarding data privacy, particularly when medical data are transmitted over different domains. To address this issue, a solution was introduced by creating a ciphered data channel between the entities sharing DICOM services. Two main DICOM services were implemented in the bridge: Storage and Query/Retrieve. The performance measures demonstrated it is quite simple to exchange information and processes between several institutions. The solution can be integrated with any currently installed PACS-DICOM infrastructure. This method works transparently with well-known cloud service providers. Cloud computing was introduced to augment enterprise PACS by providing standard medical imaging services across different institutions, offering communication privacy and enabling creation of wider PACS scenarios with suitable technical solutions.

  6. A New Approach to Integrate Internet-of-Things and Software-as-a-Service Model for Logistic Systems: A Case Study

    PubMed Central

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-01-01

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728

  7. A new approach to integrate Internet-of-things and software-as-a-service model for logistic systems: a case study.

    PubMed

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-03-28

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.

  8. Comparison Between CCCM and CloudSat Radar-Lidar (RL) Cloud and Radiation Products

    NASA Technical Reports Server (NTRS)

    Ham, Seung-Hee; Kato, Seiji; Rose, Fred G.; Sun-Mack, Sunny

    2015-01-01

    To enhance cloud properties, LaRC and CIRA developed each combination algorithm for obtained properties from passive, active and imager in A-satellite constellation. When comparing global cloud fraction each other, LaRC-produced CERES-CALIPSO-CloudSat-MODIS (CCCM) products larger low-level cloud fraction over tropic ocean, while CIRA-produced Radar-Lidar (RL) shows larger mid-level cloud fraction for high latitude region. The reason for different low-level cloud fraction is due to different filtering method of lidar-detected cloud layers. Meanwhile difference in mid-level clouds is occurred due to different priority of cloud boundaries from lidar and radar.

  9. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  10. An Overview of Cloud Computing in Distributed Systems

    NASA Astrophysics Data System (ADS)

    Divakarla, Usha; Kumari, Geetha

    2010-11-01

    Cloud computing is the emerging trend in the field of distributed computing. Cloud computing evolved from grid computing and distributed computing. Cloud plays an important role in huge organizations in maintaining huge data with limited resources. Cloud also helps in resource sharing through some specific virtual machines provided by the cloud service provider. This paper gives an overview of the cloud organization and some of the basic security issues pertaining to the cloud.

  11. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  12. Production experience with the ATLAS Event Service

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Calafiura, P.; Childers, T.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Compute Engine, and a growing number of HPC platforms. After briefly reviewing the concept and the architecture of the Event Service, we will report the status and experience gained in AES commissioning and production operations on supercomputers, and our plans for extending ES application beyond Geant4 simulation to other workflows, such as reconstruction and data analysis.

  13. Assessment of the NPOESS/VIIRS Nighttime Infrared Cloud Optical Properties Algorithms

    NASA Astrophysics Data System (ADS)

    Wong, E.; Ou, S. C.

    2008-12-01

    In this paper we will describe two NPOESS VIIRS IR algorithms used to retrieve microphysical properties for water and ice clouds during nighttime conditions. Both algorithms employ four VIIRS IR channels: M12 (3.7 μm), M14 (8.55 μm), M15 (10.7 μm) and M16 (12 μm). The physical basis for the two algorithms is similar in that while the Cloud Top Temperature (CTT) is derived from M14 and M16 for ice clouds the Cloud Optical Thickness (COT) and Cloud Effective Particle Size (CEPS) are derived from M12 and M15. The two algorithms depart in the different radiative transfer parameterization equations used for ice and water clouds. Both the VIIRS nighttime IR algorithms and the CERES split-window method employ the 3.7 μm and 10.7 μm bands for cloud optical properties retrievals, apparently based on similar physical principles but with different implementations. It is reasonable to expect that the VIIRS and CERES IR algorithms produce comparable performance and similar limitations. To demonstrate the VIIRS nighttime IR algorithm performance, we will select a number of test cases using NASA MODIS L1b radiance products as proxy input data for VIIRS. The VIIRS retrieved COT and CEPS will then be compared to cloud products available from the MODIS, NASA CALIPSO, CloudSat and CERES sensors. For the MODIS product, the nighttime cloud emissivity will serve as an indirect comparison to VIIRS COT. For the CALIPSO and CloudSat products, the layered COT will be used for direct comparison. Finally, the CERES products will provide direct comparison with COT as well as CEPS. This study can only provide a qualitative assessment of the VIIRS IR algorithms due to the large uncertainties in these cloud products.

  14. Development and clinical study of mobile 12-lead electrocardiography based on cloud computing for cardiac emergency.

    PubMed

    Fujita, Hideo; Uchimura, Yuji; Waki, Kayo; Omae, Koji; Takeuchi, Ichiro; Ohe, Kazuhiko

    2013-01-01

    To improve emergency services for accurate diagnosis of cardiac emergency, we developed a low-cost new mobile electrocardiography system "Cloud Cardiology®" based upon cloud computing for prehospital diagnosis. This comprises a compact 12-lead ECG unit equipped with Bluetooth and Android Smartphone with an application for transmission. Cloud server enables us to share ECG simultaneously inside and outside the hospital. We evaluated the clinical effectiveness by conducting a clinical trial with historical comparison to evaluate this system in a rapid response car in the real emergency service settings. We found that this system has an ability to shorten the onset to balloon time of patients with acute myocardial infarction, resulting in better clinical outcome. Here we propose that cloud-computing based simultaneous data sharing could be powerful solution for emergency service for cardiology, along with its significant clinical outcome.

  15. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  16. Proto-Examples of Data Access and Visualization Components of a Potential Cloud-Based GEOSS-AI System

    NASA Technical Reports Server (NTRS)

    Teng, William; Lynnes, Christopher

    2014-01-01

    Once a research or application problem has been identified, one logical next step is to search for available relevant data products. Thus, an early component of a potential GEOSS-AI system, in the continuum between observations and end point research, applications, and decision making, would be one that enables transparent data discovery and access by users. Such a component might be effected via the systems data agents. Presumably, some kind of data cataloging has already been implemented, e.g., in the GEOSS Common Infrastructure (GCI). Both the agents and cataloging could also leverage existing resources external to the system. The system would have some means to accept and integrate user-contributed agents. The need or desirability for some data format internal to the system should be evaluated. Another early component would be one that facilitates browsing visualization of the data, as well as some basic analyses.Three ongoing projects at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) provide possible proto-examples of potential data access and visualization components of a cloud-based GEOSS-AI system. 1. Reorganizing data archived as time-step arrays to point-time series (data rods), as well as leveraging the NASA Simple Subset Wizard (SSW), to significantly increase the number of data products available, at multiple NASA data centers, for production as on-the-fly (virtual) data rods. SSWs data discovery is based on OpenSearch. Both pre-generated and virtual data rods are accessible via Web services. 2. Developing Web Feature Services to publish the metadata, and expose the locations, of pre-generated and virtual data rods in the GEOSS Portal and enable direct access of the data via Web services. SSW is also leveraged to increase the availability of both NASA and non-NASA data.3.Federating NASA Giovanni (Geospatial Interactive Online Visualization and Analysis Interface), for multi-sensor data exploration, that would allow each cooperating data center, currently the NASA Distributed Active Archive Centers (DAACs), to configure its own Giovanni deployment, while also allowing all the deployments to incorporate each others data. A federated Giovanni comprises Giovanni Virtual Machines, which can be run on local servers or in the cloud.

  17. Contextual cloud-based service oriented architecture for clinical workflow.

    PubMed

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  18. The Namibia Early Flood Warning System, A CEOS Pilot Project

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert

    2012-01-01

    Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.

  19. Migrating EO/IR sensors to cloud-based infrastructure as service architectures

    NASA Astrophysics Data System (ADS)

    Berglie, Stephen T.; Webster, Steven; May, Christopher M.

    2014-06-01

    The Night Vision Image Generator (NVIG), a product of US Army RDECOM CERDEC NVESD, is a visualization tool used widely throughout Army simulation environments to provide fully attributed synthesized, full motion video using physics-based sensor and environmental effects. The NVIG relies heavily on contemporary hardware-based acceleration and GPU processing techniques, which push the envelope of both enterprise and commodity-level hypervisor support for providing virtual machines with direct access to hardware resources. The NVIG has successfully been integrated into fully virtual environments where system architectures leverage cloudbased technologies to various extents in order to streamline infrastructure and service management. This paper details the challenges presented to engineers seeking to migrate GPU-bound processes, such as the NVIG, to virtual machines and, ultimately, Cloud-Based IAS architectures. In addition, it presents the path that led to success for the NVIG. A brief overview of Cloud-Based infrastructure management tool sets is provided, and several virtual desktop solutions are outlined. A discrimination is made between general purpose virtual desktop technologies compared to technologies that expose GPU-specific capabilities, including direct rendering and hard ware-based video encoding. Candidate hypervisor/virtual machine configurations that nominally satisfy the virtualized hardware-level GPU requirements of the NVIG are presented , and each is subsequently reviewed in light of its implications on higher-level Cloud management techniques. Implementation details are included from the hardware level, through the operating system, to the 3D graphics APls required by the NVIG and similar GPU-bound tools.

  20. Accessing Cloud Properties and Satellite Imagery: A tool for visualization and data mining

    NASA Astrophysics Data System (ADS)

    Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.

    2016-12-01

    Providing public access to imagery of cloud macro and microphysical properties and the underlying satellite imagery is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and system that allows end users to easily browse cloud information and satellite imagery that is otherwise difficult to acquire and manipulate. The tool has two uses, one to visualize the data and the other to access the data directly. It uses a widely used access protocol, the Open Geospatial Consortium's Web Map and Processing Services, to encourage user to access the data we produce. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud. One goal of the tool is to provide a demonstration of the back end capability to end users so that they can use the dynamically generated imagery and data as an input to their own work flows or to set up data mining constraints. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information and satellite imagery accessible and easily searchable. Increasingly, information is used in a "mash-up" form where multiple sources of information are combined to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much cutting edge scientific knowledge, observations and products available to the citizen science, research and interested communities for these kinds of "mash-ups" as well as provide a means for automated systems to data mine our information. This tool and access method provides a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.

  1. Progress in Near Real-Time Volcanic Cloud Observations Using Satellite UV Instruments

    NASA Astrophysics Data System (ADS)

    Krotkov, N. A.; Yang, K.; Vicente, G.; Hughes, E. J.; Carn, S. A.; Krueger, A. J.

    2011-12-01

    Volcanic clouds from explosive eruptions can wreak havoc in many parts of the world, as exemplified by the 2010 eruption at the Eyjafjöll volcano in Iceland, which caused widespread disruption to air traffic and resulted in economic impacts across the globe. A suite of satellite-based systems offer the most effective means to monitor active volcanoes and to track the movement of volcanic clouds globally, providing critical information for aviation hazard mitigation. Satellite UV sensors, as part of this suite, have a long history of making unique near-real time (NRT) measurements of sulfur dioxide (SO2) and ash (aerosol Index) in volcanic clouds to supplement operational volcanic ash monitoring. Recently a NASA application project has shown that the use of near real-time (NRT,i.e., not older than 3 h) Aura/OMI satellite data produces a marked improvement in volcanic cloud detection using SO2 combined with Aerosol Index (AI) as a marker for ash. An operational online NRT OMI AI and SO2 image and data product distribution system was developed in collaboration with the NOAA Office of Satellite Data Processing and Distribution. Automated volcanic eruption alarms, and the production of volcanic cloud subsets for multiple regions are provided through the NOAA website. The data provide valuable information in support of the U.S. Federal Aviation Administration goal of a safe and efficient National Air Space. In this presentation, we will highlight the advantages of UV techniques and describe the advances in volcanic SO2 plume height estimation and enhanced volcanic ash detection using hyper-spectral UV measurements, illustrated with Aura/OMI observations of recent eruptions. We will share our plan to provide near-real-time volcanic cloud monitoring service using the Ozone Mapping and Profiler Suite (OMPS) on the Joint Polar Satellite System (JPSS).

  2. Using Radar, Lidar, and Radiometer measurements to Classify Cloud Type and Study Middle-Level Cloud Properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhien

    2010-06-29

    The project is mainly focused on the characterization of cloud macrophysical and microphysical properties, especially for mixed-phased clouds and middle level ice clouds by combining radar, lidar, and radiometer measurements available from the ACRF sites. First, an advanced mixed-phase cloud retrieval algorithm will be developed to cover all mixed-phase clouds observed at the ACRF NSA site. The algorithm will be applied to the ACRF NSA observations to generate a long-term arctic mixed-phase cloud product for model validations and arctic mixed-phase cloud processes studies. To improve the representation of arctic mixed-phase clouds in GCMs, an advanced understanding of mixed-phase cloud processesmore » is needed. By combining retrieved mixed-phase cloud microphysical properties with in situ data and large-scale meteorological data, the project aim to better understand the generations of ice crystals in supercooled water clouds, the maintenance mechanisms of the arctic mixed-phase clouds, and their connections with large-scale dynamics. The project will try to develop a new retrieval algorithm to study more complex mixed-phase clouds observed at the ACRF SGP site. Compared with optically thin ice clouds, optically thick middle level ice clouds are less studied because of limited available tools. The project will develop a new two wavelength radar technique for optically thick ice cloud study at SGP site by combining the MMCR with the W-band radar measurements. With this new algorithm, the SGP site will have a better capability to study all ice clouds. Another area of the proposal is to generate long-term cloud type classification product for the multiple ACRF sites. The cloud type classification product will not only facilitates the generation of the integrated cloud product by applying different retrieval algorithms to different types of clouds operationally, but will also support other research to better understand cloud properties and to validate model simulations. The ultimate goal is to improve our cloud classification algorithm into a VAP.« less

  3. Security Certification Challenges in a Cloud Computing Delivery Model

    DTIC Science & Technology

    2010-04-27

    Relevant Security Standards, Certifications, and Guidance  NIST SP 800 series  ISO /IEC 27001 framework  Cloud Security Alliance  Statement of...CSA Domains / Cloud Features ISO 27001 Cloud Service Provider Responsibility Government Agency Responsibility Analyze Security gaps Compensating

  4. EduCloud: PaaS versus IaaS Cloud Usage for an Advanced Computer Science Course

    ERIC Educational Resources Information Center

    Vaquero, L. M.

    2011-01-01

    The cloud has become a widely used term in academia and the industry. Education has not remained unaware of this trend, and several educational solutions based on cloud technologies are already in place, especially for software as a service cloud. However, an evaluation of the educational potential of infrastructure and platform clouds has not…

  5. Auspice: Automatic Service Planning in Cloud/Grid Environments

    NASA Astrophysics Data System (ADS)

    Chiu, David; Agrawal, Gagan

    Recent scientific advances have fostered a mounting number of services and data sets available for utilization. These resources, though scattered across disparate locations, are often loosely coupled both semantically and operationally. This loosely coupled relationship implies the possibility of linking together operations and data sets to answer queries. This task, generally known as automatic service composition, therefore abstracts the process of complex scientific workflow planning from the user. We have been exploring a metadata-driven approach toward automatic service workflow composition, among other enabling mechanisms, in our system, Auspice: Automatic Service Planning in Cloud/Grid Environments. In this paper, we present a complete overview of our system's unique features and outlooks for future deployment as the Cloud computing paradigm becomes increasingly eminent in enabling scientific computing.

  6. Secure and Resilient Cloud Computing for the Department of Defense

    DTIC Science & Technology

    2015-11-16

    platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail

  7. Cloud Applications in Language Teaching: Examining Pre-Service Teachers' Expertise, Perceptions and Integration

    ERIC Educational Resources Information Center

    Aburezeq, Ibtehal Mahmoud; Dweikat, Fawzi Fayez Ishtaiwa

    2017-01-01

    This study examined pre-service teachers' expertise, perceptions and integration of cloud applications in teaching of Arabic and English. Questionnaires and semi-structured interviews were used as data collection methods. The findings of the study specified that pre-service teachers did not own sufficient expertise for effective integration of…

  8. Pre-Service Teachers' Opinions on Cloud Supported Social Network

    ERIC Educational Resources Information Center

    Ozcan, Seher; Gokcearslan, Sahin; Kukul, Volkan

    2015-01-01

    Pre-service teachers are expected to use new technologies such as Google+ which facilitates contacting, sharing in certain environments and working collaboratively with the help of cloud support in their lessons effectively. This study aims to examine pre-service teachers' opinions regarding the use of Google+ to support lesson activities. In this…

  9. OpenID Connect as a security service in cloud-based medical imaging systems

    PubMed Central

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-01-01

    Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  10. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  11. Abstracting application deployment on Cloud infrastructures

    NASA Astrophysics Data System (ADS)

    Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.

    2017-10-01

    Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.

  12. Application of the CERES Flux-by-Cloud Type Simulator to GCM Output

    NASA Technical Reports Server (NTRS)

    Eitzen, Zachary; Su, Wenying; Xu, Kuan-Man; Loeb, Norman G.; Sun, Moguo; Doelling, David R.; Bodas-Salcedo, Alejandro

    2016-01-01

    The CERES Flux By CloudType data product produces CERES top-of-atmosphere (TOA) fluxes by region and cloud type. Here, the cloud types are defined by cloud optical depth (t) and cloud top pressure (pc), with bins similar to those used by ISCCP (International Satellite Cloud Climatology Project). This data product has the potential to be a powerful tool for the evaluation of the clouds produced by climate models by helping to identify which physical parameterizations have problems (e.g., boundary-layer parameterizations, convective clouds, processes that affect surface albedo). Also, when the flux-by-cloud type and frequency of cloud types are simultaneously used to evaluate a model, the results can determine whether an unrealistically large or small occurrence of a given cloud type has an important radiative impact for a given region. A simulator of the flux-by-cloud type product has been applied to three-hourly data from the year 2008 from the UK Met Office HadGEM2-A model using the Langley Fu-Lour radiative transfer model to obtain TOA SW and LW fluxes.

  13. Filtering and Gridding Satellite Observations of Cloud Variables to Compare with Climate Model Output

    NASA Astrophysics Data System (ADS)

    Pitts, K.; Nasiri, S. L.; Smith, N.

    2013-12-01

    Global climate models have improved considerably over the years, yet clouds still represent a large factor of uncertainty for these models. Comparisons of model-simulated cloud variables with equivalent satellite cloud products are the best way to start diagnosing the differences between model output and observations. Gridded (level 3) cloud products from many different satellites and instruments are required for a full analysis, but these products are created by different science teams using different algorithms and filtering criteria to create similar, but not directly comparable, cloud products. This study makes use of a recently developed uniform space-time gridding algorithm to create a new set of gridded cloud products from each satellite instrument's level 2 data of interest which are each filtered using the same criteria, allowing for a more direct comparison between satellite products. The filtering is done via several variables such as cloud top pressure/height, thermodynamic phase, optical properties, satellite viewing angle, and sun zenith angle. The filtering criteria are determined based on the variable being analyzed and the science question at hand. Each comparison of different variables may require different filtering strategies as no single approach is appropriate for all problems. Beyond inter-satellite data comparison, these new sets of uniformly gridded satellite products can also be used for comparison with model-simulated cloud variables. Of particular interest to this study are the differences in the vertical distributions of ice and liquid water content between the satellite retrievals and model simulations, especially in the mid-troposphere where there are mixed-phase clouds to consider. This presentation will demonstrate the proof of concept through comparisons of cloud water path from Aqua MODIS retrievals and NASA GISS-E2-[R/H] model simulations archived in the CMIP5 data portal.

  14. Cardiovascular imaging environment: will the future be cloud-based?

    PubMed

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  15. OMMYDCLD: a New A-train Cloud Product that Co-locates OMI and MODIS Cloud and Radiance Parameters onto the OMI Footprint

    NASA Technical Reports Server (NTRS)

    Fisher, Brad; Joiner, Joanna; Vasilkov, Alexander; Veefkind, Pepijn; Platnick, Steven; Wind, Galina

    2014-01-01

    Clouds cover approximately 60% of the earth's surface. When obscuring the satellite's field of view (FOV), clouds complicate the retrieval of ozone, trace gases and aerosols from data collected by earth observing satellites. Cloud properties associated with optical thickness, cloud pressure, water phase, drop size distribution (DSD), cloud fraction, vertical and areal extent can also change significantly over short spatio-temporal scales. The radiative transfer models used to retrieve column estimates of atmospheric constituents typically do not account for all these properties and their variations. The OMI science team is preparing to release a new data product, OMMYDCLD, which combines the cloud information from sensors on board two earth observing satellites in the NASA A-Train: Aura/OMI and Aqua/MODIS. OMMYDCLD co-locates high resolution cloud and radiance information from MODIS onto the much larger OMI pixel and combines it with parameters derived from the two other OMI cloud products: OMCLDRR and OMCLDO2. The product includes histograms for MODIS scientific data sets (SDS) provided at 1 km resolution. The statistics of key data fields - such as effective particle radius, cloud optical thickness and cloud water path - are further separated into liquid and ice categories using the optical and IR phase information. OMMYDCLD offers users of OMI data cloud information that will be useful for carrying out OMI calibration work, multi-year studies of cloud vertical structure and in the identification and classification of multi-layer clouds.

  16. Comparison of Monthly Mean Cloud Fraction and Cloud Optical depth Determined from Surface Cloud Radar, TOVS, AVHRR, and MODIS over Barrow, Alaska

    NASA Technical Reports Server (NTRS)

    Uttal, Taneil; Frisch, Shelby; Wang, Xuan-Ji; Key, Jeff; Schweiger, Axel; Sun-Mack, Sunny; Minnis, Patrick

    2005-01-01

    A one year comparison is made of mean monthly values of cloud fraction and cloud optical depth over Barrow, Alaska (71 deg 19.378 min North, 156 deg 36.934 min West) between 35 GHz radar-based retrievals, the TOVS Pathfinder Path-P product, the AVHRR APP-X product, and a MODIS based cloud retrieval product from the CERES-Team. The data sets represent largely disparate spatial and temporal scales, however, in this paper, the focus is to provide a preliminary analysis of how the mean monthly values derived from these different data sets compare, and determine how they can best be used separately, and in combination to provide reliable estimates of long-term trends of changing cloud properties. The radar and satellite data sets described here incorporate Arctic specific modifications that account for cloud detection challenges specific to the Arctic environment. The year 2000 was chosen for this initial comparison because the cloud radar data was particularly continuous and reliable that year, and all of the satellite retrievals of interest were also available for the year 2000. Cloud fraction was chosen as a comparison variable as accurate detection of cloud is the primary product that is necessary for any other cloud property retrievals. Cloud optical depth was additionally selected as it is likely the single cloud property that is most closely correlated to cloud influences on surface radiation budgets.

  17. Implementation of Online Veterinary Hospital on Cloud Platform.

    PubMed

    Chen, Tzer-Shyong; Chen, Tzer-Long; Chung, Yu-Fang; Huang, Yao-Min; Chen, Tao-Chieh; Wang, Huihui; Wei, Wei

    2016-06-01

    Pet markets involve in great commercial possibilities, which boost thriving development of veterinary hospital businesses. The service tends to intensive competition and diversified channel environment. Information technology is integrated for developing the veterinary hospital cloud service platform. The platform contains not only pet medical services but veterinary hospital management and services. In the study, QR Code andcloud technology are applied to establish the veterinary hospital cloud service platform for pet search by labeling a pet's identification with QR Code. This technology can break the restriction on veterinary hospital inspection in different areas and allows veterinary hospitals receiving the medical records and information through the exclusive QR Code for more effective inspection. As an interactive platform, the veterinary hospital cloud service platform allows pet owners gaining the knowledge of pet diseases and healthcare. Moreover, pet owners can enquire and communicate with veterinarians through the platform. Also, veterinary hospitals can periodically send reminders of relevant points and introduce exclusive marketing information with the platform for promoting the service items and establishing individualized marketing. Consequently, veterinary hospitals can increase the profits by information share and create the best solution in such a competitive veterinary market with industry alliance.

  18. Estimating Precipitation Susceptibility in Warm Marine Clouds Using Multi-sensor Aerosol and Cloud Products from A-Train Satellites

    NASA Astrophysics Data System (ADS)

    Bai, H.; Gong, C.; Wang, M.; Zhang, Z.

    2017-12-01

    Precipitation susceptibility to aerosol perturbation plays a key role in understanding aerosol-cloud interactions and constraining aerosol indirect effects. However, large discrepancies exist in the previous satellite estimates of precipitation susceptibility. In this paper, multi-sensor aerosol and cloud products, including those from CALIPSO, CloudSat, MODIS, and AMSR-E from June 2006 to April 2011 are analyzed to estimate precipitation susceptibility (including precipitation frequency susceptibility SPOP, precipitation intensity susceptibility SI, and precipitation rate susceptibility SR) in warm marine clouds. Our results show that SPOP demonstrates relatively robust features throughout independent LWP products and diverse rain products. In contrast, the behaviors of SI are more subject to LWP or rain products. Our results further show that SPOP strongly depends on atmospherics stability, with larger value under more stable environment. Precipitation susceptibility calculated with respect to cloud droplet number concentration (CDNC) is generally much larger than that estimated with respect to aerosol index (AI), which results from the weak dependency of CDNC on AI.

  19. The Use of OMPS Near Real Time Products in Volcanic Cloud Risk Mitigation and Smoke/Dust Air Quality Assessments

    NASA Astrophysics Data System (ADS)

    Seftor, C. J.; Krotkov, N. A.; McPeters, R. D.; Li, J. Y.; Durbin, P. B.

    2015-12-01

    Near real time (NRT) SO2 and aerosol index (AI) imagery from Aura's Ozone Monitoring Instrument (OMI) has proven invaluable in mitigating the risk posed to air traffic by SO2 and ash clouds from volcanic eruptions. The OMI products, generated as part of NASA's Land, Atmosphere Near real-time Capability for EOS (LANCE) NRT system and available through LANCE and both NOAA's NESDIS and ESA's Support to Aviation Control Service (SACS) portals, are used to monitor the current location of volcanic clouds and to provide input into Volcanic Ash (VA) advisory forecasts. NRT products have recently been developed using data from the Ozone Mapping and Profiler Suite onboard the Suomi NPP platform; they are currently being made available through the SACS portal and will shortly be incorporated into the LANCE NRT system. We will show examples of the use of OMPS NRT SO2 and AI imagery to monitor recent volcanic eruption events. We will also demonstrate the usefulness of OMPS AI imagery to detect and track dust storms and smoke from fires, and how this information can be used to forecast their impact on air quality in areas far removed from their source. Finally, we will show SO2 and AI imagery generated from our OMPS Direct Broadcast data to highlight the capability of our real time system.

  20. Using Google Applications as Part of Cloud Computing to Improve Knowledge and Teaching Skills of Faculty Members at the University of Bisha, Bisha, Saudi Arabia

    ERIC Educational Resources Information Center

    Alshihri, Bandar A.

    2017-01-01

    Cloud computing is a recent computing paradigm that has been integrated into the educational system. It provides numerous opportunities for delivering a variety of computing services in a way that has not been experienced before. The Google Company is among the top business companies that afford their cloud services by launching a number of…

  1. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  2. Atlas2 Cloud: a framework for personal genome analysis in the cloud

    PubMed Central

    2012-01-01

    Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663

  3. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    PubMed

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  4. Opportunities and challenges provided by cloud repositories for bioinformatics-enabled drug discovery.

    PubMed

    Dalpé, Gratien; Joly, Yann

    2014-09-01

    Healthcare-related bioinformatics databases are increasingly offering the possibility to maintain, organize, and distribute DNA sequencing data. Different national and international institutions are currently hosting such databases that offer researchers website platforms where they can obtain sequencing data on which they can perform different types of analysis. Until recently, this process remained mostly one-dimensional, with most analysis concentrated on a limited amount of data. However, newer genome sequencing technology is producing a huge amount of data that current computer facilities are unable to handle. An alternative approach has been to start adopting cloud computing services for combining the information embedded in genomic and model system biology data, patient healthcare records, and clinical trials' data. In this new technological paradigm, researchers use virtual space and computing power from existing commercial or not-for-profit cloud service providers to access, store, and analyze data via different application programming interfaces. Cloud services are an alternative to the need of larger data storage; however, they raise different ethical, legal, and social issues. The purpose of this Commentary is to summarize how cloud computing can contribute to bioinformatics-based drug discovery and to highlight some of the outstanding legal, ethical, and social issues that are inherent in the use of cloud services. © 2014 Wiley Periodicals, Inc.

  5. Privacy-preserving public auditing for data integrity in cloud

    NASA Astrophysics Data System (ADS)

    Shaik Saleem, M.; Murali, M.

    2018-04-01

    Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.

  6. Dynamic virtual machine allocation policy in cloud computing complying with service level agreement using CloudSim

    NASA Astrophysics Data System (ADS)

    Aneri, Parikh; Sumathy, S.

    2017-11-01

    Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.

  7. Applications for Near-Real Time Satellite Cloud and Radiation Products

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Palikonda, Rabindra; Chee, Thad L.; Bedka, Kristopher M.; Smith, W.; Ayers, Jeffrey K.; Benjamin, Stanley; Chang, F.-L.; Nguyen, Louis; Norris, Peter; hide

    2012-01-01

    At NASA Langley Research Center, a variety of cloud, clear-sky, and radiation products are being derived at different scales from regional to global using geostationary satellite (GEOSat) and lower Earth-orbiting (LEOSat) imager data. With growing availability, these products are becoming increasingly valuable for weather forecasting and nowcasting. These products include, but are not limited to, cloud-top and base heights, cloud water path and particle size, cloud temperature and phase, surface skin temperature and albedo, and top-of-atmosphere radiation budget. Some of these data products are currently assimilated operationally in a numerical weather prediction model. Others are used unofficially for nowcasting, while testing is underway for other applications. These applications include the use of cloud water path in an NWP model, cloud optical depth for detecting convective initiation in cirrus-filled skies, and aircraft icing condition diagnoses among others. This paper briefly describes a currently operating system that analyzes data from GEOSats around the globe (GOES, Meteosat, MTSAT, FY-2) and LEOSats (AVHRR and MODIS) and makes the products available in near-real time through a variety of media. Current potential future use of these products is discussed.

  8. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    PubMed Central

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811

  9. Genomic cloud computing: legal and ethical points to consider

    PubMed Central

    Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Burton, Paul; Chisholm, Rex; Fortier, Isabel; Goodwin, Pat; Harris, Jennifer; Hveem, Kristian; Kaye, Jane; Kent, Alistair; Knoppers, Bartha Maria; Lindpaintner, Klaus; Little, Julian; Riegman, Peter; Ripatti, Samuli; Stolk, Ronald; Bobrow, Martin; Cambon-Thomsen, Anne; Dressler, Lynn; Joly, Yann; Kato, Kazuto; Knoppers, Bartha Maria; Rodriguez, Laura Lyman; McPherson, Treasa; Nicolás, Pilar; Ouellette, Francis; Romeo-Casabona, Carlos; Sarin, Rajiv; Wallace, Susan; Wiesner, Georgia; Wilson, Julia; Zeps, Nikolajs; Simkevitz, Howard; De Rienzo, Assunta; Knoppers, Bartha M

    2015-01-01

    The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key ‘points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These ‘points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure. PMID:25248396

  10. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    PubMed

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  11. Genomic cloud computing: legal and ethical points to consider.

    PubMed

    Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Knoppers, Bartha M

    2015-10-01

    The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key 'points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These 'points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure.

  12. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    PubMed Central

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  13. Geometric data perturbation-based personal health record transactions in cloud computing.

    PubMed

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  14. Results from the Two-Year Infrared Cloud Imager Deployment at ARM's NSA Observatory in Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Shaw, J. A.; Nugent, P. W.

    2016-12-01

    Ground-based longwave-infrared (LWIR) cloud imaging can provide continuous cloud measurements in the Arctic. This is of particular importance during the Arctic winter when visible wavelength cloud imaging systems cannot operate. This method uses a thermal infrared camera to observe clouds and produce measurements of cloud amount and cloud optical depth. The Montana State University Optical Remote Sensor Laboratory deployed an infrared cloud imager (ICI) at the Atmospheric Radiation Monitoring North Slope of Alaska site at Barrow, AK from July 2012 through July 2014. This study was used to both understand the long-term operation of an ICI in the Arctic and to study the consistency of the ICI data products in relation to co-located active and passive sensors. The ICI was found to have a high correlation (> 0.92) with collocated cloud instruments and to produce an unbiased data product. However, the ICI also detects thin clouds that are not detected by most operational cloud sensors. Comparisons with high-sensitivity actively sensed cloud products confirm the existence of these thin clouds. Infrared cloud imaging systems can serve a critical role in developing our understanding of cloud cover in the Arctic by provided a continuous annual measurement of clouds at sites of interest.

  15. The Satellite Data Thematic Core Service within the EPOS Research Infrastructure

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Casu, Francesco; Zinno, Ivana; De Luca, Claudio; Buonanno, Sabatino; Zeni, Giovanni; Wright, Tim; Hooper, Andy; Diament, Michel; Ostanciaux, Emilie; Mandea, Mioara; Walter, Thomas; Maccaferri, Francesco; Fernandez, Josè; Stramondo, Salvatore; Bignami, Christian; Bally, Philippe; Pinto, Salvatore; Marin, Alessandro; Cuomo, Antonio

    2017-04-01

    EPOS, the European Plate Observing System, is a long-term plan to facilitate the integrated use of data, data products, software and services, available from distributed Research Infrastructures (RI), for solid Earth science in Europe. Indeed, EPOS integrates a large number of existing European RIs belonging to several fields of the Earth science, from seismology to geodesy, near fault and volcanic observatories as well as anthropogenic hazards. The EPOS vision is that the integration of the existing national and trans-national research infrastructures will increase access and use of the multidisciplinary data recorded by the solid Earth monitoring networks, acquired in laboratory experiments and/or produced by computational simulations. The establishment of EPOS will foster the interoperability of products and services in the Earth science field to a worldwide community of users. Accordingly, the EPOS aim is to integrate the diverse and advanced European Research Infrastructures for solid Earth science, and build on new e-science opportunities to monitor and understand the dynamic and complex solid-Earth System. One of the EPOS Thematic Core Services (TCS), referred to as Satellite Data, aims at developing, implementing and deploying advanced satellite data products and services, mainly based on Copernicus data (namely Sentinel acquisitions), for the Earth science community. This work intends to present the technological enhancements, fostered by EPOS, to deploy effective satellite services in a harmonized and integrated way. In particular, the Satellite Data TCS will deploy five services, EPOSAR, GDM, COMET, 3D-Def and MOD, which are mainly based on the exploitation of SAR data acquired by the Sentinel-1 constellation and designed to provide information on Earth surface displacements. In particular, the planned services will provide both advanced DInSAR products (deformation maps, velocity maps, deformation time series) and value-added measurements (source model, 3D displacement maps, seismic hazard maps). Moreover, the services will release both on-demand and systematic products. The latter will be generated and made available to the users on a continuous basis, by processing each Sentinel-1 data once acquired, over a defined number of areas of interest; while the former will allow users to select data, areas, and time period to carry out their own analyses via an on-line platform. The satellite components will be integrated within the EPOS infrastructure through a common and harmonized interface that will allow users to search, process and share remote sensing images and results. This gateway to the satellite services will be represented by the ESA- Geohazards Exploitation Platform (GEP), a new cloud-based platform for the satellite Earth Observations designed to support the scientific community in the understanding of high impact natural disasters. Satellite Data TCS will use GEP as the common interface toward the main EPOS portal to provide EPOS users not only with data products but also with relevant processing and visualisation software, thus allowing users to gather and process on a cloud-computing infrastructure large datasets without any need to download them locally.

  16. Probabilistic verification of cloud fraction from three different products with CALIPSO

    NASA Astrophysics Data System (ADS)

    Jung, B. J.; Descombes, G.; Snyder, C.

    2017-12-01

    In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.

  17. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  18. If It's in the Cloud, Get It on Paper: Cloud Computing Contract Issues

    ERIC Educational Resources Information Center

    Trappler, Thomas J.

    2010-01-01

    Much recent discussion has focused on the pros and cons of cloud computing. Some institutions are attracted to cloud computing benefits such as rapid deployment, flexible scalability, and low initial start-up cost, while others are concerned about cloud computing risks such as those related to data location, level of service, and security…

  19. Cloud-based hospital information system as a service for grassroots healthcare institutions.

    PubMed

    Yao, Qin; Han, Xiong; Ma, Xi-Kun; Xue, Yi-Feng; Chen, Yi-Jun; Li, Jing-Song

    2014-09-01

    Grassroots healthcare institutions (GHIs) are the smallest administrative levels of medical institutions, where most patients access health services. The latest report from the National Bureau of Statistics of China showed that 96.04 % of 950,297 medical institutions in China were at the grassroots level in 2012, including county-level hospitals, township central hospitals, community health service centers, and rural clinics. In developing countries, these institutions are facing challenges involving a shortage of funds and talent, inconsistent medical standards, inefficient information sharing, and difficulties in management during the adoption of health information technologies (HIT). Because of the necessity and gravity for GHIs, our aim is to provide hospital information services for GHIs using Cloud computing technologies and service modes. In this medical scenario, the computing resources are pooled by means of a Cloud-based Virtual Desktop Infrastructure (VDI) to serve multiple GHIs, with different hospital information systems dynamically assigned and reassigned according to demand. This paper is concerned with establishing a Cloud-based Hospital Information Service Center to provide hospital information software as a service (HI-SaaS) with the aim of providing GHIs with an attractive and high-performance medical information service. Compared with individually establishing all hospital information systems, this approach is more cost-effective and affordable for GHIs and does not compromise HIT performance.

  20. Satellite Derived Volcanic Ash Product Inter-Comparison in Support to SCOPE-Nowcasting

    NASA Astrophysics Data System (ADS)

    Siddans, Richard; Thomas, Gareth; Pavolonis, Mike; Bojinski, Stephan

    2016-04-01

    In support of aeronautical meteorological services, WMO organized a satellite-based volcanic ash retrieval algorithm inter-comparison activity, to improve the consistency of quantitative volcanic ash products from satellites, under the Sustained, Coordinated Processing of Environmental Satellite Data for Nowcasting (SCOPEe Nowcasting) initiative (http:/ jwww.wmo.int/pagesjprogjsatjscopee nowcasting_en.php). The aims of the intercomparison were as follows: 1. Select cases (Sarychev Peak 2009, Eyjafyallajökull 2010, Grimsvötn 2011, Puyehue-Cordón Caulle 2011, Kirishimayama 2011, Kelut 2014), and quantify the differences between satellite-derived volcanic ash cloud properties derived from different techniques and sensors; 2. Establish a basic validation protocol for satellite-derived volcanic ash cloud properties; 3. Document the strengths and weaknesses of different remote sensing approaches as a function of satellite sensor; 4. Standardize the units and quality flags associated with volcanic cloud geophysical parameters; 5. Provide recommendations to Volcanic Ash Advisory Centers (VAACs) and other users on how to best to utilize quantitative satellite products in operations; 6. Create a "road map" for future volcanic ash related scientific developments and inter-comparison/validation activities that can also be applied to SO2 clouds and emergent volcanic clouds. Volcanic ash satellite remote sensing experts from operational and research organizations were encouraged to participate in the inter-comparison activity, to establish the plans for the inter-comparison and to submit data sets. RAL was contracted by EUMETSAT to perform a systematic inter-comparison of all submitted datasets and results were reported at the WMO International Volcanic Ash Inter-comparison Meeting to held on 29 June - 2 July 2015 in Madison, WI, USA (http:/ /cimss.ssec.wisc.edujmeetings/vol_ash14). 26 different data sets were submitted, from a range of passive imagers and spectrometers and these were inter-compared against each other and against validation data such as CALIPSO lidar, ground-based lidar and aircraft observations. Results of the comparison exercise will be presented together with the conclusions and recommendations arising from the activity.

  1. Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services.

    PubMed

    Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong

    2016-01-01

    Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.

  2. Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services

    PubMed Central

    Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong

    2016-01-01

    Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider. PMID:27571421

  3. Narrative medicine and death in the ICU: word clouds as a visual legacy.

    PubMed

    Vanstone, Meredith; Toledo, Feli; Clarke, France; Boyle, Anne; Giacomini, Mita; Swinton, Marilyn; Saunders, Lois; Shears, Melissa; Zytaruk, Nicole; Woods, Anne; Rose, Trudy; Hand-Breckenridge, Tracey; Heels-Ansdell, Diane; Anderson-White, Shelley; Sheppard, Robert; Cook, Deborah

    2016-11-24

    The Word Cloud is a frequent wish in the 3 Wishes Project developed to nurture peace and ease the grieving process for dying critically ill patients. The objective was to examine whether Word Clouds can act as a heuristic approach to encourage a narrative orientation to medicine. Narrative medicine is an approach which can strengthen relationships, compassion and resilience. Word Clouds were created for 42 dying patients, and we interviewed 37 family members and 73 clinicians about their impact. We conducted a directed qualitative content analysis, using the 3 stages of narrative medicine (attention, representation, affiliation) to examine the narrative medicine potential of Word Clouds. The elicitation of stories for the Word Cloud promotes narrative attention to the patient as a whole person. The distillation of these stories into a list of words and the prioritisation of those words for arrangement in the collage encourages a representation that did not enforce a beginning, middle or end to the story of the patient's life. Strong affiliative connections were achieved through the honouring of patients, caring for families and sharing of memories encouraged through the creation, sharing and discussion of Word Clouds. In the 3 Wishes Project, Word Clouds are 1 way that families and clinicians honour a dying patient. Engaging in the process of making a Word Cloud can promote a narrative orientation to medicine, forging connections, making meaning through reminiscence and leaving a legacy of a loved one. Documenting and displaying words to remember someone in death reaffirms their life. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Aviation response to a widely dispersed volcanic ash and gas cloud from the August 2008 eruption of Kasatochi, Alaska, USA

    USGS Publications Warehouse

    Guffanti, Marianne; Schneider, David J.; Wallace, Kristi L.; Hall, Tony; Bensimon, Dov R.; Salinas, Leonard J.

    2010-01-01

    The extensive volcanic cloud from Kasatochi's 2008 eruption caused widespread disruptions to aviation operations along Pacific oceanic, Canadian, and U.S. air routes. Based on aviation hazard warnings issued by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, the Federal Aviation Administration, and Meteorological Service of Canada, air carriers largely avoided the volcanic cloud over a 5 day period by route modifications and flight cancellations. Comparison of time coincident GOES thermal infrared (TIR) data for ash detection with Ozone Monitoring Instrument (OMI) ultraviolet data for SO2 detection shows congruent areas of ash and gas in the volcanic cloud in the 2 days following onset of ash production. After about 2.5 days, the area of SO2 detected by OMI was more extensive than the area of ash indicated by TIR data, indicating significant ash depletion by fall out had occurred. Pilot reports of visible haze at cruise altitudes over Canada and the northern United States suggested that SO2 gas had converted to sulfate aerosols. Uncertain about the hazard potential of the aging cloud, airlines coped by flying over, under, or around the observed haze layer. Samples from a nondamaging aircraft encounter with Kasatochi's nearly 3 day old cloud contained volcanic silicate particles, confirming that some fine ash is present in predominantly gas clouds. The aircraft's exposure to ash was insufficient to cause engine damage; however, slightly damaging encounters with volcanic clouds from eruptions of Reventador in 2002 and Hekla in 2000 indicate the possibility of lingering hazards associated with old and/or diffuse volcanic clouds.

  5. Parameterization of cloud glaciation by atmospheric dust

    NASA Astrophysics Data System (ADS)

    Nickovic, Slobodan; Cvetkovic, Bojan; Madonna, Fabio; Pejanovic, Goran; Petkovic, Slavko

    2016-04-01

    The exponential growth of research interest on ice nucleation (IN) is motivated, inter alias, by needs to improve generally unsatisfactory representation of cold cloud formation in atmospheric models, and therefore to increase the accuracy of weather and climate predictions, including better forecasting of precipitation. Research shows that mineral dust significantly contributes to cloud ice nucleation. Samples of residual particles in cloud ice crystals collected by aircraft measurements performed in the upper tropopause of regions distant from desert sources indicate that dust particles dominate over other known ice nuclei such as soot and biological particles. In the nucleation process, dust chemical aging had minor effects. The observational evidence on IN processes has substantially improved over the last decade and clearly shows that there is a significant correlation between IN concentrations and the concentrations of coarser aerosol at a given temperature and moisture. Most recently, due to recognition of the dominant role of dust as ice nuclei, parameterizations for immersion and deposition icing specifically due to dust have been developed. Based on these achievements, we have developed a real-time forecasting coupled atmosphere-dust modelling system capable to operationally predict occurrence of cold clouds generated by dust. We have been thoroughly validated model simulations against available remote sensing observations. We have used the CNR-IMAA Potenza lidar and cloud radar observations to explore the model capability to represent vertical features of the cloud and aerosol vertical profiles. We also utilized the MSG-SEVIRI and MODIS satellite data to examine the accuracy of the simulated horizontal distribution of cold clouds. Based on the obtained encouraging verification scores, operational experimental prediction of ice clouds nucleated by dust has been introduced in the Serbian Hydrometeorological Service as a public available product.

  6. Verification of NWP Cloud Properties using A-Train Satellite Observations

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Weeks, C.; Wolff, C.; Bullock, R.; Brown, B.

    2011-12-01

    Recently, the NCAR Model Evaluation Tools (MET) has been enhanced to incorporate satellite observations for the verification of Numerical Weather Prediction (NWP) cloud products. We have developed tools that match fields spatially (both in the vertical and horizontal dimensions) to compare NWP products with satellite observations. These matched fields provide diagnostic evaluation of cloud macro attributes such as vertical distribution of clouds, cloud top height, and the spatial and seasonal distribution of cloud fields. For this research study, we have focused on using CloudSat, CALIPSO, and MODIS observations to evaluate cloud fields for a variety of NWP fields and derived products. We have selected cases ranging from large, mid-latitude synoptic systems to well-organized tropical cyclones. For each case, we matched the observed cloud field with gridded model and/or derived product fields. CloudSat and CALIPSO observations and model fields were matched and compared in the vertical along the orbit track. MODIS data and model fields were matched and compared in the horizontal. We then use MET to compute the verification statistics to quantify the performance of the models in representing the cloud fields. In this presentation we will give a summary of our comparison and show verification results for both synoptic and tropical cyclone cases.

  7. Design and implementation of a reliable and cost-effective cloud computing infrastructure: the INFN Napoli experience

    NASA Astrophysics Data System (ADS)

    Capone, V.; Esposito, R.; Pardi, S.; Taurino, F.; Tortone, G.

    2012-12-01

    Over the last few years we have seen an increasing number of services and applications needed to manage and maintain cloud computing facilities. This is particularly true for computing in high energy physics, which often requires complex configurations and distributed infrastructures. In this scenario a cost effective rationalization and consolidation strategy is the key to success in terms of scalability and reliability. In this work we describe an IaaS (Infrastructure as a Service) cloud computing system, with high availability and redundancy features, which is currently in production at INFN-Naples and ATLAS Tier-2 data centre. The main goal we intended to achieve was a simplified method to manage our computing resources and deliver reliable user services, reusing existing hardware without incurring heavy costs. A combined usage of virtualization and clustering technologies allowed us to consolidate our services on a small number of physical machines, reducing electric power costs. As a result of our efforts we developed a complete solution for data and computing centres that can be easily replicated using commodity hardware. Our architecture consists of 2 main subsystems: a clustered storage solution, built on top of disk servers running GlusterFS file system, and a virtual machines execution environment. GlusterFS is a network file system able to perform parallel writes on multiple disk servers, providing this way live replication of data. High availability is also achieved via a network configuration using redundant switches and multiple paths between hypervisor hosts and disk servers. We also developed a set of management scripts to easily perform basic system administration tasks such as automatic deployment of new virtual machines, adaptive scheduling of virtual machines on hypervisor hosts, live migration and automated restart in case of hypervisor failures.

  8. Ten Years of Cloud Properties from MODIS: Global Statistics and Use in Climate Model Evaluation

    NASA Technical Reports Server (NTRS)

    Platnick, Steven E.

    2011-01-01

    The NASA Moderate Resolution Imaging Spectroradiometer (MODIS), launched onboard the Terra and Aqua spacecrafts, began Earth observations on February 24, 2000 and June 24,2002, respectively. Among the algorithms developed and applied to this sensor, a suite of cloud products includes cloud masking/detection, cloud-top properties (temperature, pressure), and optical properties (optical thickness, effective particle radius, water path, and thermodynamic phase). All cloud algorithms underwent numerous changes and enhancements between for the latest Collection 5 production version; this process continues with the current Collection 6 development. We will show example MODIS Collection 5 cloud climatologies derived from global spatial . and temporal aggregations provided in the archived gridded Level-3 MODIS atmosphere team product (product names MOD08 and MYD08 for MODIS Terra and Aqua, respectively). Data sets in this Level-3 product include scalar statistics as well as 1- and 2-D histograms of many cloud properties, allowing for higher order information and correlation studies. In addition to these statistics, we will show trends and statistical significance in annual and seasonal means for a variety of the MODIS cloud properties, as well as the time required for detection given assumed trends. To assist in climate model evaluation, we have developed a MODIS cloud simulator with an accompanying netCDF file containing subsetted monthly Level-3 statistical data sets that correspond to the simulator output. Correlations of cloud properties with ENSO offer the potential to evaluate model cloud sensitivity; initial results will be discussed.

  9. Method for validating cloud mask obtained from satellite measurements using ground-based sky camera.

    PubMed

    Letu, Husi; Nagao, Takashi M; Nakajima, Takashi Y; Matsumae, Yoshiaki

    2014-11-01

    Error propagation in Earth's atmospheric, oceanic, and land surface parameters of the satellite products caused by misclassification of the cloud mask is a critical issue for improving the accuracy of satellite products. Thus, characterizing the accuracy of the cloud mask is important for investigating the influence of the cloud mask on satellite products. In this study, we proposed a method for validating multiwavelength satellite data derived cloud masks using ground-based sky camera (GSC) data. First, a cloud cover algorithm for GSC data has been developed using sky index and bright index. Then, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data derived cloud masks by two cloud-screening algorithms (i.e., MOD35 and CLAUDIA) were validated using the GSC cloud mask. The results indicate that MOD35 is likely to classify ambiguous pixels as "cloudy," whereas CLAUDIA is likely to classify them as "clear." Furthermore, the influence of error propagations caused by misclassification of the MOD35 and CLAUDIA cloud masks on MODIS derived reflectance, brightness temperature, and normalized difference vegetation index (NDVI) in clear and cloudy pixels was investigated using sky camera data. It shows that the influence of the error propagation by the MOD35 cloud mask on the MODIS derived monthly mean reflectance, brightness temperature, and NDVI for clear pixels is significantly smaller than for the CLAUDIA cloud mask; the influence of the error propagation by the CLAUDIA cloud mask on MODIS derived monthly mean cloud products for cloudy pixels is significantly smaller than that by the MOD35 cloud mask.

  10. Level 4 Global and European Chl-a Daily Analyses for End Users and Data Assimilation in the Frame of the Copernicus-Marine Environment Monitoring Service

    NASA Astrophysics Data System (ADS)

    Saulquin, Bertrand; Gohin, Francis; Garnesson, Philippe; Demaria, Julien; Mangin, Antoine; Fanton d'Andon, Odile

    2016-08-01

    The level-4 daily chl-a products are a combination of a water typed merge of chl-a estimates and an optimal interpolation based on the kriging method with regional anisotropic models [1, 2]. The Level 4 products basically pro- vide a global continuous (cloud free) estimation of the surface chl-a concentration at 4 km resolution over the world and 1 km resolution over the Europe. The level-4 products gather MODIS, MERIS, SeaWiFS, VIIRS and OLCI daily observations from 1998 to now.The Level 4 product avoids end users to consider typical lack of data as observed during cloudy conditions and the historical multiplicity of available algorithms such as involved by case 1 (oligotrophic) and case 2 (turbid) water issues in ocean colour. [3, 4].A total product uncertainty, i.e. a combination of the interpolation and the estimation error, is provided for each daily product. The L4 products are freely distributed in the frame of the Copernicus - Marine environment monitoring service.

  11. AIRS Data Subsetting Service at the Goddard Earth Sciences (GES) DISC/DAAC

    NASA Technical Reports Server (NTRS)

    Vicente, Gilberto A.; Qin, Jianchun; Li, Jason; Gerasimov, Irina; Savtchenko, Andrey

    2004-01-01

    The AIRS mission, as a combination of the Atmospheric Infrared Sounder (AIRS), the Advanced Microwave Sounding Unit (AMSU) and the Humidity Sounder for Brazil (HSB), brings climate research and weather prediction into 21st century. From NASA' Aqua spacecraft, the AIRS/AMSU/HSB instruments measure humidity, temperature, cloud properties and the amounts of greenhouse gases. The AIRS also reveals land and sea- surface temperatures. Measurements from these three instruments are analyzed . jointly to filter out the effects of clouds from the IR data in order to derive clear-column air-temperature profiles and surface temperatures with high vertical resolution and accuracy. Together, they constitute an advanced operational sounding data system that have contributed to improve global modeling efforts and numerical weather prediction; enhance studies of the global energy and water cycles, the effects of greenhouse gases, and atmosphere-surface interactions; and facilitate monitoring of climate variations and trends. The high data volume generated by the AIRS/AMSU/HSB instruments and the complexity of its data format (Hierarchical Data Format, HDF) are barriers to AIRS data use. Although many researchers are interested in only a fraction of the data they receive or request, they are forced to run their algorithms on a much larger data set to extract the information of interest. In order to better server its users, the GES DISC/DAAC, provider of long-term archives and distribution services as well science support for the AIRS/AMSU/HSB data products, has developed various tools for performing channels, variables, parameter, spatial and derived products subsetting, resampling and reformatting operations. This presentation mainly describes the web-enabled subsetting services currently available at the GES DISC/DAAC that provide subsetting functions for all the Level 1B and Level 2 data products from the AIRS/AMSU/HSB instruments.

  12. Developing cloud-based Business Process Management (BPM): a survey

    NASA Astrophysics Data System (ADS)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  13. Cloud Surprises in Moving NASA EOSDIS Applications into Amazon Web Services

    NASA Technical Reports Server (NTRS)

    Mclaughlin, Brett

    2017-01-01

    NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. We ran into surprising network policy limitations, billing challenges in a government-based cost model, and difficulty in obtaining certificates in an NASA security-compliant manner. On the other hand, this approach has allowed us to move a number of applications from local hosting to the cloud in a matter of hours (yes, hours!!), and our CMR application now services 95% of granule searches and an astonishing 99% of all collection searches in under a second. And most surprising of all, well, you'll just have to wait and see the realization that caught our entire team off guard!

  14. Service-oriented Software Defined Optical Networks for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  15. A cloud computing based 12-lead ECG telemedicine service

    PubMed Central

    2012-01-01

    Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan. PMID:22838382

  16. A cloud computing based 12-lead ECG telemedicine service.

    PubMed

    Hsieh, Jui-Chien; Hsu, Meng-Wei

    2012-07-28

    Due to the great variability of 12-lead ECG instruments and medical specialists' interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists' decision making support in emergency telecardiology. We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  17. CloudSat Level 3 Gridded Products: Introduction and Application to Evaluation of the MERRA-2 Reanalysis

    NASA Astrophysics Data System (ADS)

    Haynes, J. M.; Miller, S. D.; Partain, P.

    2016-12-01

    CloudSat mission data are currently available to the science community in the form of granule-level, single-orbit Level 2 products. Although this is useful for process-level studies and investigation of individual radar profiles, it is less convenient for regional studies or investigations requiring that cloud properties be aggregated over long periods of time. This aggregation process is not necessary straight-forward: it must be tailored to the specific data product and scientific data contained therein, it requires large amounts of data transfer, and care must be taken to perform the aggregation only on statistically significant spatial and temporal scales. To make CloudSat data more accessible to the broader scientific community and in order to better preserve the environmental data record, a suite of Level 3 (L3), gridded data products are being developed by the CloudSat Data Processing Center (DPC). These products are being developed in four broad categories: (1) radiation budget, (2) radar reflectivity, (3) precipitation incidence and type, and (4) microphysics. L3 products will be generated on both standard (i.e. fixed resolution) grids, and dynamically with user-configurable grid spacing and timescales via an online user interface. An important distinction of the current L3 development is in its usage of dynamically configurable histograms, allowing for representation of the detailed, non-Gaussian characteristics of the data distribution. This work serves to both introduce these products to the wider scientific community and demonstrate their utility for model and reanalysis evaluation. Toward the latter goal, an analysis of the new Modern Era Retrospective-Analysis for Research and Applications version 2 (MERRA-2) cloud products is performed using a development version of the CloudSat L3 products. L3 products are used to evaluate near-global cloud fraction, optical depth, cloud liquid and ice water content, shortwave and longwave cloud radiative effects, and precipitation occurrence. These results are then contrasted against the corresponding MERRA-2 fields, and the differences are explored in terms of potential improvements and/or shortcomings in both the reanalysis and observational products.

  18. The Development of an Educational Cloud for IS Curriculum through a Student-Run Data Center

    ERIC Educational Resources Information Center

    Hwang, Drew; Pike, Ron; Manson, Dan

    2016-01-01

    The industry-wide emphasis on cloud computing has created a new focus in Information Systems (IS) education. As the demand for graduates with adequate knowledge and skills in cloud computing is on the rise, IS educators are facing a challenge to integrate cloud technology into their curricula. Although public cloud tools and services are available…

  19. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-03-16

    of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data

  20. e-Infrastuctures interoperability: the Geohazards Exploitation Platform for the use of satellite earth observations in Geosciences

    NASA Astrophysics Data System (ADS)

    Caumont, Herve; Brito, Fabrice; Mathot, Emmanuel; Barchetta, Francesco; Loeschau, Frank

    2015-04-01

    We present recent achievements with the Geohazards Exploitation Platform (GEP), a European contribution to the GEO SuperSites, and its interoperability with the MEDiterranean SUpersite Volcanoes (MED-SUV) e- infrastructure. The GEP is a catalyst for the use of satellite Earth observation missions, providing data to initiatives such as the GEO Geohazard Supersites and Natural Laboratories (GSNL), the Volcano and Seismic Hazards CEOS Pilots or the European Plate Observing System (EPOS). As satellite sensors are delivering increasing amounts of data, researchers need more computational science tools and services. The GEP contribution in this regard allows scientists to access different data types, relevant to the same area and phenomena and to directly stage selected inputs to scalable processing applications that deliver EO-based science products. With the GEP concept of operation for improved collaboration, a partner can bring its processing tools, use from his workspace other shared toolboxes and access large data repositories. GEP is based on Open Source Software components, on a Cloud Services architecture inheriting a range of ESA and EC funded innovations, and is associating the scientific community and SMEs in implementing new capabilities. Via MED-SUV, we are making discoverable and accessible a large number of products over the Mt. Etna, Vesu- vius/Campi Flegrei volcanic areas, which are of broader interest for Geosciences researchers, so they can process ENVISAT MERIS, ENVISAT ASAR, and ERS SAR data (both Level 1 and Level 2) hosted in the ESA clusters and in ESA's Virtual Archive, TerraSAR-X data hosted in DLR's Virtual Archive, as well as data hosted in other dedicated MED-SUV Virtual Archives (e.g. for LANDSAT, EOS-1). GEP will gradually access Sentinel-1A data, other space agencies data and value-added products. Processed products can also be published and archived on the MED-SUV e-Infrastructure. In this effort, data policy rules applied to the acquisitions are verified against the GEOSS Data Collection of Open Resources for Everyone (GEOSS Data-CORE) principles. The resulting infras- tructure repositories include connectivity to the GEOSS Data Access Broker (DAB), through the "OGC CS-W OpenSearch Geo and Time extensions" interface standard, a key interoperability arrangement used by the MED- SUV systems, making EO data products available to both the project partners and the broader initiatives. GEP is also proposing and further developing hosted processing, aimed at MED-SUV researchers' work on new methods to integrate in-situ and satellite sensors data: a set of users services (concept of Platform-as-a-Service, or PaaS) for generating value-added products, including tools to design and develop Hadoop-enabled processing chains. The PaaS core engine is the Developer Cloud Sandboxes service, where scalable processing chains are prepared and validated. The PaaS makes use of Virtual Machines technology, and of middleware for scaling-out processing tasks via interfaces to commercial Cloud Providers, or through research agreements to academic re- sources like EGI.eu. After integration, processors are deployed and invoked 'as-a-Service' by partners via OGC Web Processing Service standard interface, or shared as reusable virtualized resources. Recent integration work covered e.g. ROI_PAC, GMTSAR and DORIS ADORE toolboxes along with supporting processing services such as DEM generation. Such approach has been discussed also with the MARSite project, ensuring the adopted solu- tions are aligned. As part of the MED-SUV project, we are developing tools and services supporting researchers working on new data fusion methods, and fostering collaboration between different end users and partners, including towards the GEO communities. Overall, the approach provides an integrated European contribution for the exploitation of decades of scientific data gathered from Earth observation satellites.

  1. Pathfinder Sea Surface Temperature Climate Data Record

    NASA Astrophysics Data System (ADS)

    Baker-Yeboah, S.; Saha, K.; Zhang, D.; Casey, K. S.

    2016-02-01

    Global sea surface temperature (SST) fields are important in understanding ocean and climate variability. The NOAA National Centers for Environmental Information (NCEI) develops and maintains a high resolution, long-term, climate data record (CDR) of global satellite SST. These SST values are generated at approximately 4 km resolution using Advanced Very High Resolution Radiometer (AVHRR) instruments aboard NOAA polar-orbiting satellites going back to 1981. The Pathfinder SST algorithm is based on the Non-Linear SST algorithm using the modernized NASA SeaWiFS Data Analysis System (SeaDAS). Coefficients for this SST product were generated using regression analyses with co-located in situ and satellite measurements. Previous versions of Pathfinder included level 3 collated (L3C) products. Pathfinder Version 5.3 includes level 2 pre-processed (L2P), level 3 Uncollated (L3C), and L3C products. Notably, the data were processed in the cloud using Amazon Web Services and are made available through all of the modern web visualization and subset services provided by the THREDDS Data Server, the Live Access Server, and the OPeNDAP Hyrax Server.In this version of Pathfinder SST, anomalous hot-spots at land-water boundaries are better identified and the dataset includes updated land masks and sea ice data over the Antarctic ice shelves. All quality levels of SST values are generated, giving the user greater flexibility and the option to apply their own cloud-masking procedures. Additional improvements include consistent cloud tree tests for NOAA-07 and NOAA-19 with respect to the other sensors, improved SSTs in sun glint areas, and netCDF file format improvements to ensure consistency with the latest Group for High Resolution SST (GHRSST) requirements. This quality controlled satellite SST field is a reference environmental data record utilized as a primary resource of SST for numerous regional and global marine efforts.

  2. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Astrophysics Data System (ADS)

    Pilone, D.

    2016-12-01

    As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.

  3. MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.

    PubMed

    Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed

    2017-01-20

    Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.

  4. Where the Cloud Meets the Commons

    ERIC Educational Resources Information Center

    Ipri, Tom

    2011-01-01

    Changes presented by cloud computing--shared computing services, applications, and storage available to end users via the Internet--have the potential to seriously alter how libraries provide services, not only remotely, but also within the physical library, specifically concerning challenges facing the typical desktop computing experience.…

  5. Models of evaluating efficiency and risks on integration of cloud-base IT-services of the machine-building enterprise: a system approach

    NASA Astrophysics Data System (ADS)

    Razumnikov, S.; Kurmanbay, A.

    2016-04-01

    The present paper suggests a system approach to evaluation of the effectiveness and risks resulted from the integration of cloud-based services in a machine-building enterprise. This approach makes it possible to estimate a set of enterprise IT applications and choose the applications to be migrated to the cloud with regard to specific business requirements, a technological strategy and willingness to risk.

  6. Building a Cloud Computing and Big Data Infrastructure for Cybersecurity Research and Education

    DTIC Science & Technology

    2015-04-17

    408 1,408 312,912 17 Hadoop- Integration M/D Node R720xd 2 24 128 3,600 5 Subtotal: 120 640 18,000 5 Cloud - Production VRTX M620 2 16 256 30,720...4 Subtotal: 8 64 1,024 30,720 4 Cloud - Integration IBM HS22 7870H5U 2 12 84 4,800 5 Subtotal: 10 60 420 4,800 5 TOTAL: 62 652 3,492 366,432...3,492 366,432 Cloud - Integration Hadoop- Production Hadoop- Integration Cloud - Production September 2014 8 Exploring New Opportunities (Cybersecurity

  7. Clouds and the Earth's Radiant Energy System (CERES) algorithm theoretical basis document. volume 4; Determination of surface and atmosphere fluxes and temporally and spatially averaged products (subsystems 5-12); Determination of surface and atmosphere fluxes and temporally and spatially averaged products

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A. (Principal Investigator); Barkstrom, Bruce R. (Principal Investigator); Baum, Bryan A.; Charlock, Thomas P.; Green, Richard N.; Lee, Robert B., III; Minnis, Patrick; Smith, G. Louis; Coakley, J. A.; Randall, David R.

    1995-01-01

    The theoretical bases for the Release 1 algorithms that will be used to process satellite data for investigation of the Clouds and the Earth's Radiant Energy System (CERES) are described. The architecture for software implementation of the methodologies is outlined. Volume 4 details the advanced CERES techniques for computing surface and atmospheric radiative fluxes (using the coincident CERES cloud property and top-of-the-atmosphere (TOA) flux products) and for averaging the cloud properties and TOA, atmospheric, and surface radiative fluxes over various temporal and spatial scales. CERES attempts to match the observed TOA fluxes with radiative transfer calculations that use as input the CERES cloud products and NOAA National Meteorological Center analyses of temperature and humidity. Slight adjustments in the cloud products are made to obtain agreement of the calculated and observed TOA fluxes. The computed products include shortwave and longwave fluxes from the surface to the TOA. The CERES instantaneous products are averaged on a 1.25-deg latitude-longitude grid, then interpolated to produce global, synoptic maps to TOA fluxes and cloud properties by using 3-hourly, normalized radiances from geostationary meteorological satellites. Surface and atmospheric fluxes are computed by using these interpolated quantities. Clear-sky and total fluxes and cloud properties are then averaged over various scales.

  8. Intercomparisons of marine boundary layer cloud properties from the ARM CAP-MBL campaign and two MODIS cloud products

    NASA Astrophysics Data System (ADS)

    Zhang, Zhibo; Dong, Xiquan; Xi, Baike; Song, Hua; Ma, Po-Lun; Ghan, Steven J.; Platnick, Steven; Minnis, Patrick

    2017-02-01

    From April 2009 to December 2010, the Department of Energy Atmospheric Radiation Measurement (ARM) program carried out an observational field campaign on Graciosa Island, targeting the marine boundary layer (MBL) clouds over the Azores region. In this paper, we present an intercomparison of the MBL cloud properties, namely, cloud liquid water path (LWP), cloud optical thickness (COT), and cloud-droplet effective radius (CER), among retrievals from the ARM mobile facility and two Moderate Resolution Imaging Spectroradiometer (MODIS) cloud products (Goddard Space Flight Center (GSFC)-MODIS and Clouds and Earth's Radiant Energy System-MODIS). A total of 63 daytime single-layer MBL cloud cases are selected for intercomparison. Comparison of collocated retrievals indicates that the two MODIS cloud products agree well on both COT and CER retrievals, with the correlation coefficient R > 0.95, despite their significant difference in spatial sampling. In both MODIS products, the CER retrievals based on the 2.1 µm band (CER2.1) are significantly larger than those based on the 3.7 µm band (CER3.7). The GSFC-MODIS cloud product is collocated and compared with ground-based ARM observations at several temporal-spatial scales. In general, the correlation increases with more precise collocation. For the 63 selected MBL cloud cases, the GSFC-MODIS LWP and COT retrievals agree reasonably well with the ground-based observations with no apparent bias and correlation coefficient R around 0.85 and 0.70, respectively. However, GSFC-MODIS CER3.7 and CER2.1 retrievals have a lower correlation (R 0.5) with the ground-based retrievals. For the 63 selected cases, they are on average larger than ground observations by about 1.5 µm and 3.0 µm, respectively. Taking into account that the MODIS CER retrievals are only sensitive to cloud top reduces the bias only by 0.5 µm.

  9. Evaluating the Influence of the Client Behavior in Cloud Computing.

    PubMed

    Souza Pardo, Mário Henrique; Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José

    2016-01-01

    This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system.

  10. Evaluating the Influence of the Client Behavior in Cloud Computing

    PubMed Central

    Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José

    2016-01-01

    This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system. PMID:27441559

  11. Integrated Model to Assess Cloud Deployment Effectiveness When Developing an IT-strategy

    NASA Astrophysics Data System (ADS)

    Razumnikov, S.; Prankevich, D.

    2016-04-01

    Developing an IT-strategy of cloud deployment is a complex issue since even the stage of its formation necessitates revealing what applications will be the best possible to meet the requirements of a company business-strategy, evaluate reliability and safety of cloud providers and analyze staff satisfaction. A system of criteria, as well an integrated model to assess cloud deployment effectiveness is offered. The model makes it possible to identify what applications being at the disposal of a company, as well as new tools to be deployed are reliable and safe enough for implementation in the cloud environment. The data on practical use of the procedure to assess cloud deployment effectiveness by a provider of telecommunication services is presented. The model was used to calculate values of integral indexes of services to be assessed, then, ones, meeting the criteria and answering the business-strategy of a company, were selected.

  12. Data Protection-Aware Design for Cloud Services

    NASA Astrophysics Data System (ADS)

    Creese, Sadie; Hopkins, Paul; Pearson, Siani; Shen, Yun

    The Cloud is a relatively new concept and so it is unsurprising that the information assurance, data protection, network security and privacy concerns have yet to be fully addressed. This paper seeks to begin the process of designing data protection controls into clouds from the outset so as to avoid the costs associated with bolting on security as an afterthought. Our approach is firstly to consider cloud maturity from an enterprise level perspective, describing a novel capability maturity model. We use this model to explore privacy controls within an enterprise cloud deployment, and explore where there may be opportunities to design in data protection controls as exploitation of the Cloud matures. We demonstrate how we might enable such controls via the use of design patterns. Finally, we consider how Service Level Agreements (SLAs) might be used to ensure that third party suppliers act in support of such controls.

  13. Platform for High-Assurance Cloud Computing

    DTIC Science & Technology

    2016-06-01

    to create today’s standard cloud computing applications and services. Additionally , our SuperCloud (a related but distinct project under the same... Additionally , our SuperCloud (a related but distinct project under the same MRC funding) reduces vendor lock-in and permits application to migrate, to follow...managing key- value storage with strong assurance properties. This first accomplishment allows us to climb the cloud technical stack, by offering

  14. Information Security: Governmentwide Guidance Needed to Assist Agencies in Implementing Cloud Computing

    DTIC Science & Technology

    2010-07-01

    Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to

  15. Supporting reputation based trust management enhancing security layer for cloud service models

    NASA Astrophysics Data System (ADS)

    Karthiga, R.; Vanitha, M.; Sumaiya Thaseen, I.; Mangaiyarkarasi, R.

    2017-11-01

    In the existing system trust between cloud providers and consumers is inadequate to establish the service level agreement though the consumer’s response is good cause to assess the overall reliability of cloud services. Investigators recognized the significance of trust can be managed and security can be provided based on feedback collected from participant. In this work a face recognition system that helps to identify the user effectively. So we use an image comparison algorithm where the user face is captured during registration time and get stored in database. With that original image we compare it with the sample image that is already stored in database. If both the image get matched then the users are identified effectively. When the confidential data are subcontracted to the cloud, data holders will become worried about the confidentiality of their data in the cloud. Encrypting the data before subcontracting has been regarded as the important resources of keeping user data privacy beside the cloud server. So in order to keep the data secure we use an AES algorithm. Symmetric-key algorithms practice a shared key concept, keeping data secret requires keeping this key secret. So only the user with private key can decrypt data.

  16. Developing cloud applications using the e-Science Central platform.

    PubMed

    Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek

    2013-01-28

    This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction.

  17. Developing cloud applications using the e-Science Central platform

    PubMed Central

    Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek

    2013-01-01

    This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction. PMID:23230161

  18. Virtualized Multi-Mission Operations Center (vMMOC) and its Cloud Services

    NASA Technical Reports Server (NTRS)

    Ido, Haisam Kassim

    2017-01-01

    His presentation will cover, the current and future, technical and organizational opportunities and challenges with virtualizing a multi-mission operations center. The full deployment of Goddard Space Flight Centers (GSFC) Virtualized Multi-Mission Operations Center (vMMOC) is nearly complete. The Space Science Mission Operations (SSMO) organizations spacecraft ACE, Fermi, LRO, MMS(4), OSIRIS-REx, SDO, SOHO, Swift, and Wind are in the process of being fully migrated to the vMMOC. The benefits of the vMMOC will be the normalization and the standardization of IT services, mission operations, maintenance, and development as well as ancillary services and policies such as collaboration tools, change management systems, and IT Security. The vMMOC will also provide operational efficiencies regarding hardware, IT domain expertise, training, maintenance and support.The presentation will also cover SSMO's secure Situational Awareness Dashboard in an integrated, fleet centric, cloud based web services fashion. Additionally the SSMO Telemetry as a Service (TaaS) will be covered, which allows authorized users and processes to access telemetry for the entire SSMO fleet, and for the entirety of each spacecrafts history. Both services leverage cloud services in a secure FISMA High and FedRamp environment, and also leverage distributed object stores in order to house and provide the telemetry. The services are also in the process of leveraging the cloud computing services elasticity and horizontal scalability. In the design phase is the Navigation as a Service (NaaS) which will provide a standardized, efficient, and normalized service for the fleet's space flight dynamics operations. Additional future services that may be considered are Ground Segment as a Service (GSaaS), Telemetry and Command as a Service (TCaaS), Flight Software Simulation as a Service, etc.

  19. State of the Art of Network Security Perspectives in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Oh, Tae Hwan; Lim, Shinyoung; Choi, Young B.; Park, Kwang-Roh; Lee, Heejo; Choi, Hyunsang

    Cloud computing is now regarded as one of social phenomenon that satisfy customers' needs. It is possible that the customers' needs and the primary principle of economy - gain maximum benefits from minimum investment - reflects realization of cloud computing. We are living in the connected society with flood of information and without connected computers to the Internet, our activities and work of daily living will be impossible. Cloud computing is able to provide customers with custom-tailored features of application software and user's environment based on the customer's needs by adopting on-demand outsourcing of computing resources through the Internet. It also provides cloud computing users with high-end computing power and expensive application software package, and accordingly the users will access their data and the application software where they are located at the remote system. As the cloud computing system is connected to the Internet, network security issues of cloud computing are considered as mandatory prior to real world service. In this paper, survey and issues on the network security in cloud computing are discussed from the perspective of real world service environments.

  20. Acceptance of Cloud Services in Face-to-Face Computer-Supported Collaborative Learning: A Comparison between Single-User Mode and Multi-User Mode

    ERIC Educational Resources Information Center

    Wang, Chia-Sui; Huang, Yong-Ming

    2016-01-01

    Face-to-face computer-supported collaborative learning (CSCL) was used extensively to facilitate learning in classrooms. Cloud services not only allow a single user to edit a document, but they also enable multiple users to simultaneously edit a shared document. However, few researchers have compared student acceptance of such services in…

  1. Cyber Capability Development Centre (CCDC) Private Cloud Design

    DTIC Science & Technology

    2014-11-01

    68 8.4 Shared Services Canada (SSC) Controlled Firewall .......................................................... 69 9 Cloud...opposed to east-west traffic (VM to VM). With North-South traffic, Shared Services Canada will want to ensure that the lab environment is contained. One...way traffic flow into the lab should be acceptable, Shared Services Canada will need to ensure that traffic doesn’t flow north or out of the CCDC

  2. The Adoption of Cloud Computing in the Field of Genomics Research: The Influence of Ethical and Legal Issues

    PubMed Central

    Charlebois, Kathleen; Palmour, Nicole; Knoppers, Bartha Maria

    2016-01-01

    This study aims to understand the influence of the ethical and legal issues on cloud computing adoption in the field of genomics research. To do so, we adapted Diffusion of Innovation (DoI) theory to enable understanding of how key stakeholders manage the various ethical and legal issues they encounter when adopting cloud computing. Twenty semi-structured interviews were conducted with genomics researchers, patient advocates and cloud service providers. Thematic analysis generated five major themes: 1) Getting comfortable with cloud computing; 2) Weighing the advantages and the risks of cloud computing; 3) Reconciling cloud computing with data privacy; 4) Maintaining trust and 5) Anticipating the cloud by creating the conditions for cloud adoption. Our analysis highlights the tendency among genomics researchers to gradually adopt cloud technology. Efforts made by cloud service providers to promote cloud computing adoption are confronted by researchers’ perpetual cost and security concerns, along with a lack of familiarity with the technology. Further underlying those fears are researchers’ legal responsibility with respect to the data that is stored on the cloud. Alternative consent mechanisms aimed at increasing patients’ control over the use of their data also provide a means to circumvent various institutional and jurisdictional hurdles that restrict access by creating siloed databases. However, the risk of creating new, cloud-based silos may run counter to the goal in genomics research to increase data sharing on a global scale. PMID:27755563

  3. The Adoption of Cloud Computing in the Field of Genomics Research: The Influence of Ethical and Legal Issues.

    PubMed

    Charlebois, Kathleen; Palmour, Nicole; Knoppers, Bartha Maria

    2016-01-01

    This study aims to understand the influence of the ethical and legal issues on cloud computing adoption in the field of genomics research. To do so, we adapted Diffusion of Innovation (DoI) theory to enable understanding of how key stakeholders manage the various ethical and legal issues they encounter when adopting cloud computing. Twenty semi-structured interviews were conducted with genomics researchers, patient advocates and cloud service providers. Thematic analysis generated five major themes: 1) Getting comfortable with cloud computing; 2) Weighing the advantages and the risks of cloud computing; 3) Reconciling cloud computing with data privacy; 4) Maintaining trust and 5) Anticipating the cloud by creating the conditions for cloud adoption. Our analysis highlights the tendency among genomics researchers to gradually adopt cloud technology. Efforts made by cloud service providers to promote cloud computing adoption are confronted by researchers' perpetual cost and security concerns, along with a lack of familiarity with the technology. Further underlying those fears are researchers' legal responsibility with respect to the data that is stored on the cloud. Alternative consent mechanisms aimed at increasing patients' control over the use of their data also provide a means to circumvent various institutional and jurisdictional hurdles that restrict access by creating siloed databases. However, the risk of creating new, cloud-based silos may run counter to the goal in genomics research to increase data sharing on a global scale.

  4. A Cloud Mask for AIRS

    NASA Technical Reports Server (NTRS)

    Brubaker, N.; Jedlovec, G. J.

    2004-01-01

    With the preliminary release of AIRS Level 1 and 2 data to the scientific community, there is a growing need for an accurate AIRS cloud mask for data assimilation studies and in producing products derived from cloud free radiances. Current cloud information provided with the AIRS data are limited or based on simplified threshold tests. A multispectral cloud detection approach has been developed for AIRS that utilizes the hyper-spectral capabilities to detect clouds based on specific cloud signatures across the short wave and long wave infrared window regions. This new AIRS cloud mask has been validated against the existing AIRS Level 2 cloud product and cloud information derived from MODIS. Preliminary results for both day and night applications over the continental U.S. are encouraging. Details of the cloud detection approach and validation results will be presented at the conference.

  5. Extending MODIS Cloud Top and Infrared Phase Climate Records with VIIRS and CrIS

    NASA Astrophysics Data System (ADS)

    Heidinger, A. K.; Platnick, S. E.; Ackerman, S. A.; Holz, R.; Meyer, K.; Frey, R.; Wind, G.; Li, Y.; Botambekov, D.

    2015-12-01

    The MODIS imagers on the NASA EOS Terra and Aqua satellites have generated accurate and well-used cloud climate data records for 15 years. Both missions are expected to continue until the end of this decade and perhaps beyond. The Visible and Infrared Imaging Radiometer Suite (VIIRS) imagers on the Suomi-NPP (SNPP) mission (launched in October 2011) and future NOAA Joint Polar Satellite System (JPSS) platforms are the successors for imager-based cloud climate records from polar orbiting satellites after MODIS. To ensure product continuity across a broad suite of EOS products, NASA has funded a SNPP science team to develop EOS-like algorithms that can be use with SNPP and JPSS observations, including two teams to work on cloud products. Cloud data record continuity between MODIS and VIIRS is particularly challenging due to the lack of VIIRS CO2-slicing channels, which reduces information content for cloud detection and cloud-top property products, as well as down-stream cloud optical products that rely on both. Here we report on our approach to providing continuity specifically for the MODIS/VIIRS cloud-top and infrared-derived thermodynamic phase products by combining elements of the NASA MODIS science team (MOD) and the NOAA Algorithm Working Group (AWG) algorithms. The combined approach is referred to as the MODAWG processing package. In collaboration with the NASA Atmospheric SIPS located at the University of Wisconsin Space Science and Engineering Center, the MODAWG code has been exercised on one year of SNPP VIIRS data. In addition to cloud-top and phase, MODAWG provides a full suite of cloud products that are physically consistent with MODIS and have a similar data format. Further, the SIPS has developed tools to allow use of Cross-track Infrared Sounder (CrIS) observations in the MODAWG processing that can ameliorate the loss of the CO2 absorption channels on VIIRS. Examples will be given that demonstrate the positive impact that the CrIS data can provide when combined with VIIRS for cloud height and IR-phase retrievals.

  6. Space Situational Awareness Data Processing Scalability Utilizing Google Cloud Services

    NASA Astrophysics Data System (ADS)

    Greenly, D.; Duncan, M.; Wysack, J.; Flores, F.

    Space Situational Awareness (SSA) is a fundamental and critical component of current space operations. The term SSA encompasses the awareness, understanding and predictability of all objects in space. As the population of orbital space objects and debris increases, the number of collision avoidance maneuvers grows and prompts the need for accurate and timely process measures. The SSA mission continually evolves to near real-time assessment and analysis demanding the need for higher processing capabilities. By conventional methods, meeting these demands requires the integration of new hardware to keep pace with the growing complexity of maneuver planning algorithms. SpaceNav has implemented a highly scalable architecture that will track satellites and debris by utilizing powerful virtual machines on the Google Cloud Platform. SpaceNav algorithms for processing CDMs outpace conventional means. A robust processing environment for tracking data, collision avoidance maneuvers and various other aspects of SSA can be created and deleted on demand. Migrating SpaceNav tools and algorithms into the Google Cloud Platform will be discussed and the trials and tribulations involved. Information will be shared on how and why certain cloud products were used as well as integration techniques that were implemented. Key items to be presented are: 1.Scientific algorithms and SpaceNav tools integrated into a scalable architecture a) Maneuver Planning b) Parallel Processing c) Monte Carlo Simulations d) Optimization Algorithms e) SW Application Development/Integration into the Google Cloud Platform 2. Compute Engine Processing a) Application Engine Automated Processing b) Performance testing and Performance Scalability c) Cloud MySQL databases and Database Scalability d) Cloud Data Storage e) Redundancy and Availability

  7. Satellite remote sensing of dust aerosol indirect effects on ice cloud formation.

    PubMed

    Ou, Steve Szu-Cheng; Liou, Kuo-Nan; Wang, Xingjuan; Hansell, Richard; Lefevre, Randy; Cocks, Stephen

    2009-01-20

    We undertook a new approach to investigate the aerosol indirect effect of the first kind on ice cloud formation by using available data products from the Moderate-Resolution Imaging Spectrometer (MODIS) and obtained physical understanding about the interaction between aerosols and ice clouds. Our analysis focused on the examination of the variability in the correlation between ice cloud parameters (optical depth, effective particle size, cloud water path, and cloud particle number concentration) and aerosol optical depth and number concentration that were inferred from available satellite cloud and aerosol data products. Correlation results for a number of selected scenes containing dust and ice clouds are presented, and dust aerosol indirect effects on ice clouds are directly demonstrated from satellite observations.

  8. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  9. Photochemical ozone production in tropical squall line convection during NASA Global Tropospheric Experiment/Amazon Boundary Layer Experiment 2A

    NASA Technical Reports Server (NTRS)

    Pickering, Kenneth E.; Thompson, Anne M.; Tao, Wei-Kuo; Simpson, Joanne; Scala, John R.

    1991-01-01

    The role of convection was examined in trace gas transport and ozone production in a tropical dry season squall line sampled on August 3, 1985, during NASA Global Tropospheric Experiment/Amazon Boundary Layer Experiment 2A (NASA GTE/ABLE 2A) in Amazonia, Brazil. Two types of analyses were performed. Transient effects within the cloud are examined with a combination of two-dimensional cloud and one-dimensional photochemical modeling. Tracer analyses using the cloud model wind fields yield a series of cross sections of NO(x), CO, and O3 distribution during the lifetime of the cloud; these fields are used in the photochemical model to compute the net rate of O3 production. At noon, when the cloud was mature, the instantaneous ozone production potential in the cloud is between 50 and 60 percent less than in no-cloud conditions due to reduced photolysis and cloud scavenging of radicals. Analysis of cloud inflows and outflows is used to differentiate between air that is undisturbed and air that has been modified by the storm. These profiles are used in the photochemical model to examine the aftereffects of convective redistribution in the 24-hour period following the storm. Total tropospheric column O3 production changed little due to convection because so little NO(x) was available in the lower troposphere. However, the integrated O3 production potential in the 5- to 13-km layer changed from net destruction to net production as a result of the convection. The conditions of the August 3, 1985, event may be typical of the early part of the dry season in Amazonia, when only minimal amounts of pollution from biomass burning have been transported into the region.

  10. CEMS: Building a Cloud-Based Infrastructure to Support Climate and Environmental Data Services

    NASA Astrophysics Data System (ADS)

    Kershaw, P. J.; Curtis, M.; Pechorro, E.

    2012-04-01

    CEMS, the facility for Climate and Environmental Monitoring from Space, is a new joint collaboration between academia and industry to bring together their collective expertise to support research into climate change and provide a catalyst for growth in related Earth Observation (EO) technologies and services in the commercial sector. A recent major investment by the UK Space Agency has made possible the development of a dedicated facility at ISIC, the International Space Innovation Centre at Harwell in the UK. CEMS has a number of key elements: the provision of access to large-volume EO and climate datasets co-located with high performance computing facilities; a flexible infrastructure to support the needs of research projects in the academic community and new business opportunities for commercial companies. Expertise and tools for scientific data quality and integrity are another essential component, giving users confidence and transparency in its data, services and products. Central to the development of this infrastructure is the utilisation of cloud-based technology: multi-tenancy and the dynamic provision of resources are key characteristics to exploit in order to support the range of organisations using the facilities and the varied use cases. The hosting of processing services and applications next to the data within the CEMS facility is another important capability. With the expected exponential increase in data volumes within the climate science and EO domains it is becoming increasingly impracticable for organisations to retrieve this data over networks and provide the necessary storage. Consider for example, the factor of o20 increase in data volumes expected for the ESA Sentinel missions over the equivalent Envisat instruments. We explore the options for the provision of a hybrid community/private cloud looking at offerings from the commercial sector and developments in the Open Source community. Building on this virtualisation layer, a further core services tier will support and serve applications as part of a service oriented architecture. We consider the constituent services in this layer to support access to the data, data processing and the orchestration of workflows.

  11. A comparison of Aqua MODIS ice and liquid water cloud physical and optical properties between collection 6 and collection 5.1: Pixel-to-pixel comparisons

    NASA Astrophysics Data System (ADS)

    Yi, Bingqi; Rapp, Anita D.; Yang, Ping; Baum, Bryan A.; King, Michael D.

    2017-04-01

    We compare differences in ice and liquid water cloud physical and optical properties between Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 (C6) and collection 5.1 (C51). The C6 cloud products changed significantly due to improved calibration, improvements based on comparisons with the Cloud-Aerosol Lidar with Orthogonal Polarization, treatment of subpixel liquid water clouds, introduction of a roughened ice habit for C6 rather than the use of smooth ice particles in C51, and more. The MODIS cloud products form a long-term data set for analysis, modeling, and various purposes. Thus, it is important to understand the impact of the changes. Two cases are considered for C6 to C51 comparisons. Case 1 considers pixels with valid cloud retrievals in both C6 and C51, while case 2 compares all valid cloud retrievals in each collection. One year (2012) of level-2 MODIS cloud products are examined, including cloud effective radius (CER), optical thickness (COT), water path, cloud top pressure (CTP), cloud top temperature, and cloud fraction. Large C6-C51 differences are found in the ice CER (regionally, as large as 15 μm) and COT (decrease in annual average by approximately 25%). Liquid water clouds have higher CTP in marine stratocumulus regions in C6 but lower CTP globally (-5 hPa), and there are 66% more valid pixels in C6 (case 2) due to the treatment of pixels with subpixel clouds. Simulated total cloud radiative signatures from C51 and C6 are compared to Clouds and the Earth's Radiant Energy System Energy Balanced And Filled (EBAF) product. The C6 CREs compare more closely with the EBAF than the C51 counterparts.

  12. A cloud-ozone data product from Aura OMI and MLS satellite measurements

    NASA Astrophysics Data System (ADS)

    Ziemke, Jerald R.; Strode, Sarah A.; Douglass, Anne R.; Joiner, Joanna; Vasilkov, Alexander; Oman, Luke D.; Liu, Junhua; Strahan, Susan E.; Bhartia, Pawan K.; Haffner, David P.

    2017-11-01

    Ozone within deep convective clouds is controlled by several factors involving photochemical reactions and transport. Gas-phase photochemical reactions and heterogeneous surface chemical reactions involving ice, water particles, and aerosols inside the clouds all contribute to the distribution and net production and loss of ozone. Ozone in clouds is also dependent on convective transport that carries low-troposphere/boundary-layer ozone and ozone precursors upward into the clouds. Characterizing ozone in thick clouds is an important step for quantifying relationships of ozone with tropospheric H2O, OH production, and cloud microphysics/transport properties. Although measuring ozone in deep convective clouds from either aircraft or balloon ozonesondes is largely impossible due to extreme meteorological conditions associated with these clouds, it is possible to estimate ozone in thick clouds using backscattered solar UV radiation measured by satellite instruments. Our study combines Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) satellite measurements to generate a new research product of monthly-mean ozone concentrations in deep convective clouds between 30° S and 30° N for October 2004-April 2016. These measurements represent mean ozone concentration primarily in the upper levels of thick clouds and reveal key features of cloud ozone including: persistent low ozone concentrations in the tropical Pacific of ˜ 10 ppbv or less; concentrations of up to 60 pphv or greater over landmass regions of South America, southern Africa, Australia, and India/east Asia; connections with tropical ENSO events; and intraseasonal/Madden-Julian oscillation variability. Analysis of OMI aerosol measurements suggests a cause and effect relation between boundary-layer pollution and elevated ozone inside thick clouds over landmass regions including southern Africa and India/east Asia.

  13. A Cloud-Ozone Data Product from Aura OMI and MLS Satellite Measurements.

    PubMed

    Ziemke, Jerald R; Strode, Sarah A; Douglass, Anne R; Joiner, Joanna; Vasilkov, Alexander; Oman, Luke D; Liu, Junhua; Strahan, Susan E; Bhartia, Pawan K; Haffner, David P

    2017-01-01

    Ozone within deep convective clouds is controlled by several factors involving photochemical reactions and transport. Gas-phase photochemical reactions and heterogeneous surface chemical reactions involving ice, water particles, and aerosols inside the clouds all contribute to the distribution and net production and loss of ozone. Ozone in clouds is also dependent on convective transport that carries low troposphere/boundary layer ozone and ozone precursors upward into the clouds. Characterizing ozone in thick clouds is an important step for quantifying relationships of ozone with tropospheric H 2 O, OH production, and cloud microphysics/transport properties. Although measuring ozone in deep convective clouds from either aircraft or balloon ozonesondes is largely impossible due to extreme meteorological conditions associated with these clouds, it is possible to estimate ozone in thick clouds using backscattered solar UV radiation measured by satellite instruments. Our study combines Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) satellite measurements to generate a new research product of monthly-mean ozone concentrations in deep convective clouds between 30°S to 30°N for October 2004 - April 2016. These measurements represent mean ozone concentration primarily in the upper levels of thick clouds and reveal key features of cloud ozone including: persistent low ozone concentrations in the tropical Pacific of ~10 ppbv or less; concentrations of up to 60 pphv or greater over landmass regions of South America, southern Africa, Australia, and India/east Asia; connections with tropical ENSO events; and intra-seasonal/Madden-Julian Oscillation variability. Analysis of OMI aerosol measurements suggests a cause and effect relation between boundary layer pollution and elevated ozone inside thick clouds over land-mass regions including southern Africa and India/east Asia.

  14. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  15. Cloud Computing for Comparative Genomics with Windows Azure Platform

    PubMed Central

    Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609

  16. Retrieval of radiative and microphysical properties of clouds from multispectral infrared measurements

    NASA Astrophysics Data System (ADS)

    Iwabuchi, Hironobu; Saito, Masanori; Tokoro, Yuka; Putri, Nurfiena Sagita; Sekiguchi, Miho

    2016-12-01

    Satellite remote sensing of the macroscopic, microphysical, and optical properties of clouds are useful for studying spatial and temporal variations of clouds at various scales and constraining cloud physical processes in climate and weather prediction models. Instead of using separate independent algorithms for different cloud properties, a unified, optimal estimation-based cloud retrieval algorithm is developed and applied to moderate resolution imaging spectroradiometer (MODIS) observations using ten thermal infrared bands. The model considers sensor configurations, background surface and atmospheric profile, and microphysical and optical models of ice and liquid cloud particles and radiative transfer in a plane-parallel, multilayered atmosphere. Measurement and model errors are thoroughly quantified from direct comparisons of clear-sky observations over the ocean with model calculations. Performance tests by retrieval simulations show that ice cloud properties are retrieved with high accuracy when cloud optical thickness (COT) is between 0.1 and 10. Cloud-top pressure is inferred with uncertainty lower than 10 % when COT is larger than 0.3. Applying the method to a tropical cloud system and comparing the results with the MODIS Collection 6 cloud product shows good agreement for ice cloud optical thickness when COT is less than about 5. Cloud-top height agrees well with estimates obtained by the CO2 slicing method used in the MODIS product. The present algorithm can detect optically thin parts at the edges of high clouds well in comparison with the MODIS product, in which these parts are recognized as low clouds by the infrared window method. The cloud thermodynamic phase in the present algorithm is constrained by cloud-top temperature, which tends not to produce results with an ice cloud that is too warm and liquid cloud that is too cold.

  17. A Geospatial Information Grid Framework for Geological Survey.

    PubMed

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  18. A Geospatial Information Grid Framework for Geological Survey

    PubMed Central

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255

  19. Intercomparisons of Marine Boundary Layer Cloud Properties from the ARM CAP-MBL Campaign and Two MODIS Cloud Products

    NASA Technical Reports Server (NTRS)

    Zhang, Zhibo; Dong, Xiquan; Xi, Baike; Song, Hua; Ma, Po-Lun; Ghan, Steven J.; Platnick, Steven; Minnis, Patrick

    2017-01-01

    From April 2009 to December 2010, the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program carried out an observational field campaign on Graciosa Island, targeting the marine boundary layer (MBL) clouds over the Azores region. In this paper, we present an inter-comparison of the MBL cloud properties, namely, cloud liquid water path (LWP), cloud optical thickness (COT) and cloud-droplet effective radius (CER), among retrievals from the ARM mobile facility (AMF) and two Moderate Resolution Spectroradiometer (MODIS) cloud products (GSFC-MODIS and CERES-MODIS). A total of 63 daytime single-layer MBL cloud cases are selected for inter-comparison. Comparison of collocated retrievals indicates that the two MODIS cloud products agree well on both COT and CER retrievals, with the correlation coefficient R greater than 0.95 despite their significant difference in spatial sampling. In both MODIS products, the CER retrievals based on the 2.1 micrometers band (CER(sub 2.1)) is significantly smaller than that based on the 3.7 micrometers band (CER(sub 3.7)). The GSFC-MODIS cloud product is collocated and compared with ground-based ARM observations at several temporal spatial scales. In general, the correlation increases with more precise collocation. For the 63 selected MBL cloud cases, the GSFC-MODIS LWP and COT retrievals agree reasonably well with the ground-based observations with no apparent bias and correlation coefficient R around 0.85 and 0.70, respectively. However, GSFC-MODIS CER(sub 3.7) and CER(sub 2.1) retrievals have a lower correlation (R is approximately 0.5) with the ground-based retrievals. For the 63 selected cases, they are on average larger than ground observations by about 1.5 micrometers and 3.0 micrometers, respectively. Taking into account that the MODIS CER retrievals are only sensitive to cloud top reduces the bias only by 0.5 micrometers.

  20. Towards Consistent Aerosol and Cloud Climate Data Records from the Along Track Scanning Radiometer instruments

    NASA Astrophysics Data System (ADS)

    Thomas, G.

    2015-12-01

    The ESA Climate Change Initiative (CCI) programme has provided a mechanism for the production of new long-term data records of essential climate variables (ECVs) defined by WMO Global Climate Observing System (GCOS). These include consistent cloud (from the MODIS, AVHRR, ATSR-2 and AATSR instruments) and aerosol (from ATSR-2 and AATSR) products produced using the Optimal Retrieval of Aerosol and Cloud (ORAC) scheme. This talk will present an overview of the newly produced ORAC cloud and aerosol datasets, their evaluation and a joint aerosol-cloud product produced for the 1995-2012 ATSR-2-AATSR data record.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Hyunwoo; Timm, Steven

    We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.

  2. SnowCloud - a Framework to Predict Streamflow in Snowmelt-dominated Watersheds Using Cloud-based Computing

    NASA Astrophysics Data System (ADS)

    Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.

    2017-12-01

    Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing countries.

  3. The OCareCloudS project: Toward organizing care through trusted cloud services.

    PubMed

    De Backere, Femke; Ongenae, Femke; Vannieuwenborg, Frederic; Van Ooteghem, Jan; Duysburgh, Pieter; Jansen, Arne; Hoebeke, Jeroen; Wuyts, Kim; Rossey, Jen; Van den Abeele, Floris; Willems, Karen; Decancq, Jasmien; Annema, Jan Henk; Sulmon, Nicky; Van Landuyt, Dimitri; Verstichel, Stijn; Crombez, Pieter; Ackaert, Ann; De Grooff, Dirk; Jacobs, An; De Turck, Filip

    2016-01-01

    The increasing elderly population and the shift from acute to chronic illness makes it difficult to care for people in hospitals and rest homes. Moreover, elderly people, if given a choice, want to stay at home as long as possible. In this article, the methodologies to develop a cloud-based semantic system, offering valuable information and knowledge-based services, are presented. The information and services are related to the different personal living hemispheres of the patient, namely the daily care-related needs, the social needs and the daily life assistance. Ontologies are used to facilitate the integration, analysis, aggregation and efficient use of all the available data in the cloud. By using an interdisciplinary research approach, where user researchers, (ontology) engineers, researchers and domain stakeholders are at the forefront, a platform can be developed of great added value for the patients that want to grow old in their own home and for their caregivers.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center are working on a multi-year Collaborative Research and Development Agreement.With the knowledge developed in the first year on how to provision and manage a federation of virtual machines through Cloud management systems. In this second year, we expanded the work on provisioning and federation, increasing both scale and diversity of solutions, and we started to build on-demand services on the established fabric, introducing the paradigm of Platform as a Service to assist with the execution of scientific workflows. We have enabled scientific workflows ofmore » stakeholders to run on multiple cloud resources at the scale of 1,000 concurrent machines. The demonstrations have been in the areas of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federation of Cloud Resources, and (c) On-demand Services for ScientificWorkflows.« less

  5. Game Theory Based Trust Model for Cloud Environment

    PubMed Central

    Gokulnath, K.; Uthariaraj, Rhymend

    2015-01-01

    The aim of this work is to propose a method to establish trust at bootload level in cloud computing environment. This work proposes a game theoretic based approach for achieving trust at bootload level of both resources and users perception. Nash equilibrium (NE) enhances the trust evaluation of the first-time users and providers. It also restricts the service providers and the users to violate service level agreement (SLA). Significantly, the problem of cold start and whitewashing issues are addressed by the proposed method. In addition appropriate mapping of cloud user's application to cloud service provider for segregating trust level is achieved as a part of mapping. Thus, time complexity and space complexity are handled efficiently. Experiments were carried out to compare and contrast the performance of the conventional methods and the proposed method. Several metrics like execution time, accuracy, error identification, and undecidability of the resources were considered. PMID:26380365

  6. Exploiting geo-distributed clouds for a e-health monitoring system with minimum service delay and privacy preservation.

    PubMed

    Shen, Qinghua; Liang, Xiaohui; Shen, Xuemin; Lin, Xiaodong; Luo, Henry Y

    2014-03-01

    In this paper, we propose an e-health monitoring system with minimum service delay and privacy preservation by exploiting geo-distributed clouds. In the system, the resource allocation scheme enables the distributed cloud servers to cooperatively assign the servers to the requested users under the load balance condition. Thus, the service delay for users is minimized. In addition, a traffic-shaping algorithm is proposed. The traffic-shaping algorithm converts the user health data traffic to the nonhealth data traffic such that the capability of traffic analysis attacks is largely reduced. Through the numerical analysis, we show the efficiency of the proposed traffic-shaping algorithm in terms of service delay and privacy preservation. Furthermore, through the simulations, we demonstrate that the proposed resource allocation scheme significantly reduces the service delay compared to two other alternatives using jointly the short queue and distributed control law.

  7. A secure EHR system based on hybrid clouds.

    PubMed

    Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke

    2012-10-01

    Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.

  8. Cloud Service Provider Methods for Managing Insider Threats: Analysis Phase 2, Expanded Analysis and Recommendations

    DTIC Science & Technology

    2014-01-01

    and software as a service ( SaaS )) for staff’s abnormal behavior that may indicate an insider incident. As mentioned above, combining SIEM data...Mellon Software Engineering Institute, contacted commercial and government cloud service providers (CSPs) to better understand the administrative and...availability services . We have observed a number of scenarios in which a customer leaves a CSP’s IaaS, PaaS, or SaaS , but its data remains online for some

  9. ARM KAZR-ARSCL Value Added Product

    DOE Data Explorer

    Jensen, Michael

    2012-09-28

    The Ka-band ARM Zenith Radars (KAZRs) have replaced the long-serving Millimeter Cloud Radars, or MMCRs. Accordingly, the primary MMCR Value Added Product (VAP), the Active Remote Sensing of CLouds (ARSCL) product, is being replaced by a KAZR-based version, the KAZR-ARSCL VAP. KAZR-ARSCL provides cloud boundaries and best-estimate time-height fields of radar moments.

  10. XMPP for cloud computing in bioinformatics supporting discovery and invocation of asynchronous web services

    PubMed Central

    Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl ES

    2009-01-01

    Background Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. Results We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. Conclusion XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics. PMID:19732427

  11. XMPP for cloud computing in bioinformatics supporting discovery and invocation of asynchronous web services.

    PubMed

    Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl E S

    2009-09-04

    Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics.

  12. Estimating precipitation susceptibility in warm marine clouds using multi-sensor aerosol and cloud products from A-Train satellites

    NASA Astrophysics Data System (ADS)

    Bai, Heming; Gong, Cheng; Wang, Minghuai; Zhang, Zhibo; L'Ecuyer, Tristan

    2018-02-01

    Precipitation susceptibility to aerosol perturbation plays a key role in understanding aerosol-cloud interactions and constraining aerosol indirect effects. However, large discrepancies exist in the previous satellite estimates of precipitation susceptibility. In this paper, multi-sensor aerosol and cloud products, including those from the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO), CloudSat, Moderate Resolution Imaging Spectroradiometer (MODIS), and Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) from June 2006 to April 2011 are analyzed to estimate precipitation frequency susceptibility SPOP, precipitation intensity susceptibility SI, and precipitation rate susceptibility SR in warm marine clouds. We find that SPOP strongly depends on atmospheric stability, with larger values under more stable environments. Our results show that precipitation susceptibility for drizzle (with a -15 dBZ rainfall threshold) is significantly different than that for rain (with a 0 dBZ rainfall threshold). Onset of drizzle is not as readily suppressed in warm clouds as rainfall while precipitation intensity susceptibility is generally smaller for rain than for drizzle. We find that SPOP derived with respect to aerosol index (AI) is about one-third of SPOP derived with respect to cloud droplet number concentration (CDNC). Overall, SPOP demonstrates relatively robust features throughout independent liquid water path (LWP) products and diverse rain products. In contrast, the behaviors of SI and SR are subject to LWP or rain products used to derive them. Recommendations are further made for how to better use these metrics to quantify aerosol-cloud-precipitation interactions in observations and models.

  13. Evaluation of the Cloud Fields in the UK Met Office HadGEM3-UKCA Model Using the CCCM Satellite Data Product to Advance Our Understanding of the Influence of Clouds on Tropospheric Composition and Chemistry

    NASA Technical Reports Server (NTRS)

    Varma, Sunil; Voulgarakis, Apostolos; Liu, Hongyu; Crawford, James H.; White, James

    2016-01-01

    To determine the role of clouds in driving inter-annual and inter-seasonal variability of trace gases in the troposphere and lower stratosphere with a particular focus on the importance of cloud modification of photolysis. To evaluate the cloud fields and their vertical distribution in the HadGEM3 model utilizing CCCM, a unique 3-D cloud data product merged from multiple A-Train satellites (CERES, CloudSat, CALIPSO, and MODIS) developed at the NASA Langley Research Center.

  14. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    ERIC Educational Resources Information Center

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  15. 77 FR 68122 - Formations of, Acquisitions by, and Mergers of Savings and Loan Holding Companies

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-15

    ..., Minneapolis, Minnesota 55480-0291: 1. The Miller Family 2012 Trust U/A Dated December 21, 2012, St. Cloud... Services of Saint Cloud, Inc., Saint Cloud, MN, and thereby indirectly acquire control of Liberty Savings Bank, FSB, Saint Cloud, Minnesota. Board of Governors of the Federal Reserve System, November 9, 2012...

  16. Mobile 3d Mapping with a Low-Cost Uav System

    NASA Astrophysics Data System (ADS)

    Neitzel, F.; Klonowski, J.

    2011-09-01

    In this contribution it is shown how an UAV system can be built at low costs. The components of the system, the equipment as well as the control software are presented. Furthermore an implemented programme for photogrammetric flight planning and its execution are described. The main focus of this contribution is on the generation of 3D point clouds from digital imagery. For this web services and free software solutions are presented which automatically generate 3D point clouds from arbitrary image configurations. Possibilities of georeferencing are described whereas the achieved accuracy has been determined. The presented workflow is finally used for the acquisition of 3D geodata. On the example of a landfill survey it is shown that marketable products can be derived using a low-cost UAV.

  17. Covariability in the Monthly Mean Convective and Radiative Diurnal Cycles in the Amazon

    NASA Technical Reports Server (NTRS)

    Dodson, Jason B.; Taylor, Patrick C.

    2015-01-01

    The diurnal cycle of convective clouds greatly influences the radiative energy balance in convectively active regions of Earth, through both direct presence, and the production of anvil and stratiform clouds. Previous studies show that the frequency and properties of convective clouds can vary on monthly timescales as a result of variability in the monthly mean atmospheric state. Furthermore, the radiative budget in convectively active regions also varies by up to 7 Wm-2 in convectively active regions. These facts suggest that convective clouds connect atmospheric state variability and radiation variability beyond clear sky effects alone. Previous research has identified monthly covariability between the diurnal cycle of CERES-observed top-of-atmosphere radiative fluxes and multiple atmospheric state variables from reanalysis over the Amazon region. ASVs that enhance (reduce) deep convection, such as CAPE (LTS), tend to shift the daily OLR and cloud albedo maxima earlier (later) in the day by 2-3 hr. We first test the analysis method using multiple reanalysis products for both the dry and wet seasons to further investigate the robustness of the preliminary results. We then use CloudSat data as an independent cloud observing system to further evaluate the relationships of cloud properties to variability in radiation and atmospheric states. While CERES can decompose OLR variability into clear sky and cloud effects, it cannot determine what variability in cloud properties lead to variability in the radiative cloud effects. Cloud frequency, cloud top height, and cloud microphysics all contribute to the cloud radiative effect, all of which are observable by CloudSat. In addition, CloudSat can also observe the presence and variability of deep convective cores responsible for the production of anvil clouds. We use these capabilities to determine the covariability of convective cloud properties and the radiative diurnal cycle.

  18. Performance Evaluation of Resource Management in Cloud Computing Environments.

    PubMed

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  19. Performance Evaluation of Resource Management in Cloud Computing Environments

    PubMed Central

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price. PMID:26555730

  20. Towards an e-Health Cloud Solution for Remote Regions at Bahia-Brazil.

    PubMed

    Sarinho, V T; Mota, A O; Silva, E P

    2017-12-19

    This paper presents CloudMedic, an e-Health Cloud solution that manages health care services in remote regions of Bahia-Brazil. For that, six main modules: Clinic, Hospital, Supply, Administrative, Billing and Health Business Intelligence, were developed to control the health flow among health actors at health institutions. They provided database model and procedures for health business rules, a standard gateway for data maintenance between web views and database layer, and a multi-front-end framework based on web views and web commands configurations. These resources were used by 2042 health actors in 261 health posts covering health demands from 118 municipalities at Bahia state. They also managed approximately 2.4 million health service 'orders and approximately 13.5 million health exams for more than 1.3 million registered patients. As a result, a collection of health functionalities available in a cloud infrastructure was successfully developed, deployed and validated in more than 28% of Bahia municipalities. A viable e-Health Cloud solution that, despite municipality limitations in remote regions, decentralized and improved the access to health care services at Bahia state.

  1. Cloud-based adaptive exon prediction for DNA analysis.

    PubMed

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  2. Forensic Investigation of Cooperative Storage Cloud Service: Symform as a Case Study.

    PubMed

    Teing, Yee-Yang; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Dargahi, Tooska; Conti, Mauro

    2017-05-01

    Researchers envisioned Storage as a Service (StaaS) as an effective solution to the distributed management of digital data. Cooperative storage cloud forensic is relatively new and is an under-explored area of research. Using Symform as a case study, we seek to determine the data remnants from the use of cooperative cloud storage services. In particular, we consider both mobile devices and personal computers running various popular operating systems, namely Windows 8.1, Mac OS X Mavericks 10.9.5, Ubuntu 14.04.1 LTS, iOS 7.1.2, and Android KitKat 4.4.4. Potential artefacts recovered during the research include data relating to the installation and uninstallation of the cloud applications, log-in to and log-out from Symform account using the client application, file synchronization as well as their time stamp information. This research contributes to an in-depth understanding of the types of terrestrial artifacts that are likely to remain after the use of cooperative storage cloud on client devices. © 2016 American Academy of Forensic Sciences.

  3. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    PubMed

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  4. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  5. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-28

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  6. Adaptive Resource Utilization Prediction System for Infrastructure as a Service Cloud.

    PubMed

    Zia Ullah, Qazi; Hassan, Shahzad; Khan, Gul Muhammad

    2017-01-01

    Infrastructure as a Service (IaaS) cloud provides resources as a service from a pool of compute, network, and storage resources. Cloud providers can manage their resource usage by knowing future usage demand from the current and past usage patterns of resources. Resource usage prediction is of great importance for dynamic scaling of cloud resources to achieve efficiency in terms of cost and energy consumption while keeping quality of service. The purpose of this paper is to present a real-time resource usage prediction system. The system takes real-time utilization of resources and feeds utilization values into several buffers based on the type of resources and time span size. Buffers are read by R language based statistical system. These buffers' data are checked to determine whether their data follows Gaussian distribution or not. In case of following Gaussian distribution, Autoregressive Integrated Moving Average (ARIMA) is applied; otherwise Autoregressive Neural Network (AR-NN) is applied. In ARIMA process, a model is selected based on minimum Akaike Information Criterion (AIC) values. Similarly, in AR-NN process, a network with the lowest Network Information Criterion (NIC) value is selected. We have evaluated our system with real traces of CPU utilization of an IaaS cloud of one hundred and twenty servers.

  7. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  8. Adaptive Resource Utilization Prediction System for Infrastructure as a Service Cloud

    PubMed Central

    Hassan, Shahzad; Khan, Gul Muhammad

    2017-01-01

    Infrastructure as a Service (IaaS) cloud provides resources as a service from a pool of compute, network, and storage resources. Cloud providers can manage their resource usage by knowing future usage demand from the current and past usage patterns of resources. Resource usage prediction is of great importance for dynamic scaling of cloud resources to achieve efficiency in terms of cost and energy consumption while keeping quality of service. The purpose of this paper is to present a real-time resource usage prediction system. The system takes real-time utilization of resources and feeds utilization values into several buffers based on the type of resources and time span size. Buffers are read by R language based statistical system. These buffers' data are checked to determine whether their data follows Gaussian distribution or not. In case of following Gaussian distribution, Autoregressive Integrated Moving Average (ARIMA) is applied; otherwise Autoregressive Neural Network (AR-NN) is applied. In ARIMA process, a model is selected based on minimum Akaike Information Criterion (AIC) values. Similarly, in AR-NN process, a network with the lowest Network Information Criterion (NIC) value is selected. We have evaluated our system with real traces of CPU utilization of an IaaS cloud of one hundred and twenty servers. PMID:28811819

  9. Streamlining On-Demand Access to Joint Polar Satellite System (JPSS) Data Products for Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Tislin, D.

    2017-12-01

    Observations from the Joint Polar Satellite System (JPSS) support National Weather Service (NWS) forecasters, whose Advanced Weather Interactive Processing System (AWIPS) Data Delivery (DD) will access JPSS data products on demand from the National Environmental Satellite, Data, and Information Service (NESDIS) Product Distribution and Access (PDA) service. Based on the Open Geospatial Consortium (OGC) Web Coverage Service, this on-demand service promises broad interoperability and frugal use of data networks by serving only the data that a user needs. But the volume, velocity, and variety of JPSS data products impose several challenges to such a service. It must be efficient to handle large volumes of complex, frequently updated data, and to fulfill many concurrent requests. It must offer flexible data handling and delivery, to work with a diverse and changing collection of data, and to tailor its outputs into products that users need, with minimal coordination between provider and user communities. It must support 24x7 operation, with no pauses in incoming data or user demand; and it must scale to rapid changes in data volume, variety, and demand as new satellites launch, more products come online, and users rely increasingly on the service. We are addressing these challenges in order to build an efficient and effective on-demand JPSS data service. For example, on-demand subsetting by many users at once may overload a server's processing capacity or its disk bandwidth - unless alleviated by spatial indexing, geolocation transforms, or pre-tiling and caching. Filtering by variable (/ band / layer) may also alleviate network loads, and provide fine-grained variable selection; to that end we are investigating how best to provide random access into the variety of spatiotemporal JPSS data products. Finally, producing tailored products (derivatives, aggregations) can boost flexibility for end users; but some tailoring operations may impose significant server loads. Operating this service in a cloud computing environment allows cost-effective scaling during the development and early deployment phases - and perhaps beyond. We will discuss how NESDIS and NWS are assessing and addressing these challenges to provide timely and effective access to JPSS data products for weather forecasters throughout the country.

  10. The effect of medication therapy management service combined with a national PharmaCloud system for polypharmacy patients.

    PubMed

    Chen, Chang-Ming; Kuo, Li-Na; Cheng, Kuei-Ju; Shen, Wan-Chen; Bai, Kuan-Jen; Wang, Chih-Chi; Chiang, Yi-Chun; Chen, Hsiang-Yin

    2016-10-01

    This study evaluated a medication therapy management service using the Taiwan National Health Insurance Administration's PharmaCloud system in a medical center in Taiwan. The new PharmaCloud System, launched in 2013, links a complete list of prescribed and dispensed medication from different hospitals, clinics, and pharmacies for all insured patients. The study included patients with polypharmacy (≥5 drugs) at a medication therapy management service from March 2013 to March 2014. A structured questionnaire was designed to collect patients' baseline data and record patients' knowledge, attitudes, and practice scores before and after the service intervention. Phone follow-ups for practice and adherence scores on medication use were performed after 3 months. There were 152 patients recruited in the study. Scores for medication use attitudes and practice significantly increased after the service (attitudes: 40.06 ± 0.26 to 43.07 ± 0.19, p <0.001; practice: 33.42 ± 0.30 to 40.37 ± 0.30, p <0.001). The scores for medication adherence also increased from 3.02 ± 0.07 to 3.92 ± 0.02 (p <0.001). The PharmaCloud system facilitates accurate and efficient medication reconciliation for pharmacists in the medication therapy management service. The model improved patients' attitudes and practice of the rational use of medications and adherence with medications. Further studies are warranted to evaluate human resources, executing costs, and the cost-benefit ratio of this medication therapy management service with the PharmaCloud system. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.

    2012-04-01

    Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking can benefit largely from the hardware parallelism provided by the cloud deployment. The resulting output, post-stack section, coherence, and NMO-velocity panels are used to generate a smooth migration-velocity model. Residual static corrections are calculated as a by-product of the stack and can be applied iteratively. As a final step, a time migrated subsurface image is obtained by a parallelized Kirchhoff time migration scheme. Processing can be done step-by-step or using a graphical workflow editor that can launch a series of pipelined tasks. The status of the submitted jobs is monitored by a dedicated service. All results are stored in project directories, where they can be downloaded of viewed directly in the browser. Currently, the portal has access to three research clusters having a total number of 70 nodes with 4 cores each. They are shared with four other cloud-computing applications bundled within the GRIDA3 project. To demonstrate the functionality of our "seismic cloud lab", we will present results obtained for three different types of data, all taken from hydrogeophysical studies: (1) a seismic reflection data set, made of compressional waves from explosive sources, recorded in Muravera, Sardinia; (2) a shear-wave data set from, Sardinia; (3) a multi-offset Ground-Penetrating-Radar data set from Larreule, France. The presented work was funded by the government of the Autonomous Region of Sardinia and by the Italian Ministry of Research and Education.

  12. Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds

    DOE PAGES

    Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn; ...

    2016-02-18

    In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less

  13. Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn

    In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less

  14. The MODIS Cloud Optical and Microphysical Products: Collection 6 Up-dates and Examples From Terra and Aqua

    NASA Technical Reports Server (NTRS)

    Platnick, Steven; Meyer, Kerry G.; King, Michael D.; Wind, Galina; Amarasinghe, Nandana; Marchant, Benjamin G.; Arnold, G. Thomas; Zhang, Zhibo; Hubanks, Paul A.; Holz, Robert E.; hide

    2016-01-01

    The MODIS Level-2 cloud product (Earth Science Data Set names MOD06 and MYD06 for Terra and Aqua MODIS, respectively) provides pixel-level retrievals of cloud-top properties (day and night pressure, temperature, and height) and cloud optical properties(optical thickness, effective particle radius, and water path for both liquid water and ice cloud thermodynamic phases daytime only). Collection 6 (C6) reprocessing of the product was completed in May 2014 and March 2015 for MODIS Aqua and Terra, respectively. Here we provide an overview of major C6 optical property algorithm changes relative to the previous Collection 5 (C5) product. Notable C6 optical and microphysical algorithm changes include: (i) new ice cloud optical property models and a more extensive cloud radiative transfer code lookup table (LUT) approach, (ii) improvement in the skill of the shortwave-derived cloud thermodynamic phase, (iii) separate cloud effective radius retrieval datasets for each spectral combination used in previous collections, (iv) separate retrievals for partly cloudy pixels and those associated with cloud edges, (v) failure metrics that provide diagnostic information for pixels having observations that fall outside the LUT solution space, and (vi) enhanced pixel-level retrieval uncertainty calculations.The C6 algorithm changes collectively can result in significant changes relative to C5,though the magnitude depends on the dataset and the pixels retrieval location in the cloud parameter space. Example Level-2 granule and Level-3 gridded dataset differences between the two collections are shown. While the emphasis is on the suite of cloud opticalproperty datasets, other MODIS cloud datasets are discussed when relevant.

  15. The MODIS cloud optical and microphysical products: Collection 6 updates and examples from Terra and Aqua.

    PubMed

    Platnick, Steven; Meyer, Kerry G; King, Michael D; Wind, Galina; Amarasinghe, Nandana; Marchant, Benjamin; Arnold, G Thomas; Zhang, Zhibo; Hubanks, Paul A; Holz, Robert E; Yang, Ping; Ridgway, William L; Riedi, Jérôme

    2017-01-01

    The MODIS Level-2 cloud product (Earth Science Data Set names MOD06 and MYD06 for Terra and Aqua MODIS, respectively) provides pixel-level retrievals of cloud-top properties (day and night pressure, temperature, and height) and cloud optical properties (optical thickness, effective particle radius, and water path for both liquid water and ice cloud thermodynamic phases-daytime only). Collection 6 (C6) reprocessing of the product was completed in May 2014 and March 2015 for MODIS Aqua and Terra, respectively. Here we provide an overview of major C6 optical property algorithm changes relative to the previous Collection 5 (C5) product. Notable C6 optical and microphysical algorithm changes include: (i) new ice cloud optical property models and a more extensive cloud radiative transfer code lookup table (LUT) approach, (ii) improvement in the skill of the shortwave-derived cloud thermodynamic phase, (iii) separate cloud effective radius retrieval datasets for each spectral combination used in previous collections, (iv) separate retrievals for partly cloudy pixels and those associated with cloud edges, (v) failure metrics that provide diagnostic information for pixels having observations that fall outside the LUT solution space, and (vi) enhanced pixel-level retrieval uncertainty calculations. The C6 algorithm changes collectively can result in significant changes relative to C5, though the magnitude depends on the dataset and the pixel's retrieval location in the cloud parameter space. Example Level-2 granule and Level-3 gridded dataset differences between the two collections are shown. While the emphasis is on the suite of cloud optical property datasets, other MODIS cloud datasets are discussed when relevant.

  16. The MODIS cloud optical and microphysical products: Collection 6 updates and examples from Terra and Aqua

    PubMed Central

    Platnick, Steven; Meyer, Kerry G.; King, Michael D.; Wind, Galina; Amarasinghe, Nandana; Marchant, Benjamin; Arnold, G. Thomas; Zhang, Zhibo; Hubanks, Paul A.; Holz, Robert E.; Yang, Ping; Ridgway, William L.; Riedi, Jérôme

    2018-01-01

    The MODIS Level-2 cloud product (Earth Science Data Set names MOD06 and MYD06 for Terra and Aqua MODIS, respectively) provides pixel-level retrievals of cloud-top properties (day and night pressure, temperature, and height) and cloud optical properties (optical thickness, effective particle radius, and water path for both liquid water and ice cloud thermodynamic phases–daytime only). Collection 6 (C6) reprocessing of the product was completed in May 2014 and March 2015 for MODIS Aqua and Terra, respectively. Here we provide an overview of major C6 optical property algorithm changes relative to the previous Collection 5 (C5) product. Notable C6 optical and microphysical algorithm changes include: (i) new ice cloud optical property models and a more extensive cloud radiative transfer code lookup table (LUT) approach, (ii) improvement in the skill of the shortwave-derived cloud thermodynamic phase, (iii) separate cloud effective radius retrieval datasets for each spectral combination used in previous collections, (iv) separate retrievals for partly cloudy pixels and those associated with cloud edges, (v) failure metrics that provide diagnostic information for pixels having observations that fall outside the LUT solution space, and (vi) enhanced pixel-level retrieval uncertainty calculations. The C6 algorithm changes collectively can result in significant changes relative to C5, though the magnitude depends on the dataset and the pixel’s retrieval location in the cloud parameter space. Example Level-2 granule and Level-3 gridded dataset differences between the two collections are shown. While the emphasis is on the suite of cloud optical property datasets, other MODIS cloud datasets are discussed when relevant. PMID:29657349

  17. Evaluation of Decision Trees for Cloud Detection from AVHRR Data

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Nemani, Ramakrishna

    2005-01-01

    Automated cloud detection and tracking is an important step in assessing changes in radiation budgets associated with global climate change via remote sensing. Data products based on satellite imagery are available to the scientific community for studying trends in the Earth's atmosphere. The data products include pixel-based cloud masks that assign cloud-cover classifications to pixels. Many cloud-mask algorithms have the form of decision trees. The decision trees employ sequential tests that scientists designed based on empirical astrophysics studies and simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In a previous study we compared automatically learned decision trees to cloud masks included in Advanced Very High Resolution Radiometer (AVHRR) data products from the year 2000. In this paper we report the replication of the study for five-year data, and for a gold standard based on surface observations performed by scientists at weather stations in the British Islands. For our sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks p < 0.001.

  18. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    PubMed

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  19. Menu-driven cloud computing and resource sharing for R and Bioconductor.

    PubMed

    Bolouri, Hamid; Dulepet, Rajiv; Angerman, Michael

    2011-08-15

    We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. hbolouri@fhcrc.org.

  20. Cloud based emergency health care information service in India.

    PubMed

    Karthikeyan, N; Sukanesh, R

    2012-12-01

    A hospital is a health care organization providing patient treatment by expert physicians, surgeons and equipments. A report from a health care accreditation group says that miscommunication between patients and health care providers is the reason for the gap in providing emergency medical care to people in need. In developing countries, illiteracy is the major key root for deaths resulting from uncertain diseases constituting a serious public health problem. Mentally affected, differently abled and unconscious patients can't communicate about their medical history to the medical practitioners. Also, Medical practitioners can't edit or view DICOM images instantly. Our aim is to provide palm vein pattern recognition based medical record retrieval system, using cloud computing for the above mentioned people. Distributed computing technology is coming in the new forms as Grid computing and Cloud computing. These new forms are assured to bring Information Technology (IT) as a service. In this paper, we have described how these new forms of distributed computing will be helpful for modern health care industries. Cloud Computing is germinating its benefit to industrial sectors especially in medical scenarios. In Cloud Computing, IT-related capabilities and resources are provided as services, via the distributed computing on-demand. This paper is concerned with sprouting software as a service (SaaS) by means of Cloud computing with an aim to bring emergency health care sector in an umbrella with physical secured patient records. In framing the emergency healthcare treatment, the crucial thing considered necessary to decide about patients is their previous health conduct records. Thus a ubiquitous access to appropriate records is essential. Palm vein pattern recognition promises a secured patient record access. Likewise our paper reveals an efficient means to view, edit or transfer the DICOM images instantly which was a challenging task for medical practitioners in the past years. We have developed two services for health care. 1. Cloud based Palm vein recognition system 2. Distributed Medical image processing tools for medical practitioners.

  1. Lessons learned in deploying a cloud-based knowledge platform for the Earth Science Information Partners Federation (ESIP)

    NASA Astrophysics Data System (ADS)

    Pouchard, L. C.; Depriest, A.; Huhns, M.

    2012-12-01

    Ontologies and semantic technologies are an essential infrastructure component of systems supporting knowledge integration in the Earth Sciences. Numerous earth science ontologies exist, but are hard to discover because they tend to be hosted with the projects that develop them. There are often few quality measures and sparse metadata associated with these ontologies, such as modification dates, versioning, purpose, number of classes, and properties. Projects often develop ontologies for their own needs without considering existing ontology entities or derivations from formal and more basic ontologies. The result is mostly orthogonal ontologies, and ontologies that are not modular enough to reuse in part or adapt for new purposes, in spite of existing, standards for ontology representation. Additional obstacles to sharing and reuse include a lack of maintenance once a project is completed. The obstacles prevent the full exploitation of semantic technologies in a context where they could become needed enablers for service discovery and for matching data with services. To start addressing this gap, we have deployed BioPortal, a mature, domain-independent ontology and semantic service system developed by the National Center for Biomedical Ontologies (NCBO), on the ESIP Testbed under the governance of the ESIP Semantic Web cluster. ESIP provides a forum for a broad-based, distributed community of data and information technology practitioners and stakeholders to coordinate their efforts and develop new ideas for interoperability solutions. The Testbed provides an environment where innovations and best practices can be explored and evaluated. One objective of this deployment is to provide a community platform that would harness the organizational and cyber infrastructure provided by ESIP at minimal costs. Another objective is to host ontology services on a scalable, public cloud and investigate the business case for crowd sourcing of ontology maintenance. We deployed the system on Amazon 's Elastic Compute Cloud (EC2) where ESIP maintains an account. Our approach had three phases: 1) set up a private cloud environment at the University of South Carolina to become familiar with the complex architecture of the system and enable some basic customization, 2) coordinate the production of a Virtual Appliance for the system with NCBO and deploy it on the Amazon cloud, and 3) outreach to the ESIP community to solicit participation, populate the repository, and develop new use cases. Phase 2 is nearing completion and Phase 3 is underway. Ontologies were gathered during updates to the ESIP cluster. Discussion points included the criteria for a shareable ontology and how to determine the best size for an ontology to be reusable. Outreach highlighted that the system can start addressing an integration of discovery frameworks via linking data and services in a pull model (data and service casting), a key issue of the Discovery cluster. This work thus presents several contributions: 1) technology injection from another domain into the earth sciences, 2) the deployment of a mature knowledge platform on the EC2 cloud, and 3) the successful engagement of the community through the ESIP clusters and Testbed model.

  2. A computational- And storage-cloud for integration of biodiversity collections

    USGS Publications Warehouse

    Matsunaga, A.; Thompson, A.; Figueiredo, R. J.; Germain-Aubrey, C.C; Collins, M.; Beeman, R.S; Macfadden, B.J.; Riccardi, G.; Soltis, P.S; Page, L. M.; Fortes, J.A.B

    2013-01-01

    A core mission of the Integrated Digitized Biocollections (iDigBio) project is the building and deployment of a cloud computing environment customized to support the digitization workflow and integration of data from all U.S. nonfederal biocollections. iDigBio chose to use cloud computing technologies to deliver a cyberinfrastructure that is flexible, agile, resilient, and scalable to meet the needs of the biodiversity community. In this context, this paper describes the integration of open source cloud middleware, applications, and third party services using standard formats, protocols, and services. In addition, this paper demonstrates the value of the digitized information from collections in a broader scenario involving multiple disciplines.

  3. Extraction of convective cloud parameters from Doppler Weather Radar MAX(Z) product using Image Processing Technique

    NASA Astrophysics Data System (ADS)

    Arunachalam, M. S.; Puli, Anil; Anuradha, B.

    2016-07-01

    In the present work continuous extraction of convective cloud optical information and reflectivity (MAX(Z) in dBZ) using online retrieval technique for time series data production from Doppler Weather Radar (DWR) located at Indian Meteorological Department, Chennai has been developed in MATLAB. Reflectivity measurements for different locations within the DWR range of 250 Km radii of circular disc area can be retrieved using this technique. It gives both time series reflectivity of point location and also Range Time Intensity (RTI) maps of reflectivity for the corresponding location. The Graphical User Interface (GUI) developed for the cloud reflectivity is user friendly; it also provides the convective cloud optical information such as cloud base height (CBH), cloud top height (CTH) and cloud optical depth (COD). This technique is also applicable for retrieving other DWR products such as Plan Position Indicator (Z, in dBZ), Plan Position Indicator (Z, in dBZ)-Close Range, Volume Velocity Processing (V, in knots), Plan Position Indicator (V, in m/s), Surface Rainfall Intensity (SRI, mm/hr), Precipitation Accumulation (PAC) 24 hrs at 0300UTC. Keywords: Reflectivity, cloud top height, cloud base, cloud optical depth

  4. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  5. Distance Learning and Cloud Computing: "Just Another Buzzword or a Major E-Learning Breakthrough?"

    ERIC Educational Resources Information Center

    Romiszowski, Alexander J.

    2012-01-01

    "Cloud computing is a model for the enabling of ubiquitous, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and other services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." This…

  6. GABE: A Cloud Brokerage System for Service Selection, Accountability and Enforcement

    ERIC Educational Resources Information Center

    Sundareswaran, Smitha

    2014-01-01

    Much like its meteorological counterpart, "Cloud Computing" is an amorphous agglomeration of entities. It is amorphous in that the exact layout of the servers, the load balancers and their functions are neither known nor fixed. Its an agglomerate in that multiple service providers and vendors often coordinate to form a multitenant system…

  7. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  8. The NOAA Big Data Project

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  9. Spatial and Temporal Distribution of Clouds Observed by MODIS Onboard the Terra and Aqua Satellites

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Menzel, W. Paul; Ackerman, Steven A.; Hubanks, Paul A.

    2012-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) was developed by NASA and launched aboard the Terra spacecraft on December 18, 1999 and Aqua spacecraft on May 4, 2002. A comprehensive set of remote sensing algorithms for the retrieval of cloud physical and optical properties have enabled over twelve years of continuous observations of cloud properties from Terra and over nine years from Aqua. The archived products from these algorithms include 1 km pixel-level (Level-2) and global gridded Level-3 products. In addition to an extensive cloud mask, products include cloud-top properties (temperature, pressure, effective emissivity), cloud thermodynamic phase, cloud optical and microphysical parameters (optical thickness, effective particle radius, water path), as well as derived statistics. Results include the latitudinal distribution of cloud optical and radiative properties for both liquid water and ice clouds, as well as latitudinal distributions of cloud top pressure and cloud top temperature. MODIS finds the cloud fraction, as derived by the cloud mask, is nearly identical during the day and night, with only modest diurnal variation. Globally, the cloud fraction derived by the MODIS cloud mask is approx.67%, with somewhat more clouds over land during the afternoon and less clouds over ocean in the afternoon, with very little difference in global cloud cover between Terra and Aqua. Overall, cloud fraction over land is approx.55%, with a distinctive seasonal cycle, whereas the ocean cloudiness is much higher, around 72%, with much reduced seasonal variation. Cloud top pressure and temperature have distinct spatial and temporal patterns, and clearly reflect our understanding of the global cloud distribution. High clouds are especially prevalent over the northern hemisphere continents between 30 and 50 . Aqua and Terra have comparable zonal cloud top pressures, with Aqua having somewhat higher clouds (cloud top pressures lower by 100 hPa) over land due to afternoon deep convection. The coldest cloud tops (colder than 230 K) generally occur over Antarctica and the high clouds in the tropics (ITCZ and the deep convective clouds over the western tropical Pacific and Indian sub-continent).

  10. Analysis of the VIIRS cloud mask, comparison with the NAVOCEANO cloud mask, and how they complement each other

    NASA Astrophysics Data System (ADS)

    Cayula, Jean-François P.; May, Douglas A.; McKenzie, Bruce D.

    2014-05-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) Cloud Mask (VCM) Intermediate Product (IP) has been developed for use with Suomi National Polar-orbiting Partnership (NPP) VIIRS Environmental Data Record (EDR) products. In particular, the VIIRS Sea Surface Temperature (SST) EDR relies on VCM to identify cloud contaminated observations. Unfortunately, VCM does not appear to perform as well as cloud detection algorithms for SST. This may be due to similar but different goals of the two algorithms. VCM is concerned with detecting clouds while SST is interested in identifying clear observations. The result is that in undetermined cases VCM defaults to "clear," while the SST cloud detection defaults to "cloud." This problem is further compounded because classic SST cloud detection often flags as "cloud" all types of corrupted data, thus making a comparison with VCM difficult. The Naval Oceanographic Office (NAVOCEANO), which operationally produces a VIIRS SST product, relies on cloud detection from the NAVOCEANO Cloud Mask (NCM), adapted from cloud detection schemes designed for SST processing. To analyze VCM, the NAVOCEANO SST process was modified to attach the VCM flags to all SST retrievals. Global statistics are computed for both day and night data. The cases where NCM and/or VCM tag data as cloud-contaminated or clear can then be investigated. By analyzing the VCM individual test flags in conjunction with the status of NCM, areas where VCM can complement NCM are identified.

  11. MISR CMV New Data

    Atmospheric Science Data Center

    2016-10-31

    Cloud Motion Vector (CMV) Product The MISR Level 3 Products are global or ... field campaigns at daily and monthly time scales. The CMV product provides conveniently organized, high quality retrievals of cloud ...

  12. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities.

    PubMed

    Chen, Yuh-Shyan; Tsai, Yi-Ting

    2018-02-06

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the throughput, the probability of packet loss, and the number of control messages.

  13. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities

    PubMed Central

    Tsai, Yi-Ting

    2018-01-01

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the throughput, the probability of packet loss, and the number of control messages. PMID:29415510

  14. Optical properties of aerosol contaminated cloud derived from MODIS instrument

    NASA Astrophysics Data System (ADS)

    Mei, Linlu; Rozanov, Vladimir; Lelli, Luca; Vountas, Marco; Burrows, John P.

    2016-04-01

    The presence of absorbing aerosols above/within cloud can reduce the amount of up-welling radiation in visible (VIS) and short-wave infrared and darken the spectral reflectance when compared with a spectrum of a clean cloud observed by satellite instruments (Jethva et al., 2013). Cloud properties retrieval for aerosol contaminated cases is a great challenge. Even small additional injection of aerosol particles into clouds in the cleanest regions of Earth's atmosphere will cause significant effect on those clouds and on climate forcing (Koren et al., 2014; Rosenfeld et al., 2014) because the micro-physical cloud process are non-linear with respect to the aerosol loading. The current cloud products like Moderate Resolution Imaging Spectroradiometer (MODIS) ignoring the aerosol effect for the retrieval, which may cause significant error in the satellite-derived cloud properties. In this paper, a new cloud properties retrieval method, considering aerosol effect, based on the weighting-function (WF) method, is presented. The retrieval results shows that the WF retrieved cloud properties (e.g COT) agrees quite well with MODIS COT product for relative clear atmosphere (AOT ≤ 0.4) while there is a large difference for large aerosol loading. The MODIS COT product is underestimated for at least 2 - 3 times for AOT>0.4, and this underestimation increases with the increase of AOT.

  15. Observing relationships between lightning and cloud profiles by means of a satellite-borne cloud radar

    NASA Astrophysics Data System (ADS)

    Buiat, Martina; Porcù, Federico; Dietrich, Stefano

    2017-01-01

    Cloud electrification and related lightning activity in thunderstorms have their origin in the charge separation and resulting distribution of charged iced particles within the cloud. So far, the ice distribution within convective clouds has been investigated mainly by means of ground-based meteorological radars. In this paper we show how the products from Cloud Profiling Radar (CPR) on board CloudSat, a polar satellite of NASA's Earth System Science Pathfinder (ESSP), can be used to obtain information from space on the vertical distribution of ice particles and ice content and relate them to the lightning activity. The analysis has been carried out, focusing on 12 convective events over Italy that crossed CloudSat overpasses during significant lightning activity. The CPR products considered here are the vertical profiles of cloud ice water content (IWC) and the effective radius (ER) of ice particles, which are compared with the number of strokes as measured by a ground lightning network (LINET). Results show a strong correlation between the number of strokes and the vertical distribution of ice particles as depicted by the 94 GHz CPR products: in particular, cloud upper and middle levels, high IWC content and relatively high ER seem to be favourable contributory causes for CG (cloud to ground) stroke occurrence.

  16. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud

    PubMed Central

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization. PMID:26380364

  17. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud.

    PubMed

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization.

  18. Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS)

    NASA Astrophysics Data System (ADS)

    Daniels, M. D.; Graves, S. J.; Vernon, F.; Kerkez, B.; Chandra, C. V.; Keiser, K.; Martin, C.

    2014-12-01

    Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) Access, utilization and management of real-time data continue to be challenging for decision makers, as well as researchers in several scientific fields. This presentation will highlight infrastructure aimed at addressing some of the gaps in handling real-time data, particularly in increasing accessibility of these data to the scientific community through cloud services. The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) system addresses the ever-increasing importance of real-time scientific data, particularly in mission critical scenarios, where informed decisions must be made rapidly. Advances in the distribution of real-time data are leading many new transient phenomena in space-time to be observed, however real-time decision-making is infeasible in many cases that require streaming scientific data as these data are locked down and sent only to proprietary in-house tools or displays. This lack of accessibility to the broader scientific community prohibits algorithm development and workflows initiated by these data streams. As part of NSF's EarthCube initiative, CHORDS proposes to make real-time data available to the academic community via cloud services. The CHORDS infrastructure will enhance the role of real-time data within the geosciences, specifically expanding the potential of streaming data sources in enabling adaptive experimentation and real-time hypothesis testing. Adherence to community data and metadata standards will promote the integration of CHORDS real-time data with existing standards-compliant analysis, visualization and modeling tools.

  19. The Generation of Near-Real Time Data Products for MODIS

    NASA Astrophysics Data System (ADS)

    Teague, M.; Schmaltz, J. E.; Ilavajhala, S.; Ye, G.; Masuoka, E.; Murphy, K. J.; Michael, K.

    2010-12-01

    The GSFC Terrestrial Information Systems Branch (614.5) operate the Land and Atmospheres Near-real-time Capability for EOS (LANCE-MODIS) system. Other LANCE elements include -AIRS, -MLS, -OMI, and -AMSR-E. LANCE-MODIS incorporates the former Rapid Response system and will, in early 2011, include the Fire Information for Resource Management System (FIRMS). The purpose of LANCE is to provide applications users with a variety of products on a near-real time basis. The LANCE-MODIS data products include Level 1 (L1), L2 fire, snow, sea ice, cloud mask/profiles, aerosols, clouds, land surface reflectance, land surface temperature, and L2G and L3 gridded, daily, land surface reflectance products. Data are available either by ftp access (pull) or by subscription (push) and the L1 and L2 data products are available within an average of 2.5 hours of the observation time. The use of ancillary data products input to the standard science algorithms has been modified in order to obtain these latencies. The resulting products have been approved for applications use by the MODIS Science Team. The http://lance.nasa.gov site provides registration information and extensive information concerning the MODIS data products and imagery including a comparison between the LANCE-MODIS and the standard science-quality products generated by the MODAPS system. The LANCE-MODIS system includes a variety of tools that enable users to manipulate the data products including: parameter, band, and geographic subsetting, re-projection, mosaicing, and generation of data in the GeoTIFF format. In most instances the data resulting from use of these tools has a latency of less than 3 hours. Access to these tools is available through a Web Coverage Service. A Google Earth/Web Mapping Service is available to access image products. LANCE-MODIS supports a wide variety of applications users in civilian, military, and foreign agencies as well as universities and the private sector. Examples of applications are: Flood Mapping, Famine relief, Food and Agriculture, Hazards and Disasters, and Weather.

  20. Maintaining Enterprise Resiliency via Kaleidoscopic Adaption and Transformation of Software Services (MEERKATS)

    DTIC Science & Technology

    2016-04-01

    infrastructure . The work is motivated by the fact that today’s clouds are very static, uniform, and predictable, allowing attackers who identify a...vulnerability in one of the services or infrastructure components to spread their effect to other, mission-critical services. Our goal is to integrate into...clouds by elevating continuous change, evolution, and misinformation as first-rate design principles of the cloud’s infrastructure . Our work is

  1. WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, K; Kagadis, G; Xing, L

    As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set againstmore » new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.« less

  2. A Distributed Parallel Genetic Algorithm of Placement Strategy for Virtual Machines Deployment on Cloud Platform

    PubMed Central

    Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong

    2014-01-01

    The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform. PMID:25097872

  3. A distributed parallel genetic algorithm of placement strategy for virtual machines deployment on cloud platform.

    PubMed

    Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong

    2014-01-01

    The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform.

  4. Investigating the Use of Cloudbursts for High-Throughput Medical Image Registration

    PubMed Central

    Kim, Hyunjoo; Parashar, Manish; Foran, David J.; Yang, Lin

    2010-01-01

    This paper investigates the use of clouds and autonomic cloudbursting to support a medical image registration. The goal is to enable a virtual computational cloud that integrates local computational environments and public cloud services on-the-fly, and support image registration requests from different distributed researcher groups with varied computational requirements and QoS constraints. The virtual cloud essentially implements shared and coordinated task-spaces, which coordinates the scheduling of jobs submitted by a dynamic set of research groups to their local job queues. A policy-driven scheduling agent uses the QoS constraints along with performance history and the state of the resources to determine the appropriate size and mix of the public and private cloud resource that should be allocated to a specific request. The virtual computational cloud and the medical image registration service have been developed using the CometCloud engine and have been deployed on a combination of private clouds at Rutgers University and the Cancer Institute of New Jersey and Amazon EC2. An experimental evaluation is presented and demonstrates the effectiveness of autonomic cloudbursts and policy-based autonomic scheduling for this application. PMID:20640235

  5. Daytime variations of absorbing aerosols above clouds in the southeast Atlantic

    NASA Astrophysics Data System (ADS)

    Chang, Y. Y.; Christopher, S. A.

    2016-12-01

    The daytime variation of aerosol optical depth (AOD) above maritime stratocumulus clouds in the southeast Atlantic is investigated by merging geostationary data from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) with NASA A-Train data sets. SEVIRI's 15-minute above cloud AOD and below aerosol cloud optical depth (COD) retrieval provides the opportunity to assess their direct radiative forcing using actual cloud and aerosol properties instead of using fixed values from polar-orbiting measurements. The impact of overlying aerosols above clouds on the cloud mask products are compared with active spaceborne lidar to examine the performance of the product. Uncertainty analyses of aerosol properties on the estimation of optical properties and radiative forcing are addressed.

  6. Spectral Longwave Cloud Radiative Forcing as Observed by AIRS

    NASA Technical Reports Server (NTRS)

    Blaisdell, John M.; Susskind, Joel; Lee, Jae N.; Iredell, Lena

    2016-01-01

    AIRS V6 products contain the spectral contributions to Outgoing Longwave Radiation (OLR), clear-sky OLR (OLR(sub CLR)), and Longwave Cloud Radiative Forcing (LWCRF) in 16 bands from 100 cm(exp -1) to 3260 cm(exp -1). We show climatologies of selected spectrally resolved AIRS V6 products over the period of September 2002 through August 2016. Spectrally resolved LWCRF can better describe the response of the Earth system to cloud and cloud feedback processes. The spectral LWCRF enables us to estimate the fraction of each contributing factor to cloud forcing, i.e.: surface temperature, mid to upper tropospheric water vapor, and tropospheric temperature. This presentation also compares the spatial characteristics of LWCRF from AIRS, CERES_EBAF Edition-2.8, and MERRA-2. AIRS and CERES LWCRF products show good agreement. The OLR bias between AIRS and CERES is very close to that of OLR(sub CLR). This implies that both AIRS and CERES OLR products accurately account for the effect of clouds on OLR.

  7. Intelligent cloud computing security using genetic algorithm as a computational tools

    NASA Astrophysics Data System (ADS)

    Razuky AL-Shaikhly, Mazin H.

    2018-05-01

    An essential change had occurred in the field of Information Technology which represented with cloud computing, cloud giving virtual assets by means of web yet awesome difficulties in the field of information security and security assurance. Currently main problem with cloud computing is how to improve privacy and security for cloud “cloud is critical security”. This paper attempts to solve cloud security by using intelligent system with genetic algorithm as wall to provide cloud data secure, all services provided by cloud must detect who receive and register it to create list of users (trusted or un-trusted) depend on behavior. The execution of present proposal has shown great outcome.

  8. Comparison of Cloud Properties from CALIPSO-CloudSat and Geostationary Satellite Data

    NASA Technical Reports Server (NTRS)

    Nguyen, L.; Minnis, P.; Chang, F.; Winker, D.; Sun-Mack, S.; Spangenberg, D.; Austin, R.

    2007-01-01

    Cloud properties are being derived in near-real time from geostationary satellite imager data for a variety of weather and climate applications and research. Assessment of the uncertainties in each of the derived cloud parameters is essential for confident use of the products. Determination of cloud amount, cloud top height, and cloud layering is especially important for using these real -time products for applications such as aircraft icing condition diagnosis and numerical weather prediction model assimilation. Furthermore, the distribution of clouds as a function of altitude has become a central component of efforts to evaluate climate model cloud simulations. Validation of those parameters has been difficult except over limited areas where ground-based active sensors, such as cloud radars or lidars, have been available on a regular basis. Retrievals of cloud properties are sensitive to the surface background, time of day, and the clouds themselves. Thus, it is essential to assess the geostationary satellite retrievals over a variety of locations. The availability of cloud radar data from CloudSat and lidar data from CALIPSO make it possible to perform those assessments over each geostationary domain at 0130 and 1330 LT. In this paper, CloudSat and CALIPSO data are matched with contemporaneous Geostationary Operational Environmental Satellite (GOES), Multi-functional Transport Satellite (MTSAT), and Meteosat-8 data. Unlike comparisons with cloud products derived from A-Train imagers, this study considers comparisons of nadir active sensor data with off-nadir retrievals. These matched data are used to determine the uncertainties in cloud-top heights and cloud amounts derived from the geostationary satellite data using the Clouds and the Earth s Radiant Energy System (CERES) cloud retrieval algorithms. The CERES multi-layer cloud detection method is also evaluated to determine its accuracy and limitations in the off-nadir mode. The results will be useful for constraining the use of the passive retrieval data in models and for improving the accuracy of the retrievals.

  9. GOME and Sciamachy data access using the Netherlands Sciamachy Data Center

    NASA Astrophysics Data System (ADS)

    Som de Cerff, Wim; de Vreede, Ernst; van de Vegte, John; van Hees, Ricard; van der Neut, Ian; Stammes, Piet; Pieters, Ankie; van der A, Ronald

    2010-05-01

    The Netherlands Sciamachy Data Center (NL-SCIA-DC) provides access to satellite data from the GOME and Sciamachy instruments for over 10 years now. GOME and Sciamachy both measure trace gases like Ozone, Methane, NO2 and aerosols, which are important for climate and air quality monitoring. Recently (February 2010) a new release of the NL-SCIA-DC provides an improved processing and archiving structure and an improved user interface. This Java Webstart application allows the user to browse, query and download GOME and Sciamachy data products, including KNMI and SRON GOME and Sciamachy products (cloud products, CH4, NO2, CO). Data can be searched on file and pixel level, and can be graphically displayed. The huge database containing all pixel information of GOME and Sciamachy is unique and allows specific selection, e.g., selecting cloud free pixels. Ordered data is delivered by FTP or email. The data available spans the mission times of GOME and Sciamachy, and is constantly updated as new data becomes available. The data services future upgrades include offering additional functionality to end-users of Sciamachy data. One of the functionalities provided will be the possibility to select and process Sciamachy products using different data processors, using Grid technology. This technology was successfully researched and will be made operationally available in the near future.

  10. Can cloud computing benefit health services? - a SWOT analysis.

    PubMed

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  11. Spatial and Temporal Distribution of Clouds as Observed by MODIS Onboard the Terra and Aqua Satellites

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Menzel, Paul; Ackerman, Steven A.

    2006-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) was developed by NASA and launched onboard the Terra spacecraft on December 18,1999 and Aqua spacecraft on May 4, 2002. It achieved its final orbit and began Earth observations on February 24,2000 for Terra and June 24,2002 for Aqua. A comprehensive set of remote sensing algorithms for cloud masking and the retrieval of cloud physical and optical properties has been developed by members of the MODIS atmosphere science team. The archived products from these algorithms have applications in climate change studies, climate modeling, numerical weather prediction, and fundamental atmospheric research. In addition to an extensive cloud mask, products include cloud-top properties (temperature, pressure, effective emissivity), cloud thermodynamic phase, cloud optical and microphysical parameters (optical thickness, effective particle radius, water path), as well as derived statistics. Over the last year, extensive improvements and enhancements in the global cloud products have been implemented, and reprocessing of all MODIS data on Terra has commenced since first light in February 2000. In the cloud mask algorithm, the most extensive improvements were in distinguishing clouds at nighttime, including the challenging polar darkness regions of the world. Additional improvements have been made to properly distinguish sunglint from clouds in the tropical ocean regions, and to improve the identification of clouds from snow during daytime in Polar Regions. We will show global monthly mean cloud fraction for both Terra and Aqua, and show how similar the global daytime cloud fraction is from these morning and afternoon orbits, respectively. We will also show the zonal distribution of cloud fraction over land and ocean regions for both Terra and Aqua, and show the time series of global cloud fraction from July 2002 through June 2006.

  12. AIRS Subpixel Cloud Characterization Using MODIS Cloud Products.

    NASA Astrophysics Data System (ADS)

    Li, Jun; Menzel, W. Paul; Sun, Fengying; Schmit, Timothy J.; Gurka, James

    2004-08-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) and the Atmospheric Infrared Sounder (AIRS) measurements from the Earth Observing System's (EOS's) Aqua satellite enable improved global monitoring of the distribution of clouds. MODIS is able to provide, at high spatial resolution (1 5 km), a cloud mask, surface and cloud types, cloud phase, cloud-top pressure (CTP), effective cloud amount (ECA), cloud particle size (CPS), and cloud optical thickness (COT). AIRS is able to provide CTP, ECA, CPS, and COT at coarser spatial resolution (13.5 km at nadir) but with much better accuracy using its high-spectral-resolution measurements. The combined MODIS AIRS system offers the opportunity for improved cloud products over those possible from either system alone. The key steps for synergistic use of imager and sounder radiance measurements are 1) collocation in space and time and 2) imager cloud amount, type, and phase determination within the sounder pixel. The MODIS and AIRS measurements from the EOS Aqua satellite provide the opportunity to study the synergistic use of advanced imager and sounder measurements. As the first step, the MODIS classification procedure is applied to identify various surface and cloud types within an AIRS footprint. Cloud-layer information (lower, midlevel, or high clouds) and phase information (water, ice, or mixed-phase clouds) within the AIRS footprint are sorted and characterized using MODIS 1-km-spatial-resolution data. The combined MODIS and AIRS data for various scenes are analyzed to study the utility of the synergistic use of high-spatial-resolution imager products and high-spectral-resolution sounder radiance measurements. There is relevance to the optimal use of data from the Advanced Baseline Imager (ABI) and Hyperspectral Environmental Suite (HES) systems, which are to fly on the Geostationary Operational Environmental Satellite (GOES)-R.


  13. The virtual machine (VM) scaler: an infrastructure manager supporting environmental modeling on IaaS clouds

    USDA-ARS?s Scientific Manuscript database

    Infrastructure-as-a-service (IaaS) clouds provide a new medium for deployment of environmental modeling applications. Harnessing advancements in virtualization, IaaS clouds can provide dynamic scalable infrastructure to better support scientific modeling computational demands. Providing scientific m...

  14. The GOES-R/JPSS Approach for Identifying Hazardous Low Clouds: Overview and Operational Impacts

    NASA Astrophysics Data System (ADS)

    Calvert, Corey; Pavolonis, Michael; Lindstrom, Scott; Gravelle, Chad; Terborg, Amanda

    2017-04-01

    Low ceiling and visibility is a weather hazard that nearly every forecaster, in nearly every National Weather Service (NWS) Weather Forecast Office (WFO), must regularly address. In addition, national forecast centers such as the Aviation Weather Center (AWC), Alaska Aviation Weather Unit (AAWU) and the Ocean Prediction Center (OPC) are responsible for issuing low ceiling and visibility related products. As such, reliable methods for detecting and characterizing hazardous low clouds are needed. Traditionally, hazardous areas of Fog/Low Stratus (FLS) are identified using a simple stand-alone satellite product that is constructed by subtracting the 3.9 and 11 μm brightness temperatures. However, the 3.9-11 μm brightness temperature difference (BTD) has several major limitations. In an effort to address the limitations of the BTD product, the GOES-R Algorithm Working Group (AWG) developed an approach that fuses satellite, Numerical Weather Prediction (NWP) model, Sea Surface Temperature (SST) analyses, and other data sets (e.g. digital surface elevation maps, surface emissivity maps, and surface type maps) to determine the probability that hazardous low clouds are present using a naïve Bayesian classifier. In addition, recent research has focused on blending geostationary (e.g. GOES-R) and low earth orbit (e.g. JPSS) satellite data to further improve the products. The FLS algorithm has adopted an enterprise approach in that it can utilize satellite data from a variety of current and future operational sensors and NWP data from a variety of models. The FLS products are available in AWIPS/N-AWIPS/AWIPS-II and have been evaluated within NWS operations over the last four years as part of the Satellite Proving Ground. Forecaster feedback has been predominantly positive and references to these products within Area Forecast Discussions (AFD's) indicate that the products are influencing operational forecasts. At the request of the NWS, the FLS products are currently being transitioned to NOAA/NESDIS operations, which will ensure that users have long-term access to these products. This paper will provide an overview of the FLS products and illustrate how they are being used to improve transportation safety and efficiency.

  15. Using Cloud-Computing Applications to Support Collaborative Scientific Inquiry: Examining Pre-Service Teachers' Perceived Barriers to Integration

    ERIC Educational Resources Information Center

    Donna, Joel D.; Miller, Brant G.

    2013-01-01

    Technology plays a crucial role in facilitating collaboration within the scientific community. Cloud-computing applications, such as Google Drive, can be used to model such collaboration and support inquiry within the secondary science classroom. Little is known about pre-service teachers' beliefs related to the envisioned use of collaborative,…

  16. A City Parking Integration System Combined with Cloud Computing Technologies and Smart Mobile Devices

    ERIC Educational Resources Information Center

    Yeh, Her-Tyan; Chen, Bing-Chang; Wang, Bo-Xun

    2016-01-01

    The current study applied cloud computing technology and smart mobile devices combined with a streaming server for parking lots to plan a city parking integration system. It is also equipped with a parking search system, parking navigation system, parking reservation service, and car retrieval service. With this system, users can quickly find…

  17. Safety & Service in the Skies

    ERIC Educational Resources Information Center

    Grush, Mary

    2009-01-01

    As colleges and universities rely more heavily on software as a service (SaaS), they're putting more critical data in the cloud. What are the security issues, and how are cloud providers responding? "Campus Technology" ("CT") went to three higher ed SaaS vendors--Google, IBM, and TopSchool--and asked them to share their thoughts about the state of…

  18. The Globus Galaxies Platform. Delivering Science Gateways as a Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Ravi; Chard, Kyle; Chard, Ryan

    We use public cloud computers to host sophisticated scientific data; software is then used to transform scientific practice by enabling broad access to capabilities previously available only to the few. The primary obstacle to more widespread use of public clouds to host scientific software (‘cloud-based science gateways’) has thus far been the considerable gap between the specialized needs of science applications and the capabilities provided by cloud infrastructures. We describe here a domain-independent, cloud-based science gateway platform, the Globus Galaxies platform, which overcomes this gap by providing a set of hosted services that directly address the needs of science gatewaymore » developers. The design and implementation of this platform leverages our several years of experience with Globus Genomics, a cloud-based science gateway that has served more than 200 genomics researchers across 30 institutions. Building on that foundation, we have also implemented a platform that leverages the popular Galaxy system for application hosting and workflow execution; Globus services for data transfer, user and group management, and authentication; and a cost-aware elastic provisioning model specialized for public cloud resources. We describe here the capabilities and architecture of this platform, present six scientific domains in which we have successfully applied it, report on user experiences, and analyze the economics of our deployments. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less

  19. Menu-driven cloud computing and resource sharing for R and Bioconductor

    PubMed Central

    Bolouri, Hamid; Angerman, Michael

    2011-01-01

    Summary: We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. Availability and Implementation: CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. Contact: hbolouri@fhcrc.org PMID:21685055

  20. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    PubMed Central

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  1. Scientific Data Storage for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Readey, J.

    2014-12-01

    Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.

  2. Spatially Varying Spectrally Thresholds for MODIS Cloud Detection

    NASA Technical Reports Server (NTRS)

    Haines, S. L.; Jedlovec, G. J.; Lafontaine, F.

    2004-01-01

    The EOS science team has developed an elaborate global MODIS cloud detection procedure, and the resulting MODIS product (MOD35) is used in the retrieval process of several geophysical parameters to mask out clouds. While the global application of the cloud detection approach appears quite robust, the product has some shortcomings on the regional scale, often over determining clouds in a variety of settings, particularly at night. This over-determination of clouds can cause a reduction in the spatial coverage of MODIS derived clear-sky products. To minimize this problem, a new regional cloud detection method for use with MODIS data has been developed at NASA's Global Hydrology and Climate Center (GHCC). The approach is similar to that used by the GHCC for GOES data over the continental United States. Several spatially varying thresholds are applied to MODIS spectral data to produce a set of tests for detecting clouds. The thresholds are valid for each MODIS orbital pass, and are derived from 20-day composites of GOES channels with similar wavelengths to MODIS. This paper and accompanying poster will introduce the GHCC MODIS cloud mask, provide some examples, and present some preliminary validation.

  3. A Model for Trust-based Access Control and Delegation in Mobile Clouds (Post Print)

    DTIC Science & Technology

    2013-10-01

    the access-granter knowing the identity of access requester beforehand and authenticating the requester, can no longer be applied. Mobile Wallet Cloud...TktC) for a reservation and con- tacts the user’s mobile wallet provider (MobWC) to purchase the ticket from TktC. For accessing different services...receiving regular services. For example, the human user in our scenario can be an elite member with the mobile wallet service provider that

  4. Near-Real-Time Cloud Auditing for Rapid Response

    DTIC Science & Technology

    2013-10-01

    cloud auditing , which provides timely evaluation results and rapid response, is the key to assuring the cloud. In this paper, we discuss security and...providers with possible automation of the audit , assertion, assessment, and assurance of their services. The Cloud Security Alliance (CSA [15]) was formed...monitoring tools, research literature, standards, and other resources related to IA (Information Assurance ) metrics and IT auditing . In the following

  5. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  6. Future Naval Use of COTS Networking Infrastructure

    DTIC Science & Technology

    2009-07-01

    user to benefit from Google’s vast databases and computational resources. Obviously, the ability to harness the full power of the Cloud could be... Computing Impact Findings Action Items Take-Aways Appendices: Pages 54-68 A. Terms of Reference Document B. Sample Definitions of Cloud ...and definition of Cloud Computing . While Cloud Computing is developing in many variations – including Infrastructure as a Service (IaaS), Platform as

  7. Proposing Telecardiology Services on Cloud for Different Medical Institutions: A Model of Reference.

    PubMed

    de la Torre-Díez, Isabel; Garcia-Zapirain, Begoña; López-Coronado, Miguel; Rodrigues, Joel J P C

    2017-08-01

    For a cloud-based telecardiology solution to be established in any scenario, it is necessary to ensure optimum levels of security, as patient's data will not be in the same place from where access is gained. The main objective of this article is to present a secure, cloud-based solution for a telecardiology service in different scenarios: a hospital, a health center in a city, and a group of health centers in a rural area. iCanCloud software is used to simulate the scenarios. The first scenario will be a city hospital with over 220,000 patients at its emergency services, and ∼1 million outpatient consultations. For the health center in a city, it serves ∼107,000 medical consultations and 16,700 pediatric consultations/year. In the last scenario, a group of health centers in a rural area serve an average 437.08 consultations/month and around 15.6 a day. Each one of the solutions proposed shares common features including the following: secure authentication through smart cards, the use of StorageGRID technology, and load balancers. For all cases, the cloud is private and the estimated price of the solution would cost around 450 €/month. Thanks to the research conducted in this work, it has been possible to provide an adapted solution in the form of a telecardiology service for a hospital, city health center, and rural health centers that offer security, privacy, and robustness, and is also optimum for a large number of cloud requests.

  8. Evolving the Land Information System into a Cloud Computing Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houser, Paul R.

    The Land Information System (LIS) was developed to use advanced flexible land surface modeling and data assimilation frameworks to integrate extremely large satellite- and ground-based observations with advanced land surface models to produce continuous high-resolution fields of land surface states and fluxes. The resulting fields are extremely useful for drought and flood assessment, agricultural planning, disaster management, weather and climate forecasting, water resources assessment, and the like. We envisioned transforming the LIS modeling system into a scientific cloud computing-aware web and data service that would allow clients to easily setup and configure for use in addressing large water management issues.more » The focus of this Phase 1 project was to determine the scientific, technical, commercial merit and feasibility of the proposed LIS-cloud innovations that are currently barriers to broad LIS applicability. We (a) quantified the barriers to broad LIS utility and commercialization (high performance computing, big data, user interface, and licensing issues); (b) designed the proposed LIS-cloud web service, model-data interface, database services, and user interfaces; (c) constructed a prototype LIS user interface including abstractions for simulation control, visualization, and data interaction, (d) used the prototype to conduct a market analysis and survey to determine potential market size and competition, (e) identified LIS software licensing and copyright limitations and developed solutions, and (f) developed a business plan for development and marketing of the LIS-cloud innovation. While some significant feasibility issues were found in the LIS licensing, overall a high degree of LIS-cloud technical feasibility was found.« less

  9. Towards a Multi-Mission, Airborne Science Data System Environment

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Hardman, S.; Law, E.; Freeborn, D.; Kay-Im, E.; Lau, G.; Oswald, J.

    2011-12-01

    NASA earth science instruments are increasingly relying on airborne missions. However, traditionally, there has been limited common infrastructure support available to principal investigators in the area of science data systems. As a result, each investigator has been required to develop their own computing infrastructures for the science data system. Typically there is little software reuse and many projects lack sufficient resources to provide a robust infrastructure to capture, process, distribute and archive the observations acquired from airborne flights. At NASA's Jet Propulsion Laboratory (JPL), we have been developing a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This includes improving data system interoperability across each instrument. A principal characteristic is being able to provide an agile infrastructure that is architected to allow for a variety of configurations of the infrastructure from locally installed compute and storage services to provisioning those services via the "cloud" from cloud computer vendors such as Amazon.com. Investigators often have different needs that require a flexible configuration. The data system infrastructure is built on the Apache's Object Oriented Data Technology (OODT) suite of components which has been used for a number of spaceborne missions and provides a rich set of open source software components and services for constructing science processing and data management systems. In 2010, a partnership was formed between the ACCE team and the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to support the data processing and data management needs. A principal goal is to provide support for the Fourier Transform Spectrometer (FTS) instrument which will produce over 700,000 soundings over the life of their three-year mission. The cost to purchase and operate a cluster-based system in order to generate Level 2 Full Physics products from this data was prohibitive. Through an evaluation of cloud computing solutions, Amazon's Elastic Compute Cloud (EC2) was selected for the CARVE deployment. As the ACCE infrastructure is developed and extended to form an infrastructure for airborne missions, the experience of working with CARVE has provided a number of lessons learned and has proven to be important in reinforcing the unique aspects of airborne missions and the importance of the ACCE infrastructure in developing a cost effective, flexible multi-mission capability that leverages emerging capabilities in cloud computing, workflow management, and distributed computing.

  10. dCache, Sync-and-Share for Big Data

    NASA Astrophysics Data System (ADS)

    Millar, AP; Fuhrmann, P.; Mkrtchyan, T.; Behrmann, G.; Bernardt, C.; Buchholz, Q.; Guelzow, V.; Litvintsev, D.; Schwank, K.; Rossi, A.; van der Reest, P.

    2015-12-01

    The availability of cheap, easy-to-use sync-and-share cloud services has split the scientific storage world into the traditional big data management systems and the very attractive sync-and-share services. With the former, the location of data is well understood while the latter is mostly operated in the Cloud, resulting in a rather complex legal situation. Beside legal issues, those two worlds have little overlap in user authentication and access protocols. While traditional storage technologies, popular in HEP, are based on X.509, cloud services and sync-and-share software technologies are generally based on username/password authentication or mechanisms like SAML or Open ID Connect. Similarly, data access models offered by both are somewhat different, with sync-and-share services often using proprietary protocols. As both approaches are very attractive, dCache.org developed a hybrid system, providing the best of both worlds. To avoid reinventing the wheel, dCache.org decided to embed another Open Source project: OwnCloud. This offers the required modern access capabilities but does not support the managed data functionality needed for large capacity data storage. With this hybrid system, scientists can share files and synchronize their data with laptops or mobile devices as easy as with any other cloud storage service. On top of this, the same data can be accessed via established mechanisms, like GridFTP to serve the Globus Transfer Service or the WLCG FTS3 tool, or the data can be made available to worker nodes or HPC applications via a mounted filesystem. As dCache provides a flexible authentication module, the same user can access its storage via different authentication mechanisms; e.g., X.509 and SAML. Additionally, users can specify the desired quality of service or trigger media transitions as necessary, thus tuning data access latency to the planned access profile. Such features are a natural consequence of using dCache. We will describe the design of the hybrid dCache/OwnCloud system, report on several months of operations experience running it at DESY, and elucidate the future road-map.

  11. Clouds over the summertime Sahara: an evaluation of Met Office retrievals from Meteosat Second Generation using airborne remote sensing

    NASA Astrophysics Data System (ADS)

    Kealy, John C.; Marenco, Franco; Marsham, John H.; Garcia-Carreras, Luis; Francis, Pete N.; Cooke, Michael C.; Hocking, James

    2017-05-01

    Novel methods of cloud detection are applied to airborne remote sensing observations from the unique Fennec aircraft dataset, to evaluate the Met Office-derived products on cloud properties over the Sahara based on the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) on-board the Meteosat Second Generation (MSG) satellite. Two cloud mask configurations are considered, as well as the retrievals of cloud-top height (CTH), and these products are compared to airborne cloud remote sensing products acquired during the Fennec campaign in June 2011 and June 2012. Most detected clouds (67 % of the total) have a horizontal extent that is smaller than a SEVIRI pixel (3 km × 3 km). We show that, when partially cloud-contaminated pixels are included, a match between the SEVIRI and aircraft datasets is found in 80 ± 8 % of the pixels. Moreover, under clear skies the datasets are shown to agree for more than 90 % of the pixels. The mean cloud field, derived from the satellite cloud mask acquired during the Fennec flights, shows that areas of high surface albedo and orography are preferred sites for Saharan cloud cover, consistent with published theories. Cloud-top height retrievals however show large discrepancies over the region, which are ascribed to limiting factors such as the cloud horizontal extent, the derived effective cloud amount, and the absorption by mineral dust. The results of the CTH analysis presented here may also have further-reaching implications for the techniques employed by other satellite applications facilities across the world.

  12. Assessment of Cloud Screening with Apparent Surface Reflectance in Support of the ICESat-2 Mission

    NASA Technical Reports Server (NTRS)

    Yang, Yuekui; Marshak, Alexander; Palm, Stephen P.; Wang, Zhuosen; Schaaf, Crystal

    2011-01-01

    The separation of cloud and clear scenes is usually one of the first steps in satellite data analysis. Before deriving a geophysical product, almost every satellite mission requires a cloud mask to label a scene as either clear or cloudy through a cloud detection procedure. For clear scenes, products such as surface properties may be retrieved; for cloudy scenes, scientist can focus on studying the cloud properties. Hence the quality of cloud detection directly affects the quality of most satellite operational and research products. This is certainly true for the Ice, Cloud, and land Elevation Satellite-2 (lCESat-2), which is the successor to the ICESat-l. As a top priority mission, ICESat-2 will continue to provide measurements of ice sheets and sea ice elevation on a global scale. Studies have shown that clouds can significantly affect the accuracy of the retrieved results. For example, some of the photons (a photon is a basic unit of light) in the laser beam will be scattered by cloud particles on its way. So instead of traveling in a straight line, these photons are scattered sideways and have traveled a longer path. This will result in biases in ice sheet elevation measurements. Hence cloud screening must be done and be done accurately before the retrievals.

  13. Multi-Spectral Cloud Retrievals from Moderate Image Spectrometer (MODIS)

    NASA Technical Reports Server (NTRS)

    Platnick, Steven

    2004-01-01

    MODIS observations from the NASA EOS Terra spacecraft (1030 local time equatorial sun-synchronous crossing) launched in December 1999 have provided a unique set of Earth observation data. With the launch of the NASA EOS Aqua spacecraft (1330 local time crossing! in May 2002: two MODIS daytime (sunlit) and nighttime observations are now available in a 24-hour period allowing some measure of diurnal variability. A comprehensive set of remote sensing algorithms for cloud masking and the retrieval of cloud physical and optical properties has been developed by members of the MODIS atmosphere science team. The archived products from these algorithms have applications in climate modeling, climate change studies, numerical weather prediction, as well as fundamental atmospheric research. In addition to an extensive cloud mask, products include cloud-top properties (temperature, pressure, effective emissivity), cloud thermodynamic phase, cloud optical and microphysical parameters (optical thickness, effective particle radius, water path), as well as derived statistics. An overview of the instrument and cloud algorithms will be presented along with various examples, including an initial analysis of several operational global gridded (Level-3) cloud products from the two platforms. Statistics of cloud optical and microphysical properties as a function of latitude for land and Ocean regions will be shown. Current algorithm research efforts will also be discussed.

  14. Verifying Operational and Developmental Air Force Weather Cloud Analysis and Forecast Products Using Lidar Data from Department of Energy Atmospheric Radiation Measurement (ARM) Sites

    NASA Astrophysics Data System (ADS)

    Hildebrand, E. P.

    2017-12-01

    Air Force Weather has developed various cloud analysis and forecast products designed to support global Department of Defense (DoD) missions. A World-Wide Merged Cloud Analysis (WWMCA) and short term Advected Cloud (ADVCLD) forecast is generated hourly using data from 16 geostationary and polar-orbiting satellites. Additionally, WWMCA and Numerical Weather Prediction (NWP) data are used in a statistical long-term (out to five days) cloud forecast model known as the Diagnostic Cloud Forecast (DCF). The WWMCA and ADVCLD are generated on the same polar stereographic 24 km grid for each hemisphere, whereas the DCF is generated on the same grid as its parent NWP model. When verifying the cloud forecast models, the goal is to understand not only the ability to detect cloud, but also the ability to assign it to the correct vertical layer. ADVCLD and DCF forecasts traditionally have been verified using WWMCA data as truth, but this might over-inflate the performance of those models because WWMCA also is a primary input dataset for those models. Because of this, in recent years, a WWMCA Reanalysis product has been developed, but this too is not a fully independent dataset. This year, work has been done to incorporate data from external, independent sources to verify not only the cloud forecast products, but the WWMCA data itself. One such dataset that has been useful for examining the 3-D performance of the cloud analysis and forecast models is Atmospheric Radiation Measurement (ARM) data from various sites around the globe. This presentation will focus on the use of the Department of Energy (DoE) ARM data to verify Air Force Weather cloud analysis and forecast products. Results will be presented to show relative strengths and weaknesses of the analyses and forecasts.

  15. Cloud removing method for daily snow mapping over Central Asia and Xinjiang, China

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoqi; Qiu, Yubao; Guo, Huadong; Chen, Lijuan

    2017-02-01

    Central Asia and Xinjiang, China are conjunct areas, located in the hinterland of the Eurasian continent, where the snowfall is an important water resource supplement form. The induced seasonal snow cover is vita factors to the regional energy and water balance, remote sensing plays a key role in the snow mapping filed, while the daily remote sensing products are normally contaminated by the occurrence of cloud, that obviously obstacles the utility of snow cover parameters. In this paper, based on the daily snow product from Moderate Resolution Imaging Spectroradiometer (MODIS A1), a cloud removing method was developed by considering the regional snow distribution characteristics with latitude and altitude dependence respectively. In the end, the daily cloud free products was compared with the same period of eight days MODIS standard product, revealing that the cloud free snow products are reasonable, while could provide higher temporal resolution, and more details over Center Asia and Xinjiang Province.

  16. NeuronDepot: keeping your colleagues in sync by combining modern cloud storage services, the local file system, and simple web applications

    PubMed Central

    Rautenberg, Philipp L.; Kumaraswamy, Ajayrama; Tejero-Cantero, Alvaro; Doblander, Christoph; Norouzian, Mohammad R.; Kai, Kazuki; Jacobsen, Hans-Arno; Ai, Hiroyuki; Wachtler, Thomas; Ikeno, Hidetoshi

    2014-01-01

    Neuroscience today deals with a “data deluge” derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing—thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations. PMID:24971059

  17. NeuronDepot: keeping your colleagues in sync by combining modern cloud storage services, the local file system, and simple web applications.

    PubMed

    Rautenberg, Philipp L; Kumaraswamy, Ajayrama; Tejero-Cantero, Alvaro; Doblander, Christoph; Norouzian, Mohammad R; Kai, Kazuki; Jacobsen, Hans-Arno; Ai, Hiroyuki; Wachtler, Thomas; Ikeno, Hidetoshi

    2014-01-01

    Neuroscience today deals with a "data deluge" derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing-thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations.

  18. A browser-based 3D Visualization Tool designed for comparing CERES/CALIOP/CloudSAT level-2 data sets.

    NASA Astrophysics Data System (ADS)

    Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Doelling, D. R.

    2017-12-01

    In Langley NASA, Clouds and the Earth's Radiant Energy System (CERES) and Moderate Resolution Imaging Spectroradiometer (MODIS) are merged with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and CloudSat Cloud Profiling Radar (CPR). The CERES merged product (C3M) matches up to three CALIPSO footprints with each MODIS pixel along its ground track. It then assigns the nearest CloudSat footprint to each of those MODIS pixels. The cloud properties from MODIS, retrieved using the CERES algorithms, are included in C3M with the matched CALIPSO and CloudSat products along with radiances from 18 MODIS channels. The dataset is used to validate the CERES retrieved MODIS cloud properties and the computed TOA and surface flux difference using MODIS or CALIOP/CloudSAT retrieved clouds. This information is then used to tune the computed fluxes to match the CERES observed TOA flux. A visualization tool will be invaluable to determine the cause of these large cloud and flux differences in order to improve the methodology. This effort is part of larger effort to allow users to order the CERES C3M product sub-setted by time and parameter as well as the previously mentioned visualization capabilities. This presentation will show a new graphical 3D-interface, 3D-CERESVis, that allows users to view both passive remote sensing satellites (MODIS and CERES) and active satellites (CALIPSO and CloudSat), such that the detailed vertical structures of cloud properties from CALIPSO and CloudSat are displayed side by side with horizontally retrieved cloud properties from MODIS and CERES. Similarly, the CERES computed profile fluxes whether using MODIS or CALIPSO and CloudSat clouds can also be compared. 3D-CERESVis is a browser-based visualization tool that makes uses of techniques such as multiple synchronized cursors, COLLADA format data and Cesium.

  19. Navigating the Challenges of the Cloud

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2010-01-01

    Cloud computing is increasingly popular in education. Cloud computing is "the delivery of computer services from vast warehouses of shared machines that enables companies and individuals to cut costs by handing over the running of their email, customer databases or accounting software to someone else, and then accessing it over the internet."…

  20. A Tale of Two Clouds

    ERIC Educational Resources Information Center

    Gray, Terry

    2010-01-01

    The University of Washington (UW) adopted a dual-provider cloud-computing strategy, focusing initially on software as a service. The original project--to replace an obsolete alumni e-mail system--resulted in a cloud solution that soon grew to encompass the entire campus community. The policies and contract terms UW developed, focusing on…

  1. Using Clouds for MapReduce Measurement Assignments

    ERIC Educational Resources Information Center

    Rabkin, Ariel; Reiss, Charles; Katz, Randy; Patterson, David

    2013-01-01

    We describe our experiences teaching MapReduce in a large undergraduate lecture course using public cloud services and the standard Hadoop API. Using the standard API, students directly experienced the quality of industrial big-data tools. Using the cloud, every student could carry out scalability benchmarking assignments on realistic hardware,…

  2. A Contextual Information Acquisition Approach Based on Semantics and Mashup Technology

    NASA Astrophysics Data System (ADS)

    He, Yangfan; Li, Lu; He, Keqing; Chen, Xiuhong

    Pay per use is an essential feature of cloud computing. Users can make use of some parts of a large scale service to satisfy their requirements, merely at the cost of a little payment. A good understanding of the users' requirement is a prerequisite for choosing the service in need precisely. Context implies users' potential requirements, which can be a complement to the requirements delivered explicitly. However, traditional context-aware computing research always demands some specific kinds of sensors to acquire contextual information, which renders a threshold too high for an application to become context-aware. This paper comes up with an approach which combines contextual information obtained directly and indirectly from the cloud services. Semantic relationship between different kinds of contexts lays foundation for the searching of the cloud services. And mashup technology is adopted to compose the heterogonous services. Abundant contextual information may lend strong support to a comprehensive understanding of users' context and a bettered abstraction of contextual requirements.

  3. Cloud-based adaptive exon prediction for DNA analysis

    PubMed Central

    Putluri, Srinivasareddy; Fathima, Shaik Yasmeen

    2018-01-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database. PMID:29515813

  4. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  5. Exploring the factors influencing the cloud computing adoption: a systematic study on cloud migration.

    PubMed

    Rai, Rashmi; Sahoo, Gadadhar; Mehfuz, Shabana

    2015-01-01

    Today, most of the organizations trust on their age old legacy applications, to support their business-critical systems. However, there are several critical concerns, as maintainability and scalability issues, associated with the legacy system. In this background, cloud services offer a more agile and cost effective platform, to support business applications and IT infrastructure. As the adoption of cloud services has been increasing recently and so has been the academic research in cloud migration. However, there is a genuine need of secondary study to further strengthen this research. The primary objective of this paper is to scientifically and systematically identify, categorize and compare the existing research work in the area of legacy to cloud migration. The paper has also endeavored to consolidate the research on Security issues, which is prime factor hindering the adoption of cloud through classifying the studies on secure cloud migration. SLR (Systematic Literature Review) of thirty selected papers, published from 2009 to 2014 was conducted to properly understand the nuances of the security framework. To categorize the selected studies, authors have proposed a conceptual model for cloud migration which has resulted in a resource base of existing solutions for cloud migration. This study concludes that cloud migration research is in seminal stage but simultaneously it is also evolving and maturing, with increasing participation from academics and industry alike. The paper also identifies the need for a secure migration model, which can fortify organization's trust into cloud migration and facilitate necessary tool support to automate the migration process.

  6. Generation of Classical DInSAR and PSI Ground Motion Maps on a Cloud Thematic Platform

    NASA Astrophysics Data System (ADS)

    Mora, Oscar; Ordoqui, Patrick; Romero, Laia

    2016-08-01

    This paper presents the experience of ALTAMIRA INFORMATION uploading InSAR (Synthetic Aperture Radar Interferometry) services in the Geohazard Exploitation Platform (GEP), supported by ESA. Two different processing chains are presented jointly with ground motion maps obtained from the cloud computing, one being DIAPASON for classical DInSAR and SPN (Stable Point Network) for PSI (Persistent Scatterer Interferometry) processing. The product obtained from DIAPASON is the interferometric phase related to ground motion (phase fringes from a SAR pair). SPN provides motion data (mean velocity and time series) on high-quality pixels from a stack of SAR images. DIAPASON is already implemented, and SPN is under development to be exploited with historical data coming from ERS-1/2 and ENVISAT satellites, and current acquisitions of SENTINEL-1 in SLC and TOPSAR modes.

  7. NPOESS Preparatory Project Validation Program for Atmsophere Data Products from VIIRS

    NASA Astrophysics Data System (ADS)

    Starr, D.; Wong, E.

    2009-12-01

    The National Polar-orbiting Operational Environmental Satellite Suite (NPOESS) Program, in partnership with National Aeronautical Space Administration (NASA), will launch the NPOESS Preparatory Project (NPP), a risk reduction and data continuity mission, prior to the first operational NPOESS launch. The NPOESS Program, in partnership with Northrop Grumman Aerospace Systems (NGAS), will execute the NPP Validation program to ensure the data products comply with the requirements of the sponsoring agencies. Data from the NPP Visible/Infrared Imager/Radiometer Suite (VIIRS) will be used to produce Environmental Data Records (EDR's) for aerosol and clouds, specifically Aerosol Optical Thickness (AOT), Aerosol Particle Size Parameter (APSP), and Suspended Matter (SM); and Cloud Optical Thickness (COT), Cloud Effective Particle Size (CEPS), Cloud Top Temperature (CTT), Height (CTH) and Pressure (CTP), and Cloud Base Height (CBH). The Aerosol and Cloud EDR Validation Program is a multifaceted effort to characterize and validate these data products. The program involves systematic comparison to heritage data products, e.g., MODIS, and ground-based correlative data, such as AERONET and ARM data products, and potentially airborne field measurements. To the extent possible, the domain is global. The program leverages various investments that have and are continuing to be made by national funding agencies in such resources, as well as the operational user community and the broad Earth science user community. This presentation will provide an overview of the approaches, data and schedule for the validation of the NPP VIIRS Aerosol and Cloud environmental data products.

  8. MISR RICO Products

    Atmospheric Science Data Center

    2016-11-25

    ... microphysics of the transition to a mature rainshaft, organization of trade wind clouds, water budget of trade wind cumulus, and the ... (MISR) mission objectives involve providing accurate information on cloud cover, cloud-track winds, stereo-derived cloud-top ...

  9. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  10. JPSS-1 Data and the EOSDIS System: It's seamless

    NASA Astrophysics Data System (ADS)

    Hall, A.; Behnke, J.; Ho, E.

    2017-12-01

    The continuity of climate and environmental data is the key to the NASA Earth science program to develop a scientific understanding of Earth's system and its response to changes. NASA has made a long-term investment in processing, archiving and distributing Earth science data through the Earth Observing System (EOS) Data and Information System (EOSDIS). The use of the EOSDIS infrastructure and services provides seamless integration of Suomi National Polar-Orbiting Partnership (SNPP) and future Joint Polar Satellite System (JPSS-1) products as it does for the entire NASA Earth Science data collection. This continuity of measurements from all the missions is supported by the use of common data structures and standards in the generation of products and the subsequent services, tools and access to those products. Similar to EOS missions, 5 Science Investigator-led Processing Systems (SIPS) were established for SNPP: Land, Ocean, Atmosphere, Ozone, and Sounder along with NASA's Clouds and the Earth's Radiant Energy System and Ozone Mapper/Profiler Suite Limb systems now produce the NASA SNPP standard Level 1, Level 2, and Level 3 products developed by the NASA science teams.

  11. Near-Cloud Aerosol Properties from the 1 Km Resolution MODIS Ocean Product

    NASA Technical Reports Server (NTRS)

    Varnai, Tamas; Marshak, Alexander

    2014-01-01

    This study examines aerosol properties in the vicinity of clouds by analyzing high-resolution atmospheric correction parameters provided in the MODIS (Moderate Resolution Imaging Spectroradiometer) ocean color product. The study analyzes data from a 2 week long period of September in 10 years, covering a large area in the northeast Atlantic Ocean. The results indicate that on the one hand, the Quality Assessment (QA) flags of the ocean color product successfully eliminate cloud-related uncertainties in ocean parameters such as chlorophyll content, but on the other hand, using the flags introduces a sampling bias in atmospheric products such as aerosol optical thickness (AOT) and Angstrom exponent. Therefore, researchers need to select QA flags by balancing the risks of increased retrieval uncertainties and sampling biases. Using an optimal set of QA flags, the results reveal substantial increases in optical thickness near clouds-on average the increase is 50% for the roughly half of pixels within 5 km from clouds and is accompanied by a roughly matching increase in particle size. Theoretical simulations show that the 50% increase in 550nm AOT changes instantaneous direct aerosol radiative forcing by up to 8W/m2 and that the radiative impact is significantly larger if observed near-cloud changes are attributed to aerosol particles as opposed to undetected cloud particles. These results underline that accounting for near-cloud areas and understanding the causes of near-cloud particle changes are critical for accurate calculations of direct aerosol radiative forcing.

  12. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    PubMed

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  13. An adaptive process-based cloud infrastructure for space situational awareness applications

    NASA Astrophysics Data System (ADS)

    Liu, Bingwei; Chen, Yu; Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik; Rubin, Bruce

    2014-06-01

    Space situational awareness (SSA) and defense space control capabilities are top priorities for groups that own or operate man-made spacecraft. Also, with the growing amount of space debris, there is an increase in demand for contextual understanding that necessitates the capability of collecting and processing a vast amount sensor data. Cloud computing, which features scalable and flexible storage and computing services, has been recognized as an ideal candidate that can meet the large data contextual challenges as needed by SSA. Cloud computing consists of physical service providers and middleware virtual machines together with infrastructure, platform, and software as service (IaaS, PaaS, SaaS) models. However, the typical Virtual Machine (VM) abstraction is on a per operating systems basis, which is at too low-level and limits the flexibility of a mission application architecture. In responding to this technical challenge, a novel adaptive process based cloud infrastructure for SSA applications is proposed in this paper. In addition, the details for the design rationale and a prototype is further examined. The SSA Cloud (SSAC) conceptual capability will potentially support space situation monitoring and tracking, object identification, and threat assessment. Lastly, the benefits of a more granular and flexible cloud computing resources allocation are illustrated for data processing and implementation considerations within a representative SSA system environment. We show that the container-based virtualization performs better than hypervisor-based virtualization technology in an SSA scenario.

  14. Evaluating NASA S-NPP continuity cloud products for climate research using CALIPSO, CATS and Level-3 analysis

    NASA Astrophysics Data System (ADS)

    Holz, R.; Platnick, S. E.; Meyer, K.; Frey, R.; Wind, G.; Ackerman, S. A.; Heidinger, A. K.; Botambekov, D.; Yorks, J. E.; McGill, M. J.

    2016-12-01

    The launch of VIIRS and CrIS on Suomi NPP in the fall of 2011 introduced the next generation of U.S. operational polar orbiting environmental observations. Similar to MODIS, VIIRS provides visible and IR observations at moderate spatial resolution and has a 1:30 pm equatorial crossing time consistent with the MODIS on Aqua platform. However unlike MODIS, VIIRS lacks water vapor and CO2 absorbing channels that are used by the MODIS cloud algorithms for both cloud detection and to retrieve cloud top height and cloud emissivity for ice clouds. Given the different spectral and spatial characteristics of VIIRS, we seek to understand the extent to which the 15-year MODIS climate record can be continued with VIIRS/CrIS observations while maintaining consistent sensitivities across the observational systems. This presentation will focus on the evaluation of the latest version of the NASA funded cloud retrieval algorithms being developed for climate research. We will present collocated inter-comparisons between the imagers (VIIRS and MODIS Aqua) with CALIPSO and Cloud Aerosol Transport System (CATS) lidar observations as well as long term statistics based on a new Level-3 (L3) product being developed as part the project. The CALIPSO inter-comparisons will focus on cloud detection (cloud mask) with a focus on the impact of recent modifications to the cloud mask and how these changes impact the global statistics. For the first time we will provide inter-comparisons between two different cloud lidar systems (CALIOP and CATS) and investigate how the different sensitivities of the lidars impact the cloud mask and cloud comparisons. Using CALIPSO and CATS as the reference, and applying the same algorithms to VIIRS and MODIS, we will discuss the consistency between products from both imagers. The L3 analysis will focus on the regional and seasonal consistency between the suite of MODIS and VIIRS continuity cloud products. Do systematic biases remains when using consistent algorithms but applied to different observations (MODIS or VIIRS)?

  15. Aerosols and their influence on radiation partitioning and savanna productivity in northern Australia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanniah, K. D.; Beringer, J.; Tapper, N. J.

    2010-05-01

    We investigated the effect of aerosols and clouds on the Net Ecosystem Productivity (NEP) of savannas in northern Australia using aerosol optical depth, clouds and radiation data from the Atmospheric Radiation Measurement (ARM) site in Darwin and carbon flux data measured from eddy covariance techniques from a site at Howard Springs, 35km southeast of Darwin. Generally we found that the concentration of aerosols in this region was relatively low than observed at other sites, therefore the proportion of diffuse radiation reaching the earths surface was only ~ 30%. As a result, we observed only a modest change in carbon uptakemore » under aerosol laden skies and there was no significant difference for dry season Radiation Use Efficiency (RUE) between clear sky, aerosols or thin clouds. On the other hand thick clouds in the wet season produce much more diffuse radiation than aerosols or thin clouds and therefore the initial canopy quantum efficiency was seen to increase 45 and 2.5 times more than under thin clouds and aerosols respectively. The normalized carbon uptake under thick clouds is 57% and 50% higher than under aerosols and thin clouds respectively even though the total irradiance received under thick clouds was reduced 59% and 50% than under aerosols and thin clouds respectively. However, reduction in total irradiance decreases the mean absolute carbon uptake as much as 22% under heavy cloud cover compared to thin clouds or aerosols. Thus, any increase in aerosol concentration or cloud cover that can enhance the diffuse component may have large impacts on productivity in this region.« less

  16. BetterThanPin: Empowering Users to Fight Phishing (Poster)

    NASA Astrophysics Data System (ADS)

    Tan, Teik Guan

    The BetterThanPin concept is an online security service that allows users to enable almost any Cloud or Web-based account (e.g. Gmail, MSN, Yahoo, etc) to be protected with "almost" 2-factor authentication (2FA). The result is that users can now protect their online accounts with better authentication, without waiting for the service or cloud provider.

  17. Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.

  18. Secure and Resilient Cloud Computing for the Department of Defense

    DTIC Science & Technology

    2015-07-21

    that addresses that threat model, and (3) integrate the technology into a usable, secure, resilient cloud test bed. Underpinning this work is the...risks for the DoD’s acquisition of secure, resilient cloud technology by providing proofs of concept, technology maturity, integration demonstrations...we need a strategy for integrating LLSRC technology with the cloud services and applications that need to be secured. The LLSRC integration

  19. Arctic Clouds

    Atmospheric Science Data Center

    2013-04-19

    ...     View Larger Image Stratus clouds are common in the Arctic during the summer months, ... (Acro Service Corporation/Jet Propulsion Laboratory), David J. Diner (Jet Propulsion Laboratory). Other formats available at JPL ...

  20. Obfuscatable multi-recipient re-encryption for secure privacy-preserving personal health record services.

    PubMed

    Shi, Yang; Fan, Hongfei; Xiong, Guoyue

    2015-01-01

    With the rapid development of cloud computing techniques, it is attractive for personal health record (PHR) service providers to deploy their PHR applications and store the personal health data in the cloud. However, there could be a serious privacy leakage if the cloud-based system is intruded by attackers, which makes it necessary for the PHR service provider to encrypt all patients' health data on cloud servers. Existing techniques are insufficiently secure under circumstances where advanced threats are considered, or being inefficient when many recipients are involved. Therefore, the objectives of our solution are (1) providing a secure implementation of re-encryption in white-box attack contexts and (2) assuring the efficiency of the implementation even in multi-recipient cases. We designed the multi-recipient re-encryption functionality by randomness-reusing and protecting the implementation by obfuscation. The proposed solution is secure even in white-box attack contexts. Furthermore, a comparison with other related work shows that the computational cost of the proposed solution is lower. The proposed technique can serve as a building block for supporting secure, efficient and privacy-preserving personal health record service systems.

Top