Sample records for bluesky cloud framework

  1. BlueSky Cloud Framework: An E-Learning Framework Embracing Cloud Computing

    NASA Astrophysics Data System (ADS)

    Dong, Bo; Zheng, Qinghua; Qiao, Mu; Shu, Jian; Yang, Jie

    Currently, E-Learning has grown into a widely accepted way of learning. With the huge growth of users, services, education contents and resources, E-Learning systems are facing challenges of optimizing resource allocations, dealing with dynamic concurrency demands, handling rapid storage growth requirements and cost controlling. In this paper, an E-Learning framework based on cloud computing is presented, namely BlueSky cloud framework. Particularly, the architecture and core components of BlueSky cloud framework are introduced. In BlueSky cloud framework, physical machines are virtualized, and allocated on demand for E-Learning systems. Moreover, BlueSky cloud framework combines with traditional middleware functions (such as load balancing and data caching) to serve for E-Learning systems as a general architecture. It delivers reliable, scalable and cost-efficient services to E-Learning systems, and E-Learning organizations can establish systems through these services in a simple way. BlueSky cloud framework solves the challenges faced by E-Learning, and improves the performance, availability and scalability of E-Learning systems.

  2. BlueSky Cloud - rapid infrastructure capacity using Amazon's Cloud for wildfire emergency response

    NASA Astrophysics Data System (ADS)

    Haderman, M.; Larkin, N. K.; Beach, M.; Cavallaro, A. M.; Stilley, J. C.; DeWinter, J. L.; Craig, K. J.; Raffuse, S. M.

    2013-12-01

    During peak fire season in the United States, many large wildfires often burn simultaneously across the country. Smoke from these fires can produce air quality emergencies. It is vital that incident commanders, air quality agencies, and public health officials have smoke impact information at their fingertips for evaluating where fires and smoke are and where the smoke will go next. To address the need for this kind of information, the U.S. Forest Service AirFire Team created the BlueSky Framework, a modeling system that predicts concentrations of particle pollution from wildfires. During emergency response, decision makers use BlueSky predictions to make public outreach and evacuation decisions. The models used in BlueSky predictions are computationally intensive, and the peak fire season requires significantly more computer resources than off-peak times. Purchasing enough hardware to run the number of BlueSky Framework runs that are needed during fire season is expensive and leaves idle servers running the majority of the year. The AirFire Team and STI developed BlueSky Cloud to take advantage of Amazon's virtual servers hosted in the cloud. With BlueSky Cloud, as demand increases and decreases, servers can be easily spun up and spun down at a minimal cost. Moving standard BlueSky Framework runs into the Amazon Cloud made it possible for the AirFire Team to rapidly increase the number of BlueSky Framework instances that could be run simultaneously without the costs associated with purchasing and managing servers. In this presentation, we provide an overview of the features of BlueSky Cloud, describe how the system uses Amazon Cloud, and discuss the costs and benefits of moving from privately hosted servers to a cloud-based infrastructure.

  3. Investigating fire emissions and smoke transport during the Summer of 2013 using an operational smoke modeling system and chemical transport model

    NASA Astrophysics Data System (ADS)

    ONeill, S. M.; Chung, S. H.; Wiedinmyer, C.; Larkin, N. K.; Martinez, M. E.; Solomon, R. C.; Rorig, M.

    2014-12-01

    Emissions from fires in the Western US are substantial and can impact air quality and regional climate. Many methods exist that estimate the particulate and gaseous emissions from fires, including those run operationally for use with chemical forecast models. The US Forest Service Smartfire2/BlueSky modeling framework uses satellite data and reported information about fire perimeters to estimate emissions of pollutants to the atmosphere. The emission estimates are used as inputs to dispersion models, such as HYSPLIT, and chemical transport models, such as CMAQ and WRF-Chem, to assess the chemical and physical impacts of fires on the atmosphere. Here we investigate the use of Smartfire2/BlueSky and WRF-Chem to simulate emissions from the 2013 fire summer fire season, with special focus on the Rim Fire in northern California. The 2013 Rim Fire ignited on August 17 and eventually burned more than 250,000 total acres before being contained on October 24. Large smoke plumes and pyro-convection events were observed. In this study, the Smartfire2/BlueSky operational emission estimates are compared to other estimation methods, such as the Fire INventory from NCAR (FINN) and other global databases to quantify variations in emission estimation methods for this wildfire event. The impact of the emissions on downwind chemical composition is investigated with the coupled meteorology-chemistry WRF-Chem model. The inclusion of aerosol-cloud and aerosol-radiation interactions in the model framework enables the evaluation of the downwind impacts of the fire plume. The emissions and modeled chemistry can also be evaluated with data collected from the Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS) aircraft field campaign, which intersected the fire plume.

  4. The BlueSky Smoke Modeling Framework: Recent Developments

    NASA Astrophysics Data System (ADS)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    BlueSky systems—a set of decision support tools including SmartFire and the BlueSky Framework—aid public policy decision makers and scientific researchers in evaluating the air quality impacts of fires. Smoke and fire managers use BlueSky systems in decisions about prescribed burns and wildland firefighting. Air quality agencies use BlueSky systems to support decisions related to air quality regulations. We will discuss a range of recent improvements to the BlueSky systems, as well as examples of applications and future plans. BlueSky systems have the flexibility to accept basic fire information from virtually any source and can reconcile multiple information sources so that duplication of fire records is eliminated. BlueSky systems currently apply information from (1) the National Oceanic and Atmospheric Administration's (NOAA) Hazard Mapping System (HMS), which represents remotely sensed data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Advanced Very High Resolution Radiometer (AVHRR), and Geostationary Operational Environmental Satellites (GOES); (2) the Monitoring Trends in Burn Severity (MTBS) interagency project, which derives fire perimeters from Landsat 30-meter burn scars; (3) the Geospatial Multi-Agency Coordination Group (GeoMAC), which produces helicopter-flown burn perimeters; and (4) ground-based fire reports, such as the ICS-209 reports managed by the National Wildfire Coordinating Group. Efforts are currently underway to streamline the use of additional ground-based systems, such as states' prescribed burn databases. BlueSky systems were recently modified to address known uncertainties in smoke modeling associated with (1) estimates of biomass consumption derived from sparse fuel moisture data, and (2) models of plume injection heights. Additional sources of remotely sensed data are being applied to address these issues as follows: - The National Aeronautics and Space Administration's (NASA) Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  5. Validation of BlueSky Smoke Prediction System using surface and satellite observations during major wildland fire events in Northern California

    Treesearch

    Lesley Fusina; Sharon Zhong; Julide Koracin; Tim Brown; Annie Esperanza; Leland Tarney; Haiganoush Preisler

    2007-01-01

    The BlueSky Smoke Prediction System developed by the U.S. Department of Agriculture, Forest Service, AirFire Team under the National Fire Plan is a modeling framework that integrates tools, knowledge of fuels, moisture, combustion, emissions, plume dynamics, and weather to produce real-time predictions of the cumulative impacts of smoke from wildfires, prescribed fires...

  6. Wild Fire Emissions for the NOAA Operational HYSPLIT Smoke Model

    NASA Astrophysics Data System (ADS)

    Huang, H. C.; ONeill, S. M.; Ruminski, M.; Shafran, P.; McQueen, J.; DiMego, G.; Kondragunta, S.; Gorline, J.; Huang, J. P.; Stunder, B.; Stein, A. F.; Stajner, I.; Upadhayay, S.; Larkin, N. K.

    2015-12-01

    Particulate Matter (PM) generated from forest fires often lead to degraded visibility and unhealthy air quality in nearby and downstream areas. To provide near-real time PM information to the state and local agencies, the NOAA/National Weather Service (NWS) operational HYSPLIT (Hybrid Single Particle Lagrangian Integrated Trajectory Model) smoke modeling system (NWS/HYSPLIT smoke) provides the forecast of smoke concentration resulting from fire emissions driven by the NWS North American Model 12 km weather predictions. The NWS/HYSPLIT smoke incorporates the U.S. Forest Service BlueSky Smoke Modeling Framework (BlueSky) to provide smoke fire emissions along with the input fire locations from the NOAA National Environmental Satellite, Data, and Information Service (NESDIS)'s Hazard Mapping System fire and smoke detection system. Experienced analysts inspect satellite imagery from multiple sensors onboard geostationary and orbital satellites to identify the location, size and duration of smoke emissions for the model. NWS/HYSPLIT smoke is being updated to use a newer version of USFS BlueSky. The updated BlueSky incorporates the Fuel Characteristic Classification System version 2 (FCCS2) over the continental U.S. and Alaska. FCCS2 includes a more detailed description of fuel loadings with additional plant type categories. The updated BlueSky also utilizes an improved fuel consumption model and fire emission production system. For the period of August 2014 and June 2015, NWS/HYSPLIT smoke simulations show that fire smoke emissions with updated BlueSky are stronger than the current operational BlueSky in the Northwest U.S. For the same comparisons, weaker fire smoke emissions from the updated BlueSky were observed over the middle and eastern part of the U.S. A statistical evaluation of NWS/HYSPLIT smoke predicted total column concentration compared to NOAA NESDIS GOES EAST Aerosol Smoke Product retrievals is underway. Preliminary results show that using the newer version of BlueSky leads to improved performance of NWS/HYSPLIT-smoke for June 2015. These results are partially due to the default fuel loading selected for Canadian fires that lead to stronger fire emissions there. The use of more realistic Canadian fuel loading may improve NWS/HYSPLIT smoke forecast.

  7. The global blue-sky albedo change between 2000 - 2015 seen from MODIS

    NASA Astrophysics Data System (ADS)

    Chrysoulakis, N.; Mitraka, Z.; Gorelick, N.

    2016-12-01

    The land surface albedo is a critical physical variable, which influences the Earth's climate by affecting the energy budget and distribution in the Earth-atmosphere system. Blue-sky albedo estimates provide a quantitative means for better constraining global and regional scale climate models. The Moderate Resolution Imaging Spectroradiometer (MODIS) albedo product includes parameters for the estimation of both the directional-hemispherical surface reflectance (black-sky albedo) and the bi-hemispherical surface reflectance (white-sky albedo). This dataset was used here for the blue-sky albedo estimation over the globe on an 8-day basis at 0.5 km spatial resolution for the whole time period covered by MODIS acquisitions (i.e. 2000 until today). To estimate the blue-sky albedo, the fraction of the diffused radiation is needed, a function of the Aerosol Optical Thickness (AOT). Required AOT information was acquired from the MODIS AOT product at 1̊ × 1̊ spatial resolution. Since the blue-sky albedo depends on the solar zenith angle (SZA), the 8-day mean blue-sky albedo values were computed as averages of the corresponding values for the representative SZAs covering the 24-hour day. The estimated blue-sky albedo time series was analyzed to capture changes during the 15 period. All computation were performed using the Google Earth Engine (GEE). The GEE provided access to all the MODIS products needed for the analysis without the need of searching or downloading. Moreover, the combination of MODIS products in both temporal and spatial terms was fast and effecting using the GEE API (Application Program Interface). All the products covering the globe and for the time period of 15 years were processed via a single collection. Most importantly, GEE allowed for including the calculation of SZAs covering the 24-hour day which improves the quality of the overall product. The 8-day global products of land surface albedo are available through http://www.rslab.gr/downloads.html

  8. (EDMUNDS, WA) WILDLAND FIRE EMISSIONS MODELING: INTEGRATING BLUESKY AND SMOKE

    EPA Science Inventory

    This presentation is a status update of the BlueSky emissions modeling system. BlueSky-EM has been coupled with the Sparse Matrix Operational Kernel Emissions (SMOKE) system, and is now available as a tool for estimating emissions from wildland fires

  9. Evaluation of a wildfire smoke forecasting system as a tool for public health protection.

    PubMed

    Yao, Jiayun; Brauer, Michael; Henderson, Sarah B

    2013-10-01

    Exposure to wildfire smoke has been associated with cardiopulmonary health impacts. Climate change will increase the severity and frequency of smoke events, suggesting a need for enhanced public health protection. Forecasts of smoke exposure can facilitate public health responses. We evaluated the utility of a wildfire smoke forecasting system (BlueSky) for public health protection by comparing its forecasts with observations and assessing their associations with population-level indicators of respiratory health in British Columbia, Canada. We compared BlueSky PM2.5 forecasts with PM2.5 measurements from air quality monitors, and BlueSky smoke plume forecasts with plume tracings from National Oceanic and Atmospheric Administration Hazard Mapping System remote sensing data. Daily counts of the asthma drug salbutamol sulfate dispensations and asthma-related physician visits were aggregated for each geographic local health area (LHA). Daily continuous measures of PM2.5 and binary measures of smoke plume presence, either forecasted or observed, were assigned to each LHA. Poisson regression was used to estimate the association between exposure measures and health indicators. We found modest agreement between forecasts and observations, which was improved during intense fire periods. A 30-μg/m3 increase in BlueSky PM2.5 was associated with an 8% increase in salbutamol dispensations and a 5% increase in asthma-related physician visits. BlueSky plume coverage was associated with 5% and 6% increases in the two health indicators, respectively. The effects were similar for observed smoke, and generally stronger in very smoky areas. BlueSky forecasts showed modest agreement with retrospective measures of smoke and were predictive of respiratory health indicators, suggesting they can provide useful information for public health protection.

  10. Evaluation of a Wildfire Smoke Forecasting System as a Tool for Public Health Protection

    PubMed Central

    Brauer, Michael; Henderson, Sarah B.

    2013-01-01

    Background: Exposure to wildfire smoke has been associated with cardiopulmonary health impacts. Climate change will increase the severity and frequency of smoke events, suggesting a need for enhanced public health protection. Forecasts of smoke exposure can facilitate public health responses. Objectives: We evaluated the utility of a wildfire smoke forecasting system (BlueSky) for public health protection by comparing its forecasts with observations and assessing their associations with population-level indicators of respiratory health in British Columbia, Canada. Methods: We compared BlueSky PM2.5 forecasts with PM2.5 measurements from air quality monitors, and BlueSky smoke plume forecasts with plume tracings from National Oceanic and Atmospheric Administration Hazard Mapping System remote sensing data. Daily counts of the asthma drug salbutamol sulfate dispensations and asthma-related physician visits were aggregated for each geographic local health area (LHA). Daily continuous measures of PM2.5 and binary measures of smoke plume presence, either forecasted or observed, were assigned to each LHA. Poisson regression was used to estimate the association between exposure measures and health indicators. Results: We found modest agreement between forecasts and observations, which was improved during intense fire periods. A 30-μg/m3 increase in BlueSky PM2.5 was associated with an 8% increase in salbutamol dispensations and a 5% increase in asthma-related physician visits. BlueSky plume coverage was associated with 5% and 6% increases in the two health indicators, respectively. The effects were similar for observed smoke, and generally stronger in very smoky areas. Conclusions: BlueSky forecasts showed modest agreement with retrospective measures of smoke and were predictive of respiratory health indicators, suggesting they can provide useful information for public health protection. Citation: Yao J, Brauer M, Henderson SB. 2013. Evaluation of a wildfire smoke forecasting system as a tool for public health protection. Environ Health Perspect 121:1142–1147; http://dx.doi.org/10.1289/ehp.1306768 PMID:23906969

  11. Can We Build an Open-Science Model to Fund Young, Risky, Blue-Sky Research? First Insights into Funding Geoscientists Via Thinkable.Org

    NASA Astrophysics Data System (ADS)

    McNeil, B.

    2014-12-01

    Some of the biggest discoveries and advances in geoscience research have come from purely curiosity-driven, blue-sky research. Marine biologist Osamu Shimomura's discovery of Green-Fluorecent Protein (GFP) in the 1960s during his postdoc is just one example, which came about through his interest and pursuit of how certain jellyfish bioluminescence. His discovery would eventually revolutionise medicine, culminating in a Nobel Prize in Chemistry in 2008. Despite the known importance of "blue-sky" research that doesn't have immediate commercial or social applications, it continues to struggle for funding from both government and industry. Success rates for young scientists also continue to decline within the government competitive granting models due to the importance of track records, yet history tells us that young scientists tend to come up with science's greatest discoveries. The digital age however, gives us a new opportunity to create an alternative and sustainable funding model for young, risky, blue-sky science that tends not to be supported by governments and industry anymore. Here I will discuss how new digital platforms empower researchers and organisations to showcase their research using video, allowing wider community engagment and funding that can be used to directly support young, risky, blue-sky research that is so important to the future of science. I will then talk about recent experience with this model from some ocean researchers who used a new platform called thinkable.org to showcase and raise funding via the public.

  12. Applications of Satellite Remote Sensing Products to Enhance and Evaluate the AIRPACT Regional Air Quality Modeling System

    NASA Astrophysics Data System (ADS)

    Herron-Thorpe, F. L.; Mount, G. H.; Emmons, L. K.; Lamb, B. K.; Jaffe, D. A.; Wigder, N. L.; Chung, S. H.; Zhang, R.; Woelfle, M.; Vaughan, J. K.; Leung, F. T.

    2013-12-01

    The WSU AIRPACT air quality modeling system for the Pacific Northwest forecasts hourly levels of aerosols and atmospheric trace gases for use in determining potential health and ecosystem impacts by air quality managers. AIRPACT uses the WRF/SMOKE/CMAQ modeling framework, derives dynamic boundary conditions from MOZART-4 forecast simulations with assimilated MOPITT CO, and uses the BlueSky framework to derive fire emissions. A suite of surface measurements and satellite-based remote sensing data products across the AIRPACT domain are used to evaluate and improve model performance. Specific investigations include anthropogenic emissions, wildfire simulations, and the effects of long-range transport on surface ozone. In this work we synthesize results for multiple comparisons of AIRPACT with satellite products such as IASI ammonia, AIRS carbon monoxide, MODIS AOD, OMI tropospheric ozone and nitrogen dioxide, and MISR plume height. Features and benefits of the newest version of AIRPACT's web-interface are also presented.

  13. WILDFIRE EMISSION MODELING: INTEGRATING BLUESKY AND SMOKE

    EPA Science Inventory

    Atmospheric chemical transport models are used to simulate historic meteorological episodes for developing air quality management strategies. Wildland fire emissions need to be characterized accurately to achieve these air quality management goals. The temporal and spatial esti...

  14. Idiosyncrasies of volcanic sulfur viscosity and the triggering of unheralded volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Scolamacchia, Teresa; Cronin, Shane

    2016-03-01

    Unheralded "blue-sky" eruptions from dormant volcanoes cause serious fatalities, such as at Mt. Ontake (Japan) on 27 September 2014. Could these events result from magmatic gas being trapped within hydrothermal system aquifers by elemental sulfur (Se) clogging pores, due to sharp increases in its viscosity when heated above 159oC? This mechanism was thought to prime unheralded eruptions at Mt. Ruapehu in New Zealand. Impurities in sulfur (As, Te, Se) are known to modify S-viscosity and industry experiments showed that organic compounds, H2S, and halogens dramatically influence Se viscosity under typical hydrothermal heating/cooling rates and temperature thresholds. However, the effects of complex sulfur compositions are currently ignored at volcanoes, despite its near ubiquity in long-lived volcano-hydrothermal systems. Models of impure S behavior must be urgently formulated to detect pre-eruptive warning signs before the next "blue-sky" eruption

  15. WILDLAND FIRE EMISSION MODELING: INTEGRATING BLUESKY AND SMOKE

    EPA Science Inventory

    This presentation is a summary of an improved method to estimate emissions from wildland fires. An interagency agreement between the US Forest Service and the US EPA has made it possible for these two agencies to collaborate in the study of wildland fires.

  16. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    PubMed

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  17. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  18. Higher Education, Knowledge for Its Own Sake, and an African Moral Theory

    ERIC Educational Resources Information Center

    Metz, Thaddeus

    2009-01-01

    I seek to answer the question of whether publicly funded higher education ought to aim intrinsically to promote certain kinds of "blue-sky" knowledge, knowledge that is unlikely to result in "tangible" or "concrete" social benefits such as health, wealth and liberty. I approach this question in light of an African moral theory, which contrasts…

  19. 77 FR 17534 - Self-Regulatory Organizations; The Depository Trust Company; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... Association (``SIFMA'') formed the MMI Blue-Sky Task Force (``Task Force'') to address systemic and unique... processing. The Task Force, along other money market industry members,\\8\\ determined that DTC's current MMI... amount or proceeds after the 3 p.m. E.T. deadline for RTP instructions.\\9\\ Accordingly, DTC is proposing...

  20. Contribution of regional-scale fire events to ozone and PM2.5 air quality estimated by photochemical modeling approaches

    EPA Science Inventory

    Two specific fires from 2011 are tracked for local to regional scale contribution to ozone (O3) and fine particulate matter (PM2.5) using a freely available regulatory modeling system that includes the BlueSky wildland fire emissions tool, Spare Matrix Operator Kernel Emissions (...

  1. Large-eddy simulation of subtropical cloud-topped boundary layers: 1. A forcing framework with closed surface energy balance

    NASA Astrophysics Data System (ADS)

    Tan, Zhihong; Schneider, Tapio; Teixeira, João.; Pressel, Kyle G.

    2016-12-01

    Large-eddy simulation (LES) of clouds has the potential to resolve a central question in climate dynamics, namely, how subtropical marine boundary layer (MBL) clouds respond to global warming. However, large-scale processes need to be prescribed or represented parameterically in the limited-area LES domains. It is important that the representation of large-scale processes satisfies constraints such as a closed energy balance in a manner that is realizable under climate change. For example, LES with fixed sea surface temperatures usually do not close the surface energy balance, potentially leading to spurious surface fluxes and cloud responses to climate change. Here a framework of forcing LES of subtropical MBL clouds is presented that enforces a closed surface energy balance by coupling atmospheric LES to an ocean mixed layer with a sea surface temperature (SST) that depends on radiative fluxes and sensible and latent heat fluxes at the surface. A variety of subtropical MBL cloud regimes (stratocumulus, cumulus, and stratocumulus over cumulus) are simulated successfully within this framework. However, unlike in conventional frameworks with fixed SST, feedbacks between cloud cover and SST arise, which can lead to sudden transitions between cloud regimes (e.g., stratocumulus to cumulus) as forcing parameters are varied. The simulations validate this framework for studies of MBL clouds and establish its usefulness for studies of how the clouds respond to climate change.

  2. A Framework and Improvements of the Korea Cloud Services Certification System.

    PubMed

    Jeon, Hangoo; Seo, Kwang-Kyu

    2015-01-01

    Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed.

  3. A Framework and Improvements of the Korea Cloud Services Certification System

    PubMed Central

    Jeon, Hangoo

    2015-01-01

    Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed. PMID:26125049

  4. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  5. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  6. Integration of hybrid wireless networks in cloud services oriented enterprise information systems

    NASA Astrophysics Data System (ADS)

    Li, Shancang; Xu, Lida; Wang, Xinheng; Wang, Jue

    2012-05-01

    This article presents a hybrid wireless network integration scheme in cloud services-based enterprise information systems (EISs). With the emerging hybrid wireless networks and cloud computing technologies, it is necessary to develop a scheme that can seamlessly integrate these new technologies into existing EISs. By combining the hybrid wireless networks and computing in EIS, a new framework is proposed, which includes frontend layer, middle layer and backend layers connected to IP EISs. Based on a collaborative architecture, cloud services management framework and process diagram are presented. As a key feature, the proposed approach integrates access control functionalities within the hybrid framework that provide users with filtered views on available cloud services based on cloud service access requirements and user security credentials. In future work, we will implement the proposed framework over SwanMesh platform by integrating the UPnP standard into an enterprise information system.

  7. A clear picture of smoke: Bluesky smoke forecasting.

    Treesearch

    Valerie Rapp

    2006-01-01

    Over the last several decades, the overall air quality goal in the United States has been to protect public health and clear skies by reducing emissions. At the same time, however, the risk of catastrophic fire has been rising in forests around the country as overly dense trees and understory brush crowd the stands. Prescribed fire—planned, controlled burning within...

  8. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  9. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick Start Guide (Revision 1)

    DTIC Science & Technology

    2017-06-01

    for GIFT Cloud, the web -based application version of the Generalized Intelligent Framework for Tutoring (GIFT). GIFT is a modular, open-source...external applications. GIFT is available to users with a GIFT Account at no cost. GIFT Cloud is an implementation of GIFT. This web -based application...section. Approved for public release; distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser

  10. PhD and the Manager's Dream: Professionalising the Students, the Degree and the Supervisors?

    ERIC Educational Resources Information Center

    Matos, Frederico

    2013-01-01

    This article has two main aims: to analyse relevant literature on the doctoral degree, and to assess whether recent funding changes in the UK have changed the nature of the PhD in the social sciences in a research-intensive and prestigious UK university. Data were collected at BlueSkies University where interviews with social sciences PhD…

  11. Bridging EOS remote sensing measurements and fire emissions, smoke dispersion, and air quality DSS in the Eastern US

    Treesearch

    John J. Qu; Xianjun Hao; Ruixin Yang; Swarvanu Dasgupta; Sanjeeb Bhoi; Menas Kafatos

    1999-01-01

    Fire eniissions, smoke dispersiotl. ancl air quality are very important for fire fighting and planing of prescribed burning. BlueskyRATNS (BSR) is a comprehenisive and state-of-the-art Decision Support System (DSS) for fire managers and air quality managers to plan fiiels treatments and support state air qiiality smoke regulatory actions, especially related to...

  12. 76 FR 47144 - In the Matter of: Jianwei Ding, 51 Bukit Batok Crescent, #0828 Unity Centre, Singapore 658077...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... the China Academy of Space Technology (``CAST''), participated in a scheme whereby he directed... instructions and informed Ding that he ``had to make up the story [when] I call for [a] rate quote.'' On or... kilograms of Toray M40 material to New Bluesky Technology Co. Ltd. in Hong Kong. These exports were destined...

  13. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE PAGES

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; ...

    2018-02-20

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  14. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    NASA Astrophysics Data System (ADS)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng

    2018-02-01

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.

  15. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  16. Accumulo/Hadoop, MongoDB, and Elasticsearch Performance for Semi Structured Intrusion Detection (IDS) Data

    DTIC Science & Technology

    2016-11-01

    iii Contents List of Figures v 1. Introduction 1 2. Background 1 3. Yahoo ! Cloud Serving Benchmark (YCSB) 2 3.1 Data Loading and Performance...transactional system. 3. Yahoo ! Cloud Serving Benchmark (YCSB) 3.1 Data Loading and Performance Testing Framework When originally setting out to perform the...that referred to a data loading and performance testing framework, Yahoo ! Cloud Serving Benchmark (YCSB).12 This framework is freely available and

  17. Generic-distributed framework for cloud services marketplace based on unified ontology.

    PubMed

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  18. Proceedings of the First International Research Workshop for Process Improvement in Small Settings, 2005

    DTIC Science & Technology

    2006-01-01

    at TrialStat Corporation, a software company that develops applications for clinical research and drug development, and a senior consultant with...Research.” Proceedings of the conference on Designing interactive systems: processes, practices, methods, and techniques. New York, NY: ACM Press, 2000...Rogers 97] Rogers, Yvonne & Victoria. Bellotti. "Grounding Blue-Sky Research: How can Ethnography Help." ACM Interactions Magazine (May-June 1997

  19. A framework based on 2-D Taylor expansion for quantifying the impacts of subpixel reflectance variance and covariance on cloud optical thickness and effective radius retrievals based on the bispectral method

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, K.

    2016-06-01

    The bispectral method retrieves cloud optical thickness (τ) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VIS/NIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring subpixel variations of cloud reflectances can lead to a significant bias in the retrieved τ and re. In the literature, the retrievals of τ and re are often assumed to be independent and considered separately when investigating the impact of subpixel cloud reflectance variations on the bispectral method. As a result, the impact on τ is contributed only by the subpixel variation of VIS/NIR band reflectance and the impact on re only by the subpixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of subpixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the τ and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how subpixel cloud reflectance variations impact the τ and re retrievals based on the bispectral method. In particular, our framework provides a mathematical explanation of how the subpixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used to estimate the retrieval uncertainty from subpixel reflectance variations in operational satellite cloud products and to help understand the differences in τ and re retrievals between two instruments.

  20. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method

    NASA Technical Reports Server (NTRS)

    Zhang, Z.; Werner, F.; Cho, H. -M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2016-01-01

    The bi-spectral method retrieves cloud optical thickness and cloud droplet effective radius simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VISNIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved and re. In the literature, the retrievals of and re are often assumed to be independent and considered separately when investigating the impact of sub-pixel cloud reflectance variations on the bi-spectral method. As a result, the impact on is contributed only by the sub-pixel variation of VISNIR band reflectance and the impact on re only by the sub-pixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VISNIR and SWIR cloud reflectances and their covariance on the and re retrievals. This framework takes into account the fact that the retrievals are determined by both VISNIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VISNIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used to estimate the retrieval uncertainty from sub-pixel reflectance variations in operational satellite cloud products and to help understand the differences in and re retrievals between two instruments.

  1. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Subpixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bispectral Method

    NASA Technical Reports Server (NTRS)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, K.

    2016-01-01

    The bispectral method retrieves cloud optical thickness (t) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VIS/NIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring subpixel variations of cloud reflectances can lead to a significant bias in the retrieved t and re. In the literature, the retrievals of t and re are often assumed to be independent and considered separately when investigating the impact of subpixel cloud reflectance variations on the bispectral method. As a result, the impact on t is contributed only by the subpixel variation of VIS/NIR band reflectance and the impact on re only by the subpixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of subpixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the t and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how subpixel cloud reflectance variations impact the t and re retrievals based on the bispectral method. In particular, our framework provides a mathematical explanation of how the subpixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used to estimate the retrieval uncertainty from subpixel reflectance variations in operational satellite cloud products and to help understand the differences in t and re retrievals between two instruments.

  2. A holistic image segmentation framework for cloud detection and extraction

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Xu, Haotian; Blasch, Erik; Horvath, Gregory; Pham, Khanh; Zheng, Yufeng; Ling, Haibin; Chen, Genshe

    2013-05-01

    Atmospheric clouds are commonly encountered phenomena affecting visual tracking from air-borne or space-borne sensors. Generally clouds are difficult to detect and extract because they are complex in shape and interact with sunlight in a complex fashion. In this paper, we propose a clustering game theoretic image segmentation based approach to identify, extract, and patch clouds. In our framework, the first step is to decompose a given image containing clouds. The problem of image segmentation is considered as a "clustering game". Within this context, the notion of a cluster is equivalent to a classical equilibrium concept from game theory, as the game equilibrium reflects both the internal and external (e.g., two-player) cluster conditions. To obtain the evolutionary stable strategies, we explore three evolutionary dynamics: fictitious play, replicator dynamics, and infection and immunization dynamics (InImDyn). Secondly, we use the boundary and shape features to refine the cloud segments. This step can lower the false alarm rate. In the third step, we remove the detected clouds and patch the empty spots by performing background recovery. We demonstrate our cloud detection framework on a video clip provides supportive results.

  3. The Simulations of Wildland Fire Smoke PM25 in the NWS Air Quality Forecasting Systems

    NASA Astrophysics Data System (ADS)

    Huang, H. C.; Pan, L.; McQueen, J.; Lee, P.; ONeill, S. M.; Ruminski, M.; Shafran, P.; Huang, J.; Stajner, I.; Upadhayay, S.; Larkin, N. K.

    2017-12-01

    The increase of wildland fire intensity and frequency in the United States (U.S.) has led to property loss, human fatality, and poor air quality due to elevated particulate matters and surface ozone concentrations. The NOAA/National Weather Service (NWS) built the National Air Quality Forecast Capability (NAQFC) based on the U.S. Environmental Protection Agency (EPA) Community Multi-scale Air Quality (CMAQ) Modeling System driven by the NCEP North American Mesoscale Forecast System meteorology to provide ozone and fine particulate matter (PM2.5) forecast guidance publicly. State and local forecasters use the NWS air quality forecast guidance to issue air quality alerts in their area. The NAQFC PM2.5 predictions include emissions from anthropogenic and biogenic sources, as well as natural sources such as dust storms and wildland fires. The wildland fire emission inputs to the NAQFC is derived from the NOAA National Environmental Satellite, Data, and Information Service Hazard Mapping System fire and smoke detection product and the emission module of the U.S. Forest Service (USFS) BlueSky Smoke Modeling Framework. Wildland fires are unpredictable and can be ignited by natural causes such as lightning or be human-caused. It is extremely difficult to predict future occurrences and behavior of wildland fires, as is the available bio-fuel to be burned for real-time air quality predictions. Assumptions of future day's wildland fire behavior often have to be made from older observed wildland fire information. The comparisons between the NAQFC modeled PM2.5 and the EPA AirNow surface observation show that large errors in PM2.5 prediction can occur if fire smoke emissions are sometimes placed at the wrong location and/or time. A configuration of NAQFC CMAQ-system to re-run previous 24 hours, during which wildland fires were observed from satellites has been included recently. This study focuses on the effort performed to minimize the error in NAQFC PM2.5 predictions resulting from incorporating fire smoke emissions into the NAQFC from a recently updated newer version of USFS BlueSky system. This study will show how new approaches has improved the PM2.5 predictions at both nearby and downstream areas from fire sources. Furthermore, Environment and Climate Change Canada (ECCC) fire emissions data are being tested.

  4. Sensitivities of simulated satellite views of clouds to subgrid-scale overlap and condensate heterogeneity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hillman, Benjamin R.; Marchand, Roger T.; Ackerman, Thomas P.

    Satellite simulators are often used to account for limitations in satellite retrievals of cloud properties in comparisons between models and satellite observations. The purpose of the simulator framework is to enable more robust evaluation of model cloud properties, so that di erences between models and observations can more con dently be attributed to model errors. However, these simulators are subject to uncertainties themselves. A fundamental uncertainty exists in connecting the spatial scales at which cloud properties are retrieved with those at which clouds are simulated in global models. In this study, we create a series of sensitivity tests using 4more » km global model output from the Multiscale Modeling Framework to evaluate the sensitivity of simulated satellite retrievals when applied to climate models whose grid spacing is many tens to hundreds of kilometers. In particular, we examine the impact of cloud and precipitation overlap and of condensate spatial variability. We find the simulated retrievals are sensitive to these assumptions. Specifically, using maximum-random overlap with homogeneous cloud and precipitation condensate, which is often used in global climate models, leads to large errors in MISR and ISCCP-simulated cloud cover and in CloudSat-simulated radar reflectivity. To correct for these errors, an improved treatment of unresolved clouds and precipitation is implemented for use with the simulator framework and is shown to substantially reduce the identified errors.« less

  5. Building of a Disaster Recovery Framework for E-Learning Environment Using Private Cloud Collaboration

    ERIC Educational Resources Information Center

    Togawa, Satoshi; Kanenishi, Kazuhide

    2014-01-01

    In this research, we have built a framework of disaster recovery such as against earthquake, tsunami disaster and a heavy floods for e-Learning environment. Especially, our proposed framework is based on private cloud collaboration. We build a prototype system based on IaaS architecture, and this prototype system is constructed by several private…

  6. Towards a true aerosol-and-cloud retrieval scheme

    NASA Astrophysics Data System (ADS)

    Thomas, Gareth; Poulsen, Caroline; Povey, Adam; McGarragh, Greg; Jerg, Matthias; Siddans, Richard; Grainger, Don

    2014-05-01

    The Optimal Retrieval of Aerosol and Cloud (ORAC) - formally the Oxford-RAL Aerosol and Cloud retrieval - offers a framework that can provide consistent and well characterised properties of both aerosols and clouds from a range of imaging satellite instruments. Several practical issues stand in the way of achieving the potential of this combined scheme however; in particular the sometimes conflicting priorities and requirements of aerosol and cloud retrieval problems, and the question of the unambiguous identification of aerosol and cloud pixels. This presentation will present recent developments made to the ORAC scheme for both aerosol and cloud, and detail how these are being integrated into a single retrieval framework. The implementation of a probabilistic method for pixel identification will also be presented, for both cloud detection and aerosol/cloud type selection. The method is based on Bayesian methods applied the optimal estimation retrieval output of ORAC and is particularly aimed at providing additional information in the so-called "twilight zone", where pixels can't be unambiguously identified as either aerosol or cloud and traditional cloud or aerosol products do not provide results.

  7. Security Certification Challenges in a Cloud Computing Delivery Model

    DTIC Science & Technology

    2010-04-27

    Relevant Security Standards, Certifications, and Guidance  NIST SP 800 series  ISO /IEC 27001 framework  Cloud Security Alliance  Statement of...CSA Domains / Cloud Features ISO 27001 Cloud Service Provider Responsibility Government Agency Responsibility Analyze Security gaps Compensating

  8. The Community Cloud retrieval for CLimate (CC4CL) - Part 1: A framework applied to multiple satellite imaging sensors

    NASA Astrophysics Data System (ADS)

    Sus, Oliver; Stengel, Martin; Stapelberg, Stefan; McGarragh, Gregory; Poulsen, Caroline; Povey, Adam C.; Schlundt, Cornelia; Thomas, Gareth; Christensen, Matthew; Proud, Simon; Jerg, Matthias; Grainger, Roy; Hollmann, Rainer

    2018-06-01

    We present here the key features of the Community Cloud retrieval for CLimate (CC4CL) processing algorithm. We focus on the novel features of the framework: the optimal estimation approach in general, explicit uncertainty quantification through rigorous propagation of all known error sources into the final product, and the consistency of our long-term, multi-platform time series provided at various resolutions, from 0.5 to 0.02°. By describing all key input data and processing steps, we aim to inform the user about important features of this new retrieval framework and its potential applicability to climate studies. We provide an overview of the retrieved and derived output variables. These are analysed for four, partly very challenging, scenes collocated with CALIOP (Cloud-Aerosol lidar with Orthogonal Polarization) observations in the high latitudes and over the Gulf of Guinea-West Africa. The results show that CC4CL provides very realistic estimates of cloud top height and cover for optically thick clouds but, where optically thin clouds overlap, returns a height between the two layers. CC4CL is a unique, coherent, multi-instrument cloud property retrieval framework applicable to passive sensor data of several EO missions. Through its flexibility, CC4CL offers the opportunity for combining a variety of historic and current EO missions into one dataset, which, compared to single sensor retrievals, is improved in terms of accuracy and temporal sampling.

  9. Cloud Feedbacks on Greenhouse Warming in a Multi-Scale Modeling Framework with a Higher-Order Turbulence Closure

    NASA Technical Reports Server (NTRS)

    Cheng, Anning; Xu, Kuan-Man

    2015-01-01

    Five-year simulation experiments with a multi-scale modeling Framework (MMF) with a advanced intermediately prognostic higher-order turbulence closure (IPHOC) in its cloud resolving model (CRM) component, also known as SPCAM-IPHOC (super parameterized Community Atmospheric Model), are performed to understand the fast tropical (30S-30N) cloud response to an instantaneous doubling of CO2 concentration with SST held fixed at present-day values. SPCAM-IPHOC has substantially improved the low-level representation compared with SPCAM. It is expected that the cloud responses to greenhouse warming in SPCAM-IPHOC is more realistic. The change of rising motion, surface precipitation, cloud cover, and shortwave and longwave cloud radiative forcing in SPCAM-IPHOC from the greenhouse warming will be presented in the presentation.

  10. Cloud computing strategic framework (FY13 - FY15).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  11. Proposal for a Security Management in Cloud Computing for Health Care

    PubMed Central

    Dzombeta, Srdan; Brandis, Knud

    2014-01-01

    Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources. PMID:24701137

  12. Proposal for a security management in cloud computing for health care.

    PubMed

    Haufe, Knut; Dzombeta, Srdan; Brandis, Knud

    2014-01-01

    Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources.

  13. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  14. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE PAGES

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...

    2017-04-24

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  15. Data provenance assurance in the cloud using blockchain

    NASA Astrophysics Data System (ADS)

    Shetty, Sachin; Red, Val; Kamhoua, Charles; Kwiat, Kevin; Njilla, Laurent

    2017-05-01

    Ever increasing adoption of cloud technology scales up the activities like creation, exchange, and alteration of cloud data objects, which create challenges to track malicious activities and security violations. Addressing this issue requires implementation of data provenance framework so that each data object in the federated cloud environment can be tracked and recorded but cannot be modified. The blockchain technology gives a promising decentralized platform to build tamper-proof systems. Its incorruptible distributed ledger/blockchain complements the need of maintaining cloud data provenance. In this paper, we present a cloud based data provenance framework using block chain which traces data record operations and generates provenance data. We anchor provenance data records into block chain transactions, which provide validation on provenance data and preserve user privacy at the same time. Once the provenance data is uploaded to the global block chain network, it is extremely challenging to tamper the provenance data. Besides, the provenance data uses hashed user identifiers prior to uploading so the blockchain nodes cannot link the operations to a particular user. The framework ensures that the privacy is preserved. We implemented the architecture on ownCloud, uploaded records to blockchain network, stored records in a provenance database and developed a prototype in form of a web service.

  16. Evaluating and Improving Cloud Processes in the Multi-Scale Modeling Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackerman, Thomas P.

    2015-03-01

    The research performed under this grant was intended to improve the embedded cloud model in the Multi-scale Modeling Framework (MMF) for convective clouds by using a 2-moment microphysics scheme rather than the single moment scheme used in all the MMF runs to date. The technical report and associated documents describe the results of testing the cloud resolving model with fixed boundary conditions and evaluation of model results with data. The overarching conclusion is that such model evaluations are problematic because errors in the forcing fields control the results so strongly that variations in parameterization values cannot be usefully constrained

  17. Interactive Classification of Construction Materials: Feedback Driven Framework for Annotation and Analysis of 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Hess, M. R.; Petrovic, V.; Kuester, F.

    2017-08-01

    Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.

  18. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  19. A Strategic Approach to Network Defense: Framing the Cloud

    DTIC Science & Technology

    2011-03-10

    accepted network defensive principles, to reduce risks associated with emerging virtualization capabilities and scalability of cloud computing . This expanded...defensive framework can assist enterprise networking and cloud computing architects to better design more secure systems.

  20. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The approach used follows the non-equilibrium statistical mechanical approach through a master equation. The aim is to represent the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and mass flux is a non-linear function of convective cell area, mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated mass flux variability under diurnally varying forcing. Besides its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to be capable of providing alternative, non-equilibrium, closure formulations for spectral mass flux parameterizations.« less

  1. Toward Realistic Simulation of low-Level Clouds Using a Multiscale Modeling Framework With a Third-Order Turbulence Closure in its Cloud-Resolving Model Component

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Cheng, Anning

    2010-01-01

    This study presents preliminary results from a multiscale modeling framework (MMF) with an advanced third-order turbulence closure in its cloud-resolving model (CRM) component. In the original MMF, the Community Atmosphere Model (CAM3.5) is used as the host general circulation model (GCM), and the System for Atmospheric Modeling with a first-order turbulence closure is used as the CRM for representing cloud processes in each grid box of the GCM. The results of annual and seasonal means and diurnal variability are compared between the modified and original MMFs and the CAM3.5. The global distributions of low-level cloud amounts and precipitation and the amounts of low-level clouds in the subtropics and middle-level clouds in mid-latitude storm track regions in the modified MMF show substantial improvement relative to the original MMF when both are compared to observations. Some improvements can also be seen in the diurnal variability of precipitation.

  2. A multiscale modeling framework model (superparameterized CAM5) with a higher-order turbulence closure: Model description and low-cloud simulations

    DOE PAGES

    Wang, Minghuai; Larson, Vincent E.; Ghan, Steven; ...

    2015-04-18

    In this study, a higher-order turbulence closure scheme, called Cloud Layers Unified by Binormals (CLUBB), is implemented into a Multi-scale Modeling Framework (MMF) model to improve low cloud simulations. The performance of CLUBB in MMF simulations with two different microphysics configurations (one-moment cloud microphysics without aerosol treatment and two-moment cloud microphysics coupled with aerosol treatment) is evaluated against observations and further compared with results from the Community Atmosphere Model, Version 5 (CAM5) with conventional cloud parameterizations. CLUBB is found to improve low cloud simulations in the MMF, and the improvement is particularly evident in the stratocumulus-to-cumulus transition regions. Compared tomore » the single-moment cloud microphysics, CLUBB with two-moment microphysics produces clouds that are closer to the coast, and agrees better with observations. In the stratocumulus-to cumulus transition regions, CLUBB with two-moment cloud microphysics produces shortwave cloud forcing in better agreement with observations, while CLUBB with single moment cloud microphysics overestimates shortwave cloud forcing. CLUBB is further found to produce quantitatively similar improvements in the MMF and CAM5, with slightly better performance in the MMF simulations (e.g., MMF with CLUBB generally produces low clouds that are closer to the coast than CAM5 with CLUBB). As a result, improved low cloud simulations in MMF make it an even more attractive tool for studying aerosol-cloud-precipitation interactions.« less

  3. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  4. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  5. Teachers' Cloud-Based Learning Designs: The Development of a Guiding Rubric Using the TPACK Framework

    ERIC Educational Resources Information Center

    Al-Harthi, Aisha Salim Ali; Campbell, Chris; Karimi, Arafeh

    2018-01-01

    This study aimed to develop, validate, and trial a rubric for evaluating the cloud-based learning designs (CBLD) that were developed by teachers using virtual learning environments. The rubric was developed using the technological pedagogical content knowledge (TPACK) framework, with rubric development including content and expert validation of…

  6. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    PubMed

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  7. A service brokering and recommendation mechanism for better selecting cloud services.

    PubMed

    Gui, Zhipeng; Yang, Chaowei; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Yu, Manzhu; Sun, Min; Zhou, Nanyin; Jin, Baoxuan

    2014-01-01

    Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI).

  8. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide

    DTIC Science & Technology

    2016-03-01

    distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This document serves as the quick-start guide for GIFT Cloud, the web -based...to users with a GIFT Account at no cost. GIFT Cloud is a new implementation of GIFT. This web -based application allows learners, authors, and...distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser. Officially, GIFT Cloud has been tested to work on

  9. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    ERIC Educational Resources Information Center

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  10. A compressive sensing based secure watermark detection and privacy preserving storage framework.

    PubMed

    Qia Wang; Wenjun Zeng; Jun Tian

    2014-03-01

    Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.

  11. Smoke and Emissions Model Intercomparison Project (SEMIP)

    NASA Astrophysics Data System (ADS)

    Larkin, N. K.; Raffuse, S.; Strand, T.; Solomon, R.; Sullivan, D.; Wheeler, N.

    2008-12-01

    Fire emissions and smoke impacts from wildland fire are a growing concern due to increasing fire season severity, dwindling tolerance of smoke by the public, tightening air quality regulations, and their role in climate change issues. Unfortunately, while a number of models and modeling system solutions are available to address these issues, the lack of quantitative information on the limitations and difference between smoke and emissions models impedes the use of these tools for real-world applications (JFSP, 2007). We describe a new, open-access project to directly address this issue, the open-access Smoke Emissions Model Intercomparison Project (SEMIP) and invite the community to participate. Preliminary work utilizing the modular BlueSky framework to directly compare fire location and size information, fuel loading amounts, fuel consumption rates, and fire emissions from a number of current models that has found model-to-model variability as high as two orders of magnitude for an individual fire. Fire emissions inventories also show significant variability on both regional and national scales that are dependant on the fire location information used (ground report vs. satellite), the fuel loading maps assumed, and the fire consumption models employed. SEMIP expands on this work and creates an open-access database of model results and observations with the goal of furthering model development and model prediction usability for real-world decision support.

  12. Smart Point Cloud: Definition and Remaining Challenges

    NASA Astrophysics Data System (ADS)

    Poux, F.; Hallot, P.; Neuville, R.; Billen, R.

    2016-10-01

    Dealing with coloured point cloud acquired from terrestrial laser scanner, this paper identifies remaining challenges for a new data structure: the smart point cloud. This concept arises with the statement that massive and discretized spatial information from active remote sensing technology is often underused due to data mining limitations. The generalisation of point cloud data associated with the heterogeneity and temporality of such datasets is the main issue regarding structure, segmentation, classification, and interaction for an immediate understanding. We propose to use both point cloud properties and human knowledge through machine learning to rapidly extract pertinent information, using user-centered information (smart data) rather than raw data. A review of feature detection, machine learning frameworks and database systems indexed both for mining queries and data visualisation is studied. Based on existing approaches, we propose a new 3-block flexible framework around device expertise, analytic expertise and domain base reflexion. This contribution serves as the first step for the realisation of a comprehensive smart point cloud data structure.

  13. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    PubMed

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  14. A framework for quantifying the impacts of sub-pixel reflectance variance and covariance on cloud optical thickness and effective radius retrievals based on the bi-spectral method

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Werner, F.; Cho, H.-M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2017-02-01

    The so-called bi-spectral method retrieves cloud optical thickness (τ) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near infrared (VIS/NIR) band and the other in a shortwave-infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved τ and re. In this study, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the τ and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the τ and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval.

  15. A Framework for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method.

    NASA Technical Reports Server (NTRS)

    Zhang, Z; Werner, F.; Cho, H. -M.; Wind, Galina; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2017-01-01

    The so-called bi-spectral method retrieves cloud optical thickness (t) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near infrared (VIS/NIR) band and the other in a shortwave-infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved t and re. In this study, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the t and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the t and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval.

  16. A Service Brokering and Recommendation Mechanism for Better Selecting Cloud Services

    PubMed Central

    Gui, Zhipeng; Yang, Chaowei; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Yu, Manzhu; Sun, Min; Zhou, Nanyin; Jin, Baoxuan

    2014-01-01

    Cloud computing is becoming the new generation computing infrastructure, and many cloud vendors provide different types of cloud services. How to choose the best cloud services for specific applications is very challenging. Addressing this challenge requires balancing multiple factors, such as business demands, technologies, policies and preferences in addition to the computing requirements. This paper recommends a mechanism for selecting the best public cloud service at the levels of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). A systematic framework and associated workflow include cloud service filtration, solution generation, evaluation, and selection of public cloud services. Specifically, we propose the following: a hierarchical information model for integrating heterogeneous cloud information from different providers and a corresponding cloud information collecting mechanism; a cloud service classification model for categorizing and filtering cloud services and an application requirement schema for providing rules for creating application-specific configuration solutions; and a preference-aware solution evaluation mode for evaluating and recommending solutions according to the preferences of application providers. To test the proposed framework and methodologies, a cloud service advisory tool prototype was developed after which relevant experiments were conducted. The results show that the proposed system collects/updates/records the cloud information from multiple mainstream public cloud services in real-time, generates feasible cloud configuration solutions according to user specifications and acceptable cost predication, assesses solutions from multiple aspects (e.g., computing capability, potential cost and Service Level Agreement, SLA) and offers rational recommendations based on user preferences and practical cloud provisioning; and visually presents and compares solutions through an interactive web Graphical User Interface (GUI). PMID:25170937

  17. Performance of the Goddard Multiscale Modeling Framework with Goddard Ice Microphysical Schemes

    NASA Technical Reports Server (NTRS)

    Chern, Jiun-Dar; Tao, Wei-Kuo; Lang, Stephen E.; Matsui, Toshihisa; Li, J.-L.; Mohr, Karen I.; Skofronick-Jackson, Gail M.; Peters-Lidard, Christa D.

    2016-01-01

    The multiscale modeling framework (MMF), which replaces traditional cloud parameterizations with cloud-resolving models (CRMs) within a host atmospheric general circulation model (GCM), has become a new approach for climate modeling. The embedded CRMs make it possible to apply CRM-based cloud microphysics directly within a GCM. However, most such schemes have never been tested in a global environment for long-term climate simulation. The benefits of using an MMF to evaluate rigorously and improve microphysics schemes are here demonstrated. Four one-moment microphysical schemes are implemented into the Goddard MMF and their results validated against three CloudSat/CALIPSO cloud ice products and other satellite data. The new four-class (cloud ice, snow, graupel, and frozen drops/hail) ice scheme produces a better overall spatial distribution of cloud ice amount, total cloud fractions, net radiation, and total cloud radiative forcing than earlier three-class ice schemes, with biases within the observational uncertainties. Sensitivity experiments are conducted to examine the impact of recently upgraded microphysical processes on global hydrometeor distributions. Five processes dominate the global distributions of cloud ice and snow amount in long-term simulations: (1) allowing for ice supersaturation in the saturation adjustment, (2) three additional correction terms in the depositional growth of cloud ice to snow, (3) accounting for cloud ice fall speeds, (4) limiting cloud ice particle size, and (5) new size-mapping schemes for snow and graupel. Despite the cloud microphysics improvements, systematic errors associated with subgrid processes, cyclic lateral boundaries in the embedded CRMs, and momentum transport remain and will require future improvement.

  18. Cloud Computing for Teaching Practice: A New Design?

    ERIC Educational Resources Information Center

    Saadatdoost, Robab; Sim, Alex Tze Hiang; Jafarkarimi, Hosein; Hee, Jee Mei; Saadatdoost, Leila

    2014-01-01

    Recently researchers have shown an increased interest in cloud computing technology. It is becoming increasingly difficult to ignore cloud computing technology in education context. However rapid changes in information technology are having a serious effect on teaching framework designs. So far, however, there has been little discussion about…

  19. New framework for extending cloud chemistry in the Community Multiscale Air Quality (CMAQ) modeling

    EPA Science Inventory

    Clouds and fogs significantly impact the amount, composition, and spatial distribution of gas and particulate atmospheric species, not least of which through the chemistry that occurs in cloud droplets. Atmospheric sulfate is an important component of fine aerosol mass and in an...

  20. Cloud Computing Services for Seismic Networks

    NASA Astrophysics Data System (ADS)

    Olson, Michael

    This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN---the Community Seismic Network---which uses relatively low-cost sensors deployed by members of the community, and (2) SAF---the Situation Awareness Framework---which integrates data from multiple sources, including the CSN, CISN---the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California---and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.

  1. Analytic Closed-Form Solution of a Mixed Layer Model for Stratocumulus Clouds

    NASA Astrophysics Data System (ADS)

    Akyurek, Bengu Ozge

    Stratocumulus clouds play an important role in climate cooling and are hard to predict using global climate and weather forecast models. Thus, previous studies in the literature use observations and numerical simulation tools, such as large-eddy simulation (LES), to solve the governing equations for the evolution of stratocumulus clouds. In contrast to the previous works, this work provides an analytic closed-form solution to the cloud thickness evolution of stratocumulus clouds in a mixed-layer model framework. With a focus on application over coastal lands, the diurnal cycle of cloud thickness and whether or not clouds dissipate are of particular interest. An analytic solution enables the sensitivity analysis of implicitly interdependent variables and extrema analysis of cloud variables that are hard to achieve using numerical solutions. In this work, the sensitivity of inversion height, cloud-base height, and cloud thickness with respect to initial and boundary conditions, such as Bowen ratio, subsidence, surface temperature, and initial inversion height, are studied. A critical initial cloud thickness value that can be dissipated pre- and post-sunrise is provided. Furthermore, an extrema analysis is provided to obtain the minima and maxima of the inversion height and cloud thickness within 24 h. The proposed solution is validated against LES results under the same initial and boundary conditions. Then, the proposed analytic framework is extended to incorporate multiple vertical columns that are coupled by advection through wind flow. This enables a bridge between the micro-scale and the mesoscale relations. The effect of advection on cloud evolution is studied and a sensitivity analysis is provided.

  2. New Concepts for Refinement of Cumulus Parameterization in GCM's the Arakawa-Schubert Framework

    NASA Technical Reports Server (NTRS)

    Sud, Y. C.; Walker, G. K.; Lau, William (Technical Monitor)

    2002-01-01

    Several state-of-the-art models including the one employed in this study use the Arakawa-Schubert framework for moist convection, and Sundqvist formulation of stratiform. clouds, for moist physics, in-cloud condensation, and precipitation. Despite a variety of cloud parameterization methodologies developed by several modelers including the authors, most of the parameterized cloud-models have similar deficiencies. These consist of: (a) not enough shallow clouds, (b) too many deep clouds; (c) several layers of clouds in a vertically demoralized model as opposed to only a few levels of observed clouds, and (d) higher than normal incidence of double ITCZ (Inter-tropical Convergence Zone). Even after several upgrades consisting of a sophisticated cloud-microphysics and sub-grid scale orographic precipitation into the Data Assimilation Office (DAO)'s atmospheric model (called GEOS-2 GCM) at two different resolutions, we found that the above deficiencies remained persistent. The two empirical solutions often used to counter the aforestated deficiencies consist of a) diffusion of moisture and heat within the lower troposphere to artificially force the shallow clouds; and b) arbitrarily invoke evaporation of in-cloud water for low-level clouds. Even though helpful, these implementations lack a strong physical rationale. Our research shows that two missing physical conditions can ameliorate the aforestated cloud-parameterization deficiencies. First, requiring an ascending cloud airmass to be saturated at its starting point will not only make the cloud instantly buoyant all through its ascent, but also provide the essential work function (buoyancy energy) that would promote more shallow clouds. Second, we argue that training clouds that are unstable to a finite vertical displacement, even if neutrally buoyant in their ambient environment, must continue to rise and entrain causing evaporation of in-cloud water. These concepts have not been invoked in any of the cloud parameterization schemes so far. We introduced them into the DAO-GEOS-2 GCM with McRAS (Microphysics of Clouds with Relaxed Arakawa-Schubert Scheme).

  3. Progress in Understanding the Impacts of 3-D Cloud Structure on MODIS Cloud Property Retrievals for Marine Boundary Layer Clouds

    NASA Technical Reports Server (NTRS)

    Zhang, Zhibo; Werner, Frank; Miller, Daniel; Platnick, Steven; Ackerman, Andrew; DiGirolamo, Larry; Meyer, Kerry; Marshak, Alexander; Wind, Galina; Zhao, Guangyu

    2016-01-01

    Theory: A novel framework based on 2-D Tayler expansion for quantifying the uncertainty in MODIS retrievals caused by sub-pixel reflectance inhomogeneity. (Zhang et al. 2016). How cloud vertical structure influences MODIS LWP retrievals. (Miller et al. 2016). Observation: Analysis of failed MODIS cloud property retrievals. (Cho et al. 2015). Cloud property retrievals from 15m resolution ASTER observations. (Werner et al. 2016). Modeling: LES-Satellite observation simulator (Zhang et al. 2012, Miller et al. 2016).

  4. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    NASA Astrophysics Data System (ADS)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  5. Understanding the tropical cloud feedback from an analysis of the circulation and stability regimes simulated from an upgraded multiscale modeling framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Kuan-Man; Cheng, Anning

    As revealed from studies using conventional general circulation models (GCMs), the thermodynamic contribution to the tropical cloud feedback dominates the dynamic contribution, but these models have difficulty in simulating the subsidence regimes in the tropics. In this study, we analyze the tropical cloud feedback from a 2 K sea surface temperature (SST) perturbation experiment performed with a multiscale modeling framework (MMF). The MMF explicitly represents cloud processes using 2-D cloud-resolving models with an advanced higher-order turbulence closure in each atmospheric column of the host GCM. We sort the monthly mean cloud properties and cloud radiative effects according to circulation andmore » stability regimes. Here, we find that the regime-sorted dynamic changes dominate the thermodynamic changes in terms of the absolute magnitude. The dynamic changes in the weak subsidence regimes exhibit strong negative cloud feedback due to increases in shallow cumulus and deep clouds while those in strongly convective and moderate-to-strong subsidence regimes have opposite signs, resulting in a small contribution to cloud feedback. On the other hand, the thermodynamic changes are large due to decreases in stratocumulus clouds in the moderate-to-strong subsidence regimes with small opposite changes in the weak subsidence and strongly convective regimes, resulting in a relatively large contribution to positive cloud feedback. The dynamic and thermodynamic changes contribute equally to positive cloud feedback and are relatively insensitive to stability in the moderate-to-strong subsidence regimes. But they are sensitive to stability changes from the SST increase in convective and weak subsidence regimes. Lastly, these results have implications for interpreting cloud feedback mechanisms.« less

  6. Understanding the tropical cloud feedback from an analysis of the circulation and stability regimes simulated from an upgraded multiscale modeling framework

    DOE PAGES

    Xu, Kuan-Man; Cheng, Anning

    2016-11-15

    As revealed from studies using conventional general circulation models (GCMs), the thermodynamic contribution to the tropical cloud feedback dominates the dynamic contribution, but these models have difficulty in simulating the subsidence regimes in the tropics. In this study, we analyze the tropical cloud feedback from a 2 K sea surface temperature (SST) perturbation experiment performed with a multiscale modeling framework (MMF). The MMF explicitly represents cloud processes using 2-D cloud-resolving models with an advanced higher-order turbulence closure in each atmospheric column of the host GCM. We sort the monthly mean cloud properties and cloud radiative effects according to circulation andmore » stability regimes. Here, we find that the regime-sorted dynamic changes dominate the thermodynamic changes in terms of the absolute magnitude. The dynamic changes in the weak subsidence regimes exhibit strong negative cloud feedback due to increases in shallow cumulus and deep clouds while those in strongly convective and moderate-to-strong subsidence regimes have opposite signs, resulting in a small contribution to cloud feedback. On the other hand, the thermodynamic changes are large due to decreases in stratocumulus clouds in the moderate-to-strong subsidence regimes with small opposite changes in the weak subsidence and strongly convective regimes, resulting in a relatively large contribution to positive cloud feedback. The dynamic and thermodynamic changes contribute equally to positive cloud feedback and are relatively insensitive to stability in the moderate-to-strong subsidence regimes. But they are sensitive to stability changes from the SST increase in convective and weak subsidence regimes. Lastly, these results have implications for interpreting cloud feedback mechanisms.« less

  7. The Dependence of Cloud Property Trend Detection on Absolute Calibration Accuracy of Passive Satellite Sensors

    NASA Astrophysics Data System (ADS)

    Shea, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Zelinka, M. D.

    2016-12-01

    Detecting trends in climate variables on global, decadal scales requires highly accurate, stable measurements and retrieval algorithms. Trend uncertainty depends on its magnitude, natural variability, and instrument and retrieval algorithm accuracy and stability. We applied a climate accuracy framework to quantify the impact of absolute calibration on cloud property trend uncertainty. The cloud properties studied were cloud fraction, effective temperature, optical thickness, and effective radius retrieved using the Clouds and the Earth's Radiant Energy System (CERES) Cloud Property Retrieval System, which uses Moderate-resolution Imaging Spectroradiometer measurements (MODIS). Modeling experiments from the fifth phase of the Climate Model Intercomparison Project (CMIP5) agree that net cloud feedback is likely positive but disagree regarding its magnitude, mainly due to uncertainty in shortwave cloud feedback. With the climate accuracy framework we determined the time to detect trends for instruments with various calibration accuracies. We estimated a relationship between cloud property trend uncertainty, cloud feedback, and Equilibrium Climate Sensitivity and also between effective radius trend uncertainty and aerosol indirect effect trends. The direct relationship between instrument accuracy requirements and climate model output provides the level of instrument absolute accuracy needed to reduce climate model projection uncertainty. Different cloud types have varied radiative impacts on the climate system depending on several attributes, such as their thermodynamic phase, altitude, and optical thickness. Therefore, we also conducted these studies by cloud types for a clearer understanding of instrument accuracy requirements needed to detect changes in their cloud properties. Combining this information with the radiative impact of different cloud types helps to prioritize among requirements for future satellite sensors and understanding the climate detection capabilities of existing sensors.

  8. A Semantic Based Policy Management Framework for Cloud Computing Environments

    ERIC Educational Resources Information Center

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  9. Multi-scale Modeling of Arctic Clouds

    NASA Astrophysics Data System (ADS)

    Hillman, B. R.; Roesler, E. L.; Dexheimer, D.

    2017-12-01

    The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.

  10. Cloud-based crowd sensing: a framework for location-based crowd analyzer and advisor

    NASA Astrophysics Data System (ADS)

    Aishwarya, K. C.; Nambi, A.; Hudson, S.; Nadesh, R. K.

    2017-11-01

    Cloud computing is an emerging field of computer science to integrate and explore large and powerful computing systems and storages for personal and also for enterprise requirements. Mobile Cloud Computing is the inheritance of this concept towards mobile hand-held devices. Crowdsensing, or to be precise, Mobile Crowdsensing is the process of sharing resources from an available group of mobile handheld devices that support sharing of different resources such as data, memory and bandwidth to perform a single task for collective reasons. In this paper, we propose a framework to use Crowdsensing and perform a crowd analyzer and advisor whether the user can go to the place or not. This is an ongoing research and is a new concept to which the direction of cloud computing has shifted and is viable for more expansion in the near future.

  11. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  12. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    PubMed Central

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  13. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    PubMed

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  14. Towards a bulk approach to local interactions of hydrometeors

    NASA Astrophysics Data System (ADS)

    Baumgartner, Manuel; Spichtinger, Peter

    2018-02-01

    The growth of small cloud droplets and ice crystals is dominated by the diffusion of water vapor. Usually, Maxwell's approach to growth for isolated particles is used in describing this process. However, recent investigations show that local interactions between particles can change diffusion properties of cloud particles. In this study we develop an approach for including these local interactions into a bulk model approach. For this purpose, a simplified framework of local interaction is proposed and governing equations are derived from this setup. The new model is tested against direct simulations and incorporated into a parcel model framework. Using the parcel model, possible implications of the new model approach for clouds are investigated. The results indicate that for specific scenarios the lifetime of cloud droplets in subsaturated air may be longer (e.g., for an initially water supersaturated air parcel within a downdraft). These effects might have an impact on mixed-phase clouds, for example in terms of riming efficiencies.

  15. Applying super-droplets as a compact representation of warm-rain microphysics for aerosol-cloud-aerosol interactions

    NASA Astrophysics Data System (ADS)

    Arabas, S.; Jaruga, A.; Pawlowska, H.; Grabowski, W. W.

    2012-12-01

    Clouds may influence aerosol characteristics of their environment. The relevant processes include wet deposition (rainout or washout) and cloud condensation nuclei (CCN) recycling through evaporation of cloud droplets and drizzle drops. Recycled CCN physicochemical properties may be altered if the evaporated droplets go through collisional growth or irreversible chemical reactions (e.g. SO2 oxidation). The key challenge of representing these processes in a numerical cloud model stems from the need to track properties of activated CCN throughout the cloud lifecycle. Lack of such "memory" characterises the so-called bulk, multi-moment as well as bin representations of cloud microphysics. In this study we apply the particle-based scheme of Shima et al. 2009. Each modelled particle (aka super-droplet) is a numerical proxy for a multiplicity of real-world CCN, cloud, drizzle or rain particles of the same size, nucleus type,and position. Tracking cloud nucleus properties is an inherent feature of the particle-based frameworks, making them suitable for studying aerosol-cloud-aerosol interactions. The super-droplet scheme is furthermore characterized by linear scalability in the number of computational particles, and no numerical diffusion in the condensational and in the Monte-Carlo type collisional growth schemes. The presentation will focus on processing of aerosol by a drizzling stratocumulus deck. The simulations are carried out using a 2D kinematic framework and a VOCALS experiment inspired set-up (see http://www.rap.ucar.edu/~gthompsn/workshop2012/case1/).

  16. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature.

    PubMed

    Henderson, Jette; Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-05-04

    Researchers are developing methods to automatically extract clinically relevant and useful patient characteristics from raw healthcare datasets. These characteristics, often capturing essential properties of patients with common medical conditions, are called computational phenotypes. Being generated by automated or semiautomated, data-driven methods, such potential phenotypes need to be validated as clinically meaningful (or not) before they are acceptable for use in decision making. The objective of this study was to present Phenotype Instance Verification and Evaluation Tool (PIVET), a framework that uses co-occurrence analysis on an online corpus of publically available medical journal articles to build clinical relevance evidence sets for user-supplied phenotypes. PIVET adopts a conceptual framework similar to the pioneering prototype tool PheKnow-Cloud that was developed for the phenotype validation task. PIVET completely refactors each part of the PheKnow-Cloud pipeline to deliver vast improvements in speed without sacrificing the quality of the insights PheKnow-Cloud achieved. PIVET leverages indexing in NoSQL databases to efficiently generate evidence sets. Specifically, PIVET uses a succinct representation of the phenotypes that corresponds to the index on the corpus database and an optimized co-occurrence algorithm inspired by the Aho-Corasick algorithm. We compare PIVET's phenotype representation with PheKnow-Cloud's by using PheKnow-Cloud's experimental setup. In PIVET's framework, we also introduce a statistical model trained on domain expert-verified phenotypes to automatically classify phenotypes as clinically relevant or not. Additionally, we show how the classification model can be used to examine user-supplied phenotypes in an online, rather than batch, manner. PIVET maintains the discriminative power of PheKnow-Cloud in terms of identifying clinically relevant phenotypes for the same corpus with which PheKnow-Cloud was originally developed, but PIVET's analysis is an order of magnitude faster than that of PheKnow-Cloud. Not only is PIVET much faster, it can be scaled to a larger corpus and still retain speed. We evaluated multiple classification models on top of the PIVET framework and found ridge regression to perform best, realizing an average F1 score of 0.91 when predicting clinically relevant phenotypes. Our study shows that PIVET improves on the most notable existing computational tool for phenotype validation in terms of speed and automation and is comparable in terms of accuracy. ©Jette Henderson, Junyuan Ke, Joyce C Ho, Joydeep Ghosh, Byron C Wallace. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 04.05.2018.

  17. Cloud Privacy Audit Framework: A Value-Based Design

    ERIC Educational Resources Information Center

    Coss, David Lewis

    2013-01-01

    The rapid expansion of cloud technology provides enormous capacity, which allows for the collection, dissemination and re-identification of personal information. It is the cloud's resource capabilities such as these that fuel the concern for privacy. The impetus of these concerns are not to far removed from those expressed by Mason in 1986…

  18. Long Read Alignment with Parallel MapReduce Cloud Platform

    PubMed Central

    Al-Absi, Ahmed Abdulhakim; Kang, Dae-Ki

    2015-01-01

    Genomic sequence alignment is an important technique to decode genome sequences in bioinformatics. Next-Generation Sequencing technologies produce genomic data of longer reads. Cloud platforms are adopted to address the problems arising from storage and analysis of large genomic data. Existing genes sequencing tools for cloud platforms predominantly consider short read gene sequences and adopt the Hadoop MapReduce framework for computation. However, serial execution of map and reduce phases is a problem in such systems. Therefore, in this paper, we introduce Burrows-Wheeler Aligner's Smith-Waterman Alignment on Parallel MapReduce (BWASW-PMR) cloud platform for long sequence alignment. The proposed cloud platform adopts a widely accepted and accurate BWA-SW algorithm for long sequence alignment. A custom MapReduce platform is developed to overcome the drawbacks of the Hadoop framework. A parallel execution strategy of the MapReduce phases and optimization of Smith-Waterman algorithm are considered. Performance evaluation results exhibit an average speed-up of 6.7 considering BWASW-PMR compared with the state-of-the-art Bwasw-Cloud. An average reduction of 30% in the map phase makespan is reported across all experiments comparing BWASW-PMR with Bwasw-Cloud. Optimization of Smith-Waterman results in reducing the execution time by 91.8%. The experimental study proves the efficiency of BWASW-PMR for aligning long genomic sequences on cloud platforms. PMID:26839887

  19. Long Read Alignment with Parallel MapReduce Cloud Platform.

    PubMed

    Al-Absi, Ahmed Abdulhakim; Kang, Dae-Ki

    2015-01-01

    Genomic sequence alignment is an important technique to decode genome sequences in bioinformatics. Next-Generation Sequencing technologies produce genomic data of longer reads. Cloud platforms are adopted to address the problems arising from storage and analysis of large genomic data. Existing genes sequencing tools for cloud platforms predominantly consider short read gene sequences and adopt the Hadoop MapReduce framework for computation. However, serial execution of map and reduce phases is a problem in such systems. Therefore, in this paper, we introduce Burrows-Wheeler Aligner's Smith-Waterman Alignment on Parallel MapReduce (BWASW-PMR) cloud platform for long sequence alignment. The proposed cloud platform adopts a widely accepted and accurate BWA-SW algorithm for long sequence alignment. A custom MapReduce platform is developed to overcome the drawbacks of the Hadoop framework. A parallel execution strategy of the MapReduce phases and optimization of Smith-Waterman algorithm are considered. Performance evaluation results exhibit an average speed-up of 6.7 considering BWASW-PMR compared with the state-of-the-art Bwasw-Cloud. An average reduction of 30% in the map phase makespan is reported across all experiments comparing BWASW-PMR with Bwasw-Cloud. Optimization of Smith-Waterman results in reducing the execution time by 91.8%. The experimental study proves the efficiency of BWASW-PMR for aligning long genomic sequences on cloud platforms.

  20. Simultaneous and synergistic profiling of cloud and drizzle properties using ground-based observations

    NASA Astrophysics Data System (ADS)

    Rusli, Stephanie P.; Donovan, David P.; Russchenberg, Herman W. J.

    2017-12-01

    Despite the importance of radar reflectivity (Z) measurements in the retrieval of liquid water cloud properties, it remains nontrivial to interpret Z due to the possible presence of drizzle droplets within the clouds. So far, there has been no published work that utilizes Z to identify the presence of drizzle above the cloud base in an optimized and a physically consistent manner. In this work, we develop a retrieval technique that exploits the synergy of different remote sensing systems to carry out this task and to subsequently profile the microphysical properties of the cloud and drizzle in a unified framework. This is accomplished by using ground-based measurements of Z, lidar attenuated backscatter below as well as above the cloud base, and microwave brightness temperatures. Fast physical forward models coupled to cloud and drizzle structure parameterization are used in an optimal-estimation-type framework in order to retrieve the best estimate for the cloud and drizzle property profiles. The cloud retrieval is first evaluated using synthetic signals generated from large-eddy simulation (LES) output to verify the forward models used in the retrieval procedure and the vertical parameterization of the liquid water content (LWC). From this exercise it is found that, on average, the cloud properties can be retrieved within 5 % of the mean truth. The full cloud-drizzle retrieval method is then applied to a selected ACCEPT (Analysis of the Composition of Clouds with Extended Polarization Techniques) campaign dataset collected in Cabauw, the Netherlands. An assessment of the retrieval products is performed using three independent methods from the literature; each was specifically developed to retrieve only the cloud properties, the drizzle properties below the cloud base, or the drizzle fraction within the cloud. One-to-one comparisons, taking into account the uncertainties or limitations of each retrieval, show that our results are consistent with what is derived using the three independent methods.

  1. Narrowing the Gap in Quantification of Aerosol-Cloud Radiative Effects

    NASA Astrophysics Data System (ADS)

    Feingold, G.; McComiskey, A. C.; Yamaguchi, T.; Kazil, J.; Johnson, J. S.; Carslaw, K. S.

    2016-12-01

    Despite large advances in our understanding of aerosol and cloud processes over the past years, uncertainty in the aerosol-cloud radiative effect/forcing is still of major concern. In this talk we will advocate a methodology for quantifying the aerosol-cloud radiative effect that considers the primacy of fundamental cloud properties such as cloud amount and albedo alongside the need for process level understanding of aerosol-cloud interactions. We will present a framework for quantifying the aerosol-cloud radiative effect, regime-by-regime, through process-based modelling and observations at the large eddy scale. We will argue that understanding the co-variability between meteorological and aerosol drivers of the radiative properties of the cloud system may be as important an endeavour as attempting to untangle these drivers.

  2. The diurnal cycle of clouds and precipitation at the ARM SGP site: Cloud radar observations and simulations from the multiscale modeling framework

    DOE PAGES

    Zhao, Wei; Marchand, Roger; Fu, Qiang

    2017-07-08

    Millimeter Wavelength Cloud Radar (MMCR) data from December 1996 to December 2010, collected at the U.S. Department of Energy Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site, are used to examine the diurnal cycle of hydrometeor occurrence. These data are categorized into clouds (-40 dBZ e ≤ reflectivity < -10 dBZ e), drizzle and light precipitation (-10 dBZ e ≤ reflectivity < 10 dBZ e), and heavy precipitation (reflectivity ≥ 10 dBZ e). The same criteria are implemented for the observation-equivalent reflectivity calculated by feeding outputs from a Multiscale Modeling Framework (MMF) climate model into a radar simulator.more » The MMF model consists of the National Center for Atmospheric Research Community Atmosphere Model with conventional cloud parameterizations replaced by a cloud-resolving model. We find that a radar simulator combined with the simple reflectivity categories can be an effective approach for evaluating diurnal variations in model hydrometeor occurrence. It is shown that the MMF only marginally captures observed increases in the occurrence of boundary layer clouds after sunrise in spring and autumn and does not capture diurnal changes in boundary layer clouds during the summer. Above the boundary layer, the MMF captures reasonably well diurnal variations in the vertical structure of clouds and light and heavy precipitation in the summer but not in the spring.« less

  3. Numerical Coupling and Simulation of Point-Mass System with the Turbulent Fluid Flow

    NASA Astrophysics Data System (ADS)

    Gao, Zheng

    A computational framework that combines the Eulerian description of the turbulence field with a Lagrangian point-mass ensemble is proposed in this dissertation. Depending on the Reynolds number, the turbulence field is simulated using Direct Numerical Simulation (DNS) or eddy viscosity model. In the meanwhile, the particle system, such as spring-mass system and cloud droplets, are modeled using the ordinary differential system, which is stiff and hence poses a challenge to the stability of the entire system. This computational framework is applied to the numerical study of parachute deceleration and cloud microphysics. These two distinct problems can be uniformly modeled with Partial Differential Equations (PDEs) and Ordinary Differential Equations (ODEs), and numerically solved in the same framework. For the parachute simulation, a novel porosity model is proposed to simulate the porous effects of the parachute canopy. This model is easy to implement with the projection method and is able to reproduce Darcy's law observed in the experiment. Moreover, the impacts of using different versions of k-epsilon turbulence model in the parachute simulation have been investigated and conclude that the standard and Re-Normalisation Group (RNG) model may overestimate the turbulence effects when Reynolds number is small while the Realizable model has a consistent performance with both large and small Reynolds number. For another application, cloud microphysics, the cloud entrainment-mixing problem is studied in the same numerical framework. Three sets of DNS are carried out with both decaying and forced turbulence. The numerical result suggests a new way parameterize the cloud mixing degree using the dynamical measures. The numerical experiments also verify the negative relationship between the droplets number concentration and the vorticity field. The results imply that the gravity has fewer impacts on the forced turbulence than the decaying turbulence. In summary, the proposed framework can be used to solve a physics problem that involves turbulence field and point-mass system, and therefore has a broad application.

  4. Privacy and Data Security under Cloud Computing Arrangements: The Legal Framework and Practical Do's and Don'ts

    ERIC Educational Resources Information Center

    Buckman, Joel; Gold, Stephanie

    2012-01-01

    This article outlines privacy and data security compliance issues facing postsecondary education institutions when they utilize cloud computing and concludes with a practical list of do's and dont's. Cloud computing does not change an institution's privacy and data security obligations. It does involve reliance on a third party, which requires an…

  5. Rand Arroyo Center 2014

    DTIC Science & Technology

    2015-01-01

    field effective command and control sys- tems within the framework of current policies and processes. Cost Considerations in Cloud Computing ...www.rand.org/t/PE113 Finds that cloud provider costs can vary compared with tradi- tional information system alternatives because of different cost structures...for analysts evaluating new cloud investments. U.S. Army photo by Staff Sgt. Christopher Calvert FOCUS ON Capabilities Development and Acquisition

  6. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds.

    PubMed

    Miller, Daniel J; Zhang, Zhibo; Ackerman, Andrew S; Platnick, Steven; Baum, Bryan A

    2016-04-27

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness ( τ ) and effective radius ( r e ) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5-10 g/m 2 . In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic r e profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques.

  7. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds

    PubMed Central

    Miller, Daniel J.; Zhang, Zhibo; Ackerman, Andrew S.; Platnick, Steven; Baum, Bryan A.

    2018-01-01

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness (τ) and effective radius (re) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5–10 g/m2. In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic re profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques. PMID:29637042

  8. Evaluating the Performance of the Goddard Multi-Scale Modeling Framework against GPM, TRMM and CloudSat/CALIPSO Products

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.; Lang, S. E.; Matsui, T.; Mohr, K. I.

    2014-12-01

    Four six-month (March-August 2014) experiments with the Goddard Multi-scale Modeling Framework (MMF) were performed to study the impacts of different Goddard one-moment bulk microphysical schemes and large-scale forcings on the performance of the MMF. Recently a new Goddard one-moment bulk microphysics with four-ice classes (cloud ice, snow, graupel, and frozen drops/hail) has been developed based on cloud-resolving model simulations with large-scale forcings from field campaign observations. The new scheme has been successfully implemented to the MMF and two MMF experiments were carried out with this new scheme and the old three-ice classes (cloud ice, snow graupel) scheme. The MMF has global coverage and can rigorously evaluate microphysics performance for different cloud regimes. The results show MMF with the new scheme outperformed the old one. The MMF simulations are also strongly affected by the interaction between large-scale and cloud-scale processes. Two MMF sensitivity experiments with and without nudging large-scale forcings to those of ERA-Interim reanalysis were carried out to study the impacts of large-scale forcings. The model simulated mean and variability of surface precipitation, cloud types, cloud properties such as cloud amount, hydrometeors vertical profiles, and cloud water contents, etc. in different geographic locations and climate regimes are evaluated against GPM, TRMM, CloudSat/CALIPSO satellite observations. The Goddard MMF has also been coupled with the Goddard Satellite Data Simulation Unit (G-SDSU), a system with multi-satellite, multi-sensor, and multi-spectrum satellite simulators. The statistics of MMF simulated radiances and backscattering can be directly compared with satellite observations to assess the strengths and/or deficiencies of MMF simulations and provide guidance on how to improve the MMF and microphysics.

  9. ooi: OpenStack OCCI interface

    NASA Astrophysics Data System (ADS)

    López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo

    In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.

  10. Radiative-convective equilibrium model intercomparison project

    NASA Astrophysics Data System (ADS)

    Wing, Allison A.; Reed, Kevin A.; Satoh, Masaki; Stevens, Bjorn; Bony, Sandrine; Ohno, Tomoki

    2018-03-01

    RCEMIP, an intercomparison of multiple types of models configured in radiative-convective equilibrium (RCE), is proposed. RCE is an idealization of the climate system in which there is a balance between radiative cooling of the atmosphere and heating by convection. The scientific objectives of RCEMIP are three-fold. First, clouds and climate sensitivity will be investigated in the RCE setting. This includes determining how cloud fraction changes with warming and the role of self-aggregation of convection in climate sensitivity. Second, RCEMIP will quantify the dependence of the degree of convective aggregation and tropical circulation regimes on temperature. Finally, by providing a common baseline, RCEMIP will allow the robustness of the RCE state across the spectrum of models to be assessed, which is essential for interpreting the results found regarding clouds, climate sensitivity, and aggregation, and more generally, determining which features of tropical climate a RCE framework is useful for. A novel aspect and major advantage of RCEMIP is the accessibility of the RCE framework to a variety of models, including cloud-resolving models, general circulation models, global cloud-resolving models, single-column models, and large-eddy simulation models.

  11. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  12. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    NASA Astrophysics Data System (ADS)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  13. High-Resolution Global Modeling of the Effects of Subgrid-Scale Clouds and Turbulence on Precipitating Cloud Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogenschutz, Peter; Moeng, Chin-Hoh

    2015-10-13

    The PI’s at the National Center for Atmospheric Research (NCAR), Chin-Hoh Moeng and Peter Bogenschutz, have primarily focused their time on the implementation of the Simplified-Higher Order Turbulence Closure (SHOC; Bogenschutz and Krueger 2013) to the Multi-scale Modeling Framework (MMF) global model and testing of SHOC on deep convective cloud regimes.

  14. Mixed phase clouds: observations and theoretical advances (overview)

    NASA Astrophysics Data System (ADS)

    Korolev, Alexei

    2013-04-01

    Mixed phase clouds play important role in precipitation formation and radiation budget of the Earth. The microphysical measurements in mixed phase clouds are notoriously difficult due to many technical challenges. The airborne instrumentation for characterization of the microstructure of mixed phase clouds is discussed. The results multiyear airborne observations and measurements of frequency of occurrence of mixed phase, characteristic spatial scales, humidity in mixed phase and ice clouds are presented. A theoretical framework describing the thermodynamics and phase transformation of a three phase component system consisting of ice particles, liquid droplets and water vapor is discussed. It is shown that the Wegener-Bergeron-Findeisen process plays different role in clouds with different dynamics. The problem of maintenance and longevity of mixed phase clouds is discussed.

  15. Integrating Measurement Based New Knowledge on Wildland Fire Emissions and Chemistry into the AIRPACT Air Quality Forecasting for the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Nergui, T.; Lee, Y.; Chung, S. H.; Lamb, B. K.; Yokelson, R. J.; Barsanti, K.

    2017-12-01

    A number of chamber and field measurements have shown that atmospheric organic aerosols and their precursors produced from wildfires are significantly underestimated in the emission inventories used for air quality models for various applications such as regulatory strategy development, impact assessments of air pollutants, and air quality forecasting for public health. The AIRPACT real-time air quality forecasting system consistently underestimates surface level fine particulate matter (PM2.5) concentrations in the summer at both urban and rural locations in the Pacific Northwest, primarily result of errors in organic particulate matter. In this work, we implement updated chemical speciation and emission factors based on FLAME-IV (Fourth Fire Lab at Missoula Experiment) and other measurements in the Blue-Sky fire emission model and the SMOKE emission preprocessor; and modified parameters for the secondary organic aerosol (SOA) module in CMAQ chemical transport model of the AIRPACT modeling system. Simulation results from CMAQ version 5.2 which has a better treatment for anthropogenic SOA formation (as a base case) and modified parameterization used for fire emissions and chemistry in the model (fire-soa case) are evaluated against airborne measurements downwind of the Big Windy Complex Fire and the Colockum Tarps Fire, both of which occurred in the Pacific Northwest in summer 2013. Using the observed aerosol chemical composition and mass loadings for organics, nitrate, sulfate, ammonium, and chloride from aircraft measurements during the Studies of Emissions and Atmospheric Composition, Clouds, and Climate Coupling by Regional Surveys (SEAC4RS) and the Biomass Burning Observation Project (BBOP), we assess how new knowledge gained from wildfire measurements improve model predictions for SOA and its contribution to the total mass of PM2.5 concentrations.

  16. An efficient framework for modeling clouds from Landsat8 images

    NASA Astrophysics Data System (ADS)

    Yuan, Chunqiang; Guo, Jing

    2015-03-01

    Cloud plays an important role in creating realistic outdoor scenes for video game and flight simulation applications. Classic methods have been proposed for cumulus cloud modeling. However, these methods are not flexible for modeling large cloud scenes with hundreds of clouds in that the user must repeatedly model each cloud and adjust its various properties. This paper presents a meteorologically based method to reconstruct cumulus clouds from high resolution Landsat8 satellite images. From these input satellite images, the clouds are first segmented from the background. Then, the cloud top surface is estimated from the temperature of the infrared image. After that, under a mild assumption of flat base for cumulus cloud, the base height of each cloud is computed by averaging the top height for pixels on the cloud edge. Then, the extinction is generated from the visible image. Finally, we enrich the initial shapes of clouds using a fractal method and represent the recovered clouds as a particle system. The experimental results demonstrate our method can yield realistic cloud scenes resembling those in the satellite images.

  17. An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment

    PubMed Central

    Dorairaj, Sudha Devi; Kaliannan, Thilagavathy

    2015-01-01

    Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party's premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions. PMID:26258165

  18. An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment.

    PubMed

    Dorairaj, Sudha Devi; Kaliannan, Thilagavathy

    2015-01-01

    Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party's premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions.

  19. Final Technical Report for "High-resolution global modeling of the effects of subgrid-scale clouds and turbulence on precipitating cloud systems"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Vincent

    2016-11-25

    The Multiscale Modeling Framework (MMF) embeds a cloud-resolving model in each grid column of a General Circulation Model (GCM). A MMF model does not need to use a deep convective parameterization, and thereby dispenses with the uncertainties in such parameterizations. However, MMF models grossly under-resolve shallow boundary-layer clouds, and hence those clouds may still benefit from parameterization. In this grant, we successfully created a climate model that embeds a cloud parameterization (“CLUBB”) within a MMF model. This involved interfacing CLUBB’s clouds with microphysics and reducing computational cost. We have evaluated the resulting simulated clouds and precipitation with satellite observations. Themore » chief benefit of the project is to provide a MMF model that has an improved representation of clouds and that provides improved simulations of precipitation.« less

  20. Aerosol-cloud interactions in a multi-scale modeling framework

    NASA Astrophysics Data System (ADS)

    Lin, G.; Ghan, S. J.

    2017-12-01

    Atmospheric aerosols play an important role in changing the Earth's climate through scattering/absorbing solar and terrestrial radiation and interacting with clouds. However, quantification of the aerosol effects remains one of the most uncertain aspects of current and future climate projection. Much of the uncertainty results from the multi-scale nature of aerosol-cloud interactions, which is very challenging to represent in traditional global climate models (GCMs). In contrast, the multi-scale modeling framework (MMF) provides a viable solution, which explicitly resolves the cloud/precipitation in the cloud resolved model (CRM) embedded in the GCM grid column. In the MMF version of community atmospheric model version 5 (CAM5), aerosol processes are treated with a parameterization, called the Explicit Clouds Parameterized Pollutants (ECPP). It uses the cloud/precipitation statistics derived from the CRM to treat the cloud processing of aerosols on the GCM grid. However, this treatment treats clouds on the CRM grid but aerosols on the GCM grid, which is inconsistent with the reality that cloud-aerosol interactions occur on the cloud scale. To overcome the limitation, here, we propose a new aerosol treatment in the MMF: Explicit Clouds Explicit Aerosols (ECEP), in which we resolve both clouds and aerosols explicitly on the CRM grid. We first applied the MMF with ECPP to the Accelerated Climate Modeling for Energy (ACME) model to have an MMF version of ACME. Further, we also developed an alternative version of ACME-MMF with ECEP. Based on these two models, we have conducted two simulations: one with the ECPP and the other with ECEP. Preliminary results showed that the ECEP simulations tend to predict higher aerosol concentrations than ECPP simulations, because of the more efficient vertical transport from the surface to the higher atmosphere but the less efficient wet removal. We also found that the cloud droplet number concentrations are also different between the two simulations due to the difference in the cloud droplet lifetime. Next, we will explore how the ECEP treatment affects the anthropogenic aerosol forcing, particularly the aerosol indirect forcing, by comparing present-day and pre-industrial simulations.

  1. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  2. The design of an m-Health monitoring system based on a cloud computing platform

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  3. Abstracting application deployment on Cloud infrastructures

    NASA Astrophysics Data System (ADS)

    Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.

    2017-10-01

    Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.

  4. Retrieving and Indexing Spatial Data in the Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Wang, Sheng; Zhou, Daliang

    In order to solve the drawbacks of spatial data storage in common Cloud Computing platform, we design and present a framework for retrieving, indexing, accessing and managing spatial data in the Cloud environment. An interoperable spatial data object model is provided based on the Simple Feature Coding Rules from the OGC such as Well Known Binary (WKB) and Well Known Text (WKT). And the classic spatial indexing algorithms like Quad-Tree and R-Tree are re-designed in the Cloud Computing environment. In the last we develop a prototype software based on Google App Engine to implement the proposed model.

  5. Towards a Cloud Based Smart Traffic Management Framework

    NASA Astrophysics Data System (ADS)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  6. CLARUS as a Cloud Security Framework: e-Health Use Case.

    PubMed

    Vidal, David; Iriso, Santiago; Mulero, Rafael

    2017-01-01

    Maintaining Passive Medical Health Records (PMHR) is an increasing cost and resource consumption problem. Moving to the cloud is the clearest solution to solve the problem as it offers a high amount of space and computation power. But the cloud is not safe enough when dealing with this kind of information because it can be easily accessed by attackers. The European Commission funded research project CLARUS contributes to protect healthcare-sensitive information in a secure way.

  7. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  8. Automated integration of wireless biosignal collection devices for patient-centred decision-making in point-of-care systems

    PubMed Central

    Menychtas, Andreas; Tsanakas, Panayiotis

    2016-01-01

    The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging. PMID:27222731

  9. Automated integration of wireless biosignal collection devices for patient-centred decision-making in point-of-care systems.

    PubMed

    Menychtas, Andreas; Tsanakas, Panayiotis; Maglogiannis, Ilias

    2016-03-01

    The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging.

  10. SnowCloud - a Framework to Predict Streamflow in Snowmelt-dominated Watersheds Using Cloud-based Computing

    NASA Astrophysics Data System (ADS)

    Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.

    2017-12-01

    Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing countries.

  11. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature

    PubMed Central

    Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-01-01

    Background Researchers are developing methods to automatically extract clinically relevant and useful patient characteristics from raw healthcare datasets. These characteristics, often capturing essential properties of patients with common medical conditions, are called computational phenotypes. Being generated by automated or semiautomated, data-driven methods, such potential phenotypes need to be validated as clinically meaningful (or not) before they are acceptable for use in decision making. Objective The objective of this study was to present Phenotype Instance Verification and Evaluation Tool (PIVET), a framework that uses co-occurrence analysis on an online corpus of publically available medical journal articles to build clinical relevance evidence sets for user-supplied phenotypes. PIVET adopts a conceptual framework similar to the pioneering prototype tool PheKnow-Cloud that was developed for the phenotype validation task. PIVET completely refactors each part of the PheKnow-Cloud pipeline to deliver vast improvements in speed without sacrificing the quality of the insights PheKnow-Cloud achieved. Methods PIVET leverages indexing in NoSQL databases to efficiently generate evidence sets. Specifically, PIVET uses a succinct representation of the phenotypes that corresponds to the index on the corpus database and an optimized co-occurrence algorithm inspired by the Aho-Corasick algorithm. We compare PIVET’s phenotype representation with PheKnow-Cloud’s by using PheKnow-Cloud’s experimental setup. In PIVET’s framework, we also introduce a statistical model trained on domain expert–verified phenotypes to automatically classify phenotypes as clinically relevant or not. Additionally, we show how the classification model can be used to examine user-supplied phenotypes in an online, rather than batch, manner. Results PIVET maintains the discriminative power of PheKnow-Cloud in terms of identifying clinically relevant phenotypes for the same corpus with which PheKnow-Cloud was originally developed, but PIVET’s analysis is an order of magnitude faster than that of PheKnow-Cloud. Not only is PIVET much faster, it can be scaled to a larger corpus and still retain speed. We evaluated multiple classification models on top of the PIVET framework and found ridge regression to perform best, realizing an average F1 score of 0.91 when predicting clinically relevant phenotypes. Conclusions Our study shows that PIVET improves on the most notable existing computational tool for phenotype validation in terms of speed and automation and is comparable in terms of accuracy. PMID:29728351

  12. A Framework for Collaborative and Convenient Learning on Cloud Computing Platforms

    ERIC Educational Resources Information Center

    Sharma, Deepika; Kumar, Vikas

    2017-01-01

    The depth of learning resides in collaborative work with more engagement and fun. Technology can enhance collaboration with a higher level of convenience and cloud computing can facilitate this in a cost effective and scalable manner. However, to deploy a successful online learning environment, elementary components of learning pedagogy must be…

  13. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  14. Evaluating cloud retrieval algorithms with the ARM BBHRP framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mlawer,E.; Dunn,M.; Mlawer, E.

    2008-03-10

    Climate and weather prediction models require accurate calculations of vertical profiles of radiative heating. Although heating rate calculations cannot be directly validated due to the lack of corresponding observations, surface and top-of-atmosphere measurements can indirectly establish the quality of computed heating rates through validation of the calculated irradiances at the atmospheric boundaries. The ARM Broadband Heating Rate Profile (BBHRP) project, a collaboration of all the working groups in the program, was designed with these heating rate validations as a key objective. Given the large dependence of radiative heating rates on cloud properties, a critical component of BBHRP radiative closure analysesmore » has been the evaluation of cloud microphysical retrieval algorithms. This evaluation is an important step in establishing the necessary confidence in the continuous profiles of computed radiative heating rates produced by BBHRP at the ARM Climate Research Facility (ACRF) sites that are needed for modeling studies. This poster details the continued effort to evaluate cloud property retrieval algorithms within the BBHRP framework, a key focus of the project this year. A requirement for the computation of accurate heating rate profiles is a robust cloud microphysical product that captures the occurrence, height, and phase of clouds above each ACRF site. Various approaches to retrieve the microphysical properties of liquid, ice, and mixed-phase clouds have been processed in BBHRP for the ACRF Southern Great Plains (SGP) and the North Slope of Alaska (NSA) sites. These retrieval methods span a range of assumptions concerning the parameterization of cloud location, particle density, size, shape, and involve different measurement sources. We will present the radiative closure results from several different retrieval approaches for the SGP site, including those from Microbase, the current 'reference' retrieval approach in BBHRP. At the NSA, mixed-phase clouds and cloud with a low optical depth are prevalent; the radiative closure studies using Microbase demonstrated significant residuals. As an alternative to Microbase at NSA, the Shupe-Turner cloud property retrieval algorithm, aimed at improving the partitioning of cloud phase and incorporating more constrained, conditional microphysics retrievals, also has been evaluated using the BBHRP data set.« less

  15. Polyphony: A Workflow Orchestration Framework for Cloud Computing

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Powell, Mark W.; Crockett, Tom M.; Norris, Jeffrey S.; Rossi, Ryan; Soderstrom, Tom

    2010-01-01

    Cloud Computing has delivered unprecedented compute capacity to NASA missions at affordable rates. Missions like the Mars Exploration Rovers (MER) and Mars Science Lab (MSL) are enjoying the elasticity that enables them to leverage hundreds, if not thousands, or machines for short durations without making any hardware procurements. In this paper, we describe Polyphony, a resilient, scalable, and modular framework that efficiently leverages a large set of computing resources to perform parallel computations. Polyphony can employ resources on the cloud, excess capacity on local machines, as well as spare resources on the supercomputing center, and it enables these resources to work in concert to accomplish a common goal. Polyphony is resilient to node failures, even if they occur in the middle of a transaction. We will conclude with an evaluation of a production-ready application built on top of Polyphony to perform image-processing operations of images from around the solar system, including Mars, Saturn, and Titan.

  16. CSNS computing environment Based on OpenStack

    NASA Astrophysics Data System (ADS)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  17. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets are captured for analytical purposes on the mobile user node. Experimental results are obtained and analysed. Future work is identified towards further improvement of the current design and implementation. With this new mobile video networking concept and paradigm implemented and evaluated, results and observations obtained from this study would form the basis of a more in-depth, comprehensive understanding of various challenges and opportunities in supporting high-quality real-time video streaming in mobile cloud over heterogeneous wireless networks.

  18. Novel cloud and SOA-based framework for e-health monitoring using wireless biosensors.

    PubMed

    Benharref, Abdelghani; Serhani, Mohamed Adel

    2014-01-01

    Various and independent studies are showing that an exponential increase of chronic diseases (CDs) is exhausting governmental and private healthcare systems to an extent that some countries allocate half of their budget to healthcare systems. To benefit from the IT development, e-health monitoring and prevention approaches revealed to be among top promising solutions. In fact, well-implemented monitoring and prevention schemes have reported a decent reduction of CDs risk and have narrowed their effects, on both patients' health conditions and on government budget spent on healthcare. In this paper, we propose a framework to collect patients' data in real time, perform appropriate nonintrusive monitoring, and propose medical and/or life style engagements, whenever needed and appropriate. The framework, which relies on service-oriented architecture (SOA) and the Cloud, allows a seamless integration of different technologies, applications, and services. It also integrates mobile technologies to smoothly collect and communicate vital data from a patient's wearable biosensors while considering the mobile devices' limited capabilities and power drainage in addition to intermittent network disconnections. Then, data are stored in the Cloud and made available via SOA to allow easy access by physicians, paramedics, or any other authorized entity. A case study has been developed to evaluate the usability of the framework, and the preliminary results that have been analyzed are showing very promising results.

  19. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  20. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  1. Large-scale urban point cloud labeling and reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Liqiang; Li, Zhuqiang; Li, Anjian; Liu, Fangyu

    2018-04-01

    The large number of object categories and many overlapping or closely neighboring objects in large-scale urban scenes pose great challenges in point cloud classification. In this paper, a novel framework is proposed for classification and reconstruction of airborne laser scanning point cloud data. To label point clouds, we present a rectified linear units neural network named ReLu-NN where the rectified linear units (ReLu) instead of the traditional sigmoid are taken as the activation function in order to speed up the convergence. Since the features of the point cloud are sparse, we reduce the number of neurons by the dropout to avoid over-fitting of the training process. The set of feature descriptors for each 3D point is encoded through self-taught learning, and forms a discriminative feature representation which is taken as the input of the ReLu-NN. The segmented building points are consolidated through an edge-aware point set resampling algorithm, and then they are reconstructed into 3D lightweight models using the 2.5D contouring method (Zhou and Neumann, 2010). Compared with deep learning approaches, the ReLu-NN introduced can easily classify unorganized point clouds without rasterizing the data, and it does not need a large number of training samples. Most of the parameters in the network are learned, and thus the intensive parameter tuning cost is significantly reduced. Experimental results on various datasets demonstrate that the proposed framework achieves better performance than other related algorithms in terms of classification accuracy and reconstruction quality.

  2. GRDC. A Collaborative Framework for Radiological Background and Contextual Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian J. Quiter; Ramakrishnan, Lavanya; Mark S. Bandstra

    The Radiation Mobile Analysis Platform (RadMAP) is unique in its capability to collect both high quality radiological data from both gamma-ray detectors and fast neutron detectors and a broad array of contextual data that includes positioning and stance data, high-resolution 3D radiological data from weather sensors, LiDAR, and visual and hyperspectral cameras. The datasets obtained from RadMAP are both voluminous and complex and require analyses from highly diverse communities within both the national laboratory and academic communities. Maintaining a high level of transparency will enable analysis products to further enrich the RadMAP dataset. It is in this spirit of openmore » and collaborative data that the RadMAP team proposed to collect, calibrate, and make available online data from the RadMAP system. The Berkeley Data Cloud (BDC) is a cloud-based data management framework that enables web-based data browsing visualization, and connects curated datasets to custom workflows such that analysis products can be managed and disseminated while maintaining user access rights. BDC enables cloud-based analyses of large datasets in a manner that simulates real-time data collection, such that BDC can be used to test algorithm performance on real and source-injected datasets. Using the BDC framework, a subset of the RadMAP datasets have been disseminated via the Gamma Ray Data Cloud (GRDC) that is hosted through the National Energy Research Science Computing (NERSC) Center, enabling data access to over 40 users at 10 institutions.« less

  3. Analytical modeling and feasibility study of a multi-GPU cloud-based server (MGCS) framework for non-voxel-based dose calculations.

    PubMed

    Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A

    2017-04-01

    In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.

  4. Initial Results from CALIPSO

    NASA Technical Reports Server (NTRS)

    Winker, David M.; Pelon, Jacques; McCormick, M. Patrick

    2006-01-01

    CALIPSO will carry the first polarization lidar in orbit, along with infrared and visible passive imagers, and will fly in formation as part of the Afternoon Constellation (A-train). The acquisition of observations which are simultaneous and coincident with observations from other instruments of the A-train will allow numerous synergies to be realized from combining CALIPSO observations with observations from other platforms. In particular, cloud observations from the CALIPSO lidar and the CloudSat radar will complement each other, together encompassing the variety of clouds found in the atmosphere, from thin cirrus to deep convective clouds. CALIPSO has been developed within the framework of a collaboration between NASA and CNES and is currently scheduled to launch, along with the CloudSat satellite, in spring 2006. This paper will present an overview of the CALIPSO mission, including initial results.

  5. Integration of cloud-based storage in BES III computing environment

    NASA Astrophysics Data System (ADS)

    Wang, L.; Hernandez, F.; Deng, Z.

    2014-06-01

    We present an on-going work that aims to evaluate the suitability of cloud-based storage as a supplement to the Lustre file system for storing experimental data for the BES III physics experiment and as a backend for storing files belonging to individual members of the collaboration. In particular, we discuss our findings regarding the support of cloud-based storage in the software stack of the experiment. We report on our development work that improves the support of CERN' s ROOT data analysis framework and allows efficient remote access to data through several cloud storage protocols. We also present our efforts providing the experiment with efficient command line tools for navigating and interacting with cloud storage-based data repositories both from interactive sessions and grid jobs.

  6. Study of Huizhou architecture component point cloud in surface reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Runmei; Wang, Guangyin; Ma, Jixiang; Wu, Yulu; Zhang, Guangbin

    2017-06-01

    Surface reconfiguration softwares have many problems such as complicated operation on point cloud data, too many interaction definitions, and too stringent requirements for inputing data. Thus, it has not been widely popularized so far. This paper selects the unique Huizhou Architecture chuandou wooden beam framework as the research object, and presents a complete set of implementation in data acquisition from point, point cloud preprocessing and finally implemented surface reconstruction. Firstly, preprocessing the acquired point cloud data, including segmentation and filtering. Secondly, the surface’s normals are deduced directly from the point cloud dataset. Finally, the surface reconstruction is studied by using Greedy Projection Triangulation Algorithm. Comparing the reconstructed model with the three-dimensional surface reconstruction softwares, the results show that the proposed scheme is more smooth, time efficient and portable.

  7. Evaluating the Influence of the Client Behavior in Cloud Computing.

    PubMed

    Souza Pardo, Mário Henrique; Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José

    2016-01-01

    This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system.

  8. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  9. Evaluating the Influence of the Client Behavior in Cloud Computing

    PubMed Central

    Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José

    2016-01-01

    This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system. PMID:27441559

  10. 3D Viewer Platform of Cloud Clustering Management System: Google Map 3D

    NASA Astrophysics Data System (ADS)

    Choi, Sung-Ja; Lee, Gang-Soo

    The new management system of framework for cloud envrionemnt is needed by the platfrom of convergence according to computing environments of changes. A ISV and small business model is hard to adapt management system of platform which is offered from super business. This article suggest the clustering management system of cloud computing envirionments for ISV and a man of enterprise in small business model. It applies the 3D viewer adapt from map3D & earth of google. It is called 3DV_CCMS as expand the CCMS[1].

  11. Contextual cloud-based service oriented architecture for clinical workflow.

    PubMed

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  12. QoS-aware health monitoring system using cloud-based WBANs.

    PubMed

    Almashaqbeh, Ghada; Hayajneh, Thaier; Vasilakos, Athanasios V; Mohd, Bassam J

    2014-10-01

    Wireless Body Area Networks (WBANs) are amongst the best options for remote health monitoring. However, as standalone systems WBANs have many limitations due to the large amount of processed data, mobility of monitored users, and the network coverage area. Integrating WBANs with cloud computing provides effective solutions to these problems and promotes the performance of WBANs based systems. Accordingly, in this paper we propose a cloud-based real-time remote health monitoring system for tracking the health status of non-hospitalized patients while practicing their daily activities. Compared with existing cloud-based WBAN frameworks, we divide the cloud into local one, that includes the monitored users and local medical staff, and a global one that includes the outer world. The performance of the proposed framework is optimized by reducing congestion, interference, and data delivery delay while supporting users' mobility. Several novel techniques and algorithms are proposed to accomplish our objective. First, the concept of data classification and aggregation is utilized to avoid clogging the network with unnecessary data traffic. Second, a dynamic channel assignment policy is developed to distribute the WBANs associated with the users on the available frequency channels to manage interference. Third, a delay-aware routing metric is proposed to be used by the local cloud in its multi-hop communication to speed up the reporting process of the health-related data. Fourth, the delay-aware metric is further utilized by the association protocols used by the WBANs to connect with the local cloud. Finally, the system with all the proposed techniques and algorithms is evaluated using extensive ns-2 simulations. The simulation results show superior performance of the proposed architecture in optimizing the end-to-end delay, handling the increased interference levels, maximizing the network capacity, and tracking user's mobility.

  13. FRIEDA: Flexible Robust Intelligent Elastic Data Management Framework

    DOE PAGES

    Ghoshal, Devarshi; Hendrix, Valerie; Fox, William; ...

    2017-02-01

    Scientific applications are increasingly using cloud resources for their data analysis workflows. However, managing data effectively and efficiently over these cloud resources is challenging due to the myriad storage choices with different performance, cost trade-offs, complex application choices and complexity associated with elasticity, failure rates in these environments. The different data access patterns for data-intensive scientific applications require a more flexible and robust data management solution than the ones currently in existence. FRIEDA is a Flexible Robust Intelligent Elastic Data Management framework that employs a range of data management strategies in cloud environments. FRIEDA can manage storage and data lifecyclemore » of applications in cloud environments. There are four different stages in the data management lifecycle of FRIEDA – (i) storage planning, (ii) provisioning and preparation, (iii) data placement, and (iv) execution. FRIEDA defines a data control plane and an execution plane. The data control plane defines the data partition and distribution strategy, whereas the execution plane manages the execution of the application using a master-worker paradigm. FRIEDA also provides different data management strategies, either to partition the data in real-time, or predetermine the data partitions prior to application execution.« less

  14. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    PubMed

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  15. Sensor network based solar forecasting using a local vector autoregressive ridge framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, J.; Yoo, S.; Heiser, J.

    2016-04-04

    The significant improvements and falling costs of photovoltaic (PV) technology make solar energy a promising resource, yet the cloud induced variability of surface solar irradiance inhibits its effective use in grid-tied PV generation. Short-term irradiance forecasting, especially on the minute scale, is critically important for grid system stability and auxiliary power source management. Compared to the trending sky imaging devices, irradiance sensors are inexpensive and easy to deploy but related forecasting methods have not been well researched. The prominent challenge of applying classic time series models on a network of irradiance sensors is to address their varying spatio-temporal correlations duemore » to local changes in cloud conditions. We propose a local vector autoregressive framework with ridge regularization to forecast irradiance without explicitly determining the wind field or cloud movement. By using local training data, our learned forecast model is adaptive to local cloud conditions and by using regularization, we overcome the risk of overfitting from the limited training data. Our systematic experimental results showed an average of 19.7% RMSE and 20.2% MAE improvement over the benchmark Persistent Model for 1-5 minute forecasts on a comprehensive 25-day dataset.« less

  16. FRIEDA: Flexible Robust Intelligent Elastic Data Management Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghoshal, Devarshi; Hendrix, Valerie; Fox, William

    Scientific applications are increasingly using cloud resources for their data analysis workflows. However, managing data effectively and efficiently over these cloud resources is challenging due to the myriad storage choices with different performance, cost trade-offs, complex application choices and complexity associated with elasticity, failure rates in these environments. The different data access patterns for data-intensive scientific applications require a more flexible and robust data management solution than the ones currently in existence. FRIEDA is a Flexible Robust Intelligent Elastic Data Management framework that employs a range of data management strategies in cloud environments. FRIEDA can manage storage and data lifecyclemore » of applications in cloud environments. There are four different stages in the data management lifecycle of FRIEDA – (i) storage planning, (ii) provisioning and preparation, (iii) data placement, and (iv) execution. FRIEDA defines a data control plane and an execution plane. The data control plane defines the data partition and distribution strategy, whereas the execution plane manages the execution of the application using a master-worker paradigm. FRIEDA also provides different data management strategies, either to partition the data in real-time, or predetermine the data partitions prior to application execution.« less

  17. Pairwise registration of TLS point clouds using covariance descriptors and a non-cooperative game

    NASA Astrophysics Data System (ADS)

    Zai, Dawei; Li, Jonathan; Guo, Yulan; Cheng, Ming; Huang, Pengdi; Cao, Xiaofei; Wang, Cheng

    2017-12-01

    It is challenging to automatically register TLS point clouds with noise, outliers and varying overlap. In this paper, we propose a new method for pairwise registration of TLS point clouds. We first generate covariance matrix descriptors with an adaptive neighborhood size from point clouds to find candidate correspondences, we then construct a non-cooperative game to isolate mutual compatible correspondences, which are considered as true positives. The method was tested on three models acquired by two different TLS systems. Experimental results demonstrate that our proposed adaptive covariance (ACOV) descriptor is invariant to rigid transformation and robust to noise and varying resolutions. The average registration errors achieved on three models are 0.46 cm, 0.32 cm and 1.73 cm, respectively. The computational times cost on these models are about 288 s, 184 s and 903 s, respectively. Besides, our registration framework using ACOV descriptors and a game theoretic method is superior to the state-of-the-art methods in terms of both registration error and computational time. The experiment on a large outdoor scene further demonstrates the feasibility and effectiveness of our proposed pairwise registration framework.

  18. Toward ubiquitous healthcare services with a novel efficient cloud platform.

    PubMed

    He, Chenguang; Fan, Xiaomao; Li, Ye

    2013-01-01

    Ubiquitous healthcare services are becoming more and more popular, especially under the urgent demand of the global aging issue. Cloud computing owns the pervasive and on-demand service-oriented natures, which can fit the characteristics of healthcare services very well. However, the abilities in dealing with multimodal, heterogeneous, and nonstationary physiological signals to provide persistent personalized services, meanwhile keeping high concurrent online analysis for public, are challenges to the general cloud. In this paper, we proposed a private cloud platform architecture which includes six layers according to the specific requirements. This platform utilizes message queue as a cloud engine, and each layer thereby achieves relative independence by this loosely coupled means of communications with publish/subscribe mechanism. Furthermore, a plug-in algorithm framework is also presented, and massive semistructure or unstructured medical data are accessed adaptively by this cloud architecture. As the testing results showing, this proposed cloud platform, with robust, stable, and efficient features, can satisfy high concurrent requests from ubiquitous healthcare services.

  19. SCIMITAR: Scalable Stream-Processing for Sensor Information Brokering

    DTIC Science & Technology

    2013-11-01

    IaaS) cloud frameworks including Amazon Web Services and Eucalyptus . For load testing, we used The Grinder [9], a Java load testing framework that...internal Eucalyptus cluster which we could not scale as large as the Amazon environment due to a lack of computation resources. We recreated our

  20. PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.

    PubMed

    Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang

    2017-07-26

    Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.

  1. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  2. Comparative study of internet cloud and cloudlet over wireless mesh networks for real-time applications

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2014-05-01

    Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.

  3. Integration of drug dosing data with physiological data streams using a cloud computing paradigm.

    PubMed

    Bressan, Nadja; James, Andrew; McGregor, Carolyn

    2013-01-01

    Many drugs are used during the provision of intensive care for the preterm newborn infant. Recommendations for drug dosing in newborns depend upon data from population based pharmacokinetic research. There is a need to be able to modify drug dosing in response to the preterm infant's response to the standard dosing recommendations. The real-time integration of physiological data with drug dosing data would facilitate individualised drug dosing for these immature infants. This paper proposes the use of a novel computational framework that employs real-time, temporal data analysis for this task. Deployment of the framework within the cloud computing paradigm will enable widespread distribution of individualized drug dosing for newborn infants.

  4. COMP Superscalar, an interoperable programming framework

    NASA Astrophysics Data System (ADS)

    Badia, Rosa M.; Conejero, Javier; Diaz, Carlos; Ejarque, Jorge; Lezzi, Daniele; Lordan, Francesc; Ramon-Cortes, Cristian; Sirvent, Raul

    2015-12-01

    COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel tasks and (ii) annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  5. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  6. Simulations and Evaluation of Mesoscale Convective Systems in a Multi-scale Modeling Framework (MMF)

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.

    2017-12-01

    It is well known that the mesoscale convective systems (MCS) produce more than 50% of rainfall in most tropical regions and play important roles in regional and global water cycles. Simulation of MCSs in global and climate models is a very challenging problem. Typical MCSs have horizontal scale of a few hundred kilometers. Models with a domain of several hundred kilometers and fine enough resolution to properly simulate individual clouds are required to realistically simulate MCSs. The multiscale modeling framework (MMF), which replaces traditional cloud parameterizations with cloud-resolving models (CRMs) within a host atmospheric general circulation model (GCM), has shown some capabilities of simulating organized MCS-like storm signals and propagations. However, its embedded CRMs typically have small domain (less than 128 km) and coarse resolution ( 4 km) that cannot realistically simulate MCSs and individual clouds. In this study, a series of simulations were performed using the Goddard MMF. The impacts of the domain size and model grid resolution of the embedded CRMs on simulating MCSs are examined. The changes of cloud structure, occurrence, and properties such as cloud types, updraft and downdraft, latent heating profile, and cold pool strength in the embedded CRMs are examined in details. The simulated MCS characteristics are evaluated against satellite measurements using the Goddard Satellite Data Simulator Unit. The results indicate that embedded CRMs with large domain and fine resolution tend to produce better simulations compared to those simulations with typical MMF configuration (128 km domain size and 4 km model grid spacing).

  7. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption

    PubMed Central

    2015-01-01

    Background The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. Methods We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. Results The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Conclusions Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics. PMID:26733391

  8. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption.

    PubMed

    Zhang, Yuchen; Dai, Wenrui; Jiang, Xiaoqian; Xiong, Hongkai; Wang, Shuang

    2015-01-01

    The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics.

  9. Simplified ISCCP cloud regimes for evaluating cloudiness in CMIP5 models

    NASA Astrophysics Data System (ADS)

    Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin

    2017-01-01

    We take advantage of ISCCP simulator data available for many models that participated in CMIP5, in order to introduce a framework for comparing model cloud output with corresponding ISCCP observations based on the cloud regime (CR) concept. Simplified global CRs are employed derived from the co-variations of three variables, namely cloud optical thickness, cloud top pressure and cloud fraction ( τ, p c , CF). Following evaluation criteria established in a companion paper of ours (Jin et al. 2016), we assess model cloud simulation performance based on how well the simplified CRs are simulated in terms of similarity of centroids, global values and map correlations of relative-frequency-of-occurrence, and long-term total cloud amounts. Mirroring prior results, modeled clouds tend to be too optically thick and not as extensive as in observations. CRs with high-altitude clouds from storm activity are not as well simulated here compared to the previous study, but other regimes containing near-overcast low clouds show improvement. Models that have performed well in the companion paper against CRs defined by joint τ- p c histograms distinguish themselves again here, but improvements for previously underperforming models are also seen. Averaging across models does not yield a drastically better picture, except for cloud geographical locations. Cloud evaluation with simplified regimes seems thus more forgiving than that using histogram-based CRs while still strict enough to reveal model weaknesses.

  10. From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds

    NASA Astrophysics Data System (ADS)

    Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric

    2016-04-01

    In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.

  11. Cloud Macroscopic Organization: Order Emerging from Randomness

    NASA Technical Reports Server (NTRS)

    Yuan, Tianle

    2011-01-01

    Clouds play a central role in many aspects of the climate system and their forms and shapes are remarkably diverse. Appropriate representation of clouds in climate models is a major challenge because cloud processes span at least eight orders of magnitude in spatial scales. Here we show that there exists order in cloud size distribution of low-level clouds, and that it follows a power-law distribution with exponent gamma close to 2. gamma is insensitive to yearly variations in environmental conditions, but has regional variations and land-ocean contrasts. More importantly, we demonstrate this self-organizing behavior of clouds emerges naturally from a complex network model with simple, physical organizing principles: random clumping and merging. We also demonstrate symmetry between clear and cloudy skies in terms of macroscopic organization because of similar fundamental underlying organizing principles. The order in the apparently complex cloud-clear field thus has its root in random local interactions. Studying cloud organization with complex network models is an attractive new approach that has wide applications in climate science. We also propose a concept of cloud statistic mechanics approach. This approach is fully complementary to deterministic models, and the two approaches provide a powerful framework to meet the challenge of representing clouds in our climate models when working in tandem.

  12. cOSPREY: A Cloud-Based Distributed Algorithm for Large-Scale Computational Protein Design

    PubMed Central

    Pan, Yuchao; Dong, Yuxi; Zhou, Jingtian; Hallen, Mark; Donald, Bruce R.; Xu, Wei

    2016-01-01

    Abstract Finding the global minimum energy conformation (GMEC) of a huge combinatorial search space is the key challenge in computational protein design (CPD) problems. Traditional algorithms lack a scalable and efficient distributed design scheme, preventing researchers from taking full advantage of current cloud infrastructures. We design cloud OSPREY (cOSPREY), an extension to a widely used protein design software OSPREY, to allow the original design framework to scale to the commercial cloud infrastructures. We propose several novel designs to integrate both algorithm and system optimizations, such as GMEC-specific pruning, state search partitioning, asynchronous algorithm state sharing, and fault tolerance. We evaluate cOSPREY on three different cloud platforms using different technologies and show that it can solve a number of large-scale protein design problems that have not been possible with previous approaches. PMID:27154509

  13. Hydrodynamics and Water Quality forecasting over a Cloud Computing environment: INDIGO-DataCloud

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; García, Daniel; Monteoliva, Agustín

    2017-04-01

    Algae Bloom due to eutrophication is an extended problem for water reservoirs and lakes that impacts directly in water quality. It can create a dead zone that lacks enough oxygen to support life and it can also be human harmful, so it must be controlled in water masses for supplying, bathing or other uses. Hydrodynamic and Water Quality modelling can contribute to forecast the status of the water system in order to alert authorities before an algae bloom event occurs. It can be used to predict scenarios and find solutions to reduce the harmful impact of the blooms. High resolution models need to process a big amount of data using a robust enough computing infrastructure. INDIGO-DataCloud (https://www.indigo-datacloud.eu/) is an European Commission funded project that aims at developing a data and computing platform targeting scientific communities, deployable on multiple hardware and provisioned over hybrid (private or public) e-infrastructures. The project addresses the development of solutions for different Case Studies using different Cloud-based alternatives. In the first INDIGO software release, a set of components are ready to manage the deployment of services to perform N number of Delft3D simulations (for calibrating or scenario definition) over a Cloud Computing environment, using the Docker technology: TOSCA requirement description, Docker repository, Orchestrator, AAI (Authorization, Authentication) and OneData (Distributed Storage System). Moreover, the Future Gateway portal based on Liferay, provides an user-friendly interface where the user can configure the simulations. Due to the data approach of INDIGO, the developed solutions can contribute to manage the full data life cycle of a project, thanks to different tools to manage datasets or even metadata. Furthermore, the cloud environment contributes to provide a dynamic, scalable and easy-to-use framework for non-IT experts users. This framework is potentially capable to automatize the processing of forecasting applying periodic tasks. For instance, a user can forecast every month the hydrodynamics and water quality status of a reservoir starting from a base model and supplying new data gathered from the instrumentation or observations. This interactive presentation aims to show the use of INDIGO solutions in a particular forecasting use case and to inspire others in the use of a Cloud framework for their applications.

  14. Mobile-Cloud Assisted Video Summarization Framework for Efficient Management of Remote Sensing Data Generated by Wireless Capsule Sensors

    PubMed Central

    Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook

    2014-01-01

    Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data. PMID:25225874

  15. 3D Reconstruction of Space Objects from Multi-Views by a Visible Sensor

    PubMed Central

    Zhang, Haopeng; Wei, Quanmao; Jiang, Zhiguo

    2017-01-01

    In this paper, a novel 3D reconstruction framework is proposed to recover the 3D structural model of a space object from its multi-view images captured by a visible sensor. Given an image sequence, this framework first estimates the relative camera poses and recovers the depths of the surface points by the structure from motion (SFM) method, then the patch-based multi-view stereo (PMVS) algorithm is utilized to generate a dense 3D point cloud. To resolve the wrong matches arising from the symmetric structure and repeated textures of space objects, a new strategy is introduced, in which images are added to SFM in imaging order. Meanwhile, a refining process exploiting the structural prior knowledge that most sub-components of artificial space objects are composed of basic geometric shapes is proposed and applied to the recovered point cloud. The proposed reconstruction framework is tested on both simulated image datasets and real image datasets. Experimental results illustrate that the recovered point cloud models of space objects are accurate and have a complete coverage of the surface. Moreover, outliers and points with severe noise are effectively filtered out by the refinement, resulting in an distinct improvement of the structure and visualization of the recovered points. PMID:28737675

  16. Mobile-cloud assisted video summarization framework for efficient management of remote sensing data generated by wireless capsule sensors.

    PubMed

    Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook

    2014-09-15

    Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.

  17. Using MODIS Cloud Regimes to Sort Diagnostic Signals of Aerosol-Cloud-Precipitation Interactions

    PubMed Central

    Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin

    2018-01-01

    Coincident multi-year measurements of aerosol, cloud, precipitation and radiation at near-global scales are analyzed to diagnose their apparent relationships as suggestive of interactions previously proposed based on theoretical, observational, and model constructs. Specifically, we examine whether differences in aerosol loading in separate observations go along with consistently different precipitation, cloud properties, and cloud radiative effects. Our analysis uses a cloud regime (CR) framework to dissect and sort the results. The CRs come from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor and are defined as distinct groups of cloud systems with similar co-variations of cloud top pressure and cloud optical thickness. Aerosol optical depth used as proxy for aerosol loading comes from two sources, MODIS observations, and the MERRA-2 re-analysis, and its variability is defined with respect to local seasonal climatologies. The choice of aerosol dataset impacts our results substantially. We also find that the responses of the marine and continental component of a CR are frequently quite disparate. Overall, CRs dominated by warm clouds tend to exhibit less ambiguous signals, but also have more uncertainty with regard to precipitation changes. Finally, we find weak, but occasionally systematic co-variations of select meteorological indicators and aerosol, which serves as a sober reminder that ascribing changes in cloud and cloud-affected variables solely to aerosol variations is precarious. PMID:29651373

  18. Using MODIS Cloud Regimes to Sort Diagnostic Signals of Aerosol-Cloud-Precipitation Interactions.

    PubMed

    Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin

    2017-05-27

    Coincident multi-year measurements of aerosol, cloud, precipitation and radiation at near-global scales are analyzed to diagnose their apparent relationships as suggestive of interactions previously proposed based on theoretical, observational, and model constructs. Specifically, we examine whether differences in aerosol loading in separate observations go along with consistently different precipitation, cloud properties, and cloud radiative effects. Our analysis uses a cloud regime (CR) framework to dissect and sort the results. The CRs come from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor and are defined as distinct groups of cloud systems with similar co-variations of cloud top pressure and cloud optical thickness. Aerosol optical depth used as proxy for aerosol loading comes from two sources, MODIS observations, and the MERRA-2 re-analysis, and its variability is defined with respect to local seasonal climatologies. The choice of aerosol dataset impacts our results substantially. We also find that the responses of the marine and continental component of a CR are frequently quite disparate. Overall, CRs dominated by warm clouds tend to exhibit less ambiguous signals, but also have more uncertainty with regard to precipitation changes. Finally, we find weak, but occasionally systematic co-variations of select meteorological indicators and aerosol, which serves as a sober reminder that ascribing changes in cloud and cloud-affected variables solely to aerosol variations is precarious.

  19. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    PubMed

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  20. Clouds at Barbados are representative of clouds across the trade wind regions in observations and climate models.

    PubMed

    Medeiros, Brian; Nuijens, Louise

    2016-05-31

    Trade wind regions cover most of the tropical oceans, and the prevailing cloud type is shallow cumulus. These small clouds are parameterized by climate models, and changes in their radiative effects strongly and directly contribute to the spread in estimates of climate sensitivity. This study investigates the structure and variability of these clouds in observations and climate models. The study builds upon recent detailed model evaluations using observations from the island of Barbados. Using a dynamical regimes framework, satellite and reanalysis products are used to compare the Barbados region and the broader tropics. It is shown that clouds in the Barbados region are similar to those across the trade wind regions, implying that observational findings from the Barbados Cloud Observatory are relevant to clouds across the tropics. The same methods are applied to climate models to evaluate the simulated clouds. The models generally capture the cloud radiative effect, but underestimate cloud cover and show an array of cloud vertical structures. Some models show strong biases in the environment of the Barbados region in summer, weakening the connection between the regional biases and those across the tropics. Even bearing that limitation in mind, it is shown that covariations of cloud and environmental properties in the models are inconsistent with observations. The models tend to misrepresent sensitivity to moisture variations and inversion characteristics. These model errors are likely connected to cloud feedback in climate projections, and highlight the importance of the representation of shallow cumulus convection.

  1. Clouds at Barbados are representative of clouds across the trade wind regions in observations and climate models

    PubMed Central

    Nuijens, Louise

    2016-01-01

    Trade wind regions cover most of the tropical oceans, and the prevailing cloud type is shallow cumulus. These small clouds are parameterized by climate models, and changes in their radiative effects strongly and directly contribute to the spread in estimates of climate sensitivity. This study investigates the structure and variability of these clouds in observations and climate models. The study builds upon recent detailed model evaluations using observations from the island of Barbados. Using a dynamical regimes framework, satellite and reanalysis products are used to compare the Barbados region and the broader tropics. It is shown that clouds in the Barbados region are similar to those across the trade wind regions, implying that observational findings from the Barbados Cloud Observatory are relevant to clouds across the tropics. The same methods are applied to climate models to evaluate the simulated clouds. The models generally capture the cloud radiative effect, but underestimate cloud cover and show an array of cloud vertical structures. Some models show strong biases in the environment of the Barbados region in summer, weakening the connection between the regional biases and those across the tropics. Even bearing that limitation in mind, it is shown that covariations of cloud and environmental properties in the models are inconsistent with observations. The models tend to misrepresent sensitivity to moisture variations and inversion characteristics. These model errors are likely connected to cloud feedback in climate projections, and highlight the importance of the representation of shallow cumulus convection. PMID:27185925

  2. NOAA's National Air Quality Prediction and Development of Aerosol and Atmospheric Composition Prediction Components for NGGPS

    NASA Astrophysics Data System (ADS)

    Stajner, I.; McQueen, J.; Lee, P.; Stein, A. F.; Wilczak, J. M.; Upadhayay, S.; daSilva, A.; Lu, C. H.; Grell, G. A.; Pierce, R. B.

    2017-12-01

    NOAA's operational air quality predictions of ozone, fine particulate matter (PM2.5) and wildfire smoke over the United States and airborne dust over the contiguous 48 states are distributed at http://airquality.weather.gov. The National Air Quality Forecast Capability (NAQFC) providing these predictions was updated in June 2017. Ozone and PM2.5 predictions are now produced using the system linking the Community Multiscale Air Quality model (CMAQ) version 5.0.2 with meteorological inputs from the North American Mesoscale Forecast System (NAM) version 4. Predictions of PM2.5 include intermittent dust emissions and wildfire emissions from an updated version of BlueSky system. For the latter, the CMAQ system is initialized by rerunning it over the previous 24 hours to include wildfire emissions at the time when they were observed from the satellites. Post processing to reduce the bias in PM2.5 prediction was updated using the Kalman filter analog (KFAN) technique. Dust related aerosol species at the CMAQ domain lateral boundaries now come from the NEMS Global Aerosol Component (NGAC) v2 predictions. Further development of NAQFC includes testing of CMAQ predictions to 72 hours, Canadian fire emissions data from Environment and Climate Change Canada (ECCC) and the KFAN technique to reduce bias in ozone predictions. NOAA is developing the Next Generation Global Predictions System (NGGPS) with an aerosol and gaseous atmospheric composition component to improve and integrate aerosol and ozone predictions and evaluate their impacts on physics, data assimilation and weather prediction. Efforts are underway to improve cloud microphysics, investigate aerosol effects and include representations of atmospheric composition of varying complexity into NGGPS: from the operational ozone parameterization, GOCART aerosols, with simplified ozone chemistry, to CMAQ chemistry with aerosol modules. We will present progress on community building, planning and development of NGGPS.

  3. Cybersecurity Protection: Design Science Research toward an Intercloud Transparent Bridge Architecture (ITCOBRA)

    ERIC Educational Resources Information Center

    Wilson, Joe M.

    2013-01-01

    This dissertation uses design science research and engineering to develop a cloud-based simulator for modeling next-generation cybersecurity protection frameworks in the United States. The claim is made that an agile and neutral framework extending throughout the cyber-threat plane is needed for critical infrastructure protection (CIP). This…

  4. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  5. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  6. Predicting the diurnal blue-sky albedo of soils using their laboratory reflectance spectra and roughness indices

    NASA Astrophysics Data System (ADS)

    Cierniewski, Jerzy; Ceglarek, Jakub; Karnieli, Arnon; Królewicz, Sławomir; Kaźmierowski, Cezary; Zagajewski, Bogdan

    2017-10-01

    The objective of this study was to assess the relationship between the hyperspectral reflectance of soils and their albedo, measured under various roughness conditions. 108 soil surface measurements were conducted in Poland and Israel. Each surface was characterised by its diurnal albedo variation in the field as well as by its reflectance spectra obtained in the laboratory. The best fit to the model was achieved by post-processing manipulation of the spectra, namely second derivate transformation. Using a stepwise elimination process, four spectral wavelengths and the roughness index were selected for modelling. The resulting models allowed the albedo of a soil to be predicted for its different roughness states and any solar zenith angle, provided that hyperspectral reflectance data is available.

  7. The ESA Cloud CCI project: Generation of Multi Sensor consistent Cloud Properties with an Optimal Estimation Based Retrieval Algorithm

    NASA Astrophysics Data System (ADS)

    Jerg, M.; Stengel, M.; Hollmann, R.; Poulsen, C.

    2012-04-01

    The ultimate objective of the ESA Climate Change Initiative (CCI) Cloud project is to provide long-term coherent cloud property data sets exploiting and improving on the synergetic capabilities of past, existing, and upcoming European and American satellite missions. The synergetic approach allows not only for improved accuracy and extended temporal and spatial sampling of retrieved cloud properties better than those provided by single instruments alone but potentially also for improved (inter-)calibration and enhanced homogeneity and stability of the derived time series. Such advances are required by the scientific community to facilitate further progress in satellite-based climate monitoring, which leads to a better understanding of climate. Some of the primary objectives of ESA Cloud CCI Cloud are (1) the development of inter-calibrated radiance data sets, so called Fundamental Climate Data Records - for ESA and non ESA instruments through an international collaboration, (2) the development of an optimal estimation based retrieval framework for cloud related essential climate variables like cloud cover, cloud top height and temperature, liquid and ice water path, and (3) the development of two multi-annual global data sets for the mentioned cloud properties including uncertainty estimates. These two data sets are characterized by different combinations of satellite systems: the AVHRR heritage product comprising (A)ATSR, AVHRR and MODIS and the novel (A)ATSR - MERIS product which is based on a synergetic retrieval using both instruments. Both datasets cover the years 2007-2009 in the first project phase. ESA Cloud CCI will also carry out a comprehensive validation of the cloud property products and provide a common data base as in the framework of the Global Energy and Water Cycle Experiment (GEWEX). The presentation will give an overview of the ESA Cloud CCI project and its goals and approaches and then continue with results from the Round Robin algorithm comparison exercise carried out at the beginning of the project which included three algorithms. The purpose of the exercise was to assess and compare existing cloud retrieval algorithms in order to chose one of them as backbone of the retrieval system and also identify areas of potential improvement and general strengths and weaknesses of the algorithm. Furthermore the presentation will elaborate on the optimal estimation algorithm subsequently chosen to derive the heritage product and which is presently further developed and will be employed for the AVHRR heritage product. The algorithm's capabilities to coherently and simultaneously process all radiative input and yield retrieval parameters together with associated uncertainty estimates will be presented together with first results for the heritage product. In the course of the project the algorithm is being developed into a freely and publicly available community retrieval system for interested scientists.

  8. Cloud Computing for Protein-Ligand Binding Site Comparison

    PubMed Central

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery. PMID:23762824

  9. Cloud computing for protein-ligand binding site comparison.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.

  10. GCSS/WGNE Pacific Cross-section Intercomparison: Tropical and Subtropical Cloud Transitions

    NASA Astrophysics Data System (ADS)

    Teixeira, J.

    2008-12-01

    In this presentation I will discuss the role of the GEWEX Cloud Systems Study (GCSS) working groups in paving the way for substantial improvements in cloud parameterization in weather and climate models. The GCSS/WGNE Pacific Cross-section Intercomparison (GPCI) is an extension of GCSS and is a different type of model evaluation where climate models are analyzed along a Pacific Ocean transect from California to the equator. This approach aims at complementing the more traditional efforts in GCSS by providing a simple framework for the evaluation of models that encompasses several fundamental cloud regimes such as stratocumulus, shallow cumulus and deep cumulus, as well as the transitions between them. Currently twenty four climate and weather prediction models are participating in GPCI. We will present results of the comparison between models and recent satellite data. In particular, we will explore in detail the potential of the Atmospheric Infrared Sounder (AIRS) and CloudSat data for the evaluation of the representation of clouds and convection in climate models.

  11. Fine-grained Database Field Search Using Attribute-Based Encryption for E-Healthcare Clouds.

    PubMed

    Guo, Cheng; Zhuang, Ruhan; Jie, Yingmo; Ren, Yizhi; Wu, Ting; Choo, Kim-Kwang Raymond

    2016-11-01

    An effectively designed e-healthcare system can significantly enhance the quality of access and experience of healthcare users, including facilitating medical and healthcare providers in ensuring a smooth delivery of services. Ensuring the security of patients' electronic health records (EHRs) in the e-healthcare system is an active research area. EHRs may be outsourced to a third-party, such as a community healthcare cloud service provider for storage due to cost-saving measures. Generally, encrypting the EHRs when they are stored in the system (i.e. data-at-rest) or prior to outsourcing the data is used to ensure data confidentiality. Searchable encryption (SE) scheme is a promising technique that can ensure the protection of private information without compromising on performance. In this paper, we propose a novel framework for controlling access to EHRs stored in semi-trusted cloud servers (e.g. a private cloud or a community cloud). To achieve fine-grained access control for EHRs, we leverage the ciphertext-policy attribute-based encryption (CP-ABE) technique to encrypt tables published by hospitals, including patients' EHRs, and the table is stored in the database with the primary key being the patient's unique identity. Our framework can enable different users with different privileges to search on different database fields. Differ from previous attempts to secure outsourcing of data, we emphasize the control of the searches of the fields within the database. We demonstrate the utility of the scheme by evaluating the scheme using datasets from the University of California, Irvine.

  12. Toward a Proof of Concept Cloud Framework for Physics Applications on Blue Gene Supercomputers

    NASA Astrophysics Data System (ADS)

    Dreher, Patrick; Scullin, William; Vouk, Mladen

    2015-09-01

    Traditional high performance supercomputers are capable of delivering large sustained state-of-the-art computational resources to physics applications over extended periods of time using batch processing mode operating environments. However, today there is an increasing demand for more complex workflows that involve large fluctuations in the levels of HPC physics computational requirements during the simulations. Some of the workflow components may also require a richer set of operating system features and schedulers than normally found in a batch oriented HPC environment. This paper reports on progress toward a proof of concept design that implements a cloud framework onto BG/P and BG/Q platforms at the Argonne Leadership Computing Facility. The BG/P implementation utilizes the Kittyhawk utility and the BG/Q platform uses an experimental heterogeneous FusedOS operating system environment. Both platforms use the Virtual Computing Laboratory as the cloud computing system embedded within the supercomputer. This proof of concept design allows a cloud to be configured so that it can capitalize on the specialized infrastructure capabilities of a supercomputer and the flexible cloud configurations without resorting to virtualization. Initial testing of the proof of concept system is done using the lattice QCD MILC code. These types of user reconfigurable environments have the potential to deliver experimental schedulers and operating systems within a working HPC environment for physics computations that may be different from the native OS and schedulers on production HPC supercomputers.

  13. Modern and prospective technologies for weather modification activities: Developing a framework for integrating autonomous unmanned aircraft systems

    NASA Astrophysics Data System (ADS)

    DeFelice, T. P.; Axisa, Duncan

    2017-09-01

    This paper builds upon the processes and framework already established for identifying, integrating and testing an unmanned aircraft system (UAS) with sensing technology for use in rainfall enhancement cloud seeding programs to carry out operational activities or to monitor and evaluate seeding operations. We describe the development and assessment methodologies of an autonomous and adaptive UAS platform that utilizes in-situ real time data to sense, target and implement seeding. The development of a UAS platform that utilizes remote and in-situ real-time data to sense, target and implement seeding deployed with a companion UAS ensures optimal, safe, secure, cost-effective seeding operations, and the dataset to quantify the results of seeding. It also sets the path for an innovative, paradigm shifting approach for enhancing precipitation independent of seeding mode. UAS technology is improving and their application in weather modification must be explored to lay the foundation for future implementation. The broader significance lies in evolving improved technology and automating cloud seeding operations that lowers the cloud seeding operational footprint and optimizes their effectiveness and efficiency, while providing the temporal and spatial sensitivities to overcome the predictability or sparseness of environmental parameters needed to identify conditions suitable for seeding, and how such might be implemented. The dataset from the featured approach will contain data from concurrent Eulerian and Lagrangian perspectives over sub-cloud scales that will facilitate the development of cloud seeding decision support tools.

  14. Preliminary results of radiometric measurements of clear air and cloud brightness (antenna) temperatures at 37GHz

    NASA Astrophysics Data System (ADS)

    Arakelyan, A. K.; Hambaryan, A. K.; Arakelyan, A. A.

    2012-05-01

    In this paper the results of polarization measurements of clear air and clouds brightness temperatures at 37GHz are presented. The results were obtained during the measurements carried out in Armenia from the measuring complex built under the framework of ISTC Projects A-872 and A-1524. The measurements were carried out at vertical and horizontal polarizations, under various angles of sensing by Ka-band combined scatterometric-radiometric system (ArtAr-37) developed and built by ECOSERV Remote Observation Centre Co.Ltd. under the framework of the above Projects. In the paper structural and operational features of the utilized system and the whole measuring complex will be considered and discussed as well.

  15. Cloud-Based Perception and Control of Sensor Nets and Robot Swarms

    DTIC Science & Technology

    2016-04-01

    distributed stream processing framework provides the necessary API and infrastructure to develop and execute such applications in a cluster of computation...streaming DDDAS applications based on challenges they present to the backend Cloud control system. Figure 2 Parallel SLAM Application 3 1) Set of...the art deep learning- based object detectors can recognize among hundreds of object classes and this capability would be very useful for mobile

  16. DESPIC: Detecting Early Signatures of Persuasion in Information Cascades

    DTIC Science & Technology

    2015-08-27

    over NoSQL Databases, Proceedings of the 14th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid 2014). 26-MAY-14, . : , P...over NoSQL Databases. Proceedings of the 14th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid 2014). Chicago, IL, USA...distributed NoSQL databases including HBase and Riak, we finalized the requirements of the optimal computational architecture to support our framework

  17. Semantic Segmentation of Indoor Point Clouds Using Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Babacan, K.; Chen, L.; Sohn, G.

    2017-11-01

    As Building Information Modelling (BIM) thrives, geometry becomes no longer sufficient; an ever increasing variety of semantic information is needed to express an indoor model adequately. On the other hand, for the existing buildings, automatically generating semantically enriched BIM from point cloud data is in its infancy. The previous research to enhance the semantic content rely on frameworks in which some specific rules and/or features that are hand coded by specialists. These methods immanently lack generalization and easily break in different circumstances. On this account, a generalized framework is urgently needed to automatically and accurately generate semantic information. Therefore we propose to employ deep learning techniques for the semantic segmentation of point clouds into meaningful parts. More specifically, we build a volumetric data representation in order to efficiently generate the high number of training samples needed to initiate a convolutional neural network architecture. The feedforward propagation is used in such a way to perform the classification in voxel level for achieving semantic segmentation. The method is tested both for a mobile laser scanner point cloud, and a larger scale synthetically generated data. We also demonstrate a case study, in which our method can be effectively used to leverage the extraction of planar surfaces in challenging cluttered indoor environments.

  18. The Diurnal Cycle of Clouds and Precipitation at the ARM SGP Site: An Atmospheric State-Based Analysis and Error Decomposition of a Multiscale Modeling Framework Simulation

    NASA Astrophysics Data System (ADS)

    Zhao, Wei; Marchand, Roger; Fu, Qiang

    2017-12-01

    Long-term reflectivity data collected by a millimeter cloud radar at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) site are used to examine the diurnal cycle of clouds and precipitation and are compared with the diurnal cycle simulated by a Multiscale Modeling Framework (MMF) climate model. The study uses a set of atmospheric states that were created specifically for the SGP and for the purpose of investigating under what synoptic conditions models compare well with observations on a statistical basis (rather than using case studies or seasonal or longer time scale averaging). Differences in the annual mean diurnal cycle between observations and the MMF are decomposed into differences due to the relative frequency of states, the daily mean vertical profile of hydrometeor occurrence, and the (normalized) diurnal variation of hydrometeors in each state. Here the hydrometeors are classified as cloud or precipitation based solely on the reflectivity observed by a millimeter radar or generated by a radar simulator. The results show that the MMF does not capture the diurnal variation of low clouds well in any of the states but does a reasonable job capturing the diurnal variations of high clouds and precipitation in some states. In particular, the diurnal variations in states that occur during summer are reasonably captured by the MMF, while the diurnal variations in states that occur during the transition seasons (spring and fall) are not well captured. Overall, the errors in the annual composite are due primarily to errors in the daily mean of hydrometeor occurrence (rather than diurnal variations), but errors in the state frequency (that is, the distribution of weather states in the model) also play a significant role.

  19. DeepSAT's CloudCNN: A Deep Neural Network for Rapid Cloud Detection from Geostationary Satellites

    NASA Astrophysics Data System (ADS)

    Kalia, S.; Li, S.; Ganguly, S.; Nemani, R. R.

    2017-12-01

    Cloud and cloud shadow detection has important applications in weather and climate studies. It is even more crucial when we introduce geostationary satellites into the field of terrestrial remotesensing. With the challenges associated with data acquired in very high frequency (10-15 mins per scan), the ability to derive an accurate cloud/shadow mask from geostationary satellite data iscritical. The key to the success for most of the existing algorithms depends on spatially and temporally varying thresholds, which better capture local atmospheric and surface effects.However, the selection of proper threshold is difficult and may lead to erroneous results. In this work, we propose a deep neural network based approach called CloudCNN to classifycloud/shadow from Himawari-8 AHI and GOES-16 ABI multispectral data. DeepSAT's CloudCNN consists of an encoder-decoder based architecture for binary-class pixel wise segmentation. We train CloudCNN on multi-GPU Nvidia Devbox cluster, and deploy the prediction pipeline on NASA Earth Exchange (NEX) Pleiades supercomputer. We achieved an overall accuracy of 93.29% on test samples. Since, the predictions take only a few seconds to segment a full multi-spectral GOES-16 or Himawari-8 Full Disk image, the developed framework can be used for real-time cloud detection, cyclone detection, or extreme weather event predictions.

  20. The evolution of nocturnal boundary-layer clouds in southern West Africa - a case study from DACCIWA

    NASA Astrophysics Data System (ADS)

    Adler, Bianca; Kalthoff, Norbert; Babić, Karmen; Lohou, Fabienne; Dione, Cheikh; Lothon, Marie; Pedruzo-Bagazgoitia, Xabier

    2017-04-01

    During the monsoon season, the atmospheric boundary layer in southern West Africa is characterised by various kinds of low-level clouds which experience a distinct diurnal cycle. During the night, extensive low-level stratiform clouds frequently form with a cloud base often less than few hundred metres above ground. After sunrise the cloud base slowly starts rising and eventually a transition to convective clouds occurs. While the existence of the clouds is documented in satellite images and synoptic observations, little is known about the mechanisms controlling their evolution. To provide observational evidence, a field campaign was conducted in southern West Africa in June and July 2016 within the framework of the Dynamics-aerosol-chemistry-cloud interactions in West Africa (DACCIWA) project. Comprehensive ground-based in situ and remote sensing measurements were performed at three different supersites in Ghana, Benin and Nigeria. In this contribution, we present the diurnal cycle of boundary-layer clouds for a typical day using data from a supersite at Savè in Benin. Due to the synergy of various instruments, we are able to obtain detailed information on the evolution of the clouds as well as on the boundary-layer structure with high temporal and vertical resolution. By combining ceilometer, cloud radar and microwave radiometer data we determined the cloud base, -depth and -density. The clouds form in the same layer as a nocturnal low-level jet (NLLJ), which we probe by sodar and UHF profiler. There is evidence for a strong link between the height and strength of the NLLJ and the density of the nocturnal clouds.

  1. NHI-PharmaCloud in Taiwan--A preliminary evaluation using the RE-AIM framework and lessons learned.

    PubMed

    Huang, San-Kuei; Wang, Pen-Jen; Tseng, Wen-Fuh; Syu, Fei-Kai; Lee, Miaw-Chwen; Shih, Ru-Liang; Sheen, Mao-Ting; Chen, Michael S

    2015-10-01

    The aim of this article is to present the preliminary impact of a medication monitoring program, PharmaCloud, in Taiwan and analyze the embedded factors that have contributed to the performance thereof. This article also compared PharmaCloud with similar international programs in order to draw lessons learned. The five domains of the RE-AIM framework - reach, effectiveness, adoption, implementation, and maintenance - were examined using qualitative and quantitative data. A difference-in-differences model was applied to analyze the quantitative impact of PharmaCloud on drug utilization and drug expenses. The qualitative impact was evaluated by document analysis based on field reports from the participating medical institutions. Reach and adoption: although all of the major hospitals adopted PharmaCloud and some of the hospitals had high inquiry rates, more time and incentives are needed to raise the overall inquiry rate. Effectiveness: during the study period of 3 months, the number of medications per prescription declined in the intervention group was 0.15 more than that of the general population, and the drug expense per person declined in the intervention group was NT $567 (US $18.9) more than that of the general population. The potential savings could be between 2% and 5% of the total pharmaceutical expenditure. Medication duplication was found to have decreased more in the intervention group. a variety of innovations in care delivery are being developed in which the pharmacists play a more significant role. Maintenance: the embedded National Health Insurance would lend strong support for PharmaCloud to grow and thrive. PharmaCloud owes its effectiveness to the embedded National Health Insurance (NHI) program, which is universal and provides a comprehensive benefit package including more than 16,000 prescription drugs. An effective medication program is one that operates under the principle of universality and comprehensiveness, facilitates innovations, and has a substantial level of interoperability with the intra-hospital health information systems. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Lagrangian Particle Tracking Simulation for Warm-Rain Processes in Quasi-One-Dimensional Domain

    NASA Astrophysics Data System (ADS)

    Kunishima, Y.; Onishi, R.

    2017-12-01

    Conventional cloud simulations are based on the Euler method and compute each microphysics process in a stochastic way assuming infinite numbers of particles within each numerical grid. They therefore cannot provide the Lagrangian statistics of individual particles in cloud microphysics (i.e., aerosol particles, cloud particles, and rain drops) nor discuss the statistical fluctuations due to finite number of particles. We here simulate the entire precipitation process of warm-rain, with tracking individual particles. We use the Lagrangian Cloud Simulator (LCS), which is based on the Euler-Lagrangian framework. In that framework, flow motion and scalar transportation are computed with the Euler method, and particle motion with the Lagrangian one. The LCS tracks particle motions and collision events individually with considering the hydrodynamic interaction between approaching particles with a superposition method, that is, it can directly represent the collisional growth of cloud particles. It is essential for trustworthy collision detection to take account of the hydrodynamic interaction. In this study, we newly developed a stochastic model based on the Twomey cloud condensation nuclei (CCN) activation for the Lagrangian tracking simulation and integrated it into the LCS. Coupling with the Euler computation for water vapour and temperature fields, the initiation and condensational growth of water droplets were computed in the Lagrangian way. We applied the integrated LCS for a kinematic simulation of warm-rain processes in a vertically-elongated domain of, at largest, 0.03×0.03×3000 (m3) with horizontal periodicity. Aerosol particles with a realistic number density, 5×107 (m3), were evenly distributed over the domain at the initial state. Prescribed updraft at the early stage initiated development of a precipitating cloud. We have confirmed that the obtained bulk statistics fairly agree with those from a conventional spectral-bin scheme for a vertical column domain. The centre of the discussion will be the Lagrangian statistics which is collected from the individual behaviour of the tracked particles.

  3. A Systematic Literature Mapping of Risk Analysis of Big Data in Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Bee Yusof Ali, Hazirah; Marziana Abdullah, Lili; Kartiwi, Mira; Nordin, Azlin; Salleh, Norsaremah; Sham Awang Abu Bakar, Normi

    2018-05-01

    This paper investigates previous literature that focusses on the three elements: risk assessment, big data and cloud. We use a systematic literature mapping method to search for journals and proceedings. The systematic literature mapping process is utilized to get a properly screened and focused literature. With the help of inclusion and exclusion criteria, the search of literature is further narrowed. Classification helps us in grouping the literature into categories. At the end of the mapping, gaps can be seen. The gap is where our focus should be in analysing risk of big data in cloud computing environment. Thus, a framework of how to assess the risk of security, privacy and trust associated with big data and cloud computing environment is highly needed.

  4. a Framework for Voxel-Based Global Scale Modeling of Urban Environments

    NASA Astrophysics Data System (ADS)

    Gehrung, Joachim; Hebel, Marcus; Arens, Michael; Stilla, Uwe

    2016-10-01

    The generation of 3D city models is a very active field of research. Modeling environments as point clouds may be fast, but has disadvantages. These are easily solvable by using volumetric representations, especially when considering selective data acquisition, change detection and fast changing environments. Therefore, this paper proposes a framework for the volumetric modeling and visualization of large scale urban environments. Beside an architecture and the right mix of algorithms for the task, two compression strategies for volumetric models as well as a data quality based approach for the import of range measurements are proposed. The capabilities of the framework are shown on a mobile laser scanning dataset of the Technical University of Munich. Furthermore the loss of the compression techniques is evaluated and their memory consumption is compared to that of raw point clouds. The presented results show that generation, storage and real-time rendering of even large urban models are feasible, even with off-the-shelf hardware.

  5. !CHAOS: A cloud of controls

    NASA Astrophysics Data System (ADS)

    Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.

    2016-01-01

    The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.

  6. Secure public cloud platform for medical images sharing.

    PubMed

    Pan, Wei; Coatrieux, Gouenou; Bouslimi, Dalel; Prigent, Nicolas

    2015-01-01

    Cloud computing promises medical imaging services offering large storage and computing capabilities for limited costs. In this data outsourcing framework, one of the greatest issues to deal with is data security. To do so, we propose to secure a public cloud platform devoted to medical image sharing by defining and deploying a security policy so as to control various security mechanisms. This policy stands on a risk assessment we conducted so as to identify security objectives with a special interest for digital content protection. These objectives are addressed by means of different security mechanisms like access and usage control policy, partial-encryption and watermarking.

  7. A two-step framework for reconstructing remotely sensed land surface temperatures contaminated by cloud

    NASA Astrophysics Data System (ADS)

    Zeng, Chao; Long, Di; Shen, Huanfeng; Wu, Penghai; Cui, Yaokui; Hong, Yang

    2018-07-01

    Land surface temperature (LST) is one of the most important parameters in land surface processes. Although satellite-derived LST can provide valuable information, the value is often limited by cloud contamination. In this paper, a two-step satellite-derived LST reconstruction framework is proposed. First, a multi-temporal reconstruction algorithm is introduced to recover invalid LST values using multiple LST images with reference to corresponding remotely sensed vegetation index. Then, all cloud-contaminated areas are temporally filled with hypothetical clear-sky LST values. Second, a surface energy balance equation-based procedure is used to correct for the filled values. With shortwave irradiation data, the clear-sky LST is corrected to the real LST under cloudy conditions. A series of experiments have been performed to demonstrate the effectiveness of the developed approach. Quantitative evaluation results indicate that the proposed method can recover LST in different surface types with mean average errors in 3-6 K. The experiments also indicate that the time interval between the multi-temporal LST images has a greater impact on the results than the size of the contaminated area.

  8. Evaluation of high-level clouds in cloud resolving model simulations with ARM and KWAJEX observations

    DOE PAGES

    Liu, Zheng; Muhlbauer, Andreas; Ackerman, Thomas

    2015-11-05

    In this paper, we evaluate high-level clouds in a cloud resolving model during two convective cases, ARM9707 and KWAJEX. The simulated joint histograms of cloud occurrence and radar reflectivity compare well with cloud radar and satellite observations when using a two-moment microphysics scheme. However, simulations performed with a single moment microphysical scheme exhibit low biases of approximately 20 dB. During convective events, two-moment microphysical overestimate the amount of high-level cloud and one-moment microphysics precipitate too readily and underestimate the amount and height of high-level cloud. For ARM9707, persistent large positive biases in high-level cloud are found, which are not sensitivemore » to changes in ice particle fall velocity and ice nuclei number concentration in the two-moment microphysics. These biases are caused by biases in large-scale forcing and maintained by the periodic lateral boundary conditions. The combined effects include significant biases in high-level cloud amount, radiation, and high sensitivity of cloud amount to nudging time scale in both convective cases. The high sensitivity of high-level cloud amount to the thermodynamic nudging time scale suggests that thermodynamic nudging can be a powerful ‘‘tuning’’ parameter for the simulated cloud and radiation but should be applied with caution. The role of the periodic lateral boundary conditions in reinforcing the biases in cloud and radiation suggests that reducing the uncertainty in the large-scale forcing in high levels is important for similar convective cases and has far reaching implications for simulating high-level clouds in super-parameterized global climate models such as the multiscale modeling framework.« less

  9. Efficacy of Cloud-Radiative Perturbations in Deep Open- and Closed-Cell Stratocumulus Clouds due to Aerosol Perturbations

    NASA Astrophysics Data System (ADS)

    Possner, A.; Wang, H.; Caldeira, K.; Wood, R.; Ackerman, T. P.

    2017-12-01

    Aerosol-cloud interactions (ACIs) in marine stratocumulus remain a significant source of uncertainty in constraining the cloud-radiative effect in a changing climate. Ship tracks are undoubted manifestations of ACIs embedded within stratocumulus cloud decks and have proven to be a useful framework to study the effect of aerosol perturbations on cloud morphology, macrophysical, microphyiscal and cloud-radiative properties. However, so far most observational (Christensen et al. 2012, Chen et al. 2015) and numerical studies (Wang et al. 2011, Possner et al. 2015, Berner et al. 2015) have concentrated on ship tracks in shallow boundary layers of depths between 300 - 800 m, while most stratocumulus decks form in significantly deeper boundary layers (Muhlbauer et al. 2014). In this study we investigate the efficacy of aerosol perturbations in deep open and closed cell stratocumulus. Multi-day idealised cloud-resolving simulations are performed for the RF06 flight of the VOCALS-Rex field campaign (Wood et al. 2011). During this flight pockets of deep open and closed cells were observed in a 1410 m deep boundary layer. The efficacy of aerosol perturbations of varied concentration and spatial gradients in altering the cloud micro- and macrophysical state and cloud-radiative effect is determined in both cloud regimes. Our simulations show that a continued point source emission flux of 1.16*1011 particles m-2 s-1 applied within a 300x300 m2 gridbox induces pronounced cloud cover changes in approximately a third of the simulated 80x80 km2 domain, a weakening of the diurnal cycle in the open-cell regime and a resulting increase in domain-mean cloud albedo of 0.2. Furthermore, we contrast the efficacy of equal strength near-surface or above-cloud aerosol perturbations in altering the cloud state.

  10. Evaluating cloudiness in an AGCM with Cloud Vertical Structure classes and their radiative effects

    NASA Astrophysics Data System (ADS)

    Lee, D.; Cho, N.; Oreopoulos, L.; Barahona, D.

    2017-12-01

    Clouds are recognized not only as the main modulator of Earth's Radiation Budget but also as the atmospheric constituent carrying the largest uncertainty in future climate projections. The presentation will showcase a new framework for evaluating clouds and their radiative effects in Atmospheric Global Climate Models (AGCMs) using Cloud Vertical Structure (CVS) classes. We take advantage of a new CVS reference dataset recently created from CloudSat's 2B-CLDCLASS-LIDAR product and which assigns observed cloud vertical configurations to nine simplified CVS classes based on cloud co-occurrence in three standard atmospheric layers. These CVS classes can also be emulated in GEOS-5 using the subcolumn cloud generator currently paired with the RRTMG radiation package as an implementation of the McICA scheme. Comparisons between the observed and modeled climatologies of the frequency of occurrence of the various CVS classes provide a new vantage point for assessing the realism of GEOS-5 clouds. Furthermore, a comparison between observed and modeled cloud radiative effects according to their CVS is also possible thanks to the availability of CloudSat's 2B-FLXHR-LIDAR product and our ability to composite radiative fluxes by CVS class - both in the observed and modeled realm. This latter effort enables an investigation of whether the contribution of the various CVS classes to the Earth's radiation budget is represented realistically in GEOS-5. Making this new pathway of cloud evaluation available to the community is a major step towards the improved representation of clouds in climate models.

  11. Synergy of stereo cloud top height and ORAC optimal estimation cloud retrieval: evaluation and application to AATSR

    NASA Astrophysics Data System (ADS)

    Fisher, Daniel; Poulsen, Caroline A.; Thomas, Gareth E.; Muller, Jan-Peter

    2016-03-01

    In this paper we evaluate the impact on the cloud parameter retrievals of the ORAC (Optimal Retrieval of Aerosol and Cloud) algorithm following the inclusion of stereo-derived cloud top heights as a priori information. This is performed in a mathematically rigorous way using the ORAC optimal estimation retrieval framework, which includes the facility to use such independent a priori information. Key to the use of a priori information is a characterisation of their associated uncertainty. This paper demonstrates the improvements that are possible using this approach and also considers their impact on the microphysical cloud parameters retrieved. The Along-Track Scanning Radiometer (AATSR) instrument has two views and three thermal channels, so it is well placed to demonstrate the synergy of the two techniques. The stereo retrieval is able to improve the accuracy of the retrieved cloud top height when compared to collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), particularly in the presence of boundary layer inversions and high clouds. The impact of the stereo a priori information on the microphysical cloud properties of cloud optical thickness (COT) and effective radius (RE) was evaluated and generally found to be very small for single-layer clouds conditions over open water (mean RE differences of 2.2 (±5.9) microns and mean COD differences of 0.5 (±1.8) for single-layer ice clouds over open water at elevations of above 9 km, which are most strongly affected by the inclusion of the a priori).

  12. Windowed and Wavelet Analysis of Marine Stratocumulus Cloud Inhomogeneity

    NASA Technical Reports Server (NTRS)

    Gollmer, Steven M.; Harshvardhan; Cahalan, Robert F.; Snider, Jack B.

    1995-01-01

    To improve radiative transfer calculations for inhomogeneous clouds, a consistent means of modeling inhomogeneity is needed. One current method of modeling cloud inhomogeneity is through the use of fractal parameters. This method is based on the supposition that cloud inhomogeneity over a large range of scales is related. An analysis technique named wavelet analysis provides a means of studying the multiscale nature of cloud inhomogeneity. In this paper, the authors discuss the analysis and modeling of cloud inhomogeneity through the use of wavelet analysis. Wavelet analysis as well as other windowed analysis techniques are used to study liquid water path (LWP) measurements obtained during the marine stratocumulus phase of the First ISCCP (International Satellite Cloud Climatology Project) Regional Experiment. Statistics obtained using analysis windows, which are translated to span the LWP dataset, are used to study the local (small scale) properties of the cloud field as well as their time dependence. The LWP data are transformed onto an orthogonal wavelet basis that represents the data as a number of times series. Each of these time series lies within a frequency band and has a mean frequency that is half the frequency of the previous band. Wavelet analysis combined with translated analysis windows reveals that the local standard deviation of each frequency band is correlated with the local standard deviation of the other frequency bands. The ratio between the standard deviation of adjacent frequency bands is 0.9 and remains constant with respect to time. This ratio defined as the variance coupling parameter is applicable to all of the frequency bands studied and appears to be related to the slope of the data's power spectrum. Similar analyses are performed on two cloud inhomogeneity models, which use fractal-based concepts to introduce inhomogeneity into a uniform cloud field. The bounded cascade model does this by iteratively redistributing LWP at each scale using the value of the local mean. This model is reformulated into a wavelet multiresolution framework, thereby presenting a number of variants of the bounded cascade model. One variant introduced in this paper is the 'variance coupled model,' which redistributes LWP using the local standard deviation and the variance coupling parameter. While the bounded cascade model provides an elegant two- parameter model for generating cloud inhomogeneity, the multiresolution framework provides more flexibility at the expense of model complexity. Comparisons are made with the results from the LWP data analysis to demonstrate both the strengths and weaknesses of these models.

  13. Accuracy assessment of building point clouds automatically generated from iphone images

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Lindenbergh, R.

    2014-06-01

    Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ) and standard deviation (σ) of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m.) and (μ2 = 0.025 m., σ2 = 0.037 m.) for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.

  14. Enabling a Scientific Cloud Marketplace: VGL (Invited)

    NASA Astrophysics Data System (ADS)

    Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.

    2013-12-01

    The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org

  15. Physical Validation of GPM Retrieval Algorithms Over Land: An Overview of the Mid-Latitude Continental Convective Clouds Experiment (MC3E)

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Jensen, Michael P.

    2011-01-01

    The joint NASA Global Precipitation Measurement (GPM) -- DOE Atmospheric Radiation Measurement (ARM) Midlatitude Continental Convective Clouds Experiment (MC3E) was conducted from April 22-June 6, 2011, centered on the DOE-ARM Southern Great Plains Central Facility site in northern Oklahoma. GPM field campaign objectives focused on the collection of airborne and ground-based measurements of warm-season continental precipitation processes to support refinement of GPM retrieval algorithm physics over land, and to improve the fidelity of coupled cloud resolving and land-surface satellite simulator models. DOE ARM objectives were synergistically focused on relating observations of cloud microphysics and the surrounding environment to feedbacks on convective system dynamics, an effort driven by the need to better represent those interactions in numerical modeling frameworks. More specific topics addressed by MC3E include ice processes and ice characteristics as coupled to precipitation at the surface and radiometer signals measured in space, the correlation properties of rainfall and drop size distributions and impacts on dual-frequency radar retrieval algorithms, the transition of cloud water to rain water (e.g., autoconversion processes) and the vertical distribution of cloud water in precipitating clouds, and vertical draft structure statistics in cumulus convection. The MC3E observational strategy relied on NASA ER-2 high-altitude airborne multi-frequency radar (HIWRAP Ka-Ku band) and radiometer (AMPR, CoSMIR; 10-183 GHz) sampling (a GPM "proxy") over an atmospheric column being simultaneously profiled in situ by the University of North Dakota Citation microphysics aircraft, an array of ground-based multi-frequency scanning polarimetric radars (DOE Ka-W, X and C-band; NASA D3R Ka-Ku and NPOL S-bands) and wind-profilers (S/UHF bands), supported by a dense network of over 20 disdrometers and rain gauges, all nested in the coverage of a six-station mesoscale rawinsonde network. As an exploratory effort to examine land-surface emissivity impacts on retrieval algorithms, and to demonstrate airborne soil moisture retrieval capabilities, the University of Tennessee Space Institute Piper aircraft carrying the MAPIR L-band radiometer was also flown during the latter half of the experiment in coordination with the ER-2. The observational strategy provided a means to sample the atmospheric column in a redundant framework that enables inter-calibration and constraint of measured and retrieved precipitation characteristics such as particle size distributions, or water contents- all within the umbrella of "proxy" satellite measurements (i.e., the ER-2). Complimenting the precipitation sampling framework, frequent and coincident launches of atmospheric soundings (e.g., 4-8/day) then provided a much larger mesoscale view of the thermodynamic and winds environment, a data set useful for initializing cloud models. The datasets collected represent a variety cloud and precipitation types including isolated cumulus clouds, severe thunderstorms, mesoscale convective systems, and widespread regions of light to moderate stratiform precipitation. We will present the MC3E experiment design, an overview of operations, and a summary of preliminary results.

  16. A New Unsteady Model for Dense Cloud Cavitation in Cryogenic Fluids

    NASA Technical Reports Server (NTRS)

    Hosangadi, A.; Ahuja, V.

    2005-01-01

    A new unsteady, cavitation model is presented wherein the phase change process (bubble growth/collapse) is coupled to the acoustic field in a cryogenic fluid. It predicts the number density and radius of bubbles in vapor clouds by tracking both the aggregate surface area and volume fraction of the cloud. Hence, formulations for the dynamics of individual bubbles (e.g. Rayleigh-Plesset equation) may be integrated within the macroscopic context of a dense vapor cloud i.e. a cloud that occupies a significant fraction of available volume and contains numerous bubbles. This formulation has been implemented within the CRUNCH CFD, which has a compressible real fluid formulation, a multi-element, unstructured grid framework, and has been validated extensively for liquid rocket turbopump inducers. Detailed unsteady simulations of a cavitating ogive in liquid nitrogen are presented where time-averaged mean cavity pressure and temperature depressions due to cavitation are compared with experimental data. The model also provides the spatial and temporal history of the bubble size distribution in the vapor clouds that are shed, an important physical parameter that is difficult to measure experimentally and is a significant advancement in the modeling of dense cloud cavitation.

  17. Aerosol-cloud interactions in Arctic mixed-phase stratocumulus

    NASA Astrophysics Data System (ADS)

    Solomon, A.

    2017-12-01

    Reliable climate projections require realistic simulations of Arctic cloud feedbacks. Of particular importance is accurately simulating Arctic mixed-phase stratocumuli (AMPS), which are ubiquitous and play an important role in regional climate due to their impact on the surface energy budget and atmospheric boundary layer structure through cloud-driven turbulence, radiative forcing, and precipitation. AMPS are challenging to model due to uncertainties in ice microphysical processes that determine phase partitioning between ice and radiatively important cloud liquid water. Since temperatures in AMPS are too warm for homogenous ice nucleation, ice must form through heterogeneous nucleation. In this presentation we discuss a relatively unexplored source of ice production-recycling of ice nuclei in regions of ice subsaturation. AMPS frequently have ice-subsaturated air near the cloud-driven mixed-layer base where falling ice crystals can sublimate, leaving behind IN. This study provides an idealized framework to understand feedbacks between dynamics and microphysics that maintain phase-partitioning in AMPS. In addition, the results of this study provide insight into the mechanisms and feedbacks that may maintain cloud ice in AMPS even when entrainment of IN at the mixed-layer boundaries is weak.

  18. Migration of the ATLAS Metadata Interface (AMI) to Web 2.0 and cloud

    NASA Astrophysics Data System (ADS)

    Odier, J.; Albrand, S.; Fulachier, J.; Lambert, F.

    2015-12-01

    The ATLAS Metadata Interface (AMI), a mature application of more than 10 years of existence, is currently under adaptation to some recently available technologies. The web interfaces, which previously manipulated XML documents using XSL transformations, are being migrated to Asynchronous JavaScript (AJAX). Web development is considerably simplified by the introduction of a framework based on JQuery and Twitter Bootstrap. Finally, the AMI services are being migrated to an OpenStack cloud infrastructure.

  19. Cloud Pedagogy: Utilizing Web-Based Technologies for the Promotion of Social Constructivist Learning in Science Teacher Preparation Courses

    NASA Astrophysics Data System (ADS)

    Barak, Miri

    2017-10-01

    The new guidelines for science education emphasize the need to introduce computers and digital technologies as a means of enabling visualization and data collection and analysis. This requires science teachers to bring advanced technologies into the classroom and use them wisely. Hence, the goal of this study was twofold: to examine the application of web-based technologies in science teacher preparation courses and to examine pre-service teachers' perceptions of "cloud pedagogy"—an instructional framework that applies technologies for the promotion of social constructivist learning. The study included university teachers ( N = 48) and pre-service science teachers ( N = 73). Data were collected from an online survey, written reflections, and interviews. The findings indicated that university teachers use technologies mainly for information management and the distribution of learning materials and less for applying social constructivist pedagogy. University teachers expect their students (i.e., pre-service science teachers) to use digital tools in their future classroom to a greater extent than they themselves do. The findings also indicated that the "cloud pedagogy" was perceived as an appropriate instructional framework for contemporary science education. The application of the cloud pedagogy fosters four attributes: the ability to adapt to frequent changes and uncertain situations, the ability to collaborate and communicate in decentralized environments, the ability to generate data and manage it, and the ability to explore new venous.

  20. Quasi-Langrangian models of nascent thermals

    NASA Technical Reports Server (NTRS)

    Rambaldi, S.; Randall, D. A.

    1981-01-01

    The motions in and around an isolated thermal were studied and rising motion in the core, and sinking motion on the outside were found; while the circulation resembled that of a vortex ring. In an entity cloud model, cloudy thermal is tracked, in a Lagrangian fashion, as a discrete entity; the field of motion in and around the thermal is not explicitly simulated. Field of motion cloud models, in which the equations of motion are numerically integrated on an Eulerian grid were developed. It is shown that the great potential of a hybrid cloud model can combine the simplicity of the entity models with the generality and flexibility of the field-of-motion models. A key problem to be overcome in the development of a hybrid model is the formulation of a mathematical framework within which the cloud dynamics can be represented.

  1. CAUSES: Attribution of Surface Radiation Biases in NWP and Climate Models near the U.S. Southern Great Plains

    DOE PAGES

    Van Weverberg, K.; Morcrette, C. J.; Petch, J.; ...

    2018-02-28

    Many Numerical Weather Prediction (NWP) and climate models exhibit too warm lower tropospheres near the midlatitude continents. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. This paper presents an attribution study on the net radiation biases in nine model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, water vapor, and aerosols are quantified, using an array of radiation measurement stationsmore » near the Atmospheric Radiation Measurement Southern Great Plains site. Furthermore, an in-depth analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface shortwave radiation is overestimated in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation, although nonnegligible contributions from the surface albedo exist in most models. Missing deep cloud events and/or simulating deep clouds with too weak cloud radiative effects dominate in the cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, cloud radiative deficiencies are related to too weak convective cloud detrainment and too large precipitation efficiencies.« less

  2. Exploring the factors influencing the cloud computing adoption: a systematic study on cloud migration.

    PubMed

    Rai, Rashmi; Sahoo, Gadadhar; Mehfuz, Shabana

    2015-01-01

    Today, most of the organizations trust on their age old legacy applications, to support their business-critical systems. However, there are several critical concerns, as maintainability and scalability issues, associated with the legacy system. In this background, cloud services offer a more agile and cost effective platform, to support business applications and IT infrastructure. As the adoption of cloud services has been increasing recently and so has been the academic research in cloud migration. However, there is a genuine need of secondary study to further strengthen this research. The primary objective of this paper is to scientifically and systematically identify, categorize and compare the existing research work in the area of legacy to cloud migration. The paper has also endeavored to consolidate the research on Security issues, which is prime factor hindering the adoption of cloud through classifying the studies on secure cloud migration. SLR (Systematic Literature Review) of thirty selected papers, published from 2009 to 2014 was conducted to properly understand the nuances of the security framework. To categorize the selected studies, authors have proposed a conceptual model for cloud migration which has resulted in a resource base of existing solutions for cloud migration. This study concludes that cloud migration research is in seminal stage but simultaneously it is also evolving and maturing, with increasing participation from academics and industry alike. The paper also identifies the need for a secure migration model, which can fortify organization's trust into cloud migration and facilitate necessary tool support to automate the migration process.

  3. CAUSES: Attribution of Surface Radiation Biases in NWP and Climate Models near the U.S. Southern Great Plains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Weverberg, K.; Morcrette, C. J.; Petch, J.

    Many Numerical Weather Prediction (NWP) and climate models exhibit too warm lower tropospheres near the midlatitude continents. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. This paper presents an attribution study on the net radiation biases in nine model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, water vapor, and aerosols are quantified, using an array of radiation measurement stationsmore » near the Atmospheric Radiation Measurement Southern Great Plains site. Furthermore, an in-depth analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface shortwave radiation is overestimated in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation, although nonnegligible contributions from the surface albedo exist in most models. Missing deep cloud events and/or simulating deep clouds with too weak cloud radiative effects dominate in the cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, cloud radiative deficiencies are related to too weak convective cloud detrainment and too large precipitation efficiencies.« less

  4. CAUSES: Attribution of Surface Radiation Biases in NWP and Climate Models near the U.S. Southern Great Plains

    NASA Astrophysics Data System (ADS)

    Van Weverberg, K.; Morcrette, C. J.; Petch, J.; Klein, S. A.; Ma, H.-Y.; Zhang, C.; Xie, S.; Tang, Q.; Gustafson, W. I.; Qian, Y.; Berg, L. K.; Liu, Y.; Huang, M.; Ahlgrimm, M.; Forbes, R.; Bazile, E.; Roehrig, R.; Cole, J.; Merryfield, W.; Lee, W.-S.; Cheruy, F.; Mellul, L.; Wang, Y.-C.; Johnson, K.; Thieman, M. M.

    2018-04-01

    Many Numerical Weather Prediction (NWP) and climate models exhibit too warm lower tropospheres near the midlatitude continents. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. This paper presents an attribution study on the net radiation biases in nine model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, water vapor, and aerosols are quantified, using an array of radiation measurement stations near the Atmospheric Radiation Measurement Southern Great Plains site. Furthermore, an in-depth analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface shortwave radiation is overestimated in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation, although nonnegligible contributions from the surface albedo exist in most models. Missing deep cloud events and/or simulating deep clouds with too weak cloud radiative effects dominate in the cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, cloud radiative deficiencies are related to too weak convective cloud detrainment and too large precipitation efficiencies.

  5. A New Framework for Cumulus Parametrization - A CPT in action

    NASA Astrophysics Data System (ADS)

    Jakob, C.; Peters, K.; Protat, A.; Kumar, V.

    2016-12-01

    The representation of convection in climate model remains a major Achilles Heel in our pursuit of better predictions of global and regional climate. The basic principle underpinning the parametrisation of tropical convection in global weather and climate models is that there exist discernible interactions between the resolved model scale and the parametrised cumulus scale. Furthermore, there must be at least some predictive power in the larger scales for the statistical behaviour on small scales for us to be able to formally close the parametrised equations. The presentation will discuss a new framework for cumulus parametrisation based on the idea of separating the prediction of cloud area from that of velocity. This idea is put into practice by combining an existing multi-scale stochastic cloud model with observations to arrive at the prediction of the area fraction for deep precipitating convection. Using mid-tropospheric humidity and vertical motion as predictors, the model is shown to reproduce the observed behaviour of both mean and variability of deep convective area fraction well. The framework allows for the inclusion of convective organisation and can - in principle - be made resolution-aware or resolution-independent. When combined with simple assumptions about cloud-base vertical motion the model can be used as a closure assumption in any existing cumulus parametrisation. Results of applying this idea in the the ECHAM model indicate significant improvements in the simulation of tropical variability, including but not limited to the MJO. This presentation will highlight how the close collaboration of the observational, theoretical and model development community in the spirit of the climate process teams can lead to significant progress in long-standing issues in climate modelling while preserving the freedom of individual groups in pursuing their specific implementation of an agreed framework.

  6. H31G-1596: DeepSAT's CloudCNN: A Deep Neural Network for Rapid Cloud Detection from Geostationary Satellites

    NASA Technical Reports Server (NTRS)

    Kalia, Subodh; Ganguly, Sangram; Li, Shuang; Nemani, Ramakrishna R.

    2017-01-01

    Cloud and cloud shadow detection has important applications in weather and climate studies. It is even more crucial when we introduce geostationary satellites into the field of terrestrial remote sensing. With the challenges associated with data acquired in very high frequency (10-15 mins per scan), the ability to derive an accurate cloud shadow mask from geostationary satellite data is critical. The key to the success for most of the existing algorithms depends on spatially and temporally varying thresholds,which better capture local atmospheric and surface effects.However, the selection of proper threshold is difficult and may lead to erroneous results. In this work, we propose a deep neural network based approach called CloudCNN to classify cloudshadow from Himawari-8 AHI and GOES-16 ABI multispectral data. DeepSAT's CloudCNN consists of an encoderdecoder based architecture for binary-class pixel wise segmentation. We train CloudCNN on multi-GPU Nvidia Devbox cluster, and deploy the prediction pipeline on NASA Earth Exchange (NEX) Pleiades supercomputer. We achieved an overall accuracy of 93.29% on test samples. Since, the predictions take only a few seconds to segment a full multispectral GOES-16 or Himawari-8 Full Disk image, the developed framework can be used for real-time cloud detection, cyclone detection, or extreme weather event predictions.

  7. The AIST Managed Cloud Environment

    NASA Astrophysics Data System (ADS)

    Cook, S.

    2016-12-01

    ESTO is currently in the process of developing and implementing the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to ESTO-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs will allow them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE will facilitate infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.

  8. The AMCE (AIST Managed Cloud Environment)

    NASA Astrophysics Data System (ADS)

    Cook, S.

    2017-12-01

    ESTO has developed and implemented the AIST Managed Cloud Environment (AMCE) to offer cloud computing services to SMD-funded PIs to conduct their project research. AIST will provide projects access to a cloud computing framework that incorporates NASA security, technical, and financial standards, on which project can freely store, run, and process data. Currently, many projects led by research groups outside of NASA do not have the awareness of requirements or the resources to implement NASA standards into their research, which limits the likelihood of infusing the work into NASA applications. Offering this environment to PIs allows them to conduct their project research using the many benefits of cloud computing. In addition to the well-known cost and time savings that it allows, it also provides scalability and flexibility. The AMCE facilitates infusion and end user access by ensuring standardization and security. This approach will ultimately benefit ESTO, the science community, and the research, allowing the technology developments to have quicker and broader applications.

  9. Design and implementation of a cloud based lithography illumination pupil processing application

    NASA Astrophysics Data System (ADS)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  10. Overview and Evaluation of a Smoke Modeling System and other Tools used during Wildfire Incident Deployments

    NASA Astrophysics Data System (ADS)

    ONeill, S. M.; Larkin, N. K.; Martinez, M.; Rorig, M.; Solomon, R. C.; Dubowy, J.; Lahm, P. W.

    2017-12-01

    Specialists operationally deployed to wildfires to forecast expected smoke conditions for the public use many tools and information. These Air Resource Advisors (ARAs) are deployed as part of the Wildland Fire Air Quality Response Program (WFAQRP) and rely on smoke models, monitoring data, meteorological information, and satellite information to produce daily Smoke Outlooks for a region impacted by smoke from wildfires. These Smoke Outlooks are distributed to air quality and health agencies, published online via smoke blogs and other social media, and distributed by the Incident Public Information Officer (PIO), and ultimately to the public. Fundamental to these operations are smoke modeling systems such as the BlueSky Smoke Modeling Framework, which combines fire activity information, mapped fuel loadings, consumption and emissions models, and air quality/dispersion models such as HYSPLIT to produce predictions of PM2.5 concentrations downwind of wildland fires. Performance of this system at a variety of meteorological resolutions, fire initialization information, and vertical allocation of emissions is evaluated for the Summer of 2015 when over 400,000 hectares burned in the northwestern US state of Washington and 1-hr average fine particulate matter (PM2.5) concentrations exceeded 700 μg/m3. The performance of the system at the 12-km, 4-km, and 1.33-km resolutions is evaluated using 1-hr average PM2.5 measurements from permanent monitors and temporary monitors deployed specifically for wildfires by ARAs on wildfire incident command teams. At the higher meteorological resolution (1.33-km) the terrain features are more detailed, showing better valley structures and in general, PM2.5 concentrations were greater in the valleys with the 1.33-km meteorological domain than with the 4-km domain.

  11. Prescribed Grassland Burning Smoke Emission Measurements in the Northern Flint Hills Region

    NASA Astrophysics Data System (ADS)

    Wilkins, J. L.; Baker, K. R.; Landis, M.; Aurell, J.; Gullett, B.

    2017-12-01

    Historically, frequent wildfires were essential for the maintenance of native prairie fire adapted ecosystems. Today prescribed fires are used to control invasive woody species and potentially improve forage production in these same prairie ecosystems for the beef-cattle industry. The emission of primary particulate matter, secondary aerosol, ozone precursors, and air toxics from prescribed grassland burning operations has been implicated as drivers of downwind air quality problems across a multi-state area. A field study has been planned to quantify prescribed burn smoke emissions using both surface and aerial sampling platforms to better constrain emissions rates for organic and inorganic pollutants. Multiple prescribed burns on tallgrass prairie fields in the northern Flint Hills ecoregion are planned for March 2017 at the Konza Prairie Biological Station in Kansas. An array of measurement systems will be deployed to quantify a suite of continuous and integrated air pollution parameters, combustion conditions, meteorological parameters, and plume dynamics to calculate more accurate and condition-specific emission factors that will be used to better predict primary and secondary pollutants both locally and regionally. These emissions measurements will allow for evaluation and improvement of the U.S. Forest Service's Bluesky modeling framework which includes the Fire Emission Production Simulator (FEPS) and Fuel characterization classification system (FCCS). Elucidating grassland prescribed burning emission factors based on fuel type, loading, and environmental conditions is expected to provide an improved understanding of the impact of this land management practice on air quality in the greater Flint Hills region. It is also expected that measurements will be made to help constrain and develop better routines for fire plume rise, vertical allocation, and smoke optical properties.

  12. Quantifying Biomass from Point Clouds by Connecting Representations of Ecosystem Structure

    NASA Astrophysics Data System (ADS)

    Hendryx, S. M.; Barron-Gafford, G.

    2017-12-01

    Quantifying terrestrial ecosystem biomass is an essential part of monitoring carbon stocks and fluxes within the global carbon cycle and optimizing natural resource management. Point cloud data such as from lidar and structure from motion can be effective for quantifying biomass over large areas, but significant challenges remain in developing effective models that allow for such predictions. Inference models that estimate biomass from point clouds are established in many environments, yet, are often scale-dependent, needing to be fitted and applied at the same spatial scale and grid size at which they were developed. Furthermore, training such models typically requires large in situ datasets that are often prohibitively costly or time-consuming to obtain. We present here a scale- and sensor-invariant framework for efficiently estimating biomass from point clouds. Central to this framework, we present a new algorithm, assignPointsToExistingClusters, that has been developed for finding matches between in situ data and clusters in remotely-sensed point clouds. The algorithm can be used for assessing canopy segmentation accuracy and for training and validating machine learning models for predicting biophysical variables. We demonstrate the algorithm's efficacy by using it to train a random forest model of above ground biomass in a shrubland environment in Southern Arizona. We show that by learning a nonlinear function to estimate biomass from segmented canopy features we can reduce error, especially in the presence of inaccurate clusterings, when compared to a traditional, deterministic technique to estimate biomass from remotely measured canopies. Our random forest on cluster features model extends established methods of training random forest regressions to predict biomass of subplots but requires significantly less training data and is scale invariant. The random forest on cluster features model reduced mean absolute error, when evaluated on all test data in leave one out cross validation, by 40.6% from deterministic mesquite allometry and 35.9% from the inferred ecosystem-state allometric function. Our framework should allow for the inference of biomass more efficiently than common subplot methods and more accurately than individual tree segmentation methods in densely vegetated environments.

  13. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Architecture

    NASA Astrophysics Data System (ADS)

    Xiao, J.; Yu, C.; Cui, C.; He, B.; Li, C.; Fan, D.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Zhang, H.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). The ultimate goal of this project is to provide a comprehensive end-to-end astronomy research environment where several independent systems seamlessly collaborate to support the full lifecycle of the modern observational astronomy based on big data, from proposal submission, to data archiving, data release, and to in-situ data analysis and processing. In this paper, the architecture and key designs of the AstroCloud platform are introduced, including data access middleware, access control and security framework, extendible proposal workflow, and system integration mechanism.

  14. Comparative evaluation of polarimetric and bi-spectral cloud microphysics retrievals: Retrieval closure experiments and comparisons based on idealized and LES case studies

    NASA Astrophysics Data System (ADS)

    Miller, D. J.; Zhang, Z.; Ackerman, A. S.; Platnick, S. E.; Cornet, C.

    2016-12-01

    A remote sensing cloud retrieval simulator, created by coupling an LES cloud model with vector radiative transfer (RT) models is the ideal framework for assessing cloud remote sensing techniques. This simulator serves as a tool for understanding bi-spectral and polarimetric retrievals by comparing them directly to LES cloud properties (retrieval closure comparison) and for comparing the retrieval techniques to one another. Our simulator utilizes the DHARMA LES [Ackerman et al., 2004] with cloud properties based on marine boundary layer (MBL) clouds observed during the DYCOMS-II and ATEX field campaigns. The cloud reflectances are produced by the vectorized RT models based on polarized doubling adding and monte carlo techniques (PDA, MCPOL). Retrievals are performed utilizing techniques as similar as possible to those implemented on their corresponding well known instruments; polarimetric retrievals are based on techniques implemented for polarimeters (POLDER, AirMSPI, and RSP) and bi-spectral retrievals are performed using the Nakajima-King LUT method utilized on a number of spectral instruments (MODIS and VIIRS). Retrieval comparisons focus on cloud droplet effective radius (re), effective variance (ve), and cloud optical thickness (τ). This work explores the sensitivities of these two retrieval techniques to various observation limitations, such as spatial resolution/cloud inhomogeneity, impact of 3D radiative effects, and angular resolution requirements. With future remote sensing missions like NASA's Aerosols/Clouds/Ecosystems (ACE) planning to feature advanced polarimetric instruments it is important to understand how these retrieval techniques compare to one another. The cloud retrieval simulator we've developed allows us to probe these important questions in a realistically relevant test bed.

  15. Regime-based evaluation of cloudiness in CMIP5 models

    NASA Astrophysics Data System (ADS)

    Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin

    2017-01-01

    The concept of cloud regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating in each grid cell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product [long-term average total cloud amount (TCA)], cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our results support previous findings that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is still not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer cloud observations evaluated against ISCCP like another model output. Lastly, contrasting cloud simulation performance against each model's equilibrium climate sensitivity in order to gain insight on whether good cloud simulation pairs with particular values of this parameter, yields no clear conclusions.

  16. Integrated approach towards understanding interactions of mineral dust aerosol with warm clouds

    NASA Astrophysics Data System (ADS)

    Kumar, Prashant

    2011-12-01

    Mineral dust is ubiquitous in the atmosphere and represents a dominant type of particulate matter by mass. Dust particles can serve as cloud condensation nuclei (CCN), giant CCN (GCCN), or ice nuclei (IN), thereby, affecting cloud microphysics, albedo, and lifetime. Despite its well-recognized importance, assessments of dust impacts on clouds and climate remain highly uncertain. This thesis addresses the role of dust as CCN and GCCN with the goal of improving our understanding of dust-warm cloud interactions and their representation in climate models. Most studies to date focus on the soluble fraction of aerosol particles when describing cloud droplet nucleation, and overlook the interactions of the hydrophilic insoluble fraction with water vapor. A new approach to include such interactions (expressed by the process of water vapor adsorption) is explored, by combining multilayer Frenkel-Halsey-Hill (FHH) physical adsorption isotherm and curvature (Kelvin) effects. The importance of adsorption activation theory (FHH-AT) is corroborated by measurements of CCN activity of mineral aerosols generated from clays, calcite, quartz, and desert soil samples from Northern Africa, East Asia/China, and Northern America. A new aerosol generation setup for CCN measurements was developed based on a dry generation technique capable of reproducing natural dust aerosol emission. Based on the dependence of critical supersaturation with particle dry diameter, it is found that the FHH-AT is a better framework for describing fresh (and unprocessed) dust CCN activity than the classical Kohler theory (KT). Ion Chromatography (IC) measurements performed on fresh regional dust samples indicate negligible soluble fraction, and support that water vapor adsorption is the prime source of CCN activity in the dust. CCN measurements with the commonly used wet generated mineral aerosol (from atomization of a dust aqueous suspension) are also carried out. Results indicate that the method is subject to biases as it generates a bimodal size distribution with a broad range of hygroscopicity. It is found that smaller particles generated in the more hygroscopic peak follow CCN activation by KT, while the larger peak is less hydrophilic with activation similar to dry generated dust that follow FHH-AT. Droplet activation kinetics measurements demonstrate that dry generated mineral aerosol display retarded activation kinetics with an equivalent water vapor uptake coefficient that is 30 - 80% lower relative to ammonium sulfate aerosol. Wet generated mineral aerosols, however, display similar activation kinetics to ammonium sulfate. These results suggest that at least a monolayer of water vapor (the rate-limiting step for adsorption) persists during the timescale of aerosol generation in the experiment, and questions the atmospheric relevance of studies on mineral aerosol generated from wet atomization method. A new parameterization of cloud droplet formation from insoluble dust CCN for regional and global climate models is also developed. The parameterization framework considers cloud droplet formation from dust CCN activating via FHH-AT, and soluble aerosol with activation described through KT. The parameterization is validated against a numerical parcel model, agreeing with predictions to within 10% (R2 ˜ 0.98). The potential role of dust GCCN activating by FHH-AT within warm stratocumulus and convective clouds is also evaluated. It is found that under pristine aerosol conditions, dust GCCN can act as collector drops with implications to dust-cloud-precipitation linkages. Biases introduced from describing dust GCCN activation by KT are also addressed. The results demonstrate that dust particles do not require deliquescent material to act as CCN in the atmosphere. Furthermore, the impact of dust particles as giant CCN on warm cloud and precipitation must be considered. Finally, the new parameterization of cloud droplet formation can be implemented in regional and global models providing an improved treatment of mineral aerosol on clouds and precipitation. The new framework is uniquely placed to address dust aerosol indirect effects on climate.

  17. Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT.

    PubMed

    Lavassani, Mehrzad; Forsström, Stefan; Jennehag, Ulf; Zhang, Tingting

    2018-05-12

    Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications.

  18. A New Framework to Compare Mass-Flux Schemes Within the AROME Numerical Weather Prediction Model

    NASA Astrophysics Data System (ADS)

    Riette, Sébastien; Lac, Christine

    2016-08-01

    In the Application of Research to Operations at Mesoscale (AROME) numerical weather forecast model used in operations at Météo-France, five mass-flux schemes are available to parametrize shallow convection at kilometre resolution. All but one are based on the eddy-diffusivity-mass-flux approach, and differ in entrainment/detrainment, the updraft vertical velocity equation and the closure assumption. The fifth is based on a more classical mass-flux approach. Screen-level scores obtained with these schemes show few discrepancies and are not sufficient to highlight behaviour differences. Here, we describe and use a new experimental framework, able to compare and discriminate among different schemes. For a year, daily forecast experiments were conducted over small domains centred on the five French metropolitan radio-sounding locations. Cloud base, planetary boundary-layer height and normalized vertical profiles of specific humidity, potential temperature, wind speed and cloud condensate were compared with observations, and with each other. The framework allowed the behaviour of the different schemes in and above the boundary layer to be characterized. In particular, the impact of the entrainment/detrainment formulation, closure assumption and cloud scheme were clearly visible. Differences mainly concerned the transport intensity thus allowing schemes to be separated into two groups, with stronger or weaker updrafts. In the AROME model (with all interactions and the possible existence of compensating errors), evaluation diagnostics gave the advantage to the first group.

  19. Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT

    PubMed Central

    Lavassani, Mehrzad; Jennehag, Ulf; Zhang, Tingting

    2018-01-01

    Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications. PMID:29757227

  20. A voting-based statistical cylinder detection framework applied to fallen tree mapping in terrestrial laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2017-07-01

    This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.

  1. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    DOE PAGES

    Kim, Hyunjoo; el-Khamra, Yaakoub; Rodero, Ivan; ...

    2011-01-01

    In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints.more » The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.« less

  2. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    PubMed

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  3. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    PubMed Central

    Vukićević, Milan

    2014-01-01

    Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101

  4. Sensitivity of Cirrus and Mixed-phase Clouds to the Ice Nuclei Spectra in McRAS-AC: Single Column Model Simulations

    NASA Technical Reports Server (NTRS)

    Betancourt, R. Morales; Lee, D.; Oreopoulos, L.; Sud, Y. C.; Barahona, D.; Nenes, A.

    2012-01-01

    The salient features of mixed-phase and ice clouds in a GCM cloud scheme are examined using the ice formation parameterizations of Liu and Penner (LP) and Barahona and Nenes (BN). The performance of LP and BN ice nucleation parameterizations were assessed in the GEOS-5 AGCM using the McRAS-AC cloud microphysics framework in single column mode. Four dimensional assimilated data from the intensive observation period of ARM TWP-ICE campaign was used to drive the fluxes and lateral forcing. Simulation experiments where established to test the impact of each parameterization in the resulting cloud fields. Three commonly used IN spectra were utilized in the BN parameterization to described the availability of IN for heterogeneous ice nucleation. The results show large similarities in the cirrus cloud regime between all the schemes tested, in which ice crystal concentrations were within a factor of 10 regardless of the parameterization used. In mixed-phase clouds there are some persistent differences in cloud particle number concentration and size, as well as in cloud fraction, ice water mixing ratio, and ice water path. Contact freezing in the simulated mixed-phase clouds contributed to transfer liquid to ice efficiently, so that on average, the clouds were fully glaciated at T approximately 260K, irrespective of the ice nucleation parameterization used. Comparison of simulated ice water path to available satellite derived observations were also performed, finding that all the schemes tested with the BN parameterization predicted 20 average values of IWP within plus or minus 15% of the observations.

  5. Black carbon mixing state impacts on cloud microphysical properties: effects of aerosol plume and environmental conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ching, Ping Pui; Riemer, Nicole; West, Matthew

    2016-05-27

    Black carbon (BC) is usually mixed with other aerosol species within individual aerosol particles. This mixture, along with the particles' size and morphology, determines the particles' optical and cloud condensation nuclei properties, and hence black carbon's climate impacts. In this study the particle-resolved aerosol model PartMC-MOSAIC was used to quantify the importance of black carbon mixing state for predicting cloud microphysical quantities. Based on a set of about 100 cloud parcel simulations a process level analysis framework was developed to attribute the response in cloud microphysical properties to changes in the underlying aerosol population ("plume effect") and the cloud parcelmore » cooling rate ("parcel effect"). It shows that the response of cloud droplet number concentration to changes in BC emissions depends on the BC mixing state. When the aerosol population contains mainly aged BC particles an increase in BC emission results in increasing cloud droplet number concentrations ("additive effect"). In contrast, when the aerosol population contains mainly fresh BC particles they act as sinks for condensable gaseous species, resulting in a decrease in cloud droplet number concentration as BC emissions are increased ("competition effect"). Additionally, we quantified the error in cloud microphysical quantities when neglecting the information on BC mixing state, which is often done in aerosol models. The errors ranged from -12% to +45% for the cloud droplet number fraction, from 0% to +1022% for the nucleation-scavenged black carbon (BC) mass fraction, from -12% to +4% for the effective radius, and from -30% to +60% for the relative dispersion.« less

  6. Investigating ice nucleation in cirrus clouds with an aerosol-enabled Multiscale Modeling Framework

    DOE PAGES

    Zhang, Chengzhu; Wang, Minghuai; Morrison, H.; ...

    2014-11-06

    In this study, an aerosol-dependent ice nucleation scheme [Liu and Penner, 2005] has been implemented in an aerosol-enabled multi-scale modeling framework (PNNL MMF) to study ice formation in upper troposphere cirrus clouds through both homogeneous and heterogeneous nucleation. The MMF model represents cloud scale processes by embedding a cloud-resolving model (CRM) within each vertical column of a GCM grid. By explicitly linking ice nucleation to aerosol number concentration, CRM-scale temperature, relative humidity and vertical velocity, the new MMF model simulates the persistent high ice supersaturation and low ice number concentration (10 to 100/L) at cirrus temperatures. The low ice numbermore » is attributed to the dominance of heterogeneous nucleation in ice formation. The new model simulates the observed shift of the ice supersaturation PDF towards higher values at low temperatures following homogeneous nucleation threshold. The MMF models predict a higher frequency of midlatitude supersaturation in the Southern hemisphere and winter hemisphere, which is consistent with previous satellite and in-situ observations. It is shown that compared to a conventional GCM, the MMF is a more powerful model to emulate parameters that evolve over short time scales such as supersaturation. Sensitivity tests suggest that the simulated global distribution of ice clouds is sensitive to the ice nucleation schemes and the distribution of sulfate and dust aerosols. Simulations are also performed to test empirical parameters related to auto-conversion of ice crystals to snow. Results show that with a value of 250 μm for the critical diameter, Dcs, that distinguishes ice crystals from snow, the model can produce good agreement to the satellite retrieved products in terms of cloud ice water path and ice water content, while the total ice water is not sensitive to the specification of Dcs value.« less

  7. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  8. Assessing the CAM5 Physics Suite in the WRF-Chem Model: Implementation, Resolution Sensitivity, and a First Evaluation for a Regional Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Po-Lun; Rasch, Philip J.; Fast, Jerome D.

    A suite of physical parameterizations (deep and shallow convection, turbulent boundary layer, aerosols, cloud microphysics, and cloud fraction) from the global climate model Community Atmosphere Model version 5.1 (CAM5) has been implemented in the regional model Weather Research and Forecasting with chemistry (WRF-Chem). A downscaling modeling framework with consistent physics has also been established in which both global and regional simulations use the same emissions and surface fluxes. The WRF-Chem model with the CAM5 physics suite is run at multiple horizontal resolutions over a domain encompassing the northern Pacific Ocean, northeast Asia, and northwest North America for April 2008 whenmore » the ARCTAS, ARCPAC, and ISDAC field campaigns took place. These simulations are evaluated against field campaign measurements, satellite retrievals, and ground-based observations, and are compared with simulations that use a set of common WRF-Chem Parameterizations. This manuscript describes the implementation of the CAM5 physics suite in WRF-Chem provides an overview of the modeling framework and an initial evaluation of the simulated meteorology, clouds, and aerosols, and quantifies the resolution dependence of the cloud and aerosol parameterizations. We demonstrate that some of the CAM5 biases, such as high estimates of cloud susceptibility to aerosols and the underestimation of aerosol concentrations in the Arctic, can be reduced simply by increasing horizontal resolution. We also show that the CAM5 physics suite performs similarly to a set of parameterizations commonly used in WRF-Chem, but produces higher ice and liquid water condensate amounts and near-surface black carbon concentration. Further evaluations that use other mesoscale model parameterizations and perform other case studies are needed to infer whether one parameterization consistently produces results more consistent with observations.« less

  9. A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation

    PubMed Central

    Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas

    2011-01-01

    High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089

  10. The HEPiX Virtualisation Working Group: Towards a Grid of Clouds

    NASA Astrophysics Data System (ADS)

    Cass, Tony

    2012-12-01

    The use of virtual machine images, as for example with Cloud services such as Amazon's Elastic Compute Cloud, is attractive for users as they have a guaranteed execution environment, something that cannot today be provided across sites participating in computing grids such as the Worldwide LHC Computing Grid. However, Grid sites often operate within computer security frameworks which preclude the use of remotely generated images. The HEPiX Virtualisation Working Group was setup with the objective to enable use of remotely generated virtual machine images at Grid sites and, to this end, has introduced the idea of trusted virtual machine images which are guaranteed to be secure and configurable by sites such that security policy commitments can be met. This paper describes the requirements and details of these trusted virtual machine images and presents a model for their use to facilitate the integration of Grid- and Cloud-based computing environments for High Energy Physics.

  11. Secure Genomic Computation through Site-Wise Encryption

    PubMed Central

    Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu

    2015-01-01

    Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients’ genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds. PMID:26306278

  12. Initial Performance Assessment of CALIOP

    NASA Technical Reports Server (NTRS)

    Winker, David; Hunt, Bill; McGill, Matthew

    2007-01-01

    The Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP, pronounced the same as "calliope") is a spaceborne two-wavelength polarizatio n lidar that has been acquiring global data since June 2006. CALIOP p rovides high resolution vertical profiles of clouds and aerosols, and has been designed with a very large linear dynamic range to encompas s the full range of signal returns from aerosols and clouds. CALIOP is the primary instrument carried by the Cloud-Aerosol Lidar and Infrar ed Pathfinder Satellite Observations (CALIPSO) satellite, which was l aunched on April, 28 2006. CALIPSO was developed within the framework of a collaboration between NASA and the French space agency, CNES. I nitial data analysis and validation intercomparisons indicate the qua lity of data from CALIOP meets or exceeds expectations. This paper presents a description of the CALIPSO mission, the CALIOP instrument, an d an initial assessment of on-orbit measurement performance.

  13. Secure Genomic Computation through Site-Wise Encryption.

    PubMed

    Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu

    2015-01-01

    Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients' genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds.

  14. A Framework for Applying Point Clouds Grabbed by Multi-Beam LIDAR in Perceiving the Driving Environment

    PubMed Central

    Liu, Jian; Liang, Huawei; Wang, Zhiling; Chen, Xiangcheng

    2015-01-01

    The quick and accurate understanding of the ambient environment, which is composed of road curbs, vehicles, pedestrians, etc., is critical for developing intelligent vehicles. The road elements included in this work are road curbs and dynamic road obstacles that directly affect the drivable area. A framework for the online modeling of the driving environment using a multi-beam LIDAR, i.e., a Velodyne HDL-64E LIDAR, which describes the 3D environment in the form of a point cloud, is reported in this article. First, ground segmentation is performed via multi-feature extraction of the raw data grabbed by the Velodyne LIDAR to satisfy the requirement of online environment modeling. Curbs and dynamic road obstacles are detected and tracked in different manners. Curves are fitted for curb points, and points are clustered into bundles whose form and kinematics parameters are calculated. The Kalman filter is used to track dynamic obstacles, whereas the snake model is employed for curbs. Results indicate that the proposed framework is robust under various environments and satisfies the requirements for online processing. PMID:26404290

  15. Development of a High Resolution Weather Forecast Model for Mesoamerica Using the NASA Nebula Cloud Computing Environment

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Case, Jonathan L.; Venner, Jason; Moreno-Madrinan, Max. J.; Delgado, Francisco

    2012-01-01

    Over the past two years, scientists in the Earth Science Office at NASA fs Marshall Space Flight Center (MSFC) have explored opportunities to apply cloud computing concepts to support near real ]time weather forecast modeling via the Weather Research and Forecasting (WRF) model. Collaborators at NASA fs Short ]term Prediction Research and Transition (SPoRT) Center and the SERVIR project at Marshall Space Flight Center have established a framework that provides high resolution, daily weather forecasts over Mesoamerica through use of the NASA Nebula Cloud Computing Platform at Ames Research Center. Supported by experts at Ames, staff at SPoRT and SERVIR have established daily forecasts complete with web graphics and a user interface that allows SERVIR partners access to high resolution depictions of weather in the next 48 hours, useful for monitoring and mitigating meteorological hazards such as thunderstorms, heavy precipitation, and tropical weather that can lead to other disasters such as flooding and landslides. This presentation will describe the framework for establishing and providing WRF forecasts, example applications of output provided via the SERVIR web portal, and early results of forecast model verification against available surface ] and satellite ]based observations.

  16. Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes

    NASA Astrophysics Data System (ADS)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.

    2017-12-01

    Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.

  17. Machine learning patterns for neuroimaging-genetic studies in the cloud.

    PubMed

    Da Mota, Benoit; Tudoran, Radu; Costan, Alexandru; Varoquaux, Gaël; Brasche, Goetz; Conrod, Patricia; Lemaitre, Herve; Paus, Tomas; Rietschel, Marcella; Frouin, Vincent; Poline, Jean-Baptiste; Antoniu, Gabriel; Thirion, Bertrand

    2014-01-01

    Brain imaging is a natural intermediate phenotype to understand the link between genetic information and behavior or brain pathologies risk factors. Massive efforts have been made in the last few years to acquire high-dimensional neuroimaging and genetic data on large cohorts of subjects. The statistical analysis of such data is carried out with increasingly sophisticated techniques and represents a great computational challenge. Fortunately, increasing computational power in distributed architectures can be harnessed, if new neuroinformatics infrastructures are designed and training to use these new tools is provided. Combining a MapReduce framework (TomusBLOB) with machine learning algorithms (Scikit-learn library), we design a scalable analysis tool that can deal with non-parametric statistics on high-dimensional data. End-users describe the statistical procedure to perform and can then test the model on their own computers before running the very same code in the cloud at a larger scale. We illustrate the potential of our approach on real data with an experiment showing how the functional signal in subcortical brain regions can be significantly fit with genome-wide genotypes. This experiment demonstrates the scalability and the reliability of our framework in the cloud with a 2 weeks deployment on hundreds of virtual machines.

  18. Development of a High Resolution Weather Forecast Model for Mesoamerica Using the NASA Nebula Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Case, J.; Venner, J.; Moreno-Madriñán, M. J.; Delgado, F.

    2012-12-01

    Over the past two years, scientists in the Earth Science Office at NASA's Marshall Space Flight Center (MSFC) have explored opportunities to apply cloud computing concepts to support near real-time weather forecast modeling via the Weather Research and Forecasting (WRF) model. Collaborators at NASA's Short-term Prediction Research and Transition (SPoRT) Center and the SERVIR project at Marshall Space Flight Center have established a framework that provides high resolution, daily weather forecasts over Mesoamerica through use of the NASA Nebula Cloud Computing Platform at Ames Research Center. Supported by experts at Ames, staff at SPoRT and SERVIR have established daily forecasts complete with web graphics and a user interface that allows SERVIR partners access to high resolution depictions of weather in the next 48 hours, useful for monitoring and mitigating meteorological hazards such as thunderstorms, heavy precipitation, and tropical weather that can lead to other disasters such as flooding and landslides. This presentation will describe the framework for establishing and providing WRF forecasts, example applications of output provided via the SERVIR web portal, and early results of forecast model verification against available surface- and satellite-based observations.

  19. CGILS: Results from the First Phase of an International Project to Understand the Physical Mechanisms of Low Cloud Feedbacks in Single Column Models

    NASA Technical Reports Server (NTRS)

    Zhang, Minghua; Bretherton, Christopher S.; Blossey, Peter N.; Austin, Phillip H.; Bacmeister, Julio T.; Bony, Sandrine; Brient, Florent; Cheedela, Suvarchal K.; Cheng, Anning; DelGenio, Anthony; hide

    2013-01-01

    1] CGILS-the CFMIP-GASS Intercomparison of Large Eddy Models (LESs) and single column models (SCMs)-investigates the mechanisms of cloud feedback in SCMs and LESs under idealized climate change perturbation. This paper describes the CGILS results from 15 SCMs and 8 LES models. Three cloud regimes over the subtropical oceans are studied: shallow cumulus, cumulus under stratocumulus, and well-mixed coastal stratus/stratocumulus. In the stratocumulus and coastal stratus regimes, SCMs without activated shallow convection generally simulated negative cloud feedbacks, while models with active shallow convection generally simulated positive cloud feedbacks. In the shallow cumulus alone regime, this relationship is less clear, likely due to the changes in cloud depth, lateral mixing, and precipitation or a combination of them. The majority of LES models simulated negative cloud feedback in the well-mixed coastal stratus/stratocumulus regime, and positive feedback in the shallow cumulus and stratocumulus regime. A general framework is provided to interpret SCM results: in a warmer climate, the moistening rate of the cloudy layer associated with the surface-based turbulence parameterization is enhanced; together with weaker large-scale subsidence, it causes negative cloud feedback. In contrast, in the warmer climate, the drying rate associated with the shallow convection scheme is enhanced. This causes positive cloud feedback. These mechanisms are summarized as the "NESTS" negative cloud feedback and the "SCOPE" positive cloud feedback (Negative feedback from Surface Turbulence under weaker Subsidence-Shallow Convection PositivE feedback) with the net cloud feedback depending on how the two opposing effects counteract each other. The LES results are consistent with these interpretations

  20. Using In Situ Observations and Satellite Retrievals to Constrain Large-Eddy Simulations and Single-Column Simulations: Implications for Boundary-Layer Cloud Parameterization in the NASA GISS GCM

    NASA Astrophysics Data System (ADS)

    Remillard, J.

    2015-12-01

    Two low-cloud periods from the CAP-MBL deployment of the ARM Mobile Facility at the Azores are selected through a cluster analysis of ISCCP cloud property matrices, so as to represent two low-cloud weather states that the GISS GCM severely underpredicts not only in that region but also globally. The two cases represent (1) shallow cumulus clouds occurring in a cold-air outbreak behind a cold front, and (2) stratocumulus clouds occurring when the region was dominated by a high-pressure system. Observations and MERRA reanalysis are used to derive specifications used for large-eddy simulations (LES) and single-column model (SCM) simulations. The LES captures the major differences in horizontal structure between the two low-cloud fields, but there are unconstrained uncertainties in cloud microphysics and challenges in reproducing W-band Doppler radar moments. The SCM run on the vertical grid used for CMIP-5 runs of the GCM does a poor job of representing the shallow cumulus case and is unable to maintain an overcast deck in the stratocumulus case, providing some clues regarding problems with low-cloud representation in the GCM. SCM sensitivity tests with a finer vertical grid in the boundary layer show substantial improvement in the representation of cloud amount for both cases. GCM simulations with CMIP-5 versus finer vertical gridding in the boundary layer are compared with observations. The adoption of a two-moment cloud microphysics scheme in the GCM is also tested in this framework. The methodology followed in this study, with the process-based examination of different time and space scales in both models and observations, represents a prototype for GCM cloud parameterization improvements.

  1. Testing cloud microphysics parameterizations in NCAR CAM5 with ISDAC and M-PACE observations

    NASA Astrophysics Data System (ADS)

    Liu, Xiaohong; Xie, Shaocheng; Boyle, James; Klein, Stephen A.; Shi, Xiangjun; Wang, Zhien; Lin, Wuyin; Ghan, Steven J.; Earle, Michael; Liu, Peter S. K.; Zelenyuk, Alla

    2011-01-01

    Arctic clouds simulated by the National Center for Atmospheric Research (NCAR) Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic spring and fall seasons performed under the Cloud-Associated Parameterizations Testbed framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary-layer mixed-phase stratocumulus and multilayer or deep frontal clouds. However, for low-level stratocumulus, the model significantly underestimates the observed cloud liquid water content in both seasons. As a result, CAM5 significantly underestimates the surface downward longwave radiative fluxes by 20-40 W m-2. Introducing a new ice nucleation parameterization slightly improves the model performance for low-level mixed-phase clouds by increasing cloud liquid water content through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron-Findeisen process. The CAM5 single-column model testing shows that changing the instantaneous freezing temperature of rain to form snow from -5°C to -40°C causes a large increase in modeled cloud liquid water content through the slowing down of cloud liquid and rain-related processes (e.g., autoconversion of cloud liquid to rain). The underestimation of aerosol concentrations in CAM5 in the Arctic also plays an important role in the low bias of cloud liquid water in the single-layer mixed-phase clouds. In addition, numerical issues related to the coupling of model physics and time stepping in CAM5 are responsible for the model biases and will be explored in future studies.

  2. A method for quantifying cloud immersion in a tropical mountain forest using time-lapse photography

    USGS Publications Warehouse

    Bassiouni, Maoya; Scholl, Martha A.; Torres-Sanchez, Angel J.; Murphy, Sheila F.

    2017-01-01

    Quantifying the frequency, duration, and elevation range of fog or cloud immersion is essential to estimate cloud water deposition in water budgets and to understand the ecohydrology of cloud forests. The goal of this study was to develop a low-cost and high spatial-coverage method to detect occurrence of cloud immersion within a mountain cloud forest by using time-lapse photography. Trail cameras and temperature/relative humidity sensors were deployed at five sites covering the elevation range from the assumed lifting condensation level to the mountain peaks in the Luquillo Mountains of Puerto Rico. Cloud-sensitive image characteristics (contrast, the coefficient of variation and the entropy of pixel luminance, and image colorfulness) were used with a k-means clustering approach to accurately detect cloud-immersed conditions in a time series of images from March 2014 to May 2016. Images provided hydrologically meaningful cloud-immersion information while temperature-relative humidity data were used to refine the image analysis using dew point information and provided temperature gradients along the elevation transect. Validation of the image processing method with human-judgment based classification generally indicated greater than 90% accuracy. Cloud-immersion frequency averaged 80% at sites above 900 m during nighttime hours and 49% during daytime hours, and was consistent with diurnal patterns of cloud immersion measured in a previous study. Results for the 617 m site demonstrated that cloud immersion in the Luquillo Mountains rarely occurs at the previously-reported cloud base elevation of about 600 m (11% during nighttime hours and 5% during daytime hours). The framework presented in this paper will be used to monitor at a low cost and high spatial resolution the long-term variability of cloud-immersion patterns in the Luquillo Mountains, and can be applied to ecohydrology research at other cloud-forest sites or in coastal ecosystems with advective sea fog.

  3. ATLAS user analysis on private cloud resources at GoeGrid

    NASA Astrophysics Data System (ADS)

    Glaser, F.; Nadal Serrano, J.; Grabowski, J.; Quadt, A.

    2015-12-01

    User analysis job demands can exceed available computing resources, especially before major conferences. ATLAS physics results can potentially be slowed down due to the lack of resources. For these reasons, cloud research and development activities are now included in the skeleton of the ATLAS computing model, which has been extended by using resources from commercial and private cloud providers to satisfy the demands. However, most of these activities are focused on Monte-Carlo production jobs, extending the resources at Tier-2. To evaluate the suitability of the cloud-computing model for user analysis jobs, we developed a framework to launch an ATLAS user analysis cluster in a cloud infrastructure on demand and evaluated two solutions. The first solution is entirely integrated in the Grid infrastructure by using the same mechanism, which is already in use at Tier-2: A designated Panda-Queue is monitored and additional worker nodes are launched in a cloud environment and assigned to a corresponding HTCondor queue according to the demand. Thereby, the use of cloud resources is completely transparent to the user. However, using this approach, submitted user analysis jobs can still suffer from a certain delay introduced by waiting time in the queue and the deployed infrastructure lacks customizability. Therefore, our second solution offers the possibility to easily deploy a totally private, customizable analysis cluster on private cloud resources belonging to the university.

  4. A Self-consistent Cloud Model for Brown Dwarfs and Young Giant Exoplanets: Comparison with Photometric and Spectroscopic Observations

    NASA Astrophysics Data System (ADS)

    Charnay, B.; Bézard, B.; Baudino, J.-L.; Bonnefoy, M.; Boccaletti, A.; Galicher, R.

    2018-02-01

    We developed a simple, physical, and self-consistent cloud model for brown dwarfs and young giant exoplanets. We compared different parametrizations for the cloud particle size, by fixing either particle radii or the mixing efficiency (parameter f sed), or by estimating particle radii from simple microphysics. The cloud scheme with simple microphysics appears to be the best parametrization by successfully reproducing the observed photometry and spectra of brown dwarfs and young giant exoplanets. In particular, it reproduces the L–T transition, due to the condensation of silicate and iron clouds below the visible/near-IR photosphere. It also reproduces the reddening observed for low-gravity objects, due to an increase of cloud optical depth for low gravity. In addition, we found that the cloud greenhouse effect shifts chemical equilibrium, increasing the abundances of species stable at high temperature. This effect should significantly contribute to the strong variation of methane abundance at the L–T transition and to the methane depletion observed on young exoplanets. Finally, we predict the existence of a continuum of brown dwarfs and exoplanets for absolute J magnitude = 15–18 and J-K color = 0–3, due to the evolution of the L–T transition with gravity. This self-consistent model therefore provides a general framework to understand the effects of clouds and appears well-suited for atmospheric retrievals.

  5. Cloud Simulations in Response to Turbulence Parameterizations in the GISS Model E GCM

    NASA Technical Reports Server (NTRS)

    Yao, Mao-Sung; Cheng, Ye

    2013-01-01

    The response of cloud simulations to turbulence parameterizations is studied systematically using the GISS general circulation model (GCM) E2 employed in the Intergovernmental Panel on Climate Change's (IPCC) Fifth Assessment Report (AR5).Without the turbulence parameterization, the relative humidity (RH) and the low cloud cover peak unrealistically close to the surface; with the dry convection or with only the local turbulence parameterization, these two quantities improve their vertical structures, but the vertical transport of water vapor is still weak in the planetary boundary layers (PBLs); with both local and nonlocal turbulence parameterizations, the RH and low cloud cover have better vertical structures in all latitudes due to more significant vertical transport of water vapor in the PBL. The study also compares the cloud and radiation climatologies obtained from an experiment using a newer version of turbulence parameterization being developed at GISS with those obtained from the AR5 version. This newer scheme differs from the AR5 version in computing nonlocal transports, turbulent length scale, and PBL height and shows significant improvements in cloud and radiation simulations, especially over the subtropical eastern oceans and the southern oceans. The diagnosed PBL heights appear to correlate well with the low cloud distribution over oceans. This suggests that a cloud-producing scheme needs to be constructed in a framework that also takes the turbulence into consideration.

  6. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  7. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  8. Dynamics of aging magnetic clouds. [interacted with solar wind

    NASA Technical Reports Server (NTRS)

    Osherovich, V. A.; Farrugia, C. J.; Burlaga, L. F.

    1993-01-01

    The dynamics of radially expanding magnetic clouds is rigorously analyzed within the framework of ideal MHD. The cloud is modelled as a cylindrically symmetric magnetic flux rope. In the force balance we include the gas pressure gradient and the Lorentz force. Interaction with the ambient solar wind due to expansion of the magnetic cloud is represented by a drag force proportional to the bulk velocity. We consider the self-similar expansion of a polytrope, and reduce the problem to an ordinary nonlinear differential equation for the evolution function. Analyzing the asymptotic behavior of the evolution function, we formulate theoretical expectations for the long-term behavior of cloud parameters. We focus on the temporal evolution of (1) the magnetic field strength; (2) the twist of the field lines; (3) the asymmetry of the total field profile; and (4) the bulk flow speed. We present data from two magnetic clouds observed at 1 AU and 2 AU, respectively, and find good agreement with theoretical expectations. For a peak magnetic field strength at 1 AU of 25 nT and a polytropic index of 0.5, we find that a magnetic cloud can be distinguished from the background interplanetary field up to a distance of about 5 AU. Taking larger magnetic fields and bigger polytropic indices this distance can double.

  9. Assessing the effects of anthropogenic aerosols on Pacific storm track using a multiscale global climate model

    PubMed Central

    Wang, Yuan; Wang, Minghuai; Zhang, Renyi; Ghan, Steven J.; Lin, Yun; Hu, Jiaxi; Pan, Bowen; Levy, Misti; Jiang, Jonathan H.; Molina, Mario J.

    2014-01-01

    Atmospheric aerosols affect weather and global general circulation by modifying cloud and precipitation processes, but the magnitude of cloud adjustment by aerosols remains poorly quantified and represents the largest uncertainty in estimated forcing of climate change. Here we assess the effects of anthropogenic aerosols on the Pacific storm track, using a multiscale global aerosol–climate model (GCM). Simulations of two aerosol scenarios corresponding to the present day and preindustrial conditions reveal long-range transport of anthropogenic aerosols across the north Pacific and large resulting changes in the aerosol optical depth, cloud droplet number concentration, and cloud and ice water paths. Shortwave and longwave cloud radiative forcing at the top of atmosphere are changed by −2.5 and +1.3 W m−2, respectively, by emission changes from preindustrial to present day, and an increased cloud top height indicates invigorated midlatitude cyclones. The overall increased precipitation and poleward heat transport reflect intensification of the Pacific storm track by anthropogenic aerosols. Hence, this work provides, for the first time to the authors’ knowledge, a global perspective of the effects of Asian pollution outflows from GCMs. Furthermore, our results suggest that the multiscale modeling framework is essential in producing the aerosol invigoration effect of deep convective clouds on a global scale. PMID:24733923

  10. Star formation in evolving molecular clouds

    NASA Astrophysics Data System (ADS)

    Völschow, M.; Banerjee, R.; Körtgen, B.

    2017-09-01

    Molecular clouds are the principle stellar nurseries of our universe; they thus remain a focus of both observational and theoretical studies. From observations, some of the key properties of molecular clouds are well known but many questions regarding their evolution and star formation activity remain open. While numerical simulations feature a large number and complexity of involved physical processes, this plethora of effects may hide the fundamentals that determine the evolution of molecular clouds and enable the formation of stars. Purely analytical models, on the other hand, tend to suffer from rough approximations or a lack of completeness, limiting their predictive power. In this paper, we present a model that incorporates central concepts of astrophysics as well as reliable results from recent simulations of molecular clouds and their evolutionary paths. Based on that, we construct a self-consistent semi-analytical framework that describes the formation, evolution, and star formation activity of molecular clouds, including a number of feedback effects to account for the complex processes inside those objects. The final equation system is solved numerically but at much lower computational expense than, for example, hydrodynamical descriptions of comparable systems. The model presented in this paper agrees well with a broad range of observational results, showing that molecular cloud evolution can be understood as an interplay between accretion, global collapse, star formation, and stellar feedback.

  11. Assessing the effects of anthropogenic aerosols on Pacific storm track using a multiscale global climate model.

    PubMed

    Wang, Yuan; Wang, Minghuai; Zhang, Renyi; Ghan, Steven J; Lin, Yun; Hu, Jiaxi; Pan, Bowen; Levy, Misti; Jiang, Jonathan H; Molina, Mario J

    2014-05-13

    Atmospheric aerosols affect weather and global general circulation by modifying cloud and precipitation processes, but the magnitude of cloud adjustment by aerosols remains poorly quantified and represents the largest uncertainty in estimated forcing of climate change. Here we assess the effects of anthropogenic aerosols on the Pacific storm track, using a multiscale global aerosol-climate model (GCM). Simulations of two aerosol scenarios corresponding to the present day and preindustrial conditions reveal long-range transport of anthropogenic aerosols across the north Pacific and large resulting changes in the aerosol optical depth, cloud droplet number concentration, and cloud and ice water paths. Shortwave and longwave cloud radiative forcing at the top of atmosphere are changed by -2.5 and +1.3 W m(-2), respectively, by emission changes from preindustrial to present day, and an increased cloud top height indicates invigorated midlatitude cyclones. The overall increased precipitation and poleward heat transport reflect intensification of the Pacific storm track by anthropogenic aerosols. Hence, this work provides, for the first time to the authors' knowledge, a global perspective of the effects of Asian pollution outflows from GCMs. Furthermore, our results suggest that the multiscale modeling framework is essential in producing the aerosol invigoration effect of deep convective clouds on a global scale.

  12. Bootstrapping Development of a Cloud-Based Spoken Dialog System in the Educational Domain from Scratch Using Crowdsourced Data. Research Report. ETS RR-16-16

    ERIC Educational Resources Information Center

    Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao

    2016-01-01

    We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…

  13. A theoretical framework for modeling dilution enhancement of non-reactive solutes in heterogeneous porous media.

    PubMed

    de Barros, F P J; Fiori, A; Boso, F; Bellin, A

    2015-01-01

    Spatial heterogeneity of the hydraulic properties of geological porous formations leads to erratically shaped solute clouds, thus increasing the edge area of the solute body and augmenting the dilution rate. In this study, we provide a theoretical framework to quantify dilution of a non-reactive solute within a steady state flow as affected by the spatial variability of the hydraulic conductivity. Embracing the Lagrangian concentration framework, we obtain explicit semi-analytical expressions for the dilution index as a function of the structural parameters of the random hydraulic conductivity field, under the assumptions of uniform-in-the-average flow, small injection source and weak-to-mild heterogeneity. Results show how the dilution enhancement of the solute cloud is strongly dependent on both the statistical anisotropy ratio and the heterogeneity level of the porous medium. The explicit semi-analytical solution also captures the temporal evolution of the dilution rate; for the early- and late-time limits, the proposed solution recovers previous results from the literature, while at intermediate times it reflects the increasing interplay between large-scale advection and local-scale dispersion. The performance of the theoretical framework is verified with high resolution numerical results and successfully tested against the Cape Cod field data. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. On the Effect of Dust Particles on Global Cloud Condensation Nuclei and Cloud Droplet Number

    NASA Technical Reports Server (NTRS)

    Karydis, V. A.; Kumar, P.; Barahona, D.; Sokolik, I. N.; Nenes, A.

    2011-01-01

    Aerosol-cloud interaction studies to date consider aerosol with a substantial fraction of soluble material as the sole source of cloud condensation nuclei (CCN). Emerging evidence suggests that mineral dust can act as good CCN through water adsorption onto the surface of particles. This study provides a first assessment of the contribution of insoluble dust to global CCN and cloud droplet number concentration (CDNC). Simulations are carried out with the NASA Global Modeling Initiative chemical transport model with an online aerosol simulation, considering emissions from fossil fuel, biomass burning, marine, and dust sources. CDNC is calculated online and explicitly considers the competition of soluble and insoluble CCN for water vapor. The predicted annual average contribution of insoluble mineral dust to CCN and CDNC in cloud-forming areas is up to 40 and 23.8%, respectively. Sensitivity tests suggest that uncertainties in dust size distribution and water adsorption parameters modulate the contribution of mineral dust to CDNC by 23 and 56%, respectively. Coating of dust by hygroscopic salts during the atmospheric aging causes a twofold enhancement of the dust contribution to CCN; the aged dust, however, can substantially deplete in-cloud supersaturation during the initial stages of cloud formation and can eventually reduce CDNC. Considering the hydrophilicity from adsorption and hygroscopicity from solute is required to comprehensively capture the dust-warm cloud interactions. The framework presented here addresses this need and can be easily integrated in atmospheric models.

  15. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    PubMed

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks.

  16. Marine biogeochemical influence on primary sea spray aerosol composition in the Southern Ocean: predictions from a mechanistic model

    NASA Astrophysics Data System (ADS)

    McCoy, D.; Burrows, S. M.; Elliott, S.; Frossard, A. A.; Russell, L. M.; Liu, X.; Ogunro, O. O.; Easter, R. C.; Rasch, P. J.

    2014-12-01

    Remote marine clouds, such as those over the Southern Ocean, are particularly sensitive to variations in the concentration and chemical composition of aerosols that serve as cloud condensation nuclei (CCN). Observational evidence indicates that the organic content of fine marine aerosol is greatly increased during the biologically active season near strong phytoplankton blooms in certain locations, while being nearly constant in other locations. We have recently developed a novel modeling framework that mechanistically links the organic fraction of submicron sea spray to ocean biogeochemistry (Burrows et al., in discussion, ACPD, 2014; Elliott et al., ERL, 2014). Because of its combination of large phytoplankton blooms and high wind speeds, the Southern Ocean is an ideal location for testing our understanding of the processes driving the enrichment of organics in sea spray aerosol. Comparison of the simulated OM fraction with satellite observations shows that OM fraction is a statistically significant predictor of cloud droplet number concentration over the Southern Ocean. This presentation will focus on predictions from our modeling framework for the Southern Ocean, specifically, the predicted geographic gradients and seasonal cycles in the aerosol organic matter and its functional group composition. The timing and location of a Southern Ocean field campaign will determine its utility in observing the effects of highly localized and seasonal phytoplankton blooms on aerosol composition and clouds. Reference cited: Burrows, S. M., Ogunro, O., Frossard, A. A., Russell, L. M., Rasch, P. J., and Elliott, S.: A physically-based framework for modelling the organic fractionation of sea spray aerosol from bubble film Langmuir equilibria, Atmos. Chem. Phys. Discuss., 14, 5375-5443, doi:10.5194/acpd-14-5375-2014, 2014. Elliott, S., Burrows, S. M., Deal, C., Liu, X., Long, M., Ogunro, O., Russell, L. M., and Wingenter O.. "Prospects for simulating macromolecular surfactant chemistry at the ocean-atmosphere boundary." Environmental Research Letters 9, no. 6 (2014): 064012.

  17. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment

    PubMed Central

    2014-01-01

    Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks. PMID:24428926

  18. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  19. Dependence of stratocumulus-topped boundary-layer entrainment on cloud-water sedimentation: Impact on global aerosol indirect effect in GISS ModelE3 single column model and global simulations

    NASA Astrophysics Data System (ADS)

    Ackerman, A. S.; Kelley, M.; Cheng, Y.; Fridlind, A. M.; Del Genio, A. D.; Bauer, S.

    2017-12-01

    Reduction in cloud-water sedimentation induced by increasing droplet concentrations has been shown in large-eddy simulations (LES) and direct numerical simulation (DNS) to enhance boundary-layer entrainment, thereby reducing cloud liquid water path and offsetting the Twomey effect when the overlying air is sufficiently dry, which is typical. Among recent upgrades to ModelE3, the latest version of the NASA Goddard Institute for Space Studies (GISS) general circulation model (GCM), are a two-moment stratiform cloud microphysics treatment with prognostic precipitation and a moist turbulence scheme that includes an option in its entrainment closure of a simple parameterization for the effect of cloud-water sedimentation. Single column model (SCM) simulations are compared to LES results for a stratocumulus case study and show that invoking the sedimentation-entrainment parameterization option indeed reduces the dependence of cloud liquid water path on increasing aerosol concentrations. Impacts of variations of the SCM configuration and the sedimentation-entrainment parameterization will be explored. Its impact on global aerosol indirect forcing in the framework of idealized atmospheric GCM simulations will also be assessed.

  20. Ultra-Parameterized CAM: Progress Towards Low-Cloud Permitting Superparameterization

    NASA Astrophysics Data System (ADS)

    Parishani, H.; Pritchard, M. S.; Bretherton, C. S.; Khairoutdinov, M.; Wyant, M. C.; Singh, B.

    2016-12-01

    A leading source of uncertainty in climate feedback arises from the representation of low clouds, which are not resolved but depend on small-scale physical processes (e.g. entrainment, boundary layer turbulence) that are heavily parameterized. We show results from recent attempts to achieve an explicit representation of low clouds by pushing the computational limits of cloud superparameterization to resolve boundary-layer eddy scales relevant to marine stratocumulus (250m horizontal and 20m vertical length scales). This extreme configuration is called "ultraparameterization". Effects of varying horizontal vs. vertical resolution are analyzed in the context of altered constraints on the turbulent kinetic energy statistics of the marine boundary layer. We show that 250m embedded horizontal resolution leads to a more realistic boundary layer vertical structure, but also to an unrealistic cloud pulsation that cannibalizes time mean LWP. We explore the hypothesis that feedbacks involving horizontal advection (not typically encountered in offline LES that neglect this degree of freedom) may conspire to produce such effects and present strategies to compensate. The results are relevant to understanding the emergent behavior of quasi-resolved low cloud decks in a multi-scale modeling framework within a previously unencountered grey zone of better resolved boundary-layer turbulence.

  1. The NASA-Goddard Multi-Scale Modeling Framework - Land Information System: Global Land/atmosphere Interaction with Resolved Convection

    NASA Technical Reports Server (NTRS)

    Mohr, Karen Irene; Tao, Wei-Kuo; Chern, Jiun-Dar; Kumar, Sujay V.; Peters-Lidard, Christa D.

    2013-01-01

    The present generation of general circulation models (GCM) use parameterized cumulus schemes and run at hydrostatic grid resolutions. To improve the representation of cloud-scale moist processes and landeatmosphere interactions, a global, Multi-scale Modeling Framework (MMF) coupled to the Land Information System (LIS) has been developed at NASA-Goddard Space Flight Center. The MMFeLIS has three components, a finite-volume (fv) GCM (Goddard Earth Observing System Ver. 4, GEOS-4), a 2D cloud-resolving model (Goddard Cumulus Ensemble, GCE), and the LIS, representing the large-scale atmospheric circulation, cloud processes, and land surface processes, respectively. The non-hydrostatic GCE model replaces the single-column cumulus parameterization of fvGCM. The model grid is composed of an array of fvGCM gridcells each with a series of embedded GCE models. A horizontal coupling strategy, GCE4fvGCM4Coupler4LIS, offered significant computational efficiency, with the scalability and I/O capabilities of LIS permitting landeatmosphere interactions at cloud-scale. Global simulations of 2007e2008 and comparisons to observations and reanalysis products were conducted. Using two different versions of the same land surface model but the same initial conditions, divergence in regional, synoptic-scale surface pressure patterns emerged within two weeks. The sensitivity of largescale circulations to land surface model physics revealed significant functional value to using a scalable, multi-model land surface modeling system in global weather and climate prediction.

  2. Climate simulations and services on HPC, Cloud and Grid infrastructures

    NASA Astrophysics Data System (ADS)

    Cofino, Antonio S.; Blanco, Carlos; Minondo Tshuma, Antonio

    2017-04-01

    Cloud, Grid and High Performance Computing have changed the accessibility and availability of computing resources for Earth Science research communities, specially for Climate community. These paradigms are modifying the way how climate applications are being executed. By using these technologies the number, variety and complexity of experiments and resources are increasing substantially. But, although computational capacity is increasing, traditional applications and tools used by the community are not good enough to manage this large volume and variety of experiments and computing resources. In this contribution, we evaluate the challenges to run climate simulations and services on Grid, Cloud and HPC infrestructures and how to tackle them. The Grid and Cloud infrastructures provided by EGI's VOs ( esr , earth.vo.ibergrid and fedcloud.egi.eu) will be evaluated, as well as HPC resources from PRACE infrastructure and institutional clusters. To solve those challenges, solutions using DRM4G framework will be shown. DRM4G provides a good framework to manage big volume and variety of computing resources for climate experiments. This work has been supported by the Spanish National R&D Plan under projects WRF4G (CGL2011-28864), INSIGNIA (CGL2016-79210-R) and MULTI-SDM (CGL2015-66583-R) ; the IS-ENES2 project from the 7FP of the European Commission (grant agreement no. 312979); the European Regional Development Fund—ERDF and the Programa de Personal Investigador en Formación Predoctoral from Universidad de Cantabria and Government of Cantabria.

  3. A Geospatial Information Grid Framework for Geological Survey.

    PubMed

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  4. A Geospatial Information Grid Framework for Geological Survey

    PubMed Central

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255

  5. Abstracted Workow Framework with a Structure from Motion Application

    NASA Astrophysics Data System (ADS)

    Rossi, Adam J.

    In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.

  6. Exact solutions of bulk viscous with string cloud attached to strange quark matter for higher dimensional FRW universe in Lyra geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Çağlar, Halife, E-mail: hlfcglr@gmail.com; Aygün, Sezgin, E-mail: saygun@comu.edu.tr

    In this study, we have investigated bulk viscous with strange quark matter attached to the string cloud for higher dimensional Friedman-Robertson-Walker (FRW) universe in Lyra geometry. By using varying deceleration parameter and conservation equations we have solved Einstein Field Equations (EFE’s) and obtained generalized exact solutions for our model. Also we have found that string is not survived for bulk viscous with strange quark matter attached to the string cloud in framework higher dimensional FRW universe in Lyra geometry. This result agrees with Kiran and Reddy, Krori et al, Sahoo and Mishra and Mohanty et al. in four and fivemore » dimensions.« less

  7. Genotyping in the cloud with Crossbow.

    PubMed

    Gurtowski, James; Schatz, Michael C; Langmead, Ben

    2012-09-01

    Crossbow is a scalable, portable, and automatic cloud computing tool for identifying SNPs from high-coverage, short-read resequencing data. It is built on Apache Hadoop, an implementation of the MapReduce software framework. Hadoop allows Crossbow to distribute read alignment and SNP calling subtasks over a cluster of commodity computers. Two robust tools, Bowtie and SOAPsnp, implement the fundamental alignment and variant calling operations respectively, and have demonstrated capabilities within Crossbow of analyzing approximately one billion short reads per hour on a commodity Hadoop cluster with 320 cores. Through protocol examples, this unit will demonstrate the use of Crossbow for identifying variations in three different operating modes: on a Hadoop cluster, on a single computer, and on the Amazon Elastic MapReduce cloud computing service.

  8. Regime-Based Evaluation of Cloudiness in CMIP5 Models

    NASA Technical Reports Server (NTRS)

    Jin, Daeho; Oraiopoulos, Lazaros; Lee, Dong Min

    2016-01-01

    The concept of Cloud Regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating for each gridcell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product (long-term average total cloud amount [TCA]), cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our findings support previous studies showing that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite their shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer (MODIS) cloud observations evaluated against ISCCP as if they were another model output. Lastly, cloud simulation performance is contrasted with each model's equilibrium climate sensitivity (ECS) in order to gain insight on whether good cloud simulation pairs with particular values of this parameter.

  9. Coupled fvGCM-GCE Modeling System, 3D Cloud-Resolving Model and Cloud Library

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional singlecolumn models in simulating various types of clouds and cloud systems from Merent geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloudscale model (termed a super-parameterization or multiscale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameteridon NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D Goddard cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF in being developed and production nms will be conducted at the beginning of 2005. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes, (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), (3) A cloud library generated by Goddard MMF, and 3D GCE model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.

  10. Global Observations of Aerosols and Clouds from Combined Lidar and Passive Instruments to Improve Radiation Budget and Climate Studies

    NASA Technical Reports Server (NTRS)

    Winker, David M.

    1999-01-01

    Current uncertainties in the effects of clouds and aerosols on the Earth radiation budget limit our understanding of the climate system and the potential for global climate change. Pathfinder Instruments for Cloud and Aerosol Spaceborne Observations - Climatologie Etendue des Nuages et des Aerosols (PICASSO-CENA) is a recently approved satellite mission within NASA's Earth System Science Pathfinder (ESSP) program which will address these uncertainties with a unique suite of active and passive instruments. The Lidar In-space Technology Experiment (LITE) demonstrated the potential benefits of space lidar for studies of clouds and aerosols. PICASSO-CENA builds on this experience with a payload consisting of a two-wavelength polarization-sensitive lidar, an oxygen A-band spectrometer (ABS), an imaging infrared radiometer (IIR), and a wide field camera (WFC). Data from these instruments will be used to measure the vertical distributions of aerosols and clouds in the atmosphere, as well as optical and physical properties of aerosols and clouds which influence the Earth radiation budget. PICASSO-CENA will be flown in formation with the PM satellite of the NASA Earth Observing System (EOS) to provide a comprehensive suite of coincident measurements of atmospheric state, aerosol and cloud optical properties, and radiative fluxes. The mission will address critical uncertainties iin the direct radiative forcing of aerosols and clouds as well as aerosol influences on cloud radiative properties and cloud-climate radiation feedbacks. PICASSO-CENA is planned for a three year mission, with a launch in early 2003. PICASSO-CENA is being developed within the framework of a collaboration between NASA and CNES.

  11. a Conceptual Framework for Indoor Mapping by Using Grammars

    NASA Astrophysics Data System (ADS)

    Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.

    2017-09-01

    Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  12. Impact of different cloud deployments on real-time video applications for mobile video cloud users

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2015-02-01

    The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical results are presented and discussed to quantify and explain the different impacts resulted from various cloud deployments, video application and wireless/mobile network setting, and user mobility. Additionally, this paper analyses the advantages, disadvantages, limitations and optimization techniques in various cloud networking deployments, in particular the cloudlet approach compared with the Internet cloud approach, with recommendations of optimized deployments highlighted. Finally, federated clouds and inter-cloud collaboration challenges and opportunities are discussed in the context of supporting real-time video applications for mobile users.

  13. CloudDOE: a user-friendly tool for deploying Hadoop clouds and analyzing high-throughput sequencing data with MapReduce.

    PubMed

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.

  14. CloudDOE: A User-Friendly Tool for Deploying Hadoop Clouds and Analyzing High-Throughput Sequencing Data with MapReduce

    PubMed Central

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343

  15. Coupled fvGCM-GCE Modeling System, TRMM Latent Heating and Cloud Library

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2004-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to imiprove the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D GCE model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF will be developed by the end of 2004 and production runs will be conducted at the beginning of 2005. The purpose of this proposal is to augment the current Goddard MMF and other cloud modeling activities. I this talk, I will present: (1) A summary of the second Cloud Modeling Workshop took place at NASA Goddard, (2) A summary of the third TRMM Latent Heating Workshop took place at Nara Japan, (3) A brief discussion on the Goddard research plan of using Weather Research Forecast (WRF) model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.

  16. Coupled fvGCM-GCE Modeling System: TRMM Latent Heating and Cloud Library

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D GCE model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF will be developed by the end of 2004 and production runs will be conducted at the beginning of 2005. The purpose of this proposal is to augment the current Goddard MMF and other cloud modeling activities. In this talk, I will present: (1) A summary of the second Cloud Modeling Workshop took place at NASA Goddard, (2) A summary of the third TRMM Latent Heating Workshop took place at Nara Japan, (3) A brief discussion on the GCE model on developing a global cloud simulator.

  17. Coupled fvGCM-GCE Modeling System, 3D Cloud-Resolving Model and Cloud Library

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud- resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF in being developed and production runs will be conducted at the beginning of 2005. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes, ( 2 ) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), (3) A cloud library generated by Goddard MMF, and 3D GCE model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Gail-Joon

    The project seeks an innovative framework to enable users to access and selectively share resources in distributed environments, enhancing the scalability of information sharing. We have investigated secure sharing & assurance approaches for ad-hoc collaboration, focused on Grids, Clouds, and ad-hoc network environments.

  19. PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare

    PubMed Central

    Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian

    2015-01-01

    Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645

  20. Cafe: A Generic Configurable Customizable Composite Cloud Application Framework

    NASA Astrophysics Data System (ADS)

    Mietzner, Ralph; Unger, Tobias; Leymann, Frank

    In this paper we present Cafe (Composite Application Framework) an approach to describe configurable composite service-oriented applications and to automatically provision them across different providers. Cafe enables independent software vendors to describe their composite service-oriented applications and the components that are used to assemble them. Components can be internal to the application or external and can be deployed in any of the delivery models present in the cloud. The components are annotated with requirements for the infrastructure they later need to be run on. Providers on the other hand advertise their infrastructure services by describing them as infrastructure capabilities. The separation of software vendors and providers enables end users and providers to follow a best-of-breed strategy by combining arbitrary applications with arbitrary providers. We show how such applications can be automatically provisioned and present an architecture and a prototype that implements the concepts.

  1. PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.

    PubMed

    Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian

    2014-10-01

    Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.

  2. Supporting capacity sharing in the cloud manufacturing environment based on game theory and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Argoneto, Pierluigi; Renna, Paolo

    2016-02-01

    This paper proposes a Framework for Capacity Sharing in Cloud Manufacturing (FCSCM) able to support the capacity sharing issue among independent firms. The success of geographical distributed plants depends strongly on the use of opportune tools to integrate their resources and demand forecast in order to gather a specific production objective. The framework proposed is based on two different tools: a cooperative game algorithm, based on the Gale-Shapley model, and a fuzzy engine. The capacity allocation policy takes into account the utility functions of the involved firms. It is shown how the capacity allocation policy proposed induces all firms to report truthfully their information about their requirements. A discrete event simulation environment has been developed to test the proposed FCSCM. The numerical results show the drastic reduction of unsatisfied capacity obtained by the model of cooperation implemented in this work.

  3. Towards an e-Health Cloud Solution for Remote Regions at Bahia-Brazil.

    PubMed

    Sarinho, V T; Mota, A O; Silva, E P

    2017-12-19

    This paper presents CloudMedic, an e-Health Cloud solution that manages health care services in remote regions of Bahia-Brazil. For that, six main modules: Clinic, Hospital, Supply, Administrative, Billing and Health Business Intelligence, were developed to control the health flow among health actors at health institutions. They provided database model and procedures for health business rules, a standard gateway for data maintenance between web views and database layer, and a multi-front-end framework based on web views and web commands configurations. These resources were used by 2042 health actors in 261 health posts covering health demands from 118 municipalities at Bahia state. They also managed approximately 2.4 million health service 'orders and approximately 13.5 million health exams for more than 1.3 million registered patients. As a result, a collection of health functionalities available in a cloud infrastructure was successfully developed, deployed and validated in more than 28% of Bahia municipalities. A viable e-Health Cloud solution that, despite municipality limitations in remote regions, decentralized and improved the access to health care services at Bahia state.

  4. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  5. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  6. Framework Development Supporting the Safety Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng

    2015-07-01

    In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.

  7. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  8. Cloud retrievals from satellite data using optimal estimation: evaluation and application to ATSR

    NASA Astrophysics Data System (ADS)

    Poulsen, C. A.; Siddans, R.; Thomas, G. E.; Sayer, A. M.; Grainger, R. G.; Campmany, E.; Dean, S. M.; Arnold, C.; Watts, P. D.

    2012-08-01

    Clouds play an important role in balancing the Earth's radiation budget. Hence, it is vital that cloud climatologies are produced that quantify cloud macro and micro physical parameters and the associated uncertainty. In this paper, we present an algorithm ORAC (Oxford-RAL retrieval of Aerosol and Cloud) which is based on fitting a physically consistent cloud model to satellite observations simultaneously from the visible to the mid-infrared, thereby ensuring that the resulting cloud properties provide both a good representation of the short-wave and long-wave radiative effects of the observed cloud. The advantages of the optimal estimation method are that it enables rigorous error propagation and the inclusion of all measurements and any a priori information and associated errors in a rigorous mathematical framework. The algorithm provides a measure of the consistency between retrieval representation of cloud and satellite radiances. The cloud parameters retrieved are the cloud top pressure, cloud optical depth, cloud effective radius, cloud fraction and cloud phase. The algorithm can be applied to most visible/infrared satellite instruments. In this paper, we demonstrate the applicability to the Along-Track Scanning Radiometers ATSR-2 and AATSR. Examples of applying the algorithm to ATSR-2 flight data are presented and the sensitivity of the retrievals assessed, in particular the algorithm is evaluated for a number of simulated single-layer and multi-layer conditions. The algorithm was found to perform well for single-layer cloud except when the cloud was very thin; i.e., less than 1 optical depths. For the multi-layer cloud, the algorithm was robust except when the upper ice cloud layer is less than five optical depths. In these cases the retrieved cloud top pressure and cloud effective radius become a weighted average of the 2 layers. The sum of optical depth of multi-layer cloud is retrieved well until the cloud becomes thick, greater than 50 optical depths, where the cloud begins to saturate. The cost proved a good indicator of multi-layer scenarios. Both the retrieval cost and the error need to be considered together in order to evaluate the quality of the retrieval. This algorithm in the configuration described here has been applied to both ATSR-2 and AATSR visible and infrared measurements in the context of the GRAPE (Global Retrieval and cloud Product Evaluation) project to produce a 14 yr consistent record for climate research.

  9. Outcome of the third cloud retrieval evaluation workshop

    NASA Astrophysics Data System (ADS)

    Roebeling, Rob; Baum, Bryan; Bennartz, Ralf; Hamann, Ulrich; Heidinger, Andy; Thoss, Anke; Walther, Andi

    2013-05-01

    Accurate measurements of global distributions of cloud parameters and their diurnal, seasonal, and interannual variations are needed to improve understanding of the role of clouds in the weather and climate system, and to monitor their time-space variations. Cloud properties retrieved from satellite observations, such as cloud vertical placement, cloud water path and cloud particle size, play an important role for such studies. In order to give climate and weather researchers more confidence in the quality of these retrievals their validity needs to be determined and their error characteristics must be quantified. The purpose of the Cloud Retrieval Evaluation Workshop (CREW), held from 15-18 Nov. 2011 in Madison, Wisconsin, USA, is to enhance knowledge on state-of-art cloud properties retrievals from passive imaging satellites, and pave the path towards optimizing these retrievals for climate monitoring as well as for the analysis of cloud parameterizations in climate and weather models. CREW also seeks to observe and understand methods used to prepare daily and monthly cloud parameter climatologies. An important workshop component is discussion on results of the algorithm and sensor comparisons and validation studies. Hereto a common database with about 12 different cloud properties retrievals from passive imagers (MSG, MODIS, AVHRR, POLDER and/or AIRS), complemented with cloud measurements that serve as a reference (CLOUDSAT, CALIPSO, AMSU, MISR), was prepared for a number of "golden days". The passive imager cloud property retrievals were inter-compared and validated against Cloudsat, Calipso and AMSU observations. In our presentation we summarize the outcome of the inter-comparison and validation work done in the framework of CREW, and elaborate on reasons for observed differences. More in depth discussions were held on retrieval principles and validation, and utilization of cloud parameters for climate research. This was done in parallel breakout sessions on cloud vertical placement, cloud physical properties, and cloud climatologies. We present the recommendations of these sessions, propose a way forward to establish international partnerships on cloud research, and summarize actions defined to tailor CREW activities to missions of international programs, such as the Global Energy and Water Cycle Experiment (GEWEX) and Sustained, Co-Ordinated Processing of Environmental Satellite Data for Climate Monitoring (SCOPE-CM). Finally, attention is given to increase the traceability and uniformity of different longterm and homogeneous records of cloud parameters.

  10. A Cloud Based Framework For Monitoring And Predicting Subsurface System Behaviour

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Rodzianko, A.; Johnson, D. V.; Soltanian, M. R.; Dwivedi, D.; Dafflon, B.; Tran, A. P.; Versteeg, O. J.

    2015-12-01

    Subsurface system behavior is driven and controlled by the interplay of physical, chemical, and biological processes which occur at multiple temporal and spatial scales. Capabilities to monitor, understand and predict this behavior in an effective and timely manner are needed for both scientific purposes and for effective subsurface system management. Such capabilities require three elements: Models, Data and an enabling cyberinfrastructure, which allow users to use these models and data in an effective manner. Under a DOE Office of Science funded STTR award Subsurface Insights and LBNL have designed and implemented a cloud based predictive assimilation framework (PAF) which automatically ingests, controls quality and stores heterogeneous physical and chemical subsurface data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of subsurface systems. PAF is implemented as a modular cloud based software application with five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result delivery and (5) orchestration. Serverside PAF uses ZF2 (a PHP web application framework) and Python and both open source (ODM2) and in house developed data models. Clientside PAF uses CSS and JS to allow for interactive data visualization and analysis. Client side modularity (which allows for a responsive interface) of the system is achieved by implementing each core capability of PAF (such as data visualization, user configuration and control, electrical geophysical monitoring and email/SMS alerts on data streams) as a SPA (Single Page Application). One of the recent enhancements is the full integration of a number of flow and mass transport and parameter estimation codes (e.g., MODFLOW, MT3DMS, PHT3D, TOUGH, PFLOTRAN) in this framework. This integration allows for autonomous and user controlled modeling of hydrological and geochemical processes. In our presentation we will discuss our software architecture and present the results of using these codes and the overall developed performance of our framework using hydrological, geochemical and geophysical data from the LBNL SFA2 Rifle field site.

  11. BAMSI: a multi-cloud service for scalable distributed filtering of massive genome data.

    PubMed

    Ausmees, Kristiina; John, Aji; Toor, Salman Z; Hellander, Andreas; Nettelblad, Carl

    2018-06-26

    The advent of next-generation sequencing (NGS) has made whole-genome sequencing of cohorts of individuals a reality. Primary datasets of raw or aligned reads of this sort can get very large. For scientific questions where curated called variants are not sufficient, the sheer size of the datasets makes analysis prohibitively expensive. In order to make re-analysis of such data feasible without the need to have access to a large-scale computing facility, we have developed a highly scalable, storage-agnostic framework, an associated API and an easy-to-use web user interface to execute custom filters on large genomic datasets. We present BAMSI, a Software as-a Service (SaaS) solution for filtering of the 1000 Genomes phase 3 set of aligned reads, with the possibility of extension and customization to other sets of files. Unique to our solution is the capability of simultaneously utilizing many different mirrors of the data to increase the speed of the analysis. In particular, if the data is available in private or public clouds - an increasingly common scenario for both academic and commercial cloud providers - our framework allows for seamless deployment of filtering workers close to data. We show results indicating that such a setup improves the horizontal scalability of the system, and present a possible use case of the framework by performing an analysis of structural variation in the 1000 Genomes data set. BAMSI constitutes a framework for efficient filtering of large genomic data sets that is flexible in the use of compute as well as storage resources. The data resulting from the filter is assumed to be greatly reduced in size, and can easily be downloaded or routed into e.g. a Hadoop cluster for subsequent interactive analysis using Hive, Spark or similar tools. In this respect, our framework also suggests a general model for making very large datasets of high scientific value more accessible by offering the possibility for organizations to share the cost of hosting data on hot storage, without compromising the scalability of downstream analysis.

  12. Strategy for long-term 3D cloud-resolving simulations over the ARM SGP site and preliminary results

    NASA Astrophysics Data System (ADS)

    Lin, W.; Liu, Y.; Song, H.; Endo, S.

    2011-12-01

    Parametric representations of cloud/precipitation processes continue having to be adopted in climate simulations with increasingly higher spatial resolution or with emerging adaptive mesh framework; and it is only becoming more critical that such parameterizations have to be scale aware. Continuous cloud measurements at DOE's ARM sites have provided a strong observational basis for novel cloud parameterization research at various scales. Despite significant progress in our observational ability, there are important cloud-scale physical and dynamical quantities that are either not currently observable or insufficiently sampled. To complement the long-term ARM measurements, we have explored an optimal strategy to carry out long-term 3-D cloud-resolving simulations over the ARM SGP site using Weather Research and Forecasting (WRF) model with multi-domain nesting. The factors that are considered to have important influences on the simulated cloud fields include domain size, spatial resolution, model top, forcing data set, model physics and the growth of model errors. The hydrometeor advection that may play a significant role in hydrological process within the observational domain but is often lacking, and the limitations due to the constraint of domain-wide uniform forcing in conventional cloud system-resolving model simulations, are at least partly accounted for in our approach. Conventional and probabilistic verification approaches are employed first for selected cases to optimize the model's capability of faithfully reproducing the observed mean and statistical distributions of cloud-scale quantities. This then forms the basis of our setup for long-term cloud-resolving simulations over the ARM SGP site. The model results will facilitate parameterization research, as well as understanding and dissecting parameterization deficiencies in climate models.

  13. Cloud and aerosol studies using combined CPL and MAS data

    NASA Astrophysics Data System (ADS)

    Vaughan, Mark A.; Rodier, Sharon; Hu, Yongxiang; McGill, Matthew J.; Holz, Robert E.

    2004-11-01

    Current uncertainties in the role of aerosols and clouds in the Earth's climate system limit our abilities to model the climate system and predict climate change. These limitations are due primarily to difficulties of adequately measuring aerosols and clouds on a global scale. The A-train satellites (Aqua, CALIPSO, CloudSat, PARASOL, and Aura) will provide an unprecedented opportunity to address these uncertainties. The various active and passive sensors of the A-train will use a variety of measurement techniques to provide comprehensive observations of the multi-dimensional properties of clouds and aerosols. However, to fully achieve the potential of this ensemble requires a robust data analysis framework to optimally and efficiently map these individual measurements into a comprehensive set of cloud and aerosol physical properties. In this work we introduce the Multi-Instrument Data Analysis and Synthesis (MIDAS) project, whose goal is to develop a suite of physically sound and computationally efficient algorithms that will combine active and passive remote sensing data in order to produce improved assessments of aerosol and cloud radiative and microphysical properties. These algorithms include (a) the development of an intelligent feature detection algorithm that combines inputs from both active and passive sensors, and (b) identifying recognizable multi-instrument signatures related to aerosol and cloud type derived from clusters of image pixels and the associated vertical profile information. Classification of these signatures will lead to the automated identification of aerosol and cloud types. Testing of these new algorithms is done using currently existing and readily available active and passive measurements from the Cloud Physics Lidar and the MODIS Airborne Simulator, which simulate, respectively, the CALIPSO and MODIS A-train instruments.

  14. Moving image analysis to the cloud: A case study with a genome-scale tomographic study

    NASA Astrophysics Data System (ADS)

    Mader, Kevin; Stampanoni, Marco

    2016-01-01

    Over the last decade, the time required to measure a terabyte of microscopic imaging data has gone from years to minutes. This shift has moved many of the challenges away from experimental design and measurement to scalable storage, organization, and analysis. As many scientists and scientific institutions lack training and competencies in these areas, major bottlenecks have arisen and led to substantial delays and gaps between measurement, understanding, and dissemination. We present in this paper a framework for analyzing large 3D datasets using cloud-based computational and storage resources. We demonstrate its applicability by showing the setup and costs associated with the analysis of a genome-scale study of bone microstructure. We then evaluate the relative advantages and disadvantages associated with local versus cloud infrastructures.

  15. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  16. Cloud of strings in {{f}}({{R}}) gravity

    NASA Astrophysics Data System (ADS)

    Morais Graça, J. P.; Lobo, Iarley P.; Salako, Ines G.

    2018-05-01

    We derive the solution for a spherically symmetric string cloud configuration in a d-dimensional spacetime in the framework of f(R) theories of gravity. We also analyze some thermodynamic properties of the joint black hole - cloud of strings solution. For its Hawking temperature, we found that the dependence of the mass with the horizon is significantly different in both theories. For the interaction of a black hole with thermal radiation, we found that the shapes of the curves are similar, but shifted. Our analysis generalizes some known results in the literature. IPL is Supported by Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq-Brazil) (150384/2017-3), JPMG and IPL thank Coordenaç ao de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) for Financial Support

  17. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.

  18. Conjugated organic framework with three-dimensionally ordered stable structure and delocalized π clouds

    NASA Astrophysics Data System (ADS)

    Guo, Jia; Xu, Yanhong; Jin, Shangbin; Chen, Long; Kaji, Toshihiko; Honsho, Yoshihito; Addicoat, Matthew A.; Kim, Jangbae; Saeki, Akinori; Ihee, Hyotcherl; Seki, Shu; Irle, Stephan; Hiramoto, Masahiro; Gao, Jia; Jiang, Donglin

    2013-11-01

    Covalent organic frameworks are a class of crystalline organic porous materials that can utilize π-π-stacking interactions as a driving force for the crystallization of polygonal sheets to form layered frameworks and ordered pores. However, typical examples are chemically unstable and lack intrasheet π-conjugation, thereby significantly limiting their applications. Here we report a chemically stable, electronically conjugated organic framework with topologically designed wire frameworks and open nanochannels, in which the π conjugation-spans the two-dimensional sheets. Our framework permits inborn periodic ordering of conjugated chains in all three dimensions and exhibits a striking combination of properties: chemical stability, extended π-delocalization, ability to host guest molecules and hole mobility. We show that the π-conjugated organic framework is useful for high on-off ratio photoswitches and photovoltaic cells. Therefore, this strategy may constitute a step towards realizing ordered semiconducting porous materials for innovations based on two-dimensionally extended π systems.

  19. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    PubMed

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  20. SeReNA Project: studying aerosol interactions with cloud microphysics in the Amazon Basin

    NASA Astrophysics Data System (ADS)

    Correia, A. L.; Catandi, P. B.; Frigeri, F. F.; Ferreira, W. C.; Martins, J.; Artaxo, P.

    2012-12-01

    Cloud microphysics and its interaction with aerosols is a key atmospheric process for weather and climate. Interactions between clouds and aerosols can impact Earth's radiative balance, its hydrological and energetic cycles, and are responsible for a large fraction of the uncertainty in climatic models. On a planetary scale, the Amazon Basin is one of the most significant land sources of moisture and latent heat energy. Moreover, every year this region undergoes mearked seasonal shifts in its atmospheric state, transitioning from clean to heavily polluted conditions due to the occurrence of seasonal biomass burning fires, that emit large amounts of smoke to the atmosphere. These conditions make the Amazon Basin a special place to study aerosol-cloud interactions. The SeReNA Project ("Remote sensing of clouds and their interaction with aerosols", from the acronym in Portuguese, @SerenaProject on Twitter) is an ongoing effort to experimentally investigate the impact of aerosols upon cloud microphysics in Amazonia. Vertical profiles of droplet effective radius of water and ice particles, in single convective clouds, can be derived from measurements of the emerging radiation on cloud sides. Aerosol optical depth, cloud top properties, and meteorological parameters retrieved from satellites will be correlated with microphysical properties derived for single clouds. Maps of cloud brightness temperature will allow building temperature vs. effective radius profiles for hydrometeors in single clouds. Figure 1 shows an example extracted from Martins et al. (2011), illustrating a proof-of-concept for the kind of result expected within the framework for the SeReNA Project. The results to be obtained will help foster the quantitative knowledge about interactions between aerosols and clouds in a microphysical level. These interactions are a fundamental process in the context of global climatic changes, they are key to understanding basic processes within clouds and how aerosols can influence them. Reference: Martins et al. (2011) ACP, v.11, p.9485-9501. Available at: http://bit.ly/martinspaper Figure 1. Brightness temperature (left panel) and thermodynamic phase (right) of hydrometeors in the convective cloud shown in the middle panel. Extracted from Martins et al. (2011).

  1. Sloped terrain segmentation for autonomous drive using sparse 3D point cloud.

    PubMed

    Cho, Seoungjae; Kim, Jonghyun; Ikram, Warda; Cho, Kyungeun; Jeong, Young-Sik; Um, Kyhyun; Sim, Sungdae

    2014-01-01

    A ubiquitous environment for road travel that uses wireless networks requires the minimization of data exchange between vehicles. An algorithm that can segment the ground in real time is necessary to obtain location data between vehicles simultaneously executing autonomous drive. This paper proposes a framework for segmenting the ground in real time using a sparse three-dimensional (3D) point cloud acquired from undulating terrain. A sparse 3D point cloud can be acquired by scanning the geography using light detection and ranging (LiDAR) sensors. For efficient ground segmentation, 3D point clouds are quantized in units of volume pixels (voxels) and overlapping data is eliminated. We reduce nonoverlapping voxels to two dimensions by implementing a lowermost heightmap. The ground area is determined on the basis of the number of voxels in each voxel group. We execute ground segmentation in real time by proposing an approach to minimize the comparison between neighboring voxels. Furthermore, we experimentally verify that ground segmentation can be executed at about 19.31 ms per frame.

  2. The Impact of Low-Level Cloud Feedback on Persistent Changes in Atmospheric Circulation in the Pacific

    NASA Astrophysics Data System (ADS)

    Burgman, R.; Kirtman, B. P.; Clement, A. C.; Vazquez, H.

    2017-12-01

    Recent studies suggest that low clouds in the Pacific play an important role in the observed decadal climate variability and future climate change. In this study, we implement a novel modeling experiment designed to isolate how interactions between local and remote feedbacks associated with low cloud, SSTs, and the largescale circulation play a significant role in the observed persistence of tropical Pacific SST and associated North American drought. The modeling approach involves the incorporation of observed patterns of satellite-derived shortwave cloud radiative effect (SWCRE) into the coupled model framework and is ideally suited for examining the role of local and large-scale coupled feedbacks and ocean heat transport in Pacific decadal variability. We show that changes in SWCRE forcing in eastern subtropical Pacific alone reproduces much of the observed changes in SST and atmospheric circulation over the past 16 years, including the observed changes in precipitation over much of the Western Hemisphere.

  3. Theoretical study of mixing in liquid clouds – Part 1: Classical concepts

    DOE PAGES

    Korolev, Alexei; Khain, Alex; Pinsky, Mark; ...

    2016-07-28

    The present study considers final stages of in-cloud mixing in the framework of classical concept of homogeneous and extreme inhomogeneous mixing. Simple analytical relationships between basic microphysical parameters were obtained for homogeneous and extreme inhomogeneous mixing based on the adiabatic consideration. It was demonstrated that during homogeneous mixing the functional relationships between the moments of the droplets size distribution hold only during the primary stage of mixing. Subsequent random mixing between already mixed parcels and undiluted cloud parcels breaks these relationships. However, during extreme inhomogeneous mixing the functional relationships between the microphysical parameters hold both for primary and subsequent mixing.more » The obtained relationships can be used to identify the type of mixing from in situ observations. The effectiveness of the developed method was demonstrated using in situ data collected in convective clouds. It was found that for the specific set of in situ measurements the interaction between cloudy and entrained environments was dominated by extreme inhomogeneous mixing.« less

  4. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  5. MaMR: High-performance MapReduce programming model for material cloud applications

    NASA Astrophysics Data System (ADS)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  6. MarDRe: efficient MapReduce-based removal of duplicate DNA reads in the cloud.

    PubMed

    Expósito, Roberto R; Veiga, Jorge; González-Domínguez, Jorge; Touriño, Juan

    2017-09-01

    This article presents MarDRe, a de novo cloud-ready duplicate and near-duplicate removal tool that can process single- and paired-end reads from FASTQ/FASTA datasets. MarDRe takes advantage of the widely adopted MapReduce programming model to fully exploit Big Data technologies on cloud-based infrastructures. Written in Java to maximize cross-platform compatibility, MarDRe is built upon the open-source Apache Hadoop project, the most popular distributed computing framework for scalable Big Data processing. On a 16-node cluster deployed on the Amazon EC2 cloud platform, MarDRe is up to 8.52 times faster than a representative state-of-the-art tool. Source code in Java and Hadoop as well as a user's guide are freely available under the GNU GPLv3 license at http://mardre.des.udc.es . rreye@udc.es. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  8. A cloud model based multi-attribute decision making approach for selection and evaluation of groundwater management schemes

    NASA Astrophysics Data System (ADS)

    Lu, Hongwei; Ren, Lixia; Chen, Yizhong; Tian, Peipei; Liu, Jia

    2017-12-01

    Due to the uncertainty (i.e., fuzziness, stochasticity and imprecision) existed simultaneously during the process for groundwater remediation, the accuracy of ranking results obtained by the traditional methods has been limited. This paper proposes a cloud model based multi-attribute decision making framework (CM-MADM) with Monte Carlo for the contaminated-groundwater remediation strategies selection. The cloud model is used to handle imprecise numerical quantities, which can describe the fuzziness and stochasticity of the information fully and precisely. In the proposed approach, the contaminated concentrations are aggregated via the backward cloud generator and the weights of attributes are calculated by employing the weight cloud module. A case study on the remedial alternative selection for a contaminated site suffering from a 1,1,1-trichloroethylene leakage problem in Shanghai, China is conducted to illustrate the efficiency and applicability of the developed approach. Totally, an attribute system which consists of ten attributes were used for evaluating each alternative through the developed method under uncertainty, including daily total pumping rate, total cost and cloud model based health risk. Results indicated that A14 was evaluated to be the most preferred alternative for the 5-year, A5 for the 10-year, A4 for the 15-year and A6 for the 20-year remediation.

  9. Developing Present-day Proxy Cases Based on NARVAL Data for Investigating Low Level Cloud Responses to Future Climate Change.

    NASA Astrophysics Data System (ADS)

    Reilly, Stephanie

    2017-04-01

    The energy budget of the entire global climate is significantly influenced by the presence of boundary layer clouds. The main aim of the High Definition Clouds and Precipitation for Advancing Climate Prediction (HD(CP)2) project is to improve climate model predictions by means of process studies of clouds and precipitation. This study makes use of observed elevated moisture layers as a proxy of future changes in tropospheric humidity. The associated impact on radiative transfer triggers fast responses in boundary layer clouds, providing a framework for investigating this phenomenon. The investigation will be carried out using data gathered during the Next-generation Aircraft Remote-sensing for VALidation (NARVAL) South campaigns. Observational data will be combined with ECMWF reanalysis data to derive the large scale forcings for the Large Eddy Simulations (LES). Simulations will be generated for a range of elevated moisture layers, spanning a multi-dimensional phase space in depth, amplitude, elevation, and cloudiness. The NARVAL locations will function as anchor-points. The results of the large eddy simulations and the observations will be studied and compared in an attempt to determine how simulated boundary layer clouds react to changes in radiative transfer from the free troposphere. Preliminary LES results will be presented and discussed.

  10. A framework for expanding aqueous chemistry in the ...

    EPA Pesticide Factsheets

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM − KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from bio

  11. On the Design of Smart Homes: A Framework for Activity Recognition in Home Environment.

    PubMed

    Cicirelli, Franco; Fortino, Giancarlo; Giordano, Andrea; Guerrieri, Antonio; Spezzano, Giandomenico; Vinci, Andrea

    2016-09-01

    A smart home is a home environment enriched with sensing, actuation, communication and computation capabilities which permits to adapt it to inhabitants preferences and requirements. Establishing a proper strategy of actuation on the home environment can require complex computational tasks on the sensed data. This is the case of activity recognition, which consists in retrieving high-level knowledge about what occurs in the home environment and about the behaviour of the inhabitants. The inherent complexity of this application domain asks for tools able to properly support the design and implementation phases. This paper proposes a framework for the design and implementation of smart home applications focused on activity recognition in home environments. The framework mainly relies on the Cloud-assisted Agent-based Smart home Environment (CASE) architecture offering basic abstraction entities which easily allow to design and implement Smart Home applications. CASE is a three layered architecture which exploits the distributed multi-agent paradigm and the cloud technology for offering analytics services. Details about how to implement activity recognition onto the CASE architecture are supplied focusing on the low-level technological issues as well as the algorithms and the methodologies useful for the activity recognition. The effectiveness of the framework is shown through a case study consisting of a daily activity recognition of a person in a home environment.

  12. Temporal evolution of stable water isotopologues in cloud droplets in a hill cap cloud in central Europe (HCCT-2010)

    USGS Publications Warehouse

    Spiegel, J.K.; Aemisegger, F.; Scholl, M.; Wienhold, F.G.; Collett, J.L.; Lee, T.; van Pinxteren, D.; Mertes, S.; Tilgner, A.; Herrmann, H.; Werner, Roland A.; Buchmann, N.; Eugster, W.

    2012-01-01

    In this work, we present the first study resolving the temporal evolution of δ2H and δ18O values in cloud droplets during 13 different cloud events. The cloud events were probed on a 937 m high mountain chain in Germany in the framework of the Hill Cap Cloud Thuringia 2010 campaign (HCCT-2010) in September and October 2010. The δ values of cloud droplets ranged from −77‰ to −15‰ (δ2H) and from −12.1‰ to −3.9‰ (δ18O) over the whole campaign. The cloud water line of the measured δ values was δ2H=7.8×δ18O+13×10−3, which is of similar slope, but with higher deuterium excess than other Central European Meteoric Water Lines. Decreasing δ values in the course of the campaign agree with seasonal trends observed in rain in central Europe. The deuterium excess was higher in clouds developing after recent precipitation revealing episodes of regional moisture recycling. The variations in δ values during one cloud event could either result from changes in meteorological conditions during condensation or from variations in the δ values of the water vapor feeding the cloud. To test which of both aspects dominated during the investigated cloud events, we modeled the variation in δ values in cloud water using a closed box model. We could show that the variation in δ values of two cloud events was mainly due to changes in local temperature conditions. For the other eleven cloud events, the variation was most likely caused by changes in the isotopic composition of the advected and entrained vapor. Frontal passages during two of the latter cloud events led to the strongest temporal changes in both δ2H (≈ 6‰ per hour) and δ18O (≈ 0.6‰ per hour). Moreover, a detailed trajectory analysis for the two longest cloud events revealed that variations in the entrained vapor were most likely related to rain out or changes in relative humidity and temperature at the moisture source region or both. This study illustrates the sensitivity of stable isotope composition of cloud water to changes in large scale air mass properties and regional recycling of moisture.

  13. Scale Interactions in the Tropics from a Simple Multi-Cloud Model

    NASA Astrophysics Data System (ADS)

    Niu, X.; Biello, J. A.

    2017-12-01

    Our lack of a complete understanding of the interaction between the moisture convection and equatorial waves remains an impediment in the numerical simulation of large-scale organization, such as the Madden-Julian Oscillation (MJO). The aim of this project is to understand interactions across spatial scales in the tropics from a simplified framework for scale interactions while a using a simplified framework to describe the basic features of moist convection. Using multiple asymptotic scales, Biello and Majda[1] derived a multi-scale model of moist tropical dynamics (IMMD[1]), which separates three regimes: the planetary scale climatology, the synoptic scale waves, and the planetary scale anomalies regime. The scales and strength of the observed MJO would categorize it in the regime of planetary scale anomalies - which themselves are forced from non-linear upscale fluxes from the synoptic scales waves. In order to close this model and determine whether it provides a self-consistent theory of the MJO. A model for diabatic heating due to moist convection must be implemented along with the IMMD. The multi-cloud parameterization is a model proposed by Khouider and Majda[2] to describe the three basic cloud types (congestus, deep and stratiform) that are most responsible for tropical diabatic heating. We implement a simplified version of the multi-cloud model that is based on results derived from large eddy simulations of convection [3]. We present this simplified multi-cloud model and show results of numerical experiments beginning with a variety of convective forcing states. Preliminary results on upscale fluxes, from synoptic scales to planetary scale anomalies, will be presented. [1] Biello J A, Majda A J. Intraseasonal multi-scale moist dynamics of the tropical atmosphere[J]. Communications in Mathematical Sciences, 2010, 8(2): 519-540. [2] Khouider B, Majda A J. A simple multicloud parameterization for convectively coupled tropical waves. Part I: Linear analysis[J]. Journal of the atmospheric sciences, 2006, 63(4): 1308-1323. [3] Dorrestijn J, Crommelin D T, Biello J A, et al. A data-driven multi-cloud model for stochastic parametrization of deep convection[J]. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 2013, 371(1991): 20120374.

  14. Usability evaluation of cloud-based mapping tools for the display of very large datasets

    NASA Astrophysics Data System (ADS)

    Stotz, Nicole Marie

    The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would need to be dropped. This is a free product and would therefore be the best option if cost is an issue, but this map may not be supported in the future.

  15. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.

    PubMed

    Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-11-18

    Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.

  16. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework

    PubMed Central

    McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-01-01

    Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268

  17. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  18. The "Physical feedbacks of Arctic PBL, Sea ice, Cloud and AerosoL (PASCAL)" campaign during the Arctic POLARSTERN expedition PS106 in spring 2017.

    NASA Astrophysics Data System (ADS)

    Macke, A.

    2017-12-01

    The Polar regions are important components in the global climate system. The widespread surface snow and ice cover strongly impacts the surface energy budget, which is tightly coupled to global atmospheric and oceanic circulations. The coupling of sea ice, clouds and aerosol in the transition zone between Open Ocean and sea ice is the focus of the PASCAL investigations to improve our understanding of the recent dramatic reduction in Arctic sea-ice. A large variety of active/passive remote sensing, in-situ-aerosol observation, and spectral irradiance measurements have been obtained during the German research icebreaker POLARSTERN expedition PS106, and provided detailed information on the atmospheric spatiotemporal structure, aerosol and cloud chemical and microphysical properties as well as the resulting surface radiation budget. Nearly identical measurements at the AWIPEV Base (German - French Research Base) in Ny-Ålesund close to the Open Ocean and collocated airborne activities of the POLAR 5 and POLAR 6 AWI aircraft in the framework of the ACLOUD project have been carried out in parallel. The airborne observations have been supplemented by observations of the boundary layer structure (mean and turbulent quantities) from a tethered balloon reaching up to 1500 m, which was operated at an ice floe station nearby POLARSTERN for two weeks. All observational activities together with intense modelling at various scales are part of the German Collaborative Research Cluster TR 172 "Arctic Amplification" that aims to provide an unprecedented picture of the complex Arctic weather and climate system. The presentation provides an overview of the measurements on-board POLARSTERN and on the ice floe station during PASCAL from May 24 to July 21 2017. We conclude how these and future similar measurements during the one-year ice drift of POLARSTERN in the framework of MOSAiC help to reduce uncertainties in Arctic aerosol-cloud interaction, cloud radiative forcing, and surface/atmosphere feedback mechanisms.

  19. A Hierarchical Auction-Based Mechanism for Real-Time Resource Allocation in Cloud Robotic Systems.

    PubMed

    Wang, Lujia; Liu, Ming; Meng, Max Q-H

    2017-02-01

    Cloud computing enables users to share computing resources on-demand. The cloud computing framework cannot be directly mapped to cloud robotic systems with ad hoc networks since cloud robotic systems have additional constraints such as limited bandwidth and dynamic structure. However, most multirobotic applications with cooperative control adopt this decentralized approach to avoid a single point of failure. Robots need to continuously update intensive data to execute tasks in a coordinated manner, which implies real-time requirements. Thus, a resource allocation strategy is required, especially in such resource-constrained environments. This paper proposes a hierarchical auction-based mechanism, namely link quality matrix (LQM) auction, which is suitable for ad hoc networks by introducing a link quality indicator. The proposed algorithm produces a fast and robust method that is accurate and scalable. It reduces both global communication and unnecessary repeated computation. The proposed method is designed for firm real-time resource retrieval for physical multirobot systems. A joint surveillance scenario empirically validates the proposed mechanism by assessing several practical metrics. The results show that the proposed LQM auction outperforms state-of-the-art algorithms for resource allocation.

  20. A hybrid approach to estimate the complex motions of clouds in sky images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, Zhenzhou; Yu, Dantong; Huang, Dong

    Tracking the motion of clouds is essential to forecasting the weather and to predicting the short-term solar energy generation. Existing techniques mainly fall into two categories: variational optical flow, and block matching. In this article, we summarize recent advances in estimating cloud motion using ground-based sky imagers and quantitatively evaluate state-of-the-art approaches. Then we propose a hybrid tracking framework to incorporate the strength of both block matching and optical flow models. To validate the accuracy of the proposed approach, we introduce a series of synthetic images to simulate the cloud movement and deformation, and thereafter comprehensively compare our hybrid approachmore » with several representative tracking algorithms over both simulated and real images collected from various sites/imagers. The results show that our hybrid approach outperforms state-of-the-art models by reducing at least 30% motion estimation errors compared with the ground-truth motions in most of simulated image sequences. Furthermore, our hybrid model demonstrates its superior efficiency in several real cloud image datasets by lowering at least 15% Mean Absolute Error (MAE) between predicted images and ground-truth images.« less

  1. A hybrid approach to estimate the complex motions of clouds in sky images

    DOE PAGES

    Peng, Zhenzhou; Yu, Dantong; Huang, Dong; ...

    2016-09-14

    Tracking the motion of clouds is essential to forecasting the weather and to predicting the short-term solar energy generation. Existing techniques mainly fall into two categories: variational optical flow, and block matching. In this article, we summarize recent advances in estimating cloud motion using ground-based sky imagers and quantitatively evaluate state-of-the-art approaches. Then we propose a hybrid tracking framework to incorporate the strength of both block matching and optical flow models. To validate the accuracy of the proposed approach, we introduce a series of synthetic images to simulate the cloud movement and deformation, and thereafter comprehensively compare our hybrid approachmore » with several representative tracking algorithms over both simulated and real images collected from various sites/imagers. The results show that our hybrid approach outperforms state-of-the-art models by reducing at least 30% motion estimation errors compared with the ground-truth motions in most of simulated image sequences. Furthermore, our hybrid model demonstrates its superior efficiency in several real cloud image datasets by lowering at least 15% Mean Absolute Error (MAE) between predicted images and ground-truth images.« less

  2. A systematic comparison of two-equation Reynolds-averaged Navier-Stokes turbulence models applied to shock-cloud interactions

    NASA Astrophysics Data System (ADS)

    Goodson, Matthew D.; Heitsch, Fabian; Eklund, Karl; Williams, Virginia A.

    2017-07-01

    Turbulence models attempt to account for unresolved dynamics and diffusion in hydrodynamical simulations. We develop a common framework for two-equation Reynolds-averaged Navier-Stokes turbulence models, and we implement six models in the athena code. We verify each implementation with the standard subsonic mixing layer, although the level of agreement depends on the definition of the mixing layer width. We then test the validity of each model into the supersonic regime, showing that compressibility corrections can improve agreement with experiment. For models with buoyancy effects, we also verify our implementation via the growth of the Rayleigh-Taylor instability in a stratified medium. The models are then applied to the ubiquitous astrophysical shock-cloud interaction in three dimensions. We focus on the mixing of shock and cloud material, comparing results from turbulence models to high-resolution simulations (up to 200 cells per cloud radius) and ensemble-averaged simulations. We find that the turbulence models lead to increased spreading and mixing of the cloud, although no two models predict the same result. Increased mixing is also observed in inviscid simulations at resolutions greater than 100 cells per radius, which suggests that the turbulent mixing begins to be resolved.

  3. Analyzing the requirements for a robust security criteria and management of multi-level security in the clouds

    NASA Astrophysics Data System (ADS)

    Farroha, Bassam S.; Farroha, Deborah L.

    2011-06-01

    The new corporate approach to efficient processing and storage is migrating from in-house service-center services to the newly coined approach of Cloud Computing. This approach advocates thin clients and providing services by the service provider over time-shared resources. The concept is not new, however the implementation approach presents a strategic shift in the way organizations provision and manage their IT resources. The requirements on some of the data sets targeted to be run on the cloud vary depending on the data type, originator, user, and confidentiality level. Additionally, the systems that fuse such data would have to deal with the classifying the product and clearing the computing resources prior to allowing new application to be executed. This indicates that we could end up with a multi-level security system that needs to follow specific rules and can send the output to a protected network and systems in order not to have data spill or contaminated resources. The paper discusses these requirements and potential impact on the cloud architecture. Additionally, the paper discusses the unexpected advantages of the cloud framework providing a sophisticated environment for information sharing and data mining.

  4. Overview of the DACCIWA ground-based field campaign in southern West Africa

    NASA Astrophysics Data System (ADS)

    Lohou, Fabienne; Kalthoff, Norbert; Brooks, Barbara; Jegede, Gbenga; Adler, Bianca; Ajao, Adewale; Ayoola, Muritala; Babić, Karmen; Bessardon, Geoffrey; Delon, Claire; Dione, Cheikh; Handwerker, Jan; Jambert, Corinne; Kohler, Martin; Lothon, Marie; Pedruzo-Bagazgoitia, Xabier; Smith, Victoria; Sunmonu, Lukman; Wieser, Andreas; Derrien, Solène

    2017-04-01

    During June and July 2016, a ground-based field campaign took place in southern West Africa within the framework of the Dynamics-aerosol-chemistry-cloud interactions in West Africa (DACCIWA) project. In the investigated region, extended low-level stratus clouds form very frequently during night-time and persist long into the following day influencing the diurnal cycle of the atmospheric boundary layer and, hence, the regional climate. The motivation for the measurements was to identify the meteorological controls on the whole process chain from the formation of nocturnal stratus clouds, via the daytime transition to convective clouds and the formation of deep precipitating clouds. During the measurement period, extensive remote sensing and in-situ measurements were performed at three supersites in Kumasi (Ghana), Savè (Benin) and Ile-Ife (Nigeria). The gathered observations included the energy-balance components at the Earth's surface, the mean and turbulent conditions in the nocturnal and daytime ABL as well as the de- and entrainment processes between the ABL and the free troposphere. The meteorological measurements were supplemented by aerosol and air-chemistry observations. We will give an overview of the conducted measurements including instrument availability and strategy during intensive observation periods.

  5. Cloud Based Web 3d GIS Taiwan Platform

    NASA Astrophysics Data System (ADS)

    Tsai, W.-F.; Chang, J.-Y.; Yan, S. Y.; Chen, B.

    2011-09-01

    This article presents the status of the web 3D GIS platform, which has been developed in the National Applied Research Laboratories. The purpose is to develop a global earth observation 3D GIS platform for applications to disaster monitoring and assessment in Taiwan. For quick response to preliminary and detailed assessment after a natural disaster occurs, the web 3D GIS platform is useful to access, transfer, integrate, display and analyze the multi-scale huge data following the international OGC standard. The framework of cloud service for data warehousing management and efficiency enhancement using VMWare is illustrated in this article.

  6. Evaluation of a Cloud Resolving Model Using TRMM Observations for Multiscale Modeling Applications

    NASA Technical Reports Server (NTRS)

    Posselt, Derek J.; L'Ecuyer, Tristan; Tao, Wei-Kuo; Hou, Arthur Y.; Stephens, Graeme L.

    2007-01-01

    The climate change simulation community is moving toward use of global cloud resolving models (CRMs), however, current computational resources are not sufficient to run global CRMs over the hundreds of years necessary to produce climate change estimates. As an intermediate step between conventional general circulation models (GCMs) and global CRMs, many climate analysis centers are embedding a CRM in each grid cell of a conventional GCM. These Multiscale Modeling Frameworks (MMFs) represent a theoretical advance over the use of conventional GCM cloud and convection parameterizations, but have been shown to exhibit an overproduction of precipitation in the tropics during the northern hemisphere summer. In this study, simulations of clouds, precipitation, and radiation over the South China Sea using the CRM component of the NASA Goddard MMF are evaluated using retrievals derived from the instruments aboard the Tropical Rainfall Measuring Mission (TRMM) satellite platform for a 46-day time period that spans 5 May - 20 June 1998. The NASA Goddard Cumulus Ensemble (GCE) model is forced with observed largescale forcing derived from soundings taken during the intensive observing period of the South China Sea Monsoon Experiment. It is found that the GCE configuration used in the NASA Goddard MMF responds too vigorously to the imposed large-scale forcing, accumulating too much moisture and producing too much cloud cover during convective phases, and overdrying the atmosphere and suppressing clouds during monsoon break periods. Sensitivity experiments reveal that changes to ice cloud microphysical parameters have a relatively large effect on simulated clouds, precipitation, and radiation, while changes to grid spacing and domain length have little effect on simulation results. The results motivate a more detailed and quantitative exploration of the sources and magnitude of the uncertainty associated with specified cloud microphysical parameters in the CRM components of MMFs.

  7. Outcome of the Third Cloud Retrieval Evaluation Workshop

    NASA Astrophysics Data System (ADS)

    Roebeling, R.; Baum, B.; Bennartz, R.; Hamann, U.; Heidinger, A.; Thoss, A.; Walther, A.

    2012-04-01

    Accurate measurements of global distributions of cloud parameters and their diurnal, seasonal, and inter-annual variations are needed to improve the understanding of the role of clouds in the weather and climate system, and to monitor their time-space variations. Cloud properties retrieved from satellite observations, such as cloud vertical placement, cloud water path and cloud particle size, play an important role such studies. In order to give climate and weather researchers more confidence in the quality of these retrievals their validity needs to be determined and their error characteristics need to be quantified. The purpose of the Cloud Retrieval Evaluation Workshop (CREW), which was held from 15-18 November 2011 in Madison, Wisconsin, USA, is to enhance our knowledge on state-of-art cloud properties retrievals from passive imaging satellites, and pave the path towards optimising these retrievals for climate monitoring as well as for the analysis of cloud parameterizations in climate and weather models. CREW also seeks to observe and understand methods that are used to prepare daily and monthly cloud parameter climatologies. An important component of the workshop is the discussion on the results of the algorithm and sensor comparisons and validation studies. Hereto a common database with about 12 different cloud properties retrievals from passive imagers (MSG, MODIS, AVHRR, POLDER and/or AIRS), complemented with cloud measurements that serve as a reference (CLOUDSAT, CALIPSO, AMSU, MISR), was prepared for a number of "golden days". The passive imager cloud property retrievals were inter-compared and validated against Cloudsat, Calipso and AMSU observations. In our presentation we will summarize the outcome of the inter-comparison and validation work done in the framework of CREW, and elaborate on the reasons for the observed differences. More in depth discussions were held on retrieval principles and validation, and the utilization of cloud parameters for climate research. This was done in parallel breakout sessions on cloud vertical placement; cloud physical properties, and cloud climatologies. We will present the recommendations of these sessions, propose a way forward to establish international partnerships on cloud research, and summarize the actions defined to tailor the CREW activities to missions of international programs, such as the Global Energy and Water Cycle Experiment (GEWEX) and Sustained, Co-Ordinated Processing of Environmental Satellite Data for Climate Monitoring (SCOPE-CM). Finally, attention will be given to increase the traceability and uniformity of different long-term and homogeneous records of cloud parameters.

  8. A Coupled GCM-Cloud Resolving Modeling System, and a Regional Scale Model to Study Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2006-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CFWs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1 998 and 1999). In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications).

  9. Atlas2 Cloud: a framework for personal genome analysis in the cloud

    PubMed Central

    2012-01-01

    Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663

  10. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    PubMed

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  11. A Coupled fcGCM-GCE Modeling System: A 3D Cloud Resolving Model and a Regional Scale Model

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and ore sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (21CE, several 31CE), Goddard radiation (including explicity calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generation regional scale model, WRF. In this talk, I will present: (1) A Brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), (3) A discussion on the Goddard WRF version (its developments and applications), and (4) The characteristics of the four-dimensional cloud data sets (or cloud library) stored at Goddard.

  12. Radiative transfer in shallow cumulus cloud fields: Observations and first analysis with the Diram instrument during the BBC-2 field campaign in May 2003

    NASA Astrophysics Data System (ADS)

    van Dop, Han; Wilson, Keith M.

    2006-11-01

    The cloud albedo is a crucial parameter in radiation budget studies, and is one of the main forcings in climate. We have designed and made a device, Diram (directional radiance distribution measurement device), which not only measures reflection and transmission of solar radiation through clouds, but which also determines the radiance distribution. The construction contains 42 sensors, consisting of a collimation system and a detector, which are mounted in two domes (21 in each). The collimators reduce the field of view of each sensor to ˜7°. The domes were mounted on top and below of the Meteo France Merlin IV research aircraft. The 42 signals were continuously logged with a frequency of 10 Hz during a number of flights in the framework of the Baltex Bridge-2 campaign at Cabauw (The Netherlands) in May 2003. The Diram instrument provided radiances during in situ observations of cumulus and (broken) stratocumulus clouds and detected anisotropic effects in solar radiation scattered by clouds which are due to different cloud geometries and which are related to microphysical cloud properties. Microphysical cloud properties were obtained from the Gerber PVM100A optical sensor aboard the aircraft. Liquid water content and particle surface area were logged with a frequency of 200 Hz. Data have been collected from a total of 10 days in different weather conditions (clear sky, broken cumulus, stratocumulus and multilayered cloud). A clear sky test of the Diram indicated that the device was able to reproduce the Rayleigh scattering pattern. During flights in stratocumulus fields, strongly anisotropic patterns were observed. The DIRAM observations confirm that in thin clouds a strong preference for forward scattering is observed in the transmitted radiation field while for thicker clouds the pattern becomes more isotropic, with a slightly brighter centre relative to the limb direction.

  13. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.; hide

    2008-01-01

    Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite simulator has been developed at GSFC, which is designed to fully utilize the multi-scale modeling system. A brief review of the multi-scale modeling system with unified physics/simulator and examples is presented in this article.

  14. Influence of Saharan dust on cloud glaciation in southern Morocco during the Saharan Mineral Dust Experiment

    NASA Astrophysics Data System (ADS)

    Ansmann, A.; Tesche, M.; Althausen, D.; Müller, D.; Seifert, P.; Freudenthaler, V.; Heese, B.; Wiegner, M.; Pisani, G.; Knippertz, P.; Dubovik, O.

    2008-02-01

    Multiwavelength lidar, Sun photometer, and radiosonde observations were conducted at Ouarzazate (30.9°N, 6.9°W, 1133 m above sea level, asl), Morocco, in the framework of the Saharan Mineral Dust Experiment (SAMUM) in May-June 2006. The field site is close to the Saharan desert. Information on the depolarization ratio, backscatter and extinction coefficients, and lidar ratio of the dust particles, estimates of the available concentration of atmospheric ice nuclei at cloud level, profiles of temperature, humidity, and the horizontal wind vector as well as backward trajectory analysis are used to study cases of cloud formation in the dust with focus on heterogeneous ice formation. Surprisingly, most of the altocumulus clouds that form at the top of the Saharan dust layer, which reaches into heights of 4-7 km asl and has layer top temperatures of -8°C to -18°C, do not show any ice formation. According to the lidar observations the presence of a high number of ice nuclei (1-20 cm-3) does not automatically result in the obvious generation of ice particles, but the observations indicate that cloud top temperatures must typically reach values as low as -20°C before significant ice production starts. Another main finding is that liquid clouds are obviously required before ice crystals form via heterogeneous freezing mechanisms, and, as a consequence, that deposition freezing is not an important ice nucleation process. An interesting case with cloud seeding in the free troposphere above the dust layer is presented in addition. Small water clouds formed at about -30°C and produced ice virga. These virga reached water cloud layers several kilometers below the initiating cloud cells and caused strong ice production in these clouds at temperatures as high as -12°C to -15°C.

  15. Testing ice microphysics parameterizations in the NCAR Community Atmospheric Model Version 3 using Tropical Warm Pool-International Cloud Experiment data

    DOE PAGES

    Wang, Weiguo; Liu, Xiaohong; Xie, Shaocheng; ...

    2009-07-23

    Here, cloud properties have been simulated with a new double-moment microphysics scheme under the framework of the single-column version of NCAR Community Atmospheric Model version 3 (CAM3). For comparison, the same simulation was made with the standard single-moment microphysics scheme of CAM3. Results from both simulations compared favorably with observations during the Tropical Warm Pool–International Cloud Experiment by the U.S. Department of Energy Atmospheric Radiation Measurement Program in terms of the temporal variation and vertical distribution of cloud fraction and cloud condensate. Major differences between the two simulations are in the magnitude and distribution of ice water content within themore » mixed-phase cloud during the monsoon period, though the total frozen water (snow plus ice) contents are similar. The ice mass content in the mixed-phase cloud from the new scheme is larger than that from the standard scheme, and ice water content extends 2 km further downward, which is in better agreement with observations. The dependence of the frozen water mass fraction on temperature from the new scheme is also in better agreement with available observations. Outgoing longwave radiation (OLR) at the top of the atmosphere (TOA) from the simulation with the new scheme is, in general, larger than that with the standard scheme, while the surface downward longwave radiation is similar. Sensitivity tests suggest that different treatments of the ice crystal effective radius contribute significantly to the difference in the calculations of TOA OLR, in addition to cloud water path. Numerical experiments show that cloud properties in the new scheme can respond reasonably to changes in the concentration of aerosols and emphasize the importance of correctly simulating aerosol effects in climate models for aerosol-cloud interactions. Further evaluation, especially for ice cloud properties based on in-situ data, is needed.« less

  16. The Incorporation and Initialization of Cloud Water/ice in AN Operational Forecast Model

    NASA Astrophysics Data System (ADS)

    Zhao, Qingyun

    Quantitative precipitation forecasts have been one of the weakest aspects of numerical weather prediction models. Theoretical studies show that the errors in precipitation calculation can arise from three sources: errors in the large-scale forecasts of primary variables, errors in the crude treatment of condensation/evaporation and precipitation processes, and errors in the model initial conditions. A new precipitation parameterization scheme has been developed to investigate the forecast value of improved precipitation physics via the introduction of cloud water and cloud ice into a numerical prediction model. The main feature of this scheme is the explicit calculation of cloud water and cloud ice in both the convective and stratiform precipitation parameterization. This scheme has been applied to the eta model at the National Meteorological Center. Four extensive tests have been performed. The statistical results showed a significant improvement in the model precipitation forecasts. Diagnostic studies suggest that the inclusion of cloud ice is important in transferring water vapor to precipitation and in the enhancement of latent heat release; the latter subsequently affects the vertical motion field significantly. Since three-dimensional cloud data is absent from the analysis/assimilation system for most numerical models, a method has been proposed to incorporate observed precipitation and nephanalysis data into the data assimilation system to obtain the initial cloud field for the eta model. In this scheme, the initial moisture and vertical motion fields are also improved at the same time as cloud initialization. The physical initialization is performed in a dynamical initialization framework that uses the Newtonian dynamical relaxation method to nudge the model's wind and mass fields toward analyses during a 12-hour data assimilation period. Results from a case study showed that a realistic cloud field was produced by this method at the end of the data assimilation period. Precipitation forecasts have been significantly improved as a result of the improved initial cloud, moisture and vertical motion fields.

  17. Robotic disaster recovery efforts with ad-hoc deployable cloud computing

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy; Marsh, Ronald; Mohammad, Atif F.

    2013-06-01

    Autonomous operations of search and rescue (SaR) robots is an ill posed problem, which is complexified by the dynamic disaster recovery environment. In a typical SaR response scenario, responder robots will require different levels of processing capabilities during various parts of the response effort and will need to utilize multiple algorithms. Placing these capabilities onboard the robot is a mediocre solution that precludes algorithm specific performance optimization and results in mediocre performance. Architecture for an ad-hoc, deployable cloud environment suitable for use in a disaster response scenario is presented. Under this model, each service provider is optimized for the task and maintains a database of situation-relevant information. This service-oriented architecture (SOA 3.0) compliant framework also serves as an example of the efficient use of SOA 3.0 in an actual cloud application.

  18. Cloud-based large-scale air traffic flow optimization

    NASA Astrophysics Data System (ADS)

    Cao, Yi

    The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model that can be used for both offline historical traffic data analysis and online traffic flow optimization. It provides an efficient and robust platform for easy deployment and implementation. A small cloud consisting of five workstations was configured and used to demonstrate the advantages of cloud computing in dealing with large-scale parallelizable traffic problems.

  19. Trends and New Directions in Software Architecture

    DTIC Science & Technology

    2014-10-10

    frameworks  Open source  Cloud strategies  NoSQL  Machine Learning  MDD  Incremental approaches  Dashboards  Distributed development...complexity grows  NoSQL Models are not created equal 2014 Our Current Research  Lightweight Evaluation and Architecture Prototyping for Big Data

  20. The Isprs Benchmark on Indoor Modelling

    NASA Astrophysics Data System (ADS)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  1. An Efficient Mutual Authentication Framework for Healthcare System in Cloud Computing.

    PubMed

    Kumar, Vinod; Jangirala, Srinivas; Ahmad, Musheer

    2018-06-28

    The increasing role of Telecare Medicine Information Systems (TMIS) makes its accessibility for patients to explore medical treatment, accumulate and approach medical data through internet connectivity. Security and privacy preservation is necessary for medical data of the patient in TMIS because of the very perceptive purpose. Recently, Mohit et al.'s proposed a mutual authentication protocol for TMIS in the cloud computing environment. In this work, we reviewed their protocol and found that it is not secure against stolen verifier attack, many logged in patient attack, patient anonymity, impersonation attack, and fails to protect session key. For enhancement of security level, we proposed a new mutual authentication protocol for the similar environment. The presented framework is also more capable in terms of computation cost. In addition, the security evaluation of the protocol protects resilience of all possible security attributes, and we also explored formal security evaluation based on random oracle model. The performance of the proposed protocol is much better in comparison to the existing protocol.

  2. Pole-Like Road Furniture Detection in Sparse and Unevenly Distributed Mobile Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Li, F.; Lehtomäki, M.; Oude Elberink, S.; Vosselman, G.; Puttonen, E.; Kukko, A.; Hyyppä, J.

    2018-05-01

    Pole-like road furniture detection received much attention due to its traffic functionality in recent years. In this paper, we develop a framework to detect pole-like road furniture from sparse mobile laser scanning data. The framework is carried out in four steps. The unorganised point cloud is first partitioned. Then above ground points are clustered and roughly classified after removing ground points. A slicing check in combination with cylinder masking is proposed to extract pole-like road furniture candidates. Pole-like road furniture are obtained after occlusion analysis in the last stage. The average completeness and correctness of pole-like road furniture in sparse and unevenly distributed mobile laser scanning data was above 0.83. It is comparable to the state of art in the field of pole-like road furniture detection in mobile laser scanning data of good quality and is potentially of practical use in the processing of point clouds collected by autonomous driving platforms.

  3. SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio

    2016-08-01

    SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.

  4. Cloud property datasets retrieved from AVHRR, MODIS, AATSR and MERIS in the framework of the Cloud_cci project

    NASA Astrophysics Data System (ADS)

    Stengel, Martin; Stapelberg, Stefan; Sus, Oliver; Schlundt, Cornelia; Poulsen, Caroline; Thomas, Gareth; Christensen, Matthew; Carbajal Henken, Cintia; Preusker, Rene; Fischer, Jürgen; Devasthale, Abhay; Willén, Ulrika; Karlsson, Karl-Göran; McGarragh, Gregory R.; Proud, Simon; Povey, Adam C.; Grainger, Roy G.; Fokke Meirink, Jan; Feofilov, Artem; Bennartz, Ralf; Bojanowski, Jedrzej S.; Hollmann, Rainer

    2017-11-01

    New cloud property datasets based on measurements from the passive imaging satellite sensors AVHRR, MODIS, ATSR2, AATSR and MERIS are presented. Two retrieval systems were developed that include components for cloud detection and cloud typing followed by cloud property retrievals based on the optimal estimation (OE) technique. The OE-based retrievals are applied to simultaneously retrieve cloud-top pressure, cloud particle effective radius and cloud optical thickness using measurements at visible, near-infrared and thermal infrared wavelengths, which ensures spectral consistency. The retrieved cloud properties are further processed to derive cloud-top height, cloud-top temperature, cloud liquid water path, cloud ice water path and spectral cloud albedo. The Cloud_cci products are pixel-based retrievals, daily composites of those on a global equal-angle latitude-longitude grid, and monthly cloud properties such as averages, standard deviations and histograms, also on a global grid. All products include rigorous propagation of the retrieval and sampling uncertainties. Grouping the orbital properties of the sensor families, six datasets have been defined, which are named AVHRR-AM, AVHRR-PM, MODIS-Terra, MODIS-Aqua, ATSR2-AATSR and MERIS+AATSR, each comprising a specific subset of all available sensors. The individual characteristics of the datasets are presented together with a summary of the retrieval systems and measurement records on which the dataset generation were based. Example validation results are given, based on comparisons to well-established reference observations, which demonstrate the good quality of the data. In particular the ensured spectral consistency and the rigorous uncertainty propagation through all processing levels can be considered as new features of the Cloud_cci datasets compared to existing datasets. In addition, the consistency among the individual datasets allows for a potential combination of them as well as facilitates studies on the impact of temporal sampling and spatial resolution on cloud climatologies.

    For each dataset a digital object identifier has been issued:

    Cloud_cci AVHRR-AM: https://doi.org/10.5676/DWD/ESA_Cloud_cci/AVHRR-AM/V002

    Cloud_cci AVHRR-PM: https://doi.org/10.5676/DWD/ESA_Cloud_cci/AVHRR-PM/V002

    Cloud_cci MODIS-Terra: https://doi.org/10.5676/DWD/ESA_Cloud_cci/MODIS-Terra/V002

    Cloud_cci MODIS-Aqua: https://doi.org/10.5676/DWD/ESA_Cloud_cci/MODIS-Aqua/V002

    Cloud_cci ATSR2-AATSR: https://doi.org/10.5676/DWD/ESA_Cloud_cci/ATSR2-AATSR/V002

    Cloud_cci MERIS+AATSR: https://doi.org/10.5676/DWD/ESA_Cloud_cci/MERIS+AATSR/V002

  5. Automated retrieval of cloud and aerosol properties from the ARM Raman lidar, part 1: feature detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thorsen, Tyler J.; Fu, Qiang; Newsom, Rob K.

    A Feature detection and EXtinction retrieval (FEX) algorithm for the Atmospheric Radiation Measurement (ARM) program’s Raman lidar (RL) has been developed. Presented here is part 1 of the FEX algorithm: the detection of features including both clouds and aerosols. The approach of FEX is to use multiple quantities— scattering ratios derived using elastic and nitro-gen channel signals from two fields of view, the scattering ratio derived using only the elastic channel, and the total volume depolarization ratio— to identify features using range-dependent detection thresholds. FEX is designed to be context-sensitive with thresholds determined for each profile by calculating the expectedmore » clear-sky signal and noise. The use of multiple quantities pro-vides complementary depictions of cloud and aerosol locations and allows for consistency checks to improve the accuracy of the feature mask. The depolarization ratio is shown to be particularly effective at detecting optically-thin features containing non-spherical particles such as cirrus clouds. Improve-ments over the existing ARM RL cloud mask are shown. The performance of FEX is validated against a collocated micropulse lidar and observations from the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) satellite over the ARM Darwin, Australia site. While we focus on a specific lidar system, the FEX framework presented here is suitable for other Raman or high spectral resolution lidars.« less

  6. ScipionCloud: An integrative and interactive gateway for large scale cryo electron microscopy image processing on commercial and academic clouds.

    PubMed

    Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María

    2017-10-01

    New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. An efficient global energy optimization approach for robust 3D plane segmentation of point clouds

    NASA Astrophysics Data System (ADS)

    Dong, Zhen; Yang, Bisheng; Hu, Pingbo; Scherer, Sebastian

    2018-03-01

    Automatic 3D plane segmentation is necessary for many applications including point cloud registration, building information model (BIM) reconstruction, simultaneous localization and mapping (SLAM), and point cloud compression. However, most of the existing 3D plane segmentation methods still suffer from low precision and recall, and inaccurate and incomplete boundaries, especially for low-quality point clouds collected by RGB-D sensors. To overcome these challenges, this paper formulates the plane segmentation problem as a global energy optimization because it is robust to high levels of noise and clutter. First, the proposed method divides the raw point cloud into multiscale supervoxels, and considers planar supervoxels and individual points corresponding to nonplanar supervoxels as basic units. Then, an efficient hybrid region growing algorithm is utilized to generate initial plane set by incrementally merging adjacent basic units with similar features. Next, the initial plane set is further enriched and refined in a mutually reinforcing manner under the framework of global energy optimization. Finally, the performances of the proposed method are evaluated with respect to six metrics (i.e., plane precision, plane recall, under-segmentation rate, over-segmentation rate, boundary precision, and boundary recall) on two benchmark datasets. Comprehensive experiments demonstrate that the proposed method obtained good performances both in high-quality TLS point clouds (i.e., http://SEMANTIC3D.NET)

  8. Reconstruction of 3d Models from Point Clouds with Hybrid Representation

    NASA Astrophysics Data System (ADS)

    Hu, P.; Dong, Z.; Yuan, P.; Liang, F.; Yang, B.

    2018-05-01

    The three-dimensional (3D) reconstruction of urban buildings from point clouds has long been an active topic in applications related to human activities. However, due to the structures significantly differ in terms of complexity, the task of 3D reconstruction remains a challenging issue especially for the freeform surfaces. In this paper, we present a new reconstruction algorithm which allows the 3D-models of building as a combination of regular structures and irregular surfaces, where the regular structures are parameterized plane primitives and the irregular surfaces are expressed as meshes. The extraction of irregular surfaces starts with an over-segmented method for the unstructured point data, a region growing approach based the adjacent graph of super-voxels is then applied to collapse these super-voxels, and the freeform surfaces can be clustered from the voxels filtered by a thickness threshold. To achieve these regular planar primitives, the remaining voxels with a larger flatness will be further divided into multiscale super-voxels as basic units, and the final segmented planes are enriched and refined in a mutually reinforcing manner under the framework of a global energy optimization. We have implemented the proposed algorithms and mainly tested on two point clouds that differ in point density and urban characteristic, and experimental results on complex building structures illustrated the efficacy of the proposed framework.

  9. Cloud-Based Evaluation of Anatomical Structure Segmentation and Landmark Detection Algorithms: VISCERAL Anatomy Benchmarks.

    PubMed

    Jimenez-Del-Toro, Oscar; Muller, Henning; Krenn, Markus; Gruenberg, Katharina; Taha, Abdel Aziz; Winterstein, Marianne; Eggel, Ivan; Foncubierta-Rodriguez, Antonio; Goksel, Orcun; Jakab, Andras; Kontokotsios, Georgios; Langs, Georg; Menze, Bjoern H; Salas Fernandez, Tomas; Schaer, Roger; Walleyo, Anna; Weber, Marc-Andre; Dicente Cid, Yashin; Gass, Tobias; Heinrich, Mattias; Jia, Fucang; Kahl, Fredrik; Kechichian, Razmig; Mai, Dominic; Spanier, Assaf B; Vincent, Graham; Wang, Chunliang; Wyeth, Daniel; Hanbury, Allan

    2016-11-01

    Variations in the shape and appearance of anatomical structures in medical images are often relevant radiological signs of disease. Automatic tools can help automate parts of this manual process. A cloud-based evaluation framework is presented in this paper including results of benchmarking current state-of-the-art medical imaging algorithms for anatomical structure segmentation and landmark detection: the VISCERAL Anatomy benchmarks. The algorithms are implemented in virtual machines in the cloud where participants can only access the training data and can be run privately by the benchmark administrators to objectively compare their performance in an unseen common test set. Overall, 120 computed tomography and magnetic resonance patient volumes were manually annotated to create a standard Gold Corpus containing a total of 1295 structures and 1760 landmarks. Ten participants contributed with automatic algorithms for the organ segmentation task, and three for the landmark localization task. Different algorithms obtained the best scores in the four available imaging modalities and for subsets of anatomical structures. The annotation framework, resulting data set, evaluation setup, results and performance analysis from the three VISCERAL Anatomy benchmarks are presented in this article. Both the VISCERAL data set and Silver Corpus generated with the fusion of the participant algorithms on a larger set of non-manually-annotated medical images are available to the research community.

  10. Water isotope tracers of tropical hydroclimate in a warming world

    NASA Astrophysics Data System (ADS)

    Konecky, B. L.; Noone, D.; Nusbaumer, J. M.; Cobb, K. M.; Di Nezio, P. N.; Otto-Bliesner, B. L.

    2016-12-01

    The tropical water cycle is projected to undergo substantial changes under a warming climate, but direct meteorological observations to contextualize these changes are rare prior to the 20th century. Stable oxygen and hydrogen isotope ratios (δ18O, δD) of environmental waters preserved in geologic archives are increasingly being used to reconstruct terrestrial rainfall over many decades to millions of years. However, a rising number of new, modern-day observations and model simulations have challenged previous interpretations of these isotopic signatures. This presentation systematically evaluates the three main influences on the δ18O and δD of modern precipitation - rainfall amount, cloud type, and moisture transport - from terrestrial stations throughout the tropics, and uses this interpretive framework to understand past changes in terrestrial tropical rainfall. Results indicate that cloud type and moisture transport have a larger influence on modern δ18O and δD of tropical precipitation than previously believed, indicating that isotope records track changes in cloud characteristics and circulation that accompany warmer and cooler climate states. We use our framework to investigate isotopic records of the land-based tropical rain belt during the Last Glacial Maximum, the period of warming following the Little Ice Age, and the 21st century. Proxy and observational data are compared with water isotope-enabled simulations with the Community Earth System Model in order to discuss how global warming and cooling may influence tropical terrestrial hydroclimate.

  11. New Information Dispersal Techniques for Trustworthy Computing

    ERIC Educational Resources Information Center

    Parakh, Abhishek

    2011-01-01

    Information dispersal algorithms (IDA) are used for distributed data storage because they simultaneously provide security, reliability and space efficiency, constituting a trustworthy computing framework for many critical applications, such as cloud computing, in the information society. In the most general sense, this is achieved by dividing data…

  12. Conjugated organic framework with three-dimensionally ordered stable structure and delocalized π clouds

    PubMed Central

    Guo, Jia; Xu, Yanhong; Jin, Shangbin; Chen, Long; Kaji, Toshihiko; Honsho, Yoshihito; Addicoat, Matthew A.; Kim, Jangbae; Saeki, Akinori; Ihee, Hyotcherl; Seki, Shu; Irle, Stephan; Hiramoto, Masahiro; Gao, Jia; Jiang, Donglin

    2013-01-01

    Covalent organic frameworks are a class of crystalline organic porous materials that can utilize π–π-stacking interactions as a driving force for the crystallization of polygonal sheets to form layered frameworks and ordered pores. However, typical examples are chemically unstable and lack intrasheet π-conjugation, thereby significantly limiting their applications. Here we report a chemically stable, electronically conjugated organic framework with topologically designed wire frameworks and open nanochannels, in which the π conjugation-spans the two-dimensional sheets. Our framework permits inborn periodic ordering of conjugated chains in all three dimensions and exhibits a striking combination of properties: chemical stability, extended π-delocalization, ability to host guest molecules and hole mobility. We show that the π-conjugated organic framework is useful for high on-off ratio photoswitches and photovoltaic cells. Therefore, this strategy may constitute a step towards realizing ordered semiconducting porous materials for innovations based on two-dimensionally extended π systems. PMID:24220603

  13. A framework for correcting brain retraction based on an eXtended Finite Element Method using a laser range scanner.

    PubMed

    Li, Ping; Wang, Weiwei; Song, Zhijian; An, Yong; Zhang, Chenxi

    2014-07-01

    Brain retraction causes great distortion that limits the accuracy of an image-guided neurosurgery system that uses preoperative images. Therefore, brain retraction correction is an important intraoperative clinical application. We used a linear elastic biomechanical model, which deforms based on the eXtended Finite Element Method (XFEM) within a framework for brain retraction correction. In particular, a laser range scanner was introduced to obtain a surface point cloud of the exposed surgical field including retractors inserted into the brain. A brain retraction surface tracking algorithm converted these point clouds into boundary conditions applied to XFEM modeling that drive brain deformation. To test the framework, we performed a brain phantom experiment involving the retraction of tissue. Pairs of the modified Hausdorff distance between Canny edges extracted from model-updated images, pre-retraction, and post-retraction CT images were compared to evaluate the morphological alignment of our framework. Furthermore, the measured displacements of beads embedded in the brain phantom and the predicted ones were compared to evaluate numerical performance. The modified Hausdorff distance of 19 pairs of images decreased from 1.10 to 0.76 mm. The forecast error of 23 stainless steel beads in the phantom was between 0 and 1.73 mm (mean 1.19 mm). The correction accuracy varied between 52.8 and 100 % (mean 81.4 %). The results demonstrate that the brain retraction compensation can be incorporated intraoperatively into the model-updating process in image-guided neurosurgery systems.

  14. Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF

    NASA Astrophysics Data System (ADS)

    Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.

    2014-12-01

    Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.

  15. Social media and nursing practice: changing the balance between the social and technical aspects of work.

    PubMed

    Casella, Evan; Mills, Jane; Usher, Kim

    2014-01-01

    Modern communication methods are drastically changing the way people interact with each other. Professions such as nursing need to evolve to remain relevant as social infrastructure changes. In the 1960s, researchers developed a sociotechnical theory that stated workers were more motivated and productive if there was a good balance between the social and technical aspects of their work. Today's technology is blurring the boundaries between the social and the technical thereby transforming human contact and communication into a multi-method process. In Australia, people are adept at utilising social media technology to become more efficient, creative and connected; Australian nurses also need to embrace changing technology to capitalise on the professional opportunities offered by social media. This paper imagines a world where nurses integrate social media into assessing, diagnosing, planning, implementing and evaluating care. Discussion draws on a combination of real-world examples of best-practice and blue-sky thinking to demonstrate that evidence-based care must be combined with the adoption of future-forward technology.

  16. A Real-Time High Performance Computation Architecture for Multiple Moving Target Tracking Based on Wide-Area Motion Imagery via Cloud and Graphic Processing Units

    PubMed Central

    Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik

    2017-01-01

    This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684

  17. Improving microphysics in a convective parameterization: possibilities and limitations

    NASA Astrophysics Data System (ADS)

    Labbouz, Laurent; Heikenfeld, Max; Stier, Philip; Morrison, Hugh; Milbrandt, Jason; Protat, Alain; Kipling, Zak

    2017-04-01

    The convective cloud field model (CCFM) is a convective parameterization implemented in the climate model ECHAM6.1-HAM2.2. It represents a population of clouds within each ECHAM-HAM model column, simulating up to 10 different convective cloud types with individual radius, vertical velocities and microphysical properties. Comparisons between CCFM and radar data at Darwin, Australia, show that in order to reproduce both the convective cloud top height distribution and the vertical velocity profile, the effect of aerodynamic drag on the rising parcel has to be considered, along with a reduced entrainment parameter. A new double-moment microphysics (the Predicted Particle Properties scheme, P3) has been implemented in the latest version of CCFM and is compared to the standard single-moment microphysics and the radar retrievals at Darwin. The microphysical process rates (autoconversion, accretion, deposition, freezing, …) and their response to changes in CDNC are investigated and compared to high resolution CRM WRF simulations over the Amazon region. The results shed light on the possibilities and limitations of microphysics improvements in the framework of CCFM and in convective parameterizations in general.

  18. A Real-Time High Performance Computation Architecture for Multiple Moving Target Tracking Based on Wide-Area Motion Imagery via Cloud and Graphic Processing Units.

    PubMed

    Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik

    2017-02-12

    This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.

  19. Sloped Terrain Segmentation for Autonomous Drive Using Sparse 3D Point Cloud

    PubMed Central

    Cho, Seoungjae; Kim, Jonghyun; Ikram, Warda; Cho, Kyungeun; Sim, Sungdae

    2014-01-01

    A ubiquitous environment for road travel that uses wireless networks requires the minimization of data exchange between vehicles. An algorithm that can segment the ground in real time is necessary to obtain location data between vehicles simultaneously executing autonomous drive. This paper proposes a framework for segmenting the ground in real time using a sparse three-dimensional (3D) point cloud acquired from undulating terrain. A sparse 3D point cloud can be acquired by scanning the geography using light detection and ranging (LiDAR) sensors. For efficient ground segmentation, 3D point clouds are quantized in units of volume pixels (voxels) and overlapping data is eliminated. We reduce nonoverlapping voxels to two dimensions by implementing a lowermost heightmap. The ground area is determined on the basis of the number of voxels in each voxel group. We execute ground segmentation in real time by proposing an approach to minimize the comparison between neighboring voxels. Furthermore, we experimentally verify that ground segmentation can be executed at about 19.31 ms per frame. PMID:25093204

  20. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  1. Positive Low Cloud and Dust Feedbacks Amplify Tropical North Atlantic Multidecadal Variability

    NASA Technical Reports Server (NTRS)

    Yuan, Tianle; Oraiopoulos, Lazaros; Zelinka, Mark; Yu, Hongbin; Norris, Joel R.; Chin, Mian; Platnick, Steven; Meyer, Kerry

    2016-01-01

    The Atlantic Multidecadal Oscillation (AMO) is characterized by a horseshoe pattern of sea surface temperature (SST) anomalies and has a wide range of climatic impacts. While the tropical arm of AMO is responsible for many of these impacts, it is either too weak or completely absent in many climate model simulations. Here we show, using both observational and model evidence, that the radiative effect of positive low cloud and dust feedbacks is strong enough to generate the tropical arm of AMO, with the low cloud feedback more dominant. The feedbacks can be understood in a consistent dynamical framework: weakened tropical trade wind speed in response to a warm middle latitude SST anomaly reduces dust loading and low cloud fraction over the tropical Atlantic, which warms the tropical North Atlantic SST. Together they contribute to appearance of the tropical arm of AMO. Most current climate models miss both the critical wind speed response and two positive feedbacks though realistic simulations of them may be essential for many climatic studies related to the AMO.

  2. Positive low cloud and dust feedbacks amplify tropical North Atlantic Multidecadal Oscillation

    DOE PAGES

    Yuan, Tianle; Oreopoulos, Lazaros; Zelinka, Mark; ...

    2016-02-04

    The Atlantic Multidecadal Oscillation (AMO) is characterized by a horseshoe pattern of sea surface temperature (SST) anomalies and has a wide range of climatic impacts. While the tropical arm of AMO is responsible for many of these impacts, it is either too weak or completely absent in many climate model simulations. Here we show, using both observational and model evidence, that the radiative effect of positive low cloud and dust feedbacks is strong enough to generate the tropical arm of AMO, with the low cloud feedback more dominant. The feedbacks can be understood in a consistent dynamical framework: weakened tropicalmore » trade wind speed in response to a warm middle latitude SST anomaly reduces dust loading and low cloud fraction over the tropical Atlantic, which warms the tropical North Atlantic SST. Together they contribute to the appearance of the tropical arm of AMO. Most current climate models miss both the critical wind speed response and two positive feedbacks though realistic simulations of them may be essential for many climatic studies related to the AMO.« less

  3. Microphysical variability of Amazonian deep convective cores observed by CloudSat and simulated by a multi-scale modeling framework

    NASA Astrophysics Data System (ADS)

    Brant Dodson, J.; Taylor, Patrick C.; Branson, Mark

    2018-05-01

    Recently launched cloud observing satellites provide information about the vertical structure of deep convection and its microphysical characteristics. In this study, CloudSat reflectivity data is stratified by cloud type, and the contoured frequency by altitude diagrams reveal a double-arc structure in deep convective cores (DCCs) above 8 km. This suggests two distinct hydrometeor modes (snow versus hail/graupel) controlling variability in reflectivity profiles. The day-night contrast in the double arcs is about four times larger than the wet-dry season contrast. Using QuickBeam, the vertical reflectivity structure of DCCs is analyzed in two versions of the Superparameterized Community Atmospheric Model (SP-CAM) with single-moment (no graupel) and double-moment (with graupel) microphysics. Double-moment microphysics shows better agreement with observed reflectivity profiles; however, neither model variant captures the double-arc structure. Ultimately, the results show that simulating realistic DCC vertical structure and its variability requires accurate representation of ice microphysics, in particular the hail/graupel modes, though this alone is insufficient.

  4. Digital Investigations of AN Archaeological Smart Point Cloud: a Real Time Web-Based Platform to Manage the Visualisation of Semantical Queries

    NASA Astrophysics Data System (ADS)

    Poux, F.; Neuville, R.; Hallot, P.; Van Wersch, L.; Luczfalvy Jancsó, A.; Billen, R.

    2017-05-01

    While virtual copies of the real world tend to be created faster than ever through point clouds and derivatives, their working proficiency by all professionals' demands adapted tools to facilitate knowledge dissemination. Digital investigations are changing the way cultural heritage researchers, archaeologists, and curators work and collaborate to progressively aggregate expertise through one common platform. In this paper, we present a web application in a WebGL framework accessible on any HTML5-compatible browser. It allows real time point cloud exploration of the mosaics in the Oratory of Germigny-des-Prés, and emphasises the ease of use as well as performances. Our reasoning engine is constructed over a semantically rich point cloud data structure, where metadata has been injected a priori. We developed a tool that directly allows semantic extraction and visualisation of pertinent information for the end users. It leads to efficient communication between actors by proposing optimal 3D viewpoints as a basis on which interactions can grow.

  5. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  6. DOE ASR Final Report on “Use of ARM Observations to Investigate the Role of Tropical Radiative Processes and Cloud Radiative Effects in Climate Simulations”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Qiang; Comstock, Jennifer

    The overall objective of this ASR funded project is to investigate the role of cloud radiative effects, especially those associated with tropical thin cirrus clouds in the tropical tropopause layer, by analyzing the ARM observations combined with numerical models. In particular, we have processed and analyzed the observations from the Raman lidar at the ARM SGP and TWP sites. In the tenure of the project (8/15/2013 – 8/14/2016 and with a no-cost extension to 8/14/2017), we have been concentrating on (i) developing an automated feature detection scheme of clouds and aerosols for the ARM Raman lidar; (ii) developing an automatedmore » retrieval of cloud and aerosol extinctions for the ARM Raman lidar; (iii) investigating cloud radiative effects based on the observations on the simulated temperatures in the tropical tropopause layer using a radiative-convective model; and (iv) examining the effect of changes of atmospheric composition on the tropical lower-stratospheric temperatures. In addition, we have examined the biases in the CALIPSO-inferred aerosol direct radiative effects using ground-based Raman lidars at the ARM SGP and TWP sites, and estimated the impact of lidar detection sensitivity on assessing global aerosol direct radiative effects. We have also investigated the diurnal cycle of clouds and precipitation at the ARM site using the cloud radar observations along with simulations from the multiscale modeling framework. The main results of our research efforts are reported in the six referred journal publications that acknowledge the DOE Grant DE-SC0010557.« less

  7. Microphysics in the Multi-Scale Modeling Systems with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the microphysics developments of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the heavy precipitation processes will be presented.

  8. The radiative versus entraining effects of overlying humidity on the Lagrangian evolution of subtropical stratocumulus

    NASA Astrophysics Data System (ADS)

    Eastman, R. M.; Wood, R.

    2017-12-01

    This study observes the 24-hour Lagrangian evolution of stratocumulus cloud amount and PBL depth in four eastern subtropical ocean basins: the NE Pacific, SE Pacific, SE Atlantic, and E Indian. Nearly 170,000 trajectories are computed using the 2-D wind field at 925mb and cloud properties are sampled along these trajectories twice daily as the A-Train satellite constellation passes overhead. Concurrent measurements of the overlying humidity and temperature profiles are interpolated from the ERA-Interim reanalysis grids. Cloud properties are sampled by MODIS and a measure of planetary boundary layer (PBL) depth is calculated using MODIS cloud top temperatures, CALIPSO lidar observations of cloud top heights, and ERA-Interim sea surface temperatures. High humidity overlying the PBL can reduce cloud top cooling by counteracting radiative cooling and by reducing evaporation within the entrainment zone. Both of these effects can slow the entrainment rate and change cloud evolution. To discern which effect is more important the humidity profile is broken into two distinct components: the specific humidity directly above the inversion, which is entraining into the boundary layer, and the column of specific humidity above that layer, which is radiatively interacting with the PBL, but not directly entraining. These two measures of humidity are compared in the Lagrangian framework. Results suggest that humidity above the PBL has a stronger effect on the Lagrangian PBL deepening rate compared to lower tropospheric stability. A comparison of PBL deepening rates driven by the entraining humidity versus the radiating humidity shows that the radiative effects of overlying humidity are dominant with respect to entrainment. However, the entraining effects of humidity are more important in prolonging cloud lifetime.

  9. A new approach to modeling aerosol effects on East Asian climate: Parametric uncertainties associated with emissions, cloud microphysics, and their interactions: AEROSOL EFFECTS ON EAST ASIAN CLIMATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Huiping; Qian, Yun; Zhao, Chun

    2015-09-09

    In this study, we adopt a parametric sensitivity analysis framework that integrates the quasi-Monte Carlo parameter sampling approach and a surrogate model to examine aerosol effects on the East Asian Monsoon climate simulated in the Community Atmosphere Model (CAM5). A total number of 256 CAM5 simulations are conducted to quantify the model responses to the uncertain parameters associated with cloud microphysics parameterizations and aerosol (e.g., sulfate, black carbon (BC), and dust) emission factors and their interactions. Results show that the interaction terms among parameters are important for quantifying the sensitivity of fields of interest, especially precipitation, to the parameters. Themore » relative importance of cloud-microphysics parameters and emission factors (strength) depends on evaluation metrics or the model fields we focused on, and the presence of uncertainty in cloud microphysics imposes an additional challenge in quantifying the impact of aerosols on cloud and climate. Due to their different optical and microphysical properties and spatial distributions, sulfate, BC, and dust aerosols have very different impacts on East Asian Monsoon through aerosol-cloud-radiation interactions. The climatic effects of aerosol do not always have a monotonic response to the change of emission factors. The spatial patterns of both sign and magnitude of aerosol-induced changes in radiative fluxes, cloud, and precipitation could be different, depending on the aerosol types, when parameters are sampled in different ranges of values. We also identify the different cloud microphysical parameters that show the most significant impact on climatic effect induced by sulfate, BC and dust, respectively, in East Asia.« less

  10. A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1

    NASA Astrophysics Data System (ADS)

    Fahey, Kathleen M.; Carlton, Annmarie G.; Pye, Havala O. T.; Baek, Jaemeen; Hutzell, William T.; Stanier, Charles O.; Baker, Kirk R.; Wyat Appel, K.; Jaoui, Mohammed; Offenberg, John H.

    2017-04-01

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM - KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM - KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from biogenic epoxides (AQCHEM - KMTI), normalized mean error and bias statistics are slightly improved for 2-methyltetrols and 2-methylglyceric acid at the Research Triangle Park measurement site in North Carolina during the Southern Oxidant and Aerosol Study (SOAS) period. The added in-cloud chemistry leads to a monthly average increase of 11-18 % in cloud SOA at the surface in the eastern United States for June 2013.

  11. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure

    NASA Astrophysics Data System (ADS)

    Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.

    2018-06-01

    Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.

  12. Optimally analyzing and implementing of bolt fittings in steel structure based on ANSYS

    NASA Astrophysics Data System (ADS)

    Han, Na; Song, Shuangyang; Cui, Yan; Wu, Yongchun

    2018-03-01

    ANSYS simulation software for its excellent performance become outstanding one in Computer-aided Engineering (CAE) family, it is committed to the innovation of engineering simulation to help users to shorten the design process. First, a typical procedure to implement CAE was design. The framework of structural numerical analysis on ANSYS Technology was proposed. Then, A optimally analyzing and implementing of bolt fittings in beam-column join of steel structure was implemented by ANSYS, which was display the cloud chart of XY-shear stress, the cloud chart of YZ-shear stress and the cloud chart of Y component of stress. Finally, ANSYS software simulating results was compared with the measured results by the experiment. The result of ANSYS simulating and analyzing is reliable, efficient and optical. In above process, a structural performance's numerical simulating and analyzing model were explored for engineering enterprises' practice.

  13. A Model for Particle Microphysics,Turbulent Mixing, and Radiative Transfer in the Stratocumulus-Topped Marine Boundary Layer and Comparisons with Measurements

    NASA Technical Reports Server (NTRS)

    Ackerman, Andrew S.; Toon, Owen B.; Hobbs, Peter V.

    1995-01-01

    A detailed 1D model of the stratocumulus-topped marine boundary layer is described. The model has three coupled components: a microphysics module that resolves the size distributions of aerosols and cloud droplets, a turbulence module that treats vertical mixing between layers, and a multiple wavelength radiative transfer module that calculates radiative heating rates and cloud optical properties. The results of a 12-h model simulation reproduce reasonably well the bulk thermodynamics, microphysical properties, and radiative fluxes measured in an approx. 500-m thick, summertime marine stratocumulus cloud layer by Nicholls. However, in this case, the model predictions of turbulent fluxes between the cloud and subcloud layers exceed the measurements. Results of model simulations are also compared to measurements of a marine stratus layer made under gate conditions and with measurements of a high, thin marine stratocumulus layer. The variations in cloud properties are generally reproduced by the model, although it underpredicts the entrainment of overlying air at cloud top under gale conditions. Sensitivities of the model results are explored. The vertical profile of cloud droplet concentration is sensitive to the lower size cutoff of the droplet size distribution due to the presence of unactivated haze particles in the lower region of the modeled cloud. Increases in total droplet concentrations do not always produce less drizzle and more cloud water in the model. The radius of the mean droplet volume does not correlate consistently with drizzle, but the effective droplet radius does. The greatest impacts on cloud properties predicted by the model are produced by halving the width of the size distribution of input condensation nuclei and by omitting the effect of cloud-top radiative cooling on the condensational growth of cloud droplets. The omission of infrared scattering produces noticeable changes in cloud properties. The collection efficiencies for droplets less than 30-micron radius, and the value of the accommodation coefficient for condensational droplet growth, have noticeable effects on cloud properties. The divergence of the horizontal wind also has a significant effect on a 12-h model simulation of cloud structure. Conclusions drawn from the model are tentative because of the limitations of the 1D model framework. A principal simplification is that the model assumes horizontal homogeneity, and, therefore, does not resolve updrafts and downdrafts. Likely consequences of this simplification include overprediction of the growth of droplets by condensation in the upper region of the cloud, underprediction of droplet condensational growth in the lower region of the cloud, and under-prediction of peak supersaturations.

  14. A Model for Particle Microphysics, Turbulent Mixing, and Radiative Transfer in the Stratocumulus-Topped Marine Boundary Layer and Comparisons with Measurements

    NASA Technical Reports Server (NTRS)

    Ackerman, Andrew S.; Toon, Owen B.; Hobbs, Peter V.

    1995-01-01

    A detailed 1D model of the stratocumulus-topped marine boundary layer is described. The model has three coupled components: a microphysics module that resolves the size distributions of aerosols and cloud droplets, a turbulence module that treats vertical mixing between layers, and a multiple wavelength radiative transfer module that calculates radiative heating rates and cloud optical properties. The results of a 12-h model simulation reproduce reasonably well the bulk thermodynamics, microphysical properties, and radiative fluxes measured in an approx. 500-m thick, summertime marine stratocumulus cloud layer by Nicholls. However, in this case, the model predictions of turbulent fluxes between the cloud and subcloud layers exceed the measurements. Results of model simulations are also compared to measurements of a marine stratus layer made under gale conditions and with measurements of a high, thin marine stratocumulus layer. The variations in cloud properties are generally reproduced by the model, although it underpredicts the entrainment of overlying air at cloud top under gale conditions. Sensitivities of the model results are explored. The vertical profile of cloud droplet concentration is sensitive to the lower size cutoff of the droplet size distribution due to the presence of unactivated haze particles in the lower region of the modeled cloud. Increases in total droplet concentrations do not always produce less drizzle and more cloud water in the model. The radius of the mean droplet volume does not correlate consistently with drizzle, but the effective droplet radius does. The greatest impacts on cloud properties predicted by the model are produced by halving the width of the size distribution of input condensation nuclei and by omitting the effect of cloud-top radiative cooling on the condensational growth of cloud droplets. The omission of infrared scattering produces noticeable changes in cloud properties. The collection efficiencies for droplets less than 30-micrometers radius, and the value of the accommodation coefficient for condensational droplet growth, have noticeable effects on cloud properties. The divergence of the horizontal wind also has a significant effect on a 12-h model simulation of cloud structure. Conclusions drawn from the model are tentative because of the limitations of the 1D model framework. A principal simplification is that the model assumes horizontal homogeneity, and, therefore, does not resolve updrafts and downdrafts. Likely consequences of this simplification include overprediction of the growth of droplets by condensation in the upper region of the cloud, underprediction of droplet condensational growth in the lower region of the cloud, and underprediction of peak supersaturations.

  15. Explicit prediction of ice clouds in general circulation models

    NASA Astrophysics Data System (ADS)

    Kohler, Martin

    1999-11-01

    Although clouds play extremely important roles in the radiation budget and hydrological cycle of the Earth, there are large quantitative uncertainties in our understanding of their generation, maintenance and decay mechanisms, representing major obstacles in the development of reliable prognostic cloud water schemes for General Circulation Models (GCMs). Recognizing their relative neglect in the past, both observationally and theoretically, this work places special focus on ice clouds. A recent version of the UCLA - University of Utah Cloud Resolving Model (CRM) that includes interactive radiation is used to perform idealized experiments to study ice cloud maintenance and decay mechanisms under various conditions in term of: (1) background static stability, (2) background relative humidity, (3) rate of cloud ice addition over a fixed initial time-period and (4) radiation: daytime, nighttime and no-radiation. Radiation is found to have major effects on the life-time of layer-clouds. Optically thick ice clouds decay significantly slower than expected from pure microphysical crystal fall-out (taucld = 0.9--1.4 h as opposed to no-motion taumicro = 0.5--0.7 h). This is explained by the upward turbulent fluxes of water induced by IR destabilization, which partially balance the downward transport of water by snowfall. Solar radiation further slows the ice-water decay by destruction of the inversion above cloud-top and the resulting upward transport of water. Optically thin ice clouds, on the other hand, may exhibit even longer life-times (>1 day) in the presence of radiational cooling. The resulting saturation mixing ratio reduction provides for a constant cloud ice source. These CRM results are used to develop a prognostic cloud water scheme for the UCLA-GCM. The framework is based on the bulk water phase model of Ose (1993). The model predicts cloud liquid water and cloud ice separately, and which is extended to split the ice phase into suspended cloud ice (predicted) and falling snow (diagnosed) components. An empirical parameterization of the effect of upward turbulent water fluxes in cloud layers is obtained from the CRM simulations by (1) identifying the time-scale of conversion of cloud ice to snow as the key parameter, and (2) regressing it onto cloud differential IR heating and environmental static stability. The updated UCLA-GCM achieves close agreement with observations in global mean top of atmosphere fluxes (within 1--4 W/m2). Artificially suppressing the impact of cloud turbulent fluxes reduces the global mean ice water path by a factor of 3 and produces errors in each of solar and IR fluxes at the top of atmosphere of about 5--6 W/m2.

  16. Architecting New Library Frameworks

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2010-01-01

    People live in an era of social, enterprise-oriented, and increasingly cloud-based technology; a dramatic shift away from stand-alone isolated silos that previously dominated. Computing systems can flourish today only when built to easily exchange data and services. An application that stands alone may provide practical functionality but may not…

  17. Reconstruction of 3D Shapes of Opaque Cumulus Clouds from Airborne Multiangle Imaging: A Proof-of-Concept

    NASA Astrophysics Data System (ADS)

    Davis, A. B.; Bal, G.; Chen, J.

    2015-12-01

    Operational remote sensing of microphysical and optical cloud properties is invariably predicated on the assumption of plane-parallel slab geometry for the targeted cloud. The sole benefit of this often-questionable assumption about the cloud is that it leads to one-dimensional (1D) radiative transfer (RT)---a textbook, computationally tractable model. We present new results as evidence that, thanks to converging advances in 3D RT, inverse problem theory, algorithm implementation, and computer hardware, we are at the dawn of a new era in cloud remote sensing where we can finally go beyond the plane-parallel paradigm. Granted, the plane-parallel/1D RT assumption is reasonable for spatially extended stratiform cloud layers, as well as the smoothly distributed background aerosol layers. However, these 1D RT-friendly scenarios exclude cases that are critically important for climate physics. 1D RT---whence operational cloud remote sensing---fails catastrophically for cumuliform clouds that have fully 3D outer shapes and internal structures driven by shallow or deep convection. For these situations, the first order of business in a robust characterization by remote sensing is to abandon the slab geometry framework and determine the 3D geometry of the cloud, as a first step toward bone fide 3D cloud tomography. With this specific goal in mind, we deliver a proof-of-concept for an entirely new kind of remote sensing applicable to 3D clouds. It is based on highly simplified 3D RT and exploits multi-angular suites of cloud images at high spatial resolution. Airborne sensors like AirMSPI readily acquire such data. The key element of the reconstruction algorithm is a sophisticated solution of the nonlinear inverse problem via linearization of the forward model and an iteration scheme supported, where necessary, by adaptive regularization. Currently, the demo uses a 2D setting to show how either vertical profiles or horizontal slices of the cloud can be accurately reconstructed. Extension to 3D volumes is straightforward but the next challenge is to accommodate images at lower spatial resolution, e.g., from MISR/Terra. G. Bal, J. Chen, and A.B. Davis (2015). Reconstruction of cloud geometry from multi-angle images, Inverse Problems in Imaging (submitted).

  18. CAUSES: Attribution of Surface Radiation Biases in NWP and Climate Models near the U.S. Southern Great Plains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Weverberg, K.; Morcrette, C. J.; Petch, J.

    Many numerical weather prediction (NWP) and climate models exhibit too warm lower tropospheres near the mid-latitude continents. This warm bias has been extensively studied before, but evidence about its origin remains inconclusive. Some studies point to deficiencies in the deep convective or low clouds. Other studies found an important contribution from errors in the land surface properties. The warm bias has been shown to coincide with important surface radiation biases that likely play a critical role in the inception or the growth of the warm bias. Documenting these radiation errors is hence an important step towards understanding and alleviating themore » warm bias. This paper presents an attribution study to quantify the net radiation biases in 9 model simulations, performed in the framework of the CAUSES project (Clouds Above the United States and Errors at the Surface). Contributions from deficiencies in the surface properties, clouds, integrated water vapor (IWV) and aerosols are quantified, using an array of radiation measurement stations near the ARM SGP site. Furthermore, an in depth-analysis is shown to attribute the radiation errors to specific cloud regimes. The net surface SW radiation is overestimated (LW underestimated) in all models throughout most of the simulation period. Cloud errors are shown to contribute most to this overestimation in all but one model, which has a dominant albedo issue. Using a cloud regime analysis, it was shown that missing deep cloud events and/or simulating deep clouds with too weak cloud-radiative effects account for most of these cloud-related radiation errors. Some models have compensating errors between excessive occurrence of deep cloud, but largely underestimating their radiative effect, while other models miss deep cloud events altogether. Surprisingly however, even the latter models tend to produce too much and too frequent afternoon surface precipitation. This suggests that rather than issues with the triggering of deep convection, the deep cloud problem in many models could be related to too weak convective cloud detrainment and too large precipitation efficiencies. This does not rule out that previously documented issues with the evaporative fraction contribute to the warm bias as well, since the majority of the models underestimate the surface rain rates overall, as they miss the observed large nocturnal precipitation peak.« less

  19. Precision and Accuracy of a Digital Impression Scanner in Full-Arch Implant Rehabilitation.

    PubMed

    Pesce, Paolo; Pera, Francesco; Setti, Paolo; Menini, Maria

    To evaluate the accuracy and precision of a digital scanner used to scan four implants positioned according to an immediate loading implant protocol and to assess the accuracy of an aluminum framework fabricated from a digital impression. Five master casts reproducing different edentulous maxillae with four tilted implants were used. Four scan bodies were screwed onto the low-profile abutments, and a digital intraoral scanner was used to perform five digital impressions of each master cast. To assess trueness, a metal framework of the best digital impression was produced with computer-aided design/computer-assisted manufacture (CAD/CAM) technology and passive fit was assessed with the Sheffield test. Gaps between the frameworks and the implant analogs were measured with a stereomicroscope. To assess precision, three-dimensional (3D) point cloud processing software was used to measure the deviations between the five digital impressions of each cast by producing a color map. The deviation values were grouped in three classes, and differences were assessed between class 2 (representing lower discrepancies) and the assembled classes 1 and 3 (representing the higher negative and positive discrepancies, respectively). The frameworks showed a mean gap of < 30 μm (range: 2 to 47 μm). A statistically significant difference was found between the two groups by the 3D point cloud software, with higher frequencies of points in class 2 than in grouped classes 1 and 3 (P < .001). Within the limits of this in vitro study, it appears that a digital impression may represent a reliable method for fabricating full-arch implant frameworks with good passive fit when tilted implants are present.

  20. Probing aerosol indirect effect on deep convection using idealized cloud-resolving simulations with parameterized large-scale dynamics.

    NASA Astrophysics Data System (ADS)

    Anber, U.; Wang, S.; Gentine, P.; Jensen, M. P.

    2017-12-01

    A framework is introduced to investigate the indirect impact of aerosol loading on tropical deep convection using 3-dimentional idealized cloud-system resolving simulations with coupled large-scale circulation. The large scale dynamics is parameterized using a spectral weak temperature gradient approximation that utilizes the dominant balance in the tropics between adiabatic cooling and diabatic heating. Aerosol loading effect is examined by varying the number concentration of nuclei (CCN) to form cloud droplets in the bulk microphysics scheme over a wide range from 30 to 5000 without including any radiative effect as the radiative cooling is prescribed at a constant rate, to isolate the microphysical effect. Increasing aerosol number concentration causes mean precipitation to decrease monotonically, despite the increase in cloud condensates. Such reduction in precipitation efficiency is attributed to reduction in the surface enthalpy fluxes, and not to the divergent circulation, as the gross moist stability remains unchanged. We drive a simple scaling argument based on the moist static energy budget, that enables a direct estimation of changes in precipitation given known changes in surfaces enthalpy fluxes and the constant gross moist stability. The impact on cloud hydrometers and microphysical properties is also examined and is consistent with the macro-physical picture.

  1. Satellite-based trends of solar radiation and cloud parameters in Europe

    NASA Astrophysics Data System (ADS)

    Pfeifroth, Uwe; Bojanowski, Jedrzej S.; Clerbaux, Nicolas; Manara, Veronica; Sanchez-Lorenzo, Arturo; Trentmann, Jörg; Walawender, Jakub P.; Hollmann, Rainer

    2018-04-01

    Solar radiation is the main driver of the Earth's climate. Measuring solar radiation and analysing its interaction with clouds are essential for the understanding of the climate system. The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) generates satellite-based, high-quality climate data records, with a focus on the energy balance and water cycle. Here, multiple of these data records are analyzed in a common framework to assess the consistency in trends and spatio-temporal variability of surface solar radiation, top-of-atmosphere reflected solar radiation and cloud fraction. This multi-parameter analysis focuses on Europe and covers the time period from 1992 to 2015. A high correlation between these three variables has been found over Europe. An overall consistency of the climate data records reveals an increase of surface solar radiation and a decrease in top-of-atmosphere reflected radiation. In addition, those trends are confirmed by negative trends in cloud cover. This consistency documents the high quality and stability of the CM SAF climate data records, which are mostly derived independently from each other. The results of this study indicate that one of the main reasons for the positive trend in surface solar radiation since the 1990's is a decrease in cloud coverage even if an aerosol contribution cannot be completely ruled out.

  2. Evaluation of Long-Term Cloud-Resolving Model Simulations Using Satellite Radiance Observations and Multi-Frequency Satellite Simulators

    NASA Technical Reports Server (NTRS)

    Matsui, Toshihisa; Zeng, Xiping; Tao, Wei-Kuo; Masunaga, Hirohiko; Olson, William S.; Lang, Stephen

    2008-01-01

    This paper proposes a methodology known as the Tropical Rainfall Measuring Mission (TRMM) Triple-Sensor Three-step Evaluation Framework (T3EF) for the systematic evaluation of precipitating cloud types and microphysics in a cloud-resolving model (CRM). T3EF utilizes multi-frequency satellite simulators and novel statistics of multi-frequency radiance and backscattering signals observed from the TRMM satellite. Specifically, T3EF compares CRM and satellite observations in the form of combined probability distributions of precipitation radar (PR) reflectivity, polarization-corrected microwave brightness temperature (Tb), and infrared Tb to evaluate the candidate CRM. T3EF is used to evaluate the Goddard Cumulus Ensemble (GCE) model for cases involving the South China Sea Monsoon Experiment (SCSMEX) and Kwajalein Experiment (KWAJEX). This evaluation reveals that the GCE properly captures the satellite-measured frequencies of different precipitating cloud types in the SCSMEX case but underestimates the frequencies of deep convective and deep stratiform types in the KWAJEX case. Moreover, the GCE tends to simulate excessively large and abundant frozen condensates in deep convective clouds as inferred from the overestimated GCE-simulated radar reflectivities and microwave Tb depressions. Unveiling the detailed errors in the GCE s performance provides the best direction for model improvements.

  3. Comparison of tropical cyclogenesis processes in climate model and cloud-resolving model simulations using moist static energy budget analysis

    NASA Astrophysics Data System (ADS)

    Wing, Allison; Camargo, Suzana; Sobel, Adam; Kim, Daehyun; Murakami, Hiroyuki; Reed, Kevin; Vecchi, Gabriel; Wehner, Michael; Zarzycki, Colin; Zhao, Ming

    2017-04-01

    In recent years, climate models have improved such that high-resolution simulations are able to reproduce the climatology of tropical cyclone activity with some fidelity and show some skill in seasonal forecasting. However biases remain in many models, motivating a better understanding of what factors control the representation of tropical cyclone activity in climate models. We explore the tropical cyclogenesis processes in five high-resolution climate models, including both coupled and uncoupled configurations. Our analysis framework focuses on how convection, moisture, clouds and related processes are coupled and employs budgets of column moist static energy and the spatial variance of column moist static energy. The latter was originally developed to study the mechanisms of tropical convective organization in idealized cloud-resolving models, and allows us to quantify the different feedback processes responsible for the amplification of moist static energy anomalies associated with the organization of convection and cyclogenesis. We track the formation and evolution of tropical cyclones in the climate model simulations and apply our analysis both along the individual tracks and composited over many tropical cyclones. We then compare the genesis processes; in particular, the role of cloud-radiation interactions, to those of spontaneous tropical cyclogenesis in idealized cloud-resolving model simulations.

  4. Validation of GEOLAND-2 Spot/vgt Albedo Products by Using Ceos Olive Methodology

    NASA Astrophysics Data System (ADS)

    Camacho de Coca, F.; Sanchez, J.; Schaaf, C.; Baret, F.; Weiss, M.; Cescatti, A.; Lacaze, R. N.

    2012-12-01

    This study evaluates the scientific merit of the global surface albedo products developed in the framework of the Geoland-2 project based on SPOT/VEGETATION observations. The methodology follows the OLIVE (On-Line Validation Exercise) approach supported by the CEOS Land Product Validation subgroup (calvalportal.ceos.org/cvp/web/olive). First, the spatial and temporal consistency of SPOT/VGT albedo products was assessed by intercomparison with reference global products (MODIS/Terra+Aqua and POLDER-3/PARASOL) for the period 2006-2007. A bulk statistical analysis over a global network of 420 homogeneous sites (BELMANIP-2) was performed and analyzed per biome types. Additional sites were included to study albedo under snow conditions. Second, the accuracy and realism of temporal variations were evaluated using a number of ground measurements from FLUXNET sites suitable for use in direct comparison to the co-located satellite data. Our results show that SPOT/VGT albedo products present reliable spatial and temporal distribution of retrievals. The SPOT/VGT albedo performs admirably with MODIS, with a mean bias and RMSE for the shortwave black-sky albedo over BELMANIP-2 sites lower than 0.006 and 0.03 (13% in relative terms) respectively, and even better for snow free pixels. Similar results were found for the white-sky albedo quantities. Discrepancies are larger when comparing with POLDER-3 products: for the shortwave black-sky albedo a mean bias of -0.014 and RMSE of 0.04 (20%) was found. This overall performance figures are however land-cover dependent and larger uncertainties were found over some biomes (or regions) or specific periods (e.g. winter in the north hemisphere). The comparison of SPOT/VGT blue-sky albedo estimates with ground measurements (mainly over Needle-leaf forest sites) show a RMSE of 0.04 and a bias of 0.003 when only snow-free pixels are considered. Moreover, this work shows that the OLIVE tool is also suitable for validation of global albedo products.

  5. Cloud information content analysis of multi-angular measurements in the oxygen A-band: application to 3MI and MSPI

    NASA Astrophysics Data System (ADS)

    Merlin, Guillaume; Riedi, Jérôme; Labonnote, Laurent C.; Cornet, Céline; Davis, Anthony B.; Dubuisson, Phillipe; Desmons, Marine; Ferlay, Nicolas; Parol, Frédéric

    2016-10-01

    Information content analyses on cloud top altitude (CTOP) and geometrical thickness (CGT) from multi-angular A-band measurements in the case of monolayer homogeneous clouds are conducted. In the framework of future multi-angular radiometer development, we compared the potential performances of the 3MI (Multi-viewing, Multi-channel and Multi-polarization Imaging) instrument developed by EUMETSAT, which is an extension of POLDER/PARASOL instrument and MSPI (Multiangle SpectroPolarimetric Imager) developed by NASA's Jet Propulsion Laboratory. Quantitative information content estimates were realized for thin, moderately opaque and opaque clouds for different surface albedo and viewing geometry configurations. Analyses show that retrieval of CTOP is possible with a high accuracy in most of the cases investigated. Retrieval of CGT is also possible for optically thick clouds above a black surface, at least when CGT > 1-2 km and for thin clouds for CGT > 2-3 km. However, for intermediate optical thicknesses (COT ≃ 4), we show that the retrieval of CGT is not simultaneously possible with CTOP. A comparison between 3MI and MSPI shows a higher information content for MSPI's measurements, traceable to a thinner filter inside the oxygen A-band, yielding higher signal-to-noise ratio for absorption estimation. Cases of cloud scenes above bright surfaces are more complex but it is shown that the retrieval of CTOP remains possible in almost all situations while the information content on CGT appears to be insufficient in many cases, particularly for COT < 4 and CGT < 2-3 km.

  6. Impact of natural and anthropogenic aerosols on stratocumulus and precipitation in the Southeast Pacific: a regional modelling study using WRF-Chem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Q.; Gustafson, W. I.; Fast, J. D.

    2012-09-28

    Cloud-system resolving simulations with the chemistry version of the Weather Research and Forecasting (WRF-Chem) model are used to quantify the relative impacts of regional anthropogenic and oceanic emissions on changes in aerosol properties, cloud macro- and microphysics, and cloud radiative forcing over the Southeast Pacific (SEP) during the VAMOS Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-REx) (15 October–16 November 2008). Two distinct regions are identified. The near-coast polluted region is characterized by low surface precipitation rates, the strong suppression of non-sea-salt particle activation due to sea-salt particles, a predominant albedo effect in aerosol indirect effects, and limited impact of aerosols associated withmore » anthropogenic emissions on clouds. Opposite sensitivities to natural marine and anthropogenic aerosol perturbations are seen in cloud properties (e.g., cloud optical depth and cloud-top and cloud-base heights), precipitation, and the top-of-atmosphere and surface shortwave fluxes over this region. The relatively clean remote region is characterized by large contributions of aerosols from non-regional sources (lateral boundaries) and much stronger drizzle at the surface. Under a scenario of five-fold increase in regional anthropogenic emissions, this relatively clean region shows large cloud responses, for example, a 13% increase in cloud-top height and a 9% increase in albedo in response to a moderate increase (25% of the reference case) in cloud condensation nuclei (CCN) concentration. The reduction of precipitation due to this increase in anthropogenic aerosols more than doubles the aerosol lifetime in the clean marine boundary layer. Therefore, the aerosol impacts on precipitation are amplified by the positive feedback of precipitation on aerosol, which ultimately alters the cloud micro- and macro-physical properties, leading to strong aerosol-cloud-precipitation interactions. The high sensitivity is also related to an increase in cloud-top entrainment rate (by 16% at night) due to the increased anthropogenic aerosols. The simulated aerosol-cloud-precipitation interactions due to the increased anthropogenic aerosols have a stronger diurnal cycle over the clean region compared to the near-coast region with stronger interactions at night. During the day, solar heating results in more frequent decoupling of the cloud and sub-cloud layers, thinner clouds, reduced precipitation, and reduced sensitivity to the increase in anthropogenic emissions. This study shows the importance of natural aerosols in accurately quantifying anthropogenic forcing within a regional modeling framework. Finally, the results of this study also imply that the energy balance perturbations from increased anthropogenic emissions are larger in the more susceptible clean environment than in already polluted environment and are larger than possible from the first indirect effect alone.« less

  7. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    PubMed

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. An ultrafast, reliable and scalable 4D CBCT∕CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment.

  8. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment

    PubMed Central

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-01-01

    Purpose: Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT/CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. Methods: In this work, we accelerated the Feldcamp–Davis–Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT/CT reconstruction algorithm. Results: Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10−7. Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. Conclusions: An ultrafast, reliable and scalable 4D CBCT/CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment. PMID:22149842

  9. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping.

    PubMed

    Nguyen, Tung; Shi, Weisong; Ruden, Douglas

    2011-06-06

    Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS). However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80%) mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http://cloudaligner.sourceforge.net/ and its web version is at http://mine.cs.wayne.edu:8080/CloudAligner/. Our results show that CloudAligner is faster than CloudBurst, provides more accurate results than RMAP, and supports various input as well as output formats. In addition, with the web-based interface, it is easier to use than its counterparts.

  10. HammerCloud: A Stress Testing System for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

    2011-12-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  11. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  12. A Coupled GCM-Cloud Resolving Modeling System, and a Regional Scale Model to Study Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2007-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a superparameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (2ICE, several 31CE), Goddard radiation (including explicitly calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generatio11 regional scale model, WRF. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications).

  13. A Coupled GCM-Cloud Resolving Modeling System, and A Regional Scale Model to Study Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2006-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (21CE, several 31CE), Goddard radiation (including explicitly calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generation regional scale model, WRF. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications).

  14. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.

  15. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.

  16. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei--Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2010-01-01

    In recent years, exponentially increasing computer power extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 sq km in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale models can be run in grid size similar to cloud resolving models through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model). (2) a regional scale model (a NASA unified weather research and forecast, W8F). (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling systems to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use the multi-satellite simulator to improve precipitation processes will be discussed.

  17. Using Multi-Scale Modeling Systems to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  18. HD 209458b in new light: evidence of nitrogen chemistry, patchy clouds and sub-solar water

    NASA Astrophysics Data System (ADS)

    MacDonald, Ryan J.; Madhusudhan, Nikku

    2017-08-01

    Interpretations of exoplanetary transmission spectra have been undermined by apparent obscuration due to clouds/hazes. Debate rages on whether weak H2O features seen in exoplanet spectra are due to clouds or inherently depleted oxygen. Assertions of solar H2O abundances have relied on making a priori model assumptions, for example, chemical/radiative equilibrium. In this work, we attempt to address this problem with a new retrieval paradigm for transmission spectra. We introduce poseidon, a two-dimensional atmospheric retrieval algorithm including generalized inhomogeneous clouds. We demonstrate that this prescription allows one to break vital degeneracies between clouds and prominent molecular abundances. We apply poseidon to the best transmission spectrum presently available, for the hot Jupiter HD 209458b, uncovering new insights into its atmosphere at the day-night terminator. We extensively explore the parameter space with an unprecedented 108 models, spanning the continuum from fully cloudy to cloud-free atmospheres, in a fully Bayesian retrieval framework. We report the first detection of nitrogen chemistry (NH3 and/or HCN) in an exoplanet atmosphere at 3.7-7.7σ confidence, non-uniform cloud coverage at 4.5-5.4σ, high-altitude hazes at >3σ and sub-solar H2O at ≳3-5σ, depending on the assumed cloud distribution. We detect NH3 at 3.3σ, and 4.9σ for fully cloudy and cloud-free scenarios, respectively. For the model with the highest Bayesian evidence, we constrain H2O at 5-15 ppm (0.01-0.03) × solar and NH3 at 0.01-2.7 ppm, strongly suggesting disequilibrium chemistry and cautioning against equilibrium assumptions. Our results herald a new promise for retrieving cloudy atmospheres using high-precision Hubble Space Telescope and James Webb Space Telescope spectra.

  19. Quality assessment and improvement of the EUMETSAT Meteosat Surface Albedo Climate Data Record

    NASA Astrophysics Data System (ADS)

    Lattanzio, A.; Fell, F.; Bennartz, R.; Trigo, I. F.; Schulz, J.

    2015-10-01

    Surface albedo has been identified as an important parameter for understanding and quantifying the Earth's radiation budget. EUMETSAT generated the Meteosat Surface Albedo (MSA) Climate Data Record (CDR) currently comprising up to 24 years (1982-2006) of continuous surface albedo coverage for large areas of the Earth. This CDR has been created within the Sustained, Coordinated Processing of Environmental Satellite Data for Climate Monitoring (SCOPE-CM) framework. The long-term consistency of the MSA CDR is high and meets the Global Climate Observing System (GCOS) stability requirements for desert reference sites. The limitation in quality due to non-removed clouds by the embedded cloud screening procedure is the most relevant weakness in the retrieval process. A twofold strategy is applied to efficiently improve the cloud detection and removal. The first step consists of the application of a robust and reliable cloud mask, taking advantage of the information contained in the measurements of the infrared and visible bands. Due to the limited information available from old radiometers, some clouds can still remain undetected. A second step relies on a post-processing analysis of the albedo seasonal variation together with the usage of a background albedo map in order to detect and screen out such outliers. The usage of a reliable cloud mask has a double effect. It enhances the number of high-quality retrievals for tropical forest areas sensed under low view angles and removes the most frequently unrealistic retrievals on similar surfaces sensed under high view angles. As expected, the usage of a cloud mask has a negligible impact on desert areas where clear conditions dominate. The exploitation of the albedo seasonal variation for cloud removal has good potentialities but it needs to be carefully addressed. Nevertheless it is shown that the inclusion of cloud masking and removal strategy is a key point for the generation of the next MSA CDR release.

  20. How to distinguish between cloudy mini-Neptunes and water/volatile-dominated super-Earths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benneke, Björn; Seager, Sara, E-mail: bbenneke@mit.edu

    One of the most profound questions about the newly discovered class of low-density super-Earths is whether these exoplanets are predominately H{sub 2}-dominated mini-Neptunes or volatile-rich worlds with gas envelopes dominated by H{sub 2}O, CO{sub 2}, CO, CH{sub 4}, or N{sub 2}. Transit observations of the super-Earth GJ 1214b rule out cloud-free H{sub 2}-dominated scenarios, but are not able to determine whether the lack of deep spectral features is due to high-altitude clouds or the presence of a high mean molecular mass atmosphere. Here, we demonstrate that one can unambiguously distinguish between cloudy mini-Neptunes and volatile-dominated worlds based on wing steepnessmore » and relative depths of absorption features in moderate-resolution near-infrared transmission spectra (R ∼ 100). In a numerical retrieval study, we show for GJ 1214b that an unambiguous distinction between a cloudy H{sub 2}-dominated atmosphere and cloud-free H{sub 2}O atmosphere will be possible if the uncertainties in the spectral transit depth measurements can be reduced by a factor of ∼3 compared to the published Hubble Space Telescope Wide-Field Camera 3 and Very Large Telescope transit observations by Berta et al. and Bean et al. We argue that the required precision for the distinction may be achievable with currently available instrumentation by stacking 10-15 repeated transit observations. We provide a scaling law that scales our quantitative results to other transiting super-Earths and Neptunes such as HD 97658b, 55 Cnc e, GJ 3470b and GJ 436b. The analysis in this work is performed using an improved version of our Bayesian atmospheric retrieval framework. The new framework not only constrains the gas composition and cloud/haze parameters, but also determines our confidence in having detected molecules and cloud/haze species through Bayesian model comparison. Using the Bayesian tool, we demonstrate quantitatively that the subtle transit depth variation in the Berta et al. data is not sufficient to claim the detection of water absorption.« less

  1. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    NASA Astrophysics Data System (ADS)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm for a hemispherical and a triangular wave point cloud.

  2. Guidelines for Dealing with Faculty Conflicts of Commitment and Conflicts of Interest in Research.

    ERIC Educational Resources Information Center

    Academic Medicine, 1990

    1990-01-01

    Incidents of scientists allowing personal or outside interests to cloud their professional judgment in conducting research are alarming and unacceptable. The Association of American Medical Colleges' Ad Hoc Committee on Misconduct and Conflict of Interest in Research offers a conceptual framework and defines institutional and individual…

  3. REEF: Retainable Evaluator Execution Framework

    PubMed Central

    Weimer, Markus; Chen, Yingda; Chun, Byung-Gon; Condie, Tyson; Curino, Carlo; Douglas, Chris; Lee, Yunseong; Majestro, Tony; Malkhi, Dahlia; Matusevych, Sergiy; Myers, Brandon; Narayanamurthy, Shravan; Ramakrishnan, Raghu; Rao, Sriram; Sears, Russell; Sezgin, Beysim; Wang, Julia

    2015-01-01

    Resource Managers like Apache YARN have emerged as a critical layer in the cloud computing system stack, but the developer abstractions for leasing cluster resources and instantiating application logic are very low-level. This flexibility comes at a high cost in terms of developer effort, as each application must repeatedly tackle the same challenges (e.g., fault-tolerance, task scheduling and coordination) and re-implement common mechanisms (e.g., caching, bulk-data transfers). This paper presents REEF, a development framework that provides a control-plane for scheduling and coordinating task-level (data-plane) work on cluster resources obtained from a Resource Manager. REEF provides mechanisms that facilitate resource re-use for data caching, and state management abstractions that greatly ease the development of elastic data processing work-flows on cloud platforms that support a Resource Manager service. REEF is being used to develop several commercial offerings such as the Azure Stream Analytics service. Furthermore, we demonstrate REEF development of a distributed shell application, a machine learning algorithm, and a port of the CORFU [4] system. REEF is also currently an Apache Incubator project that has attracted contributors from several instititutions.1 PMID:26819493

  4. Optimizing SIEM Throughput on the Cloud Using Parallelization.

    PubMed

    Alam, Masoom; Ihsan, Asif; Khan, Muazzam A; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, Muhammad Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.

  5. Cortical Surface Registration for Image-Guided Neurosurgery Using Laser-Range Scanning

    PubMed Central

    Sinha, Tuhin K.; Cash, David M.; Galloway, Robert L.; Weil, Robert J.

    2013-01-01

    In this paper, a method of acquiring intraoperative data using a laser range scanner (LRS) is presented within the context of model-updated image-guided surgery. Registering textured point clouds generated by the LRS to tomographic data is explored using established point-based and surface techniques as well as a novel method that incorporates geometry and intensity information via mutual information (SurfaceMI). Phantom registration studies were performed to examine accuracy and robustness for each framework. In addition, an in vivo registration is performed to demonstrate feasibility of the data acquisition system in the operating room. Results indicate that SurfaceMI performed better in many cases than point-based (PBR) and iterative closest point (ICP) methods for registration of textured point clouds. Mean target registration error (TRE) for simulated deep tissue targets in a phantom were 1.0 ± 0.2, 2.0 ± 0.3, and 1.2 ± 0.3 mm for PBR, ICP, and SurfaceMI, respectively. With regard to in vivo registration, the mean TRE of vessel contour points for each framework was 1.9 ± 1.0, 0 9 ± 0.6, and 1.3 ± 0.5 for PBR, ICP, and SurfaceMI, respectively. The methods discussed in this paper in conjunction with the quantitative data provide impetus for using LRS technology within the model-updated image-guided surgery framework. PMID:12906252

  6. Towards real-time photon Monte Carlo dose calculation in the cloud

    NASA Astrophysics Data System (ADS)

    Ziegenhein, Peter; Kozin, Igor N.; Kamerling, Cornelis Ph; Oelfke, Uwe

    2017-06-01

    Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.

  7. Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.

    NASA Astrophysics Data System (ADS)

    Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca

    2015-12-01

    The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.

  8. Towards real-time photon Monte Carlo dose calculation in the cloud.

    PubMed

    Ziegenhein, Peter; Kozin, Igor N; Kamerling, Cornelis Ph; Oelfke, Uwe

    2017-06-07

    Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.

  9. Influence of Convection and Aerosol Pollution on Ice Cloud Particle Effective Radius

    NASA Technical Reports Server (NTRS)

    Jiang, J. H.; Su, H.; Zhai, C.; Massie, S. T.; Schoeberl, M. R.; Colarco, P. R.; Platnick, S.; Gu, Y.; Liou, K.-N.

    2011-01-01

    Satellite observations show that ice cloud effective radius (r(sub e)) increases with ice water content (IWC) but decreases with aerosol optical thickness (AOT). Using least-squares fitting to the observed data, we obtain an analytical formula to describe the variations of r(sub e) with IWC and AOT for several regions with distinct characteristics of r(sub e) -IWC-AOT relationships. As IWC directly relates to convective strength and AOT represents aerosol loading, our empirical formula provides a means to quantify the relative roles of dynamics and aerosols in controlling r(sub e) in different geographical regions, and to establish a framework for parameterization of aerosol effects on r(sub e) in climate models.

  10. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  11. Numerical studies of a model fermion-boson system

    NASA Astrophysics Data System (ADS)

    Cheng, T.; Gospodarczyk, E. R.; Su, Q.; Grobe, R.

    2010-02-01

    We study the spectral and dynamical properties of a simplified model system of interacting fermions and bosons. The spatial discretization and an effective truncation of the Hilbert space permit us to compute the distribution of the bare fermions and bosons in the energy eigenstates of the coupled system. These states represent the physical particles and are used to examine the validity of the analytical predictions by perturbation theory and by the Greenberg-Schweber approximation that assumes all fermions are at rest. As an example of our numerical framework, we examine how a bare electron can trigger the creation of a cloud of virtual bosons around. We relate this cloud to the properties of the associated energy eigenstates.

  12. A remote sensing method for estimating regional reservoir area and evaporative loss

    DOE PAGES

    Zhang, Hua; Gorelick, Steven M.; Zimba, Paul V.; ...

    2017-10-07

    Evaporation from the water surface of a reservoir can significantly affect its function of ensuring the availability and temporal stability of water supply. Current estimations of reservoir evaporative loss are dependent on water area derived from a reservoir storage-area curve. Such curves are unavailable if the reservoir is located in a data-sparse region or questionable if long-term sedimentation has changed the original elevation-area relationship. In this paper, we propose a remote sensing framework to estimate reservoir evaporative loss at the regional scale. This framework uses a multispectral water index to extract reservoir area from Landsat imagery and estimate monthly evaporationmore » volume based on pan-derived evaporative rates. The optimal index threshold is determined based on local observations and extended to unobserved locations and periods. Built on the cloud computing capacity of the Google Earth Engine, this framework can efficiently analyze satellite images at large spatiotemporal scales, where such analysis is infeasible with a single computer. Our study involves 200 major reservoirs in Texas, captured in 17,811 Landsat images over a 32-year period. The results show that these reservoirs contribute to an annual evaporative loss of 8.0 billion cubic meters, equivalent to 20% of their total active storage or 53% of total annual water use in Texas. At five coastal basins, reservoir evaporative losses exceed the minimum freshwater inflows required to sustain ecosystem health and fishery productivity of the receiving estuaries. Reservoir evaporative loss can be significant enough to counterbalance the positive effects of impounding water and to offset the contribution of water conservation and reuse practices. Our results also reveal the spatially variable performance of the multispectral water index and indicate the limitation of using scene-level cloud cover to screen satellite images. Finally, this study demonstrates the advantage of combining satellite remote sensing and cloud computing to support regional water resources assessment.« less

  13. A remote sensing method for estimating regional reservoir area and evaporative loss

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Gorelick, Steven M.; Zimba, Paul V.; Zhang, Xiaodong

    2017-12-01

    Evaporation from the water surface of a reservoir can significantly affect its function of ensuring the availability and temporal stability of water supply. Current estimations of reservoir evaporative loss are dependent on water area derived from a reservoir storage-area curve. Such curves are unavailable if the reservoir is located in a data-sparse region or questionable if long-term sedimentation has changed the original elevation-area relationship. We propose a remote sensing framework to estimate reservoir evaporative loss at the regional scale. This framework uses a multispectral water index to extract reservoir area from Landsat imagery and estimate monthly evaporation volume based on pan-derived evaporative rates. The optimal index threshold is determined based on local observations and extended to unobserved locations and periods. Built on the cloud computing capacity of the Google Earth Engine, this framework can efficiently analyze satellite images at large spatiotemporal scales, where such analysis is infeasible with a single computer. Our study involves 200 major reservoirs in Texas, captured in 17,811 Landsat images over a 32-year period. The results show that these reservoirs contribute to an annual evaporative loss of 8.0 billion cubic meters, equivalent to 20% of their total active storage or 53% of total annual water use in Texas. At five coastal basins, reservoir evaporative losses exceed the minimum freshwater inflows required to sustain ecosystem health and fishery productivity of the receiving estuaries. Reservoir evaporative loss can be significant enough to counterbalance the positive effects of impounding water and to offset the contribution of water conservation and reuse practices. Our results also reveal the spatially variable performance of the multispectral water index and indicate the limitation of using scene-level cloud cover to screen satellite images. This study demonstrates the advantage of combining satellite remote sensing and cloud computing to support regional water resources assessment.

  14. A remote sensing method for estimating regional reservoir area and evaporative loss

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hua; Gorelick, Steven M.; Zimba, Paul V.

    Evaporation from the water surface of a reservoir can significantly affect its function of ensuring the availability and temporal stability of water supply. Current estimations of reservoir evaporative loss are dependent on water area derived from a reservoir storage-area curve. Such curves are unavailable if the reservoir is located in a data-sparse region or questionable if long-term sedimentation has changed the original elevation-area relationship. In this paper, we propose a remote sensing framework to estimate reservoir evaporative loss at the regional scale. This framework uses a multispectral water index to extract reservoir area from Landsat imagery and estimate monthly evaporationmore » volume based on pan-derived evaporative rates. The optimal index threshold is determined based on local observations and extended to unobserved locations and periods. Built on the cloud computing capacity of the Google Earth Engine, this framework can efficiently analyze satellite images at large spatiotemporal scales, where such analysis is infeasible with a single computer. Our study involves 200 major reservoirs in Texas, captured in 17,811 Landsat images over a 32-year period. The results show that these reservoirs contribute to an annual evaporative loss of 8.0 billion cubic meters, equivalent to 20% of their total active storage or 53% of total annual water use in Texas. At five coastal basins, reservoir evaporative losses exceed the minimum freshwater inflows required to sustain ecosystem health and fishery productivity of the receiving estuaries. Reservoir evaporative loss can be significant enough to counterbalance the positive effects of impounding water and to offset the contribution of water conservation and reuse practices. Our results also reveal the spatially variable performance of the multispectral water index and indicate the limitation of using scene-level cloud cover to screen satellite images. Finally, this study demonstrates the advantage of combining satellite remote sensing and cloud computing to support regional water resources assessment.« less

  15. Land-Atmosphere Coupling in the Multi-Scale Modelling Framework

    NASA Astrophysics Data System (ADS)

    Kraus, P. M.; Denning, S.

    2015-12-01

    The Multi-Scale Modeling Framework (MMF), in which cloud-resolving models (CRMs) are embedded within general circulation model (GCM) gridcells to serve as the model's cloud parameterization, has offered a number of benefits to GCM simulations. The coupling of these cloud-resolving models directly to land surface model instances, rather than passing averaged atmospheric variables to a single instance of a land surface model, the logical next step in model development, has recently been accomplished. This new configuration offers conspicuous improvements to estimates of precipitation and canopy through-fall, but overall the model exhibits warm surface temperature biases and low productivity.This work presents modifications to a land-surface model that take advantage of the new multi-scale modeling framework, and accommodate the change in spatial scale from a typical GCM range of ~200 km to the CRM grid-scale of 4 km.A parameterization is introduced to apportion modeled surface radiation into direct-beam and diffuse components. The diffuse component is then distributed among the land-surface model instances within each GCM cell domain. This substantially reduces the number excessively low light values provided to the land-surface model when cloudy conditions are modeled in the CRM, associated with its 1-D radiation scheme. The small spatial scale of the CRM, ~4 km, as compared with the typical ~200 km GCM scale, provides much more realistic estimates of precipitation intensity, this permits the elimination of a model parameterization of canopy through-fall. However, runoff at such scales can no longer be considered as an immediate flow to the ocean. Allowing sub-surface water flow between land-surface instances within the GCM domain affords better realism and also reduces temperature and productivity biases.The MMF affords a number of opportunities to land-surface modelers, providing both the advantages of direct simulation at the 4 km scale and a much reduced conceptual gap between model resolution and parameterized processes.

  16. Cloud-based distributed control of unmanned systems

    NASA Astrophysics Data System (ADS)

    Nguyen, Kim B.; Powell, Darren N.; Yetman, Charles; August, Michael; Alderson, Susan L.; Raney, Christopher J.

    2015-05-01

    Enabling warfighters to efficiently and safely execute dangerous missions, unmanned systems have been an increasingly valuable component in modern warfare. The evolving use of unmanned systems leads to vast amounts of data collected from sensors placed on the remote vehicles. As a result, many command and control (C2) systems have been developed to provide the necessary tools to perform one of the following functions: controlling the unmanned vehicle or analyzing and processing the sensory data from unmanned vehicles. These C2 systems are often disparate from one another, limiting the ability to optimally distribute data among different users. The Space and Naval Warfare Systems Center Pacific (SSC Pacific) seeks to address this technology gap through the UxV to the Cloud via Widgets project. The overarching intent of this three year effort is to provide three major capabilities: 1) unmanned vehicle control using an open service oriented architecture; 2) data distribution utilizing cloud technologies; 3) a collection of web-based tools enabling analysts to better view and process data. This paper focuses on how the UxV to the Cloud via Widgets system is designed and implemented by leveraging the following technologies: Data Distribution Service (DDS), Accumulo, Hadoop, and Ozone Widget Framework (OWF).

  17. Reconstruction of Consistent 3d CAD Models from Point Cloud Data Using a Priori CAD Models

    NASA Astrophysics Data System (ADS)

    Bey, A.; Chaine, R.; Marc, R.; Thibault, G.; Akkouche, S.

    2011-09-01

    We address the reconstruction of 3D CAD models from point cloud data acquired in industrial environments, using a pre-existing 3D model as an initial estimate of the scene to be processed. Indeed, this prior knowledge can be used to drive the reconstruction so as to generate an accurate 3D model matching the point cloud. We more particularly focus our work on the cylindrical parts of the 3D models. We propose to state the problem in a probabilistic framework: we have to search for the 3D model which maximizes some probability taking several constraints into account, such as the relevancy with respect to the point cloud and the a priori 3D model, and the consistency of the reconstructed model. The resulting optimization problem can then be handled using a stochastic exploration of the solution space, based on the random insertion of elements in the configuration under construction, coupled with a greedy management of the conflicts which efficiently improves the configuration at each step. We show that this approach provides reliable reconstructed 3D models by presenting some results on industrial data sets.

  18. Segmentation of Large Unstructured Point Clouds Using Octree-Based Region Growing and Conditional Random Fields

    NASA Astrophysics Data System (ADS)

    Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.

    2017-11-01

    Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.

  19. Radar observations of individual rain drops in the free atmosphere

    PubMed Central

    Schmidt, Jerome M.; Flatau, Piotr J.; Harasti, Paul R.; Yates, Robert D.; Littleton, Ricky; Pritchard, Michael S.; Fischer, Jody M.; Fischer, Erin J.; Kohri, William J.; Vetter, Jerome R.; Richman, Scott; Baranowski, Dariusz B.; Anderson, Mark J.; Fletcher, Ed; Lando, David W.

    2012-01-01

    Atmospheric remote sensing has played a pivotal role in the increasingly sophisticated representation of clouds in the numerical models used to assess global and regional climate change. This has been accomplished because the underlying bulk cloud properties can be derived from a statistical analysis of the returned microwave signals scattered by a diverse ensemble comprised of numerous cloud hydrometeors. A new Doppler radar, previously used to track small debris particles shed from the NASA space shuttle during launch, is shown to also have the capacity to detect individual cloud hydrometeors in the free atmosphere. Similar to the traces left behind on film by subatomic particles, larger cloud particles were observed to leave a well-defined radar signature (or streak), which could be analyzed to infer the underlying particle properties. We examine the unique radar and environmental conditions leading to the formation of the radar streaks and develop a theoretical framework which reveals the regulating role of the background radar reflectivity on their observed characteristics. This main expectation from theory is examined through an analysis of the drop properties inferred from radar and in situ aircraft measurements obtained in two contrasting regions of an observed multicellular storm system. The observations are placed in context of the parent storm circulation through the use of the radar’s unique high-resolution waveforms, which allow the bulk and individual hydrometeor properties to be inferred at the same time. PMID:22652569

  20. Radar observations of individual rain drops in the free atmosphere.

    PubMed

    Schmidt, Jerome M; Flatau, Piotr J; Harasti, Paul R; Yates, Robert D; Littleton, Ricky; Pritchard, Michael S; Fischer, Jody M; Fischer, Erin J; Kohri, William J; Vetter, Jerome R; Richman, Scott; Baranowski, Dariusz B; Anderson, Mark J; Fletcher, Ed; Lando, David W

    2012-06-12

    Atmospheric remote sensing has played a pivotal role in the increasingly sophisticated representation of clouds in the numerical models used to assess global and regional climate change. This has been accomplished because the underlying bulk cloud properties can be derived from a statistical analysis of the returned microwave signals scattered by a diverse ensemble comprised of numerous cloud hydrometeors. A new Doppler radar, previously used to track small debris particles shed from the NASA space shuttle during launch, is shown to also have the capacity to detect individual cloud hydrometeors in the free atmosphere. Similar to the traces left behind on film by subatomic particles, larger cloud particles were observed to leave a well-defined radar signature (or streak), which could be analyzed to infer the underlying particle properties. We examine the unique radar and environmental conditions leading to the formation of the radar streaks and develop a theoretical framework which reveals the regulating role of the background radar reflectivity on their observed characteristics. This main expectation from theory is examined through an analysis of the drop properties inferred from radar and in situ aircraft measurements obtained in two contrasting regions of an observed multicellular storm system. The observations are placed in context of the parent storm circulation through the use of the radar's unique high-resolution waveforms, which allow the bulk and individual hydrometeor properties to be inferred at the same time.

  1. Integration of Cloud Technologies for Data Stewardship at the NOAA National Centers for Environmental Information (NCEI)

    NASA Astrophysics Data System (ADS)

    Casey, K. S.; Hausman, S. A.

    2016-02-01

    In the last year, the NOAA National Oceanographic Data Center (NODC) and its siblings, the National Climatic Data Center and National Geophysical Data Center, were merged into one organization, the NOAA National Centers for Environmental Information (NCEI). Combining its expertise under one management has helped NCEI accelerate its efforts to embrace and integrate private, public, and hybrid cloud environments into its range of data stewardship services. These services span a range of tiers, from basic, long-term preservation and access, through enhanced access and scientific quality control, to authoritative product development and international-level services. Throughout these tiers of stewardship, partnerships and pilot projects have been launched to identify technological and policy-oriented challenges, to establish solutions to these problems, and to highlight success stories for emulation during operational integration of the cloud into NCEI's data stewardship activities. Some of these pilot activities including data storage, access, and reprocessing in Amazon Web Services, the OneStop data discovery and access framework project, and a set of Cooperative Research and Development Agreements under the Big Data Project with Amazon, Google, IBM, Microsoft, and the Open Cloud Consortium. Progress in these efforts will be highlighted along with a future vision of how NCEI could leverage hybrid cloud deployments and federated systems across NOAA to enable effective data stewardship for its oceanographic, atmospheric, climatic, and geophysical Big Data.

  2. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  3. Automatic registration of Iphone images to LASER point clouds of the urban structures using shape features

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Lindenbergh, R. C.; Menenti, M.

    2013-10-01

    Fusion of 3D airborne laser (LIDAR) data and terrestrial optical imagery can be applied in 3D urban modeling and model up-dating. The most challenging aspect of the fusion procedure is registering the terrestrial optical images on the LIDAR point clouds. In this article, we propose an approach for registering these two different data from different sensor sources. As we use iPhone camera images which are taken in front of the interested urban structure by the application user and the high resolution LIDAR point clouds of the acquired by an airborne laser sensor. After finding the photo capturing position and orientation from the iPhone photograph metafile, we automatically select the area of interest in the point cloud and transform it into a range image which has only grayscale intensity levels according to the distance from the image acquisition position. We benefit from local features for registering the iPhone image to the generated range image. In this article, we have applied the registration process based on local feature extraction and graph matching. Finally, the registration result is used for facade texture mapping on the 3D building surface mesh which is generated from the LIDAR point cloud. Our experimental results indicate possible usage of the proposed algorithm framework for 3D urban map updating and enhancing purposes.

  4. GreenIT Service Level Agreements

    NASA Astrophysics Data System (ADS)

    von Laszewski, Gregor; Wang, Lizhe

    In this paper we are introducing a framework towards the inclusion of Green IT metrics as part of service level agreements for future Grids and Clouds. As part of this effort we need to revisit Green IT metrics and proxies that we consider optimizing against in order to develop GreenIT as a Services (GaaS) that can be reused as part of a Software as a Service (SaaS) and Infrastructure Infrastructureas a service (IaaS) framework. We report on some of our ongoing efforts and demonstrate how we already achieve impact on the environment with our services.

  5. Can Condensing Organic Aerosols Lead to Less Cloud Particles?

    NASA Astrophysics Data System (ADS)

    Gao, C. Y.; Tsigaridis, K.; Bauer, S.

    2017-12-01

    We examined the impact of condensing organic aerosols on activated cloud number concentration in a new aerosol microphysics box model, MATRIX-VBS. The model includes the volatility-basis set (VBS) framework in an aerosol microphysical scheme MATRIX (Multiconfiguration Aerosol TRacker of mIXing state) that resolves aerosol mass and number concentrations and aerosol mixing state. Preliminary results show that by including the condensation of organic aerosols, the new model (MATRIX-VBS) has less activated particles compared to the original model (MATRIX), which treats organic aerosols as non-volatile. Parameters such as aerosol chemical composition, mass and number concentrations, and particle sizes which affect activated cloud number concentration are thoroughly evaluated via a suite of Monte-Carlo simulations. The Monte-Carlo simulations also provide information on which climate-relevant parameters play a critical role in the aerosol evolution in the atmosphere. This study also helps simplifying the newly developed box model which will soon be implemented in the global model GISS ModelE as a module.

  6. Are comets connected to the origin of life

    NASA Technical Reports Server (NTRS)

    Delsemme, A. H.

    1981-01-01

    Possible connections between comets and the origin of life on earth are discussed. The orbital evolution of comets and their origin are considered within a framework for the origin of the solar system, with particular attention given to the origin of the biosphere, and the origin of the Oort cloud. Evidence suggesting that cometary nuclei are undifferentiated throughout is considered, and a model of the average composition of a mean new comet is obtained from observational data which is similar to that of an interstellar frost. The chemistry of the model composition giving rise to the species observed in cometary spectra is considered, as well as the relations of cometary to cosmic abundances of oxygen, carbon and sulfur. The characteristics of possible sites for prebiotic chemistry, including interstellar clouds, the protosolar nebula, comets in the Oort cloud, periodic comets and the primitive earth, are examined, and a possible role of comets in bringing the interstellar prebiotic chemistry to earth is suggested.

  7. Evolution of the atmospheric boundary layer in southern West Africa - an overview from the DACCIWA field campaign

    NASA Astrophysics Data System (ADS)

    Kalthoff, Norbert; Lohou, Fabienne; Brooks, Barbara; Jegede, Gbenga; Adler, Bianca; Ajao, Adewale; Ayoola, Muritala; Babić, Karmen; Bessardon, Geoffrey; Delon, Claire; Dione, Cheikh; Handwerker, Jan; Jambert, Corinne; Kohler, Martin; Lothon, Marie; Pedruzo-Bagazgoitia, Xabier; Smith, Victoria; Sunmonu, Lukman; Wieser, Andreas

    2017-04-01

    In southern West Africa, extended low-level stratus clouds form very frequently during night-time and persist long into the following day influencing the diurnal cycle of the atmospheric boundary layer (ABL). During the course of the day, a transition from nocturnal low-level stratus to stratocumulus, cumulus, and sometimes congestus and possibly cumulonimbus clouds is observed. In June and July 2016, a ground-based field campaign took place in southern West Africa within the framework of the Dynamics-aerosol-chemistry-cloud interactions in West Africa (DACCIWA) project with the aim to identify the meteorological controls on the stratus and the evolution of the ABL. During the measurement period, extensive remote sensing and in-situ measurements were performed at three supersites in Kumasi (Ghana), Savè (Benin) and Ile-Ife (Nigeria). We give an overview of the atmospheric conditions during the whole measurement period focusing on the vertical and temporal distribution of the stratus and relevant related atmospheric features.

  8. A Discrete Constraint for Entropy Conservation and Sound Waves in Cloud-Resolving Modeling

    NASA Technical Reports Server (NTRS)

    Zeng, Xi-Ping; Tao, Wei-Kuo; Simpson, Joanne

    2003-01-01

    Ideal cloud-resolving models contain little-accumulative errors. When their domain is so large that synoptic large-scale circulations are accommodated, they can be used for the simulation of the interaction between convective clouds and the large-scale circulations. This paper sets up a framework for the models, using moist entropy as a prognostic variable and employing conservative numerical schemes. The models possess no accumulative errors of thermodynamic variables when they comply with a discrete constraint on entropy conservation and sound waves. Alternatively speaking, the discrete constraint is related to the correct representation of the large-scale convergence and advection of moist entropy. Since air density is involved in entropy conservation and sound waves, the challenge is how to compute sound waves efficiently under the constraint. To address the challenge, a compensation method is introduced on the basis of a reference isothermal atmosphere whose governing equations are solved analytically. Stability analysis and numerical experiments show that the method allows the models to integrate efficiently with a large time step.

  9. Parameterizing correlations between hydrometeor species in mixed-phase Arctic clouds

    NASA Astrophysics Data System (ADS)

    Larson, Vincent E.; Nielsen, Brandon J.; Fan, Jiwen; Ovchinnikov, Mikhail

    2011-01-01

    Mixed-phase Arctic clouds, like other clouds, contain small-scale variability in hydrometeor fields, such as cloud water or snow mixing ratio. This variability may be worth parameterizing in coarse-resolution numerical models. In particular, for modeling multispecies processes such as accretion and aggregation, it would be useful to parameterize subgrid correlations among hydrometeor species. However, one difficulty is that there exist many hydrometeor species and many microphysical processes, leading to complexity and computational expense. Existing lower and upper bounds on linear correlation coefficients are too loose to serve directly as a method to predict subgrid correlations. Therefore, this paper proposes an alternative method that begins with the spherical parameterization framework of Pinheiro and Bates (1996), which expresses the correlation matrix in terms of its Cholesky factorization. The values of the elements of the Cholesky matrix are populated here using a "cSigma" parameterization that we introduce based on the aforementioned bounds on correlations. The method has three advantages: (1) the computational expense is tolerable; (2) the correlations are, by construction, guaranteed to be consistent with each other; and (3) the methodology is fairly general and hence may be applicable to other problems. The method is tested noninteractively using simulations of three Arctic mixed-phase cloud cases from two field experiments: the Indirect and Semi-Direct Aerosol Campaign and the Mixed-Phase Arctic Cloud Experiment. Benchmark simulations are performed using a large-eddy simulation (LES) model that includes a bin microphysical scheme. The correlations estimated by the new method satisfactorily approximate the correlations produced by the LES.

  10. Semidirect Dynamical and Radiative Impact of North African Dust Transport on Lower Tropospheric Clouds over the Subtropical North Atlantic in CESM 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeFlorio, Mike; Ghan, Steven J.; Singh, Balwinder

    This study uses a century length pre-industrial climate simulation by the Community Earth System Model (CESM 1.0) to explore statistical relationships between dust, clouds and atmospheric circulation, and to suggest a dynamical, rather than microphysical, mechanism linking subtropical North Atlantic lower tropospheric cloud cover with North African dust transport. The length of the run allows us to account for interannual variability of dust emissions and transport downstream of North Africa in the model. CESM’s mean climatology and probability distribution of aerosol optical depth in this region agrees well with available AERONET observations. In addition, CESM shows strong seasonal cycles ofmore » dust burden and lower tropospheric cloud fraction, with maximum values occurring during boreal summer, when a strong correlation between these two variables exists downstream of North Africa over the subtropical North Atlantic. Calculations of Estimated Inversion Strength (EIS) and composites of EIS on high and low downstream North Africa dust months during boreal summer reveal that dust is likely increasing inversion strength over this region due to both solar absorption and reflection. We find no evidence for a microphysical link between dust and lower tropospheric clouds in this region. These results yield new insight over an extensive period of time into the complex relationship between North African dust and lower tropospheric clouds over the open ocean, which has previously been hindered by spatiotemporal constraints of observations. Our findings lay a framework for future analyses using sub-monthly data over regions with different underlying dynamics.« less

  11. Evolving the Land Information System into a Cloud Computing Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houser, Paul R.

    The Land Information System (LIS) was developed to use advanced flexible land surface modeling and data assimilation frameworks to integrate extremely large satellite- and ground-based observations with advanced land surface models to produce continuous high-resolution fields of land surface states and fluxes. The resulting fields are extremely useful for drought and flood assessment, agricultural planning, disaster management, weather and climate forecasting, water resources assessment, and the like. We envisioned transforming the LIS modeling system into a scientific cloud computing-aware web and data service that would allow clients to easily setup and configure for use in addressing large water management issues.more » The focus of this Phase 1 project was to determine the scientific, technical, commercial merit and feasibility of the proposed LIS-cloud innovations that are currently barriers to broad LIS applicability. We (a) quantified the barriers to broad LIS utility and commercialization (high performance computing, big data, user interface, and licensing issues); (b) designed the proposed LIS-cloud web service, model-data interface, database services, and user interfaces; (c) constructed a prototype LIS user interface including abstractions for simulation control, visualization, and data interaction, (d) used the prototype to conduct a market analysis and survey to determine potential market size and competition, (e) identified LIS software licensing and copyright limitations and developed solutions, and (f) developed a business plan for development and marketing of the LIS-cloud innovation. While some significant feasibility issues were found in the LIS licensing, overall a high degree of LIS-cloud technical feasibility was found.« less

  12. Photogrammetric DSM denoising

    NASA Astrophysics Data System (ADS)

    Nex, F.; Gerke, M.

    2014-08-01

    Image matching techniques can nowadays provide very dense point clouds and they are often considered a valid alternative to LiDAR point cloud. However, photogrammetric point clouds are often characterized by a higher level of random noise compared to LiDAR data and by the presence of large outliers. These problems constitute a limitation in the practical use of photogrammetric data for many applications but an effective way to enhance the generated point cloud has still to be found. In this paper we concentrate on the restoration of Digital Surface Models (DSM), computed from dense image matching point clouds. A photogrammetric DSM, i.e. a 2.5D representation of the surface is still one of the major products derived from point clouds. Four different algorithms devoted to DSM denoising are presented: a standard median filter approach, a bilateral filter, a variational approach (TGV: Total Generalized Variation), as well as a newly developed algorithm, which is embedded into a Markov Random Field (MRF) framework and optimized through graph-cuts. The ability of each algorithm to recover the original DSM has been quantitatively evaluated. To do that, a synthetic DSM has been generated and different typologies of noise have been added to mimic the typical errors of photogrammetric DSMs. The evaluation reveals that standard filters like median and edge preserving smoothing through a bilateral filter approach cannot sufficiently remove typical errors occurring in a photogrammetric DSM. The TGV-based approach much better removes random noise, but large areas with outliers still remain. Our own method which explicitly models the degradation properties of those DSM outperforms the others in all aspects.

  13. Development and Evaluation of a Cloud-Gap-Filled MODIS Daily Snow-Cover Product

    NASA Technical Reports Server (NTRS)

    Hall, Dorothy K.; Riggs, George A.; Foster, James L.; Kumar, Sujay V.

    2010-01-01

    The utility of the Moderate Resolution Imaging Spectroradiometer (MODIS) snow-cover products is limited by cloud cover which causes gaps in the daily snow-cover map products. We describe a cloud-gap-filled (CGF) daily snowcover map using a simple algorithm to track cloud persistence, to account for the uncertainty created by the age of the snow observation. Developed from the 0.050 resolution climate-modeling grid daily snow-cover product, MOD10C1, each grid cell of the CGF map provides a cloud-persistence count (CPC) that tells whether the current or a prior day was used to make the snow decision. Percentage of grid cells "observable" is shown to increase dramatically when prior days are considered. The effectiveness of the CGF product is evaluated by conducting a suite of data assimilation experiments using the community Noah land surface model in the NASA Land Information System (LIS) framework. The Noah model forecasts of snow conditions, such as snow-water equivalent (SWE), are updated based on the observations of snow cover which are obtained either from the MOD1 OC1 standard product or the new CGF product. The assimilation integrations using the CGF maps provide domain averaged bias improvement of -11 %, whereas such improvement using the standard MOD1 OC1 maps is -3%. These improvements suggest that the Noah model underestimates SWE and snow depth fields, and that the assimilation integrations contribute to correcting this systematic error. We conclude that the gap-filling strategy is an effective approach for increasing cloud-free observations of snow cover.

  14. Improvement of DHRA-DMDC Physical Access Software DBIDS Using Cloud Computing Technology: A Case Study

    DTIC Science & Technology

    2012-06-01

    technology originally developed on the Java platform. The Hibernate framework supports rapid development of a data access layer without requiring a...31 viii 2. Hibernate ................................................................................ 31 3. Database Design...protect from security threats; o Easy aggregate management operations via file tags; 2. Hibernate We recommend using Hibernate technology for object

  15. The Goldilocks Dilemma: Homework Policy Creating a Culture Where Simply Good Is Just Not Good Enough

    ERIC Educational Resources Information Center

    Watkins, Paul J.; Stevens, David W.

    2013-01-01

    Throughout the decades of educational reform cycles, the value of homework has proven either meaningful or meaningless depending on the reforming framework. Questions about homework as simply busy work or knowledge work, mere content distraction or content extension, ambivalence toward importance, or discipline of character all cloud any…

  16. Recent Updates to the GEOS-5 Linear Model

    NASA Technical Reports Server (NTRS)

    Holdaway, Dan; Kim, Jong G.; Errico, Ron; Gelaro, Ronald; Mahajan, Rahul

    2014-01-01

    Global Modeling and Assimilation Office (GMAO) is close to having a working 4DVAR system and has developed a linearized version of GEOS-5.This talk outlines a series of improvements made to the linearized dynamics, physics and trajectory.Of particular interest is the development of linearized cloud microphysics, which provides the framework for 'all-sky' data assimilation.

  17. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.

  18. A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes

    NASA Astrophysics Data System (ADS)

    Tao, W. K.

    2017-12-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  19. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the recent developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitating systems and hurricanes/typhoons will be presented. The high-resolution spatial and temporal visualization will be utilized to show the evolution of precipitation processes. Also how to use of the multi-satellite simulator tqimproy precipitation processes will be discussed.

  20. Impact of aerosol intrusions on sea-ice melting rates and the structure Arctic boundary layer clouds

    NASA Astrophysics Data System (ADS)

    Cotton, W.; Carrio, G.; Jiang, H.

    2003-04-01

    The Los Alamos National Laboratory sea-ice model (LANL CICE) was implemented into the real-time and research versions of the Colorado State University-Regional Atmospheric Modeling System (RAMS@CSU). The original version of CICE was modified in its structure to allow module communication in an interactive multigrid framework. In addition, some improvements have been made in the routines involved in the coupling, among them, the inclusion of iterative methods that consider variable roughness lengths for snow-covered ice thickness categories. This version of the model also includes more complex microphysics that considers the nucleation of cloud droplets, allowing the prediction of mixing ratios and number concentrations for all condensed water species. The real-time version of RAMS@CSU automatically processes the NASA Team SSMI F13 25km sea-ice coverage data; the data are objectively analyzed and mapped to the model grid configuration. We performed two types of cloud resolving simulations to assess the impact of the entrainment of aerosols from above the inversion on Arctic boundary layer clouds. The first series of numerical experiments corresponds to a case observed on May 4 1998 during the FIRE-ACE/SHEBA field experiment. Results indicate a significant impact on the microstructure of the simulated clouds. When assuming polluted initial profiles above the inversion, the liquid water fraction of the cloud monotonically decreases, the total condensate paths increases and downward IR tends to increase due to a significant increase in the ice water path. The second set of cloud resolving simulations focused on the evaluation of the potential effect of aerosol concentration above the inversion on melting rates during spring-summer period. For these multi-month simulations, the IFN and CCN profiles were also initialized assuming the 4 May profiles as benchmarks. Results suggest that increasing the aerosol concentrations above the boundary layer increases sea-ice melting rates when mixed phase clouds are present.

  1. A Scalable Cloud Library Empowering Big Data Management, Diagnosis, and Visualization of Cloud-Resolving Models

    NASA Astrophysics Data System (ADS)

    Zhou, S.; Tao, W. K.; Li, X.; Matsui, T.; Sun, X. H.; Yang, X.

    2015-12-01

    A cloud-resolving model (CRM) is an atmospheric numerical model that can numerically resolve clouds and cloud systems at 0.25~5km horizontal grid spacings. The main advantage of the CRM is that it can allow explicit interactive processes between microphysics, radiation, turbulence, surface, and aerosols without subgrid cloud fraction, overlapping and convective parameterization. Because of their fine resolution and complex physical processes, it is challenging for the CRM community to i) visualize/inter-compare CRM simulations, ii) diagnose key processes for cloud-precipitation formation and intensity, and iii) evaluate against NASA's field campaign data and L1/L2 satellite data products due to large data volume (~10TB) and complexity of CRM's physical processes. We have been building the Super Cloud Library (SCL) upon a Hadoop framework, capable of CRM database management, distribution, visualization, subsetting, and evaluation in a scalable way. The current SCL capability includes (1) A SCL data model enables various CRM simulation outputs in NetCDF, including the NASA-Unified Weather Research and Forecasting (NU-WRF) and Goddard Cumulus Ensemble (GCE) model, to be accessed and processed by Hadoop, (2) A parallel NetCDF-to-CSV converter supports NU-WRF and GCE model outputs, (3) A technique visualizes Hadoop-resident data with IDL, (4) A technique subsets Hadoop-resident data, compliant to the SCL data model, with HIVE or Impala via HUE's Web interface, (5) A prototype enables a Hadoop MapReduce application to dynamically access and process data residing in a parallel file system, PVFS2 or CephFS, where high performance computing (HPC) simulation outputs such as NU-WRF's and GCE's are located. We are testing Apache Spark to speed up SCL data processing and analysis.With the SCL capabilities, SCL users can conduct large-domain on-demand tasks without downloading voluminous CRM datasets and various observations from NASA Field Campaigns and Satellite data to a local computer, and inter-compare CRM output and data with GCE and NU-WRF.

  2. A security framework for nationwide health information exchange based on telehealth strategy.

    PubMed

    Zaidan, B B; Haiqi, Ahmed; Zaidan, A A; Abdulnabi, Mohamed; Kiah, M L Mat; Muzamel, Hussaen

    2015-05-01

    This study focuses on the situation of health information exchange (HIE) in the context of a nationwide network. It aims to create a security framework that can be implemented to ensure the safe transmission of health information across the boundaries of care providers in Malaysia and other countries. First, a critique of the major elements of nationwide health information networks is presented from the perspective of security, along with such topics as the importance of HIE, issues, and main approaches. Second, a systematic evaluation is conducted on the security solutions that can be utilized in the proposed nationwide network. Finally, a secure framework for health information transmission is proposed within a central cloud-based model, which is compatible with the Malaysian telehealth strategy. The outcome of this analysis indicates that a complete security framework for a global structure of HIE is yet to be defined and implemented. Our proposed framework represents such an endeavor and suggests specific techniques to achieve this goal.

  3. Comparison of convective clouds observed by spaceborne W-band radar and simulated by cloud-resolving atmospheric models

    NASA Astrophysics Data System (ADS)

    Dodson, Jason B.

    Deep convective clouds (DCCs) play an important role in regulating global climate through vertical mass flux, vertical water transport, and radiation. For general circulation models (GCMs) to simulate the global climate realistically, they must simulate DCCs realistically. GCMs have traditionally used cumulus parameterizations (CPs). Much recent research has shown that multiple persistent unrealistic behaviors in GCMs are related to limitations of CPs. Two alternatives to CPs exist: the global cloud-resolving model (GCRM), and the multiscale modeling framework (MMF). Both can directly simulate the coarser features of DCCs because of their multi-kilometer horizontal resolutions, and can simulate large-scale meteorological processes more realistically than GCMs. However, the question of realistic behavior of simulated DCCs remains. How closely do simulated DCCs resemble observed DCCs? In this study I examine the behavior of DCCs in the Nonhydrostatic Icosahedral Atmospheric Model (NICAM) and Superparameterized Community Atmospheric Model (SP-CAM), the latter with both single-moment and double-moment microphysics. I place particular emphasis on the relationship between cloud vertical structure and convective environment. I also emphasize the transition between shallow clouds and mature DCCs. The spatial domains used are the tropical oceans and the contiguous United States (CONUS), the latter of which produces frequent vigorous convection during the summer. CloudSat is used to observe DCCs, and A-Train and reanalysis data are used to represent the large-scale environment in which the clouds form. The CloudSat cloud mask and radar reflectivity profiles for CONUS cumuliform clouds (defined as clouds with a base within the planetary boundary layer) during boreal summer are first averaged and compared. Both NICAM and SP-CAM greatly underestimate the vertical growth of cumuliform clouds. Then they are sorted by three large-scale environmental variables: total preciptable water (TPW), surface air temperature (SAT), and 500hPa vertical velocity (W500), representing the dynamical and thermodynamical environment in which the clouds form. The sorted CloudSat profiles are then compared with NICAM and SP-CAM profiles simulated with the Quickbeam CloudSat simulator. Both models have considerable difficulty representing the relationship of SAT and clouds over CONUS. For TPW and W500, shallow clouds transition to DCCs at higher values than observed. This may be an indication of the models' inability to represent the formation of DCCs in marginal convective environments. NICAM develops tall DCCs in highly favorable environments, but SP-CAM appears to be incapable of developing tall DCCs in almost any environment. The use of double moment microphysics in SP-CAM improves the frequency of deep clouds and their relationship with TPW, but not SAT. Both models underpredict radar reflectivity in the upper cloud of mature DCCs. SP-CAM with single moment microphysics has a particularly unrealistic DCC reflectivity profile, but with double moment microphysics it improves substantially. SP-CAM with double-moment microphysics unexpectedly appears to weaken DCC updraft strength as TPW increases, but otherwise both NICAM and SP-CAM represent the environment-versus-DCC relationships fairly realistically.

  4. Sensitivity of the southern West African mean atmospheric state to variations in low-level cloud cover as simulated by ICON

    NASA Astrophysics Data System (ADS)

    Kniffka, Anke; Knippertz, Peter; Fink, Andreas

    2017-04-01

    This contribution presents first results of numerical sensitivity experiments that are carried out in the framework of the project DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa). DACCIWA aims to investigate the impact of the drastic increase in anthropogenic emissions in West Africa on the local weather and climate, for example through cloud-aerosol interactions or impacts on radiation and stability. DACCIWA organised a major international field campaign in West Africa in June-July 2016 and involves a wide range of modelling activities. Several studies have shown - and first results of the DACCIWA campaign confirm - that extensive ultra-low stratus clouds form in the southern parts of West Africa (8°W-8°E, 5-10°N) at night in connection with strong nocturnal low-level jets. The clouds persist long after sunrise and have therefore a substantial impact on the surface radiation budget and consequently on the diurnal evolution of the daytime, convectively mixed boundary layer. The objective of this study is to investigate the sensitivity of the West African monsoon system and its diurnal cycle to the radiative effects of these low clouds. The study is based on a series of daily 5-day sensitivity simulations using ICON, the operational numerical weather prediction model of the German Weather Service during the months July - September 2006. In these simulations, low clouds are made transparent, by artificially lowering the optical thickness information passed on to the model's radiation scheme. Results reveal a noticeable influence of the low-level cloud cover on the atmospheric mean state of our region of interest and beyond. Also the diurnal development of the convective boundary layer is influenced by the cloud modification. In the transparent-cloud experiments, the cloud deck tends to break up later in the day and is shifted to a higher altitude, thereby causing a short-lived intensification around 11 LT. The average rainfall patterns are modified as well, though no conclusion on the long-term impact on rainfall can be made due to the forced initial conditions in the presented experiment. In the future, the impact on the development of the West African monsoon system will be assessed.

  5. A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.

    PubMed

    Das, Arup; Gupta, A K; Mazumder, T N

    2012-08-15

    A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Fragment assignment in the cloud with eXpress-D

    PubMed Central

    2013-01-01

    Background Probabilistic assignment of ambiguously mapped fragments produced by high-throughput sequencing experiments has been demonstrated to greatly improve accuracy in the analysis of RNA-Seq and ChIP-Seq, and is an essential step in many other sequence census experiments. A maximum likelihood method using the expectation-maximization (EM) algorithm for optimization is commonly used to solve this problem. However, batch EM-based approaches do not scale well with the size of sequencing datasets, which have been increasing dramatically over the past few years. Thus, current approaches to fragment assignment rely on heuristics or approximations for tractability. Results We present an implementation of a distributed EM solution to the fragment assignment problem using Spark, a data analytics framework that can scale by leveraging compute clusters within datacenters–“the cloud”. We demonstrate that our implementation easily scales to billions of sequenced fragments, while providing the exact maximum likelihood assignment of ambiguous fragments. The accuracy of the method is shown to be an improvement over the most widely used tools available and can be run in a constant amount of time when cluster resources are scaled linearly with the amount of input data. Conclusions The cloud offers one solution for the difficulties faced in the analysis of massive high-thoughput sequencing data, which continue to grow rapidly. Researchers in bioinformatics must follow developments in distributed systems–such as new frameworks like Spark–for ways to port existing methods to the cloud and help them scale to the datasets of the future. Our software, eXpress-D, is freely available at: http://github.com/adarob/express-d. PMID:24314033

  7. A unified view of convective transports by stratocumulus clouds, shallow cumulus clouds, and deep convection

    NASA Technical Reports Server (NTRS)

    Randall, David A.

    1990-01-01

    A bulk planetary boundary layer (PBL) model was developed with a simple internal vertical structure and a simple second-order closure, designed for use as a PBL parameterization in a large-scale model. The model allows the mean fields to vary with height within the PBL, and so must address the vertical profiles of the turbulent fluxes, going beyond the usual mixed-layer assumption that the fluxes of conservative variables are linear with height. This is accomplished using the same convective mass flux approach that has also been used in cumulus parameterizations. The purpose is to show that such a mass flux model can include, in a single framework, the compensating subsidence concept, downgradient mixing, and well-mixed layers.

  8. Dynamics and formation of obscuring tori in AGNs

    NASA Astrophysics Data System (ADS)

    Bannikova, Elena Yu.; Sergeyev, Alexey V.

    2017-12-01

    We considered the evolution of a self-gravitating clumpy torus in the gravitational field of the central mass of an active galactic nucleus (AGN) in the framework of the N-body problem. The initial conditions take into account winds with different opening angles. Results of our N-body simulations show that the clouds moving on orbits with a spread in inclinations and eccentricities form a toroidal region. This mechanism can solve the problem of the geometrical thickness of the torus. The velocity of the clouds at the inner edge of the torus is lower than in a disk model that can explain the observed rotation curves. We discuss the scenario of torus formation related with the beginning of the AGN stage.

  9. A Cloud Computing Based Patient Centric Medical Information System

    NASA Astrophysics Data System (ADS)

    Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko

    This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.

  10. Discovery of a New Wolf-Rayet Star Using SAGE-LMC

    NASA Astrophysics Data System (ADS)

    Gvaramadze, V. V.; Chené, A.-N.; Kniazev, A. Y.; Schnurr, O.

    2012-12-01

    We report the first-ever discovery of an extragalactic Wolf-Rayet (WR) star with Spitzer. A new WR star in the Large Magellanic Cloud (LMC) was revealed via detection of its circumstellar shell using 24 μm images obtained in the framework of the Spitzer Survey of the Large Magellanic Cloud (SAGE-LMC). Subsequent spectroscopic observations with the Gemini South resolved the central star in two components, one of which is a WN3b+abs star, while the second one is a B0 V star. We consider the lopsided brightness distribution over the circumstellar shell as an indication that the WR star is a runaway and use this interpretation to identify a possible parent cluster of the star.

  11. Partitioning CloudSat Ice Water Content for Comparison with Upper-Tropospheric Ice in Global Atmospheric Models

    NASA Astrophysics Data System (ADS)

    Chen, W. A.; Woods, C. P.; Li, J. F.; Waliser, D. E.; Chern, J.; Tao, W.; Jiang, J. H.; Tompkins, A. M.

    2010-12-01

    CloudSat provides important estimates of vertically resolved ice water content (IWC) on a global scale based on radar reflectivity. These estimates of IWC have proven beneficial in evaluating the representations of ice clouds in global models. An issue when performing model-data comparisons of IWC particularly germane to this investigation, is the question of which component(s) of the frozen water mass are represented by retrieval estimates and how they relate to what is represented in models. The present study developed and applied a new technique to partition CloudSat total IWC into small and large ice hydrometeors, based on the CloudSat-retrieved ice particle size distribution (PSD) parameters. The new method allows one to make relevant model-data comparisons and provides new insights into the model’s representation of atmospheric IWC. The partitioned CloudSat IWC suggests that the small ice particles contribute to 20-30% of the total IWC in the upper troposphere when a threshold size of 100 μm is used. Sensitivity measures with respect to the threshold size, the PSD parameters, and the retrieval algorithms are presented. The new dataset is compared to model estimates, pointing to areas for model improvement. Cloud ice analyses from the European Centre for Medium-Range Weather Forecasts model agree well with the small IWC from CloudSat. The finite-volume multi-scale modeling framework model underestimates total IWC at 147 and 215 hPa, while overestimating the fractional contribution from the small ice species. These results are discussed in terms of their applications to, and implications for, the evaluation of global atmospheric models, providing constraints on the representations of cloud feedback and precipitation in global models, which in turn can help reduce uncertainties associated with climate change projections. Figure 1. A sample lognormal ice number distribution (red curve), and the corresponding mass distribution (black curve). The dotted line represents the cutoff size for IWC partitioning (Dc = 100 µm as an example). The partial integrals of the mass distribution for particles smaller and larger than Dc correspond to IWC<100 (green area) and IWC>100 (blue area), respectively.

  12. EPIC/DSCOVR's Oxygen Absorption Channels: A Cloud Profiling Information Content Analysis

    NASA Astrophysics Data System (ADS)

    Davis, A. B.; Merlin, G.; Labonnote, L. C.; Cornet, C.; Dubuisson, P.; Ferlay, N.; Parol, F.; Riedi, J.; Yang, Y.

    2016-12-01

    EPIC/DSCOVR has several spectral channels dedicated to cloud characterization, most notably O2 A- and B-band. Differential optical absorption spectroscopy (DOAS) ratios of in-band and reference channels are less prone to calibration error than the 4 individual signals. Using these ratios, we have replicated for mono-directional (quasi-backscattering) EPIC observations the recent cloud information content analysis by Merlin et al. (AMT-D,8:12709-12758,2015) that was focused on A-band-only but multi-angle observations by POLDER in the past, by AirMSPI in the present, and by 3MI and MAIA in the future. The methodology is based on extensive forward 1D radiative transfer (RT) computations using the ARTDECO model that implements a k-distribution technique for the absorbing (in-band) channels. These synthetic signals are combined into a Bayesian Rodgers-type framework for estimating posterior uncertainty on retrieved quantities. Recall that this formalism calls explicitly for: (1) estimates of instrument error, and (2) prior uncertainty on the retrieved quantities, to which we add (3) reasonable estimates of uncertainty in the non- or otherwise-retrieved properties. Wide ranges of cloud top heights (CTHs) and cloud geometrical thicknesses (CGTs) are examined for a representative selection of cloud optical thicknesses (COTs), solar angles, and surface reflectances. We found that CTH should be reliably retrieved from EPIC data under most circumstances as long as COT can be inferred from non-absorbing channels, and the bias from in-cloud absorption is removed. However, CGT will be hard to determine unless CTH is constrained by independent means. EPIC has several UV channels that could be brought to bear. These findings conflict those of Yang et al. (JQSRT,122:141-149,2013), so we also revisit that more preliminary study that did not account for a realistic level of residual instrument noise in the DOAS ratios. In conclusion, we believe that the present information content analysis will inform the EPIC/DSCOVR Level 2 algorithm development team about what cloud properties to target using the A/B-band channels, depending on the availability of other cloud information.

  13. On the reversibility of transitions between closed and open cellular convection

    DOE PAGES

    Feingold, G.; Koren, I.; Yamaguchi, T.; ...

    2015-07-08

    The two-way transition between closed and open cellular convection is addressed in an idealized cloud-resolving modeling framework. A series of cloud-resolving simulations shows that the transition between closed and open cellular states is asymmetrical and characterized by a rapid ("runaway") transition from the closed- to the open-cell state but slower recovery to the closed-cell state. Given that precipitation initiates the closed–open cell transition and that the recovery requires a suppression of the precipitation, we apply an ad hoc time-varying drop concentration to initiate and suppress precipitation. We show that the asymmetry in the two-way transition occurs even for very rapidmore » drop concentration replenishment. The primary barrier to recovery is the loss in turbulence kinetic energy (TKE) associated with the loss in cloud water (and associated radiative cooling) and the vertical stratification of the boundary layer during the open-cell period. In transitioning from the open to the closed state, the system faces the task of replenishing cloud water fast enough to counter precipitation losses, such that it can generate radiative cooling and TKE. It is hampered by a stable layer below cloud base that has to be overcome before water vapor can be transported more efficiently into the cloud layer. Recovery to the closed-cell state is slower when radiative cooling is inefficient such as in the presence of free tropospheric clouds or after sunrise, when it is hampered by the absorption of shortwave radiation. Tests suggest that recovery to the closed-cell state is faster when the drizzle is smaller in amount and of shorter duration, i.e., when the precipitation causes less boundary layer stratification. Cloud-resolving model results on recovery rates are supported by simulations with a simple predator–prey dynamical system analogue. It is suggested that the observed closing of open cells by ship effluent likely occurs when aerosol intrusions are large, when contact comes prior to the heaviest drizzle in the early morning hours, and when the free troposphere is cloud free.« less

  14. Relationships between lower tropospheric stability, low cloud cover, and water vapor isotopic composition in the subtropical Pacific

    NASA Astrophysics Data System (ADS)

    Galewsky, J.

    2017-12-01

    Understanding the processes that govern the relationships between lower tropospheric stability and low-cloud cover is crucial for improved constraints on low-cloud feedbacks and for improving the parameterizations of low-cloud cover used in climate models. The stable isotopic composition of atmospheric water vapor is a sensitive recorder of the balance of moistening and drying processes that set the humidity of the lower troposphere and may thus provide a useful framework for improving our understanding low-cloud processes. In-situ measurements of water vapor isotopic composition collected at the NOAA Mauna Loa Observatory in Hawaii, along with twice-daily soundings from Hilo and remote sensing of cloud cover, show a clear inverse relationship between the estimated inversion strength (EIS) and the mixing ratios and water vapor δ -values, and a positive relationship between EIS, deuterium excess, and Δ δ D, defined as the difference between an observation and a reference Rayleigh distillation curve. These relationships are consistent with reduced moistening and an enhanced upper-tropospheric contribution above the trade inversion under high EIS conditions and stronger moistening under weaker EIS conditions. The cloud fraction, cloud liquid water path, and cloud-top pressure were all found to be higher under low EIS conditions. Inverse modeling of the isotopic data for the highest and lowest terciles of EIS conditions provide quantitative constraints on the cold-point temperatures and mixing fractions that govern the humidity above the trade inversion. The modeling shows the moistening fraction between moist boundary layer air and dry middle tropospheric air 24±1.5% under low EIS conditions is and 6±1.5% under high EIS conditions. A cold-point (last-saturation) temperature of -30C can match the observations for both low and high EIS conditions. The isotopic composition of the moistening source as derived from the inversion (-114±10‰ ) requires moderate fractionation from a pure marine source, indicating a link between inversion strength and moistening of the lower troposphere from the outflow of shallow convection. This approach can be applied in other settings and the results can be used to test parameterizations in climate models.

  15. Clouds Composition in Super-Earth Atmospheres: Chemical Equilibrium Calculations

    NASA Astrophysics Data System (ADS)

    Kempton, Eliza M.-R.; Mbarek, Rostom

    2015-12-01

    Attempts to determine the composition of super-Earth atmospheres have so far been plagued by the presence of clouds. Yet the theoretical framework to understand these clouds is still in its infancy. For the super-Earth archetype GJ 1214b, KCl, Na2S, and ZnS have been proposed as condensates that would form under the condition of chemical equilibrium, if the planet’s atmosphere has a bulk composition near solar. Condensation chemistry calculations have not been presented for a wider range of atmospheric bulk composition that is to be expected for super-Earth exoplanets. Here we provide a theoretical context for the formation of super-Earth clouds in atmospheres of varied composition by determining which condensates are likely to form, under the assumption of chemical equilibrium. We model super-Earth atmospheres assuming they are formed by degassing of volatiles from a solid planetary core of chondritic material. Given the atomic makeup of these atmospheres, we minimize the global Gibbs free energy of over 550 gases and condensates to obtain the molecular composition of the atmospheres over a temperature range of 350-3,000 K. Clouds should form along the temperature-pressure boundaries where the condensed species appear in our calculations. The super-Earth atmospheres that we study range from highly reducing to oxidizing and have carbon to oxygen (C:O) ratios that are both sub-solar and super-solar, thereby spanning a diverse range of atmospheric composition that is appropriate for low-mass exoplanets. Some condensates appear across all of our models. However, the majority of condensed species appear only over specific ranges of H:O and C:O ratios. We find that for GJ 1214b, KCl is the primary cloud-forming condensate at solar composition, in agreement with previous work. However, for oxidizing atmospheres, where H:O is less than unity, K2SO4 clouds form instead. For carbon-rich atmospheres with super-solar C:O ratios, graphite clouds additionally appear. At higher temperatures, clouds are formed from a variety of materials including metals, metal oxides, and aluminosilicates.

  16. FPGA Based Adaptive Rate and Manifold Pattern Projection for Structured Light 3D Camera System †

    PubMed Central

    Lee, Sukhan

    2018-01-01

    The quality of the captured point cloud and the scanning speed of a structured light 3D camera system depend upon their capability of handling the object surface of a large reflectance variation in the trade-off of the required number of patterns to be projected. In this paper, we propose and implement a flexible embedded framework that is capable of triggering the camera single or multiple times for capturing single or multiple projections within a single camera exposure setting. This allows the 3D camera system to synchronize the camera and projector even for miss-matched frame rates such that the system is capable of projecting different types of patterns for different scan speed applications. This makes the system capturing a high quality of 3D point cloud even for the surface of a large reflectance variation while achieving a high scan speed. The proposed framework is implemented on the Field Programmable Gate Array (FPGA), where the camera trigger is adaptively generated in such a way that the position and the number of triggers are automatically determined according to camera exposure settings. In other words, the projection frequency is adaptive to different scanning applications without altering the architecture. In addition, the proposed framework is unique as it does not require any external memory for storage because pattern pixels are generated in real-time, which minimizes the complexity and size of the application-specific integrated circuit (ASIC) design and implementation. PMID:29642506

  17. Optimizing SIEM Throughput on the Cloud Using Parallelization

    PubMed Central

    Alam, Masoom; Ihsan, Asif; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, M Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage. PMID:27851762

  18. Advancing reference emission levels in subnational and national REDD+ initiatives: a CLASlite approach

    PubMed Central

    Asner, Gregory P; Joseph, Shijo

    2015-01-01

    Conservation and monitoring of tropical forests requires accurate information on their extent and change dynamics. Cloud cover, sensor errors and technical barriers associated with satellite remote sensing data continue to prevent many national and sub-national REDD+ initiatives from developing their reference deforestation and forest degradation emission levels. Here we present a framework for large-scale historical forest cover change analysis using free multispectral satellite imagery in an extremely cloudy tropical forest region. The CLASlite approach provided highly automated mapping of tropical forest cover, deforestation and degradation from Landsat satellite imagery. Critically, the fractional cover of forest photosynthetic vegetation, non-photosynthetic vegetation, and bare substrates calculated by CLASlite provided scene-invariant quantities for forest cover, allowing for systematic mosaicking of incomplete satellite data coverage. A synthesized satellite-based data set of forest cover was thereby created, reducing image incompleteness caused by clouds, shadows or sensor errors. This approach can readily be implemented by single operators with highly constrained budgets. We test this framework on tropical forests of the Colombian Pacific Coast (Chocó) – one of the cloudiest regions on Earth, with successful comparison to the Colombian government’s deforestation map and a global deforestation map. PMID:25678933

  19. Hierarchical Higher Order Crf for the Classification of Airborne LIDAR Point Clouds in Urban Areas

    NASA Astrophysics Data System (ADS)

    Niemeyer, J.; Rottensteiner, F.; Soergel, U.; Heipke, C.

    2016-06-01

    We propose a novel hierarchical approach for the classification of airborne 3D lidar points. Spatial and semantic context is incorporated via a two-layer Conditional Random Field (CRF). The first layer operates on a point level and utilises higher order cliques. Segments are generated from the labelling obtained in this way. They are the entities of the second layer, which incorporates larger scale context. The classification result of the segments is introduced as an energy term for the next iteration of the point-based layer. This framework iterates and mutually propagates context to improve the classification results. Potentially wrong decisions can be revised at later stages. The output is a labelled point cloud as well as segments roughly corresponding to object instances. Moreover, we present two new contextual features for the segment classification: the distance and the orientation of a segment with respect to the closest road. It is shown that the classification benefits from these features. In our experiments the hierarchical framework improve the overall accuracies by 2.3% on a point-based level and by 3.0% on a segment-based level, respectively, compared to a purely point-based classification.

  20. Falco: a quick and flexible single-cell RNA-seq processing framework on the cloud.

    PubMed

    Yang, Andrian; Troup, Michael; Lin, Peijie; Ho, Joshua W K

    2017-03-01

    Single-cell RNA-seq (scRNA-seq) is increasingly used in a range of biomedical studies. Nonetheless, current RNA-seq analysis tools are not specifically designed to efficiently process scRNA-seq data due to their limited scalability. Here we introduce Falco, a cloud-based framework to enable paralellization of existing RNA-seq processing pipelines using big data technologies of Apache Hadoop and Apache Spark for performing massively parallel analysis of large scale transcriptomic data. Using two public scRNA-seq datasets and two popular RNA-seq alignment/feature quantification pipelines, we show that the same processing pipeline runs 2.6-145.4 times faster using Falco than running on a highly optimized standalone computer. Falco also allows users to utilize low-cost spot instances of Amazon Web Services, providing a ∼65% reduction in cost of analysis. Falco is available via a GNU General Public License at https://github.com/VCCRI/Falco/. j.ho@victorchang.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Distributed analysis functional testing using GangaRobot in the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Legger, Federica; ATLAS Collaboration

    2011-12-01

    Automated distributed analysis tests are necessary to ensure smooth operations of the ATLAS grid resources. The HammerCloud framework allows for easy definition, submission and monitoring of grid test applications. Both functional and stress test applications can be defined in HammerCloud. Stress tests are large-scale tests meant to verify the behaviour of sites under heavy load. Functional tests are light user applications running at each site with high frequency, to ensure that the site functionalities are available at all times. Success or failure rates of these tests jobs are individually monitored. Test definitions and results are stored in a database and made available to users and site administrators through a web interface. In this work we present the recent developments of the GangaRobot framework. GangaRobot monitors the outcome of functional tests, creates a blacklist of sites failing the tests, and exports the results to the ATLAS Site Status Board (SSB) and to the Service Availability Monitor (SAM), providing on the one hand a fast way to identify systematic or temporary site failures, and on the other hand allowing for an effective distribution of the work load on the available resources.

  2. Quantifying the effect of aerosol on vertical velocity and effective terminal velocity in warm convective clouds

    NASA Astrophysics Data System (ADS)

    Dagan, Guy; Koren, Ilan; Altaratz, Orit

    2018-05-01

    Better representation of cloud-aerosol interactions is crucial for an improved understanding of natural and anthropogenic effects on climate. Recent studies have shown that the overall aerosol effect on warm convective clouds is non-monotonic. Here, we reduce the system's dimensions to its center of gravity (COG), enabling distillation and simplification of the overall trend and its temporal evolution. Within the COG framework, we show that the aerosol effects are nicely reflected by the interplay of the system's characteristic vertical velocities, namely the updraft (w) and the effective terminal velocity (η). The system's vertical velocities can be regarded as a sensitive measure for the evolution of the overall trends with time. Using a bin-microphysics cloud-scale model, we analyze and follow the trends of the aerosol effect on the magnitude and timing of w and η, and therefore the overall vertical COG velocity. Large eddy simulation (LES) model runs are used to upscale the analyzed trends to the cloud-field scale and study how the aerosol effects on the temporal evolution of the field's thermodynamic properties are reflected by the interplay between the two velocities. Our results suggest that aerosol effects on air vertical motion and droplet mobility imply an effect on the way in which water is distributed along the atmospheric column. Moreover, the interplay between w and η predicts the overall trend of the field's thermodynamic instability. These factors have an important effect on the local energy balance.

  3. Ifcwall Reconstruction from Unstructured Point Clouds

    NASA Astrophysics Data System (ADS)

    Bassier, M.; Klein, R.; Van Genechten, B.; Vergauwen, M.

    2018-05-01

    The automated reconstruction of Building Information Modeling (BIM) objects from point cloud data is still ongoing research. A key aspect is the creation of accurate wall geometry as it forms the basis for further reconstruction of objects in a BIM. After segmenting and classifying the initial point cloud, the labelled segments are processed and the wall topology is reconstructed. However, the preocedure is challenging due to noise, occlusions and the complexity of the input data.In this work, a method is presented to automatically reconstruct consistent wall geometry from point clouds. More specifically, the use of room information is proposed to aid the wall topology creation. First, a set of partial walls is constructed based on classified planar primitives. Next, the rooms are identified using the retrieved wall information along with the floors and ceilings. The wall topology is computed by the intersection of the partial walls conditioned on the room information. The final wall geometry is defined by creating IfcWallStandardCase objects conform the IFC4 standard. The result is a set of walls according to the as-built conditions of a building. The experiments prove that the used method is a reliable framework for wall reconstruction from unstructured point cloud data. Also, the implementation of room information reduces the rate of false positives for the wall topology. Given the walls, ceilings and floors, 94% of the rooms is correctly identified. A key advantage of the proposed method is that it deals with complex rooms and is not bound to single storeys.

  4. Cloudbursting - Solving the 3-body problem

    NASA Astrophysics Data System (ADS)

    Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.

    2014-12-01

    Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.

  5. D Point Cloud Model Colorization by Dense Registration of Digital Images

    NASA Astrophysics Data System (ADS)

    Crombez, N.; Caron, G.; Mouaddib, E.

    2015-02-01

    Architectural heritage is a historic and artistic property which has to be protected, preserved, restored and must be shown to the public. Modern tools like 3D laser scanners are more and more used in heritage documentation. Most of the time, the 3D laser scanner is completed by a digital camera which is used to enrich the accurate geometric informations with the scanned objects colors. However, the photometric quality of the acquired point clouds is generally rather low because of several problems presented below. We propose an accurate method for registering digital images acquired from any viewpoints on point clouds which is a crucial step for a good colorization by colors projection. We express this image-to-geometry registration as a pose estimation problem. The camera pose is computed using the entire images intensities under a photometric visual and virtual servoing (VVS) framework. The camera extrinsic and intrinsic parameters are automatically estimated. Because we estimates the intrinsic parameters we do not need any informations about the camera which took the used digital image. Finally, when the point cloud model and the digital image are correctly registered, we project the 3D model in the digital image frame and assign new colors to the visible points. The performance of the approach is proven in simulation and real experiments on indoor and outdoor datasets of the cathedral of Amiens, which highlight the success of our method, leading to point clouds with better photometric quality and resolution.

  6. The chemical evolution of molecular clouds

    NASA Technical Reports Server (NTRS)

    Iglesias, E.

    1977-01-01

    The nonequilibrium chemistry of dense molecular clouds (10,000 to 1 million hydrogen molecules per cu cm) is studied in the framework of a model that includes the latest published chemical data and most of the recent theoretical advances. In this model the only important external source of ionization is assumed to be high-energy cosmic-ray bombardment; standard charge-transfer reactions are taken into account as well as reactions that transfer charge from molecular ions to trace-metal atoms. Schemes are proposed for the synthesis of such species as NCO, HNCO, and CN. The role played by adsorption and condensation of molecules on the surface of dust grains is investigated, and effects on the chemical evolution of a dense molecular cloud are considered which result from varying the total density or the elemental abundances and from assuming negligible or severe condensation of gaseous species on dust grains. It is shown that the chemical-equilibrium time scale is given approximately by the depletion times of oxygen and nitrogen when the condensation efficiency is negligible; that this time scale is probably in the range from 1 to 4 million years, depending on the elemental composition and initial conditions in the cloud; and that this time scale is insensitive to variations in the total density.

  7. Optimizing the resource usage in Cloud based environments: the Synergy approach

    NASA Astrophysics Data System (ADS)

    Zangrando, L.; Llorens, V.; Sgaravatto, M.; Verlato, M.

    2017-10-01

    Managing resource allocation in a cloud based data centre serving multiple virtual organizations is a challenging issue. In fact, while batch systems are able to allocate resources to different user groups according to specific shares imposed by the data centre administrator, without a static partitioning of such resources, this is not so straightforward in the most common cloud frameworks, e.g. OpenStack. In the current OpenStack implementation, it is only possible to grant fixed quotas to the different user groups and these resources cannot be exceeded by one group even if there are unused resources allocated to other groups. Moreover in the existing OpenStack implementation, when there aren’t resources available, new requests are simply rejected: it is then up to the client to later re-issue the request. The recently started EU-funded INDIGO-DataCloud project is addressing this issue through “Synergy”, a new advanced scheduling service targeted for OpenStack. Synergy adopts a fair-share model for resource provisioning which guarantees that resources are distributed among users following the fair-share policies defined by the administrator, taken also into account the past usage of such resources. We present the architecture of Synergy, the status of its implementation, some preliminary results and the foreseen evolution of the service.

  8. SparkSeq: fast, scalable and cloud-ready tool for the interactive genomic data analysis with nucleotide precision.

    PubMed

    Wiewiórka, Marek S; Messina, Antonio; Pacholewska, Alicja; Maffioletti, Sergio; Gawrysiak, Piotr; Okoniewski, Michał J

    2014-09-15

    Many time-consuming analyses of next -: generation sequencing data can be addressed with modern cloud computing. The Apache Hadoop-based solutions have become popular in genomics BECAUSE OF: their scalability in a cloud infrastructure. So far, most of these tools have been used for batch data processing rather than interactive data querying. The SparkSeq software has been created to take advantage of a new MapReduce framework, Apache Spark, for next-generation sequencing data. SparkSeq is a general-purpose, flexible and easily extendable library for genomic cloud computing. It can be used to build genomic analysis pipelines in Scala and run them in an interactive way. SparkSeq opens up the possibility of customized ad hoc secondary analyses and iterative machine learning algorithms. This article demonstrates its scalability and overall fast performance by running the analyses of sequencing datasets. Tests of SparkSeq also prove that the use of cache and HDFS block size can be tuned for the optimal performance on multiple worker nodes. Available under open source Apache 2.0 license: https://bitbucket.org/mwiewiorka/sparkseq/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Direct Lagrangian tracking simulations of particles in vertically-developing atmospheric clouds

    NASA Astrophysics Data System (ADS)

    Onishi, Ryo; Kunishima, Yuichi

    2017-11-01

    We have been developing the Lagrangian Cloud Simulator (LCS), which follows the so-called Euler-Lagrangian framework, where flow motion and scalar transportations (i.e., temperature and humidity) are computed with the Euler method and particle motion with the Lagrangian method. The LCS simulation considers the hydrodynamic interaction between approaching particles for robust collision detection. This leads to reliable simulations of collision growth of cloud droplets. Recently the activation process, in which aerosol particles become tiny liquid droplets, has been implemented in the LCS. The present LCS can therefore consider the whole warm-rain precipitation processes -activation, condensation, collision and drop precipitation. In this talk, after briefly introducing the LCS, we will show kinematic simulations using the LCS for quasi-one dimensional domain, i.e., vertically elongated 3D domain. They are compared with one-dimensional kinematic simulations using a spectral-bin cloud microphysics scheme, which is based on the Euler method. The comparisons show fairly good agreement with small discrepancies, the source of which will be presented. The Lagrangian statistics, obtained for the first time for the vertical domain, will be the center of discussion. This research was supported by MEXT as ``Exploratory Challenge on Post-K computer'' (Frontiers of Basic Science: Challenging the Limits).

  10. D Scanning of Live Pigs System and its Application in Body Measurements

    NASA Astrophysics Data System (ADS)

    Guo, H.; Wang, K.; Su, W.; Zhu, D. H.; Liu, W. L.; Xing, Ch.; Chen, Z. R.

    2017-09-01

    The shape of a live pig is an important indicator of its health and value, whether for breeding or for carcass quality. This paper implements a prototype system for live single pig body surface 3d scanning based on two consumer depth cameras, utilizing the 3d point clouds data. These cameras are calibrated in advance to have a common coordinate system. The live 3D point clouds stream of moving single pig is obtained by two Xtion Pro Live sensors from different viewpoints simultaneously. A novel detection method is proposed and applied to automatically detect the frames containing pigs with the correct posture from the point clouds stream, according to the geometric characteristics of pig's shape. The proposed method is incorporated in a hybrid scheme, that serves as the preprocessing step in a body measurements framework for pigs. Experimental results show the portability of our scanning system and effectiveness of our detection method. Furthermore, an updated this point cloud preprocessing software for livestock body measurements can be downloaded freely from https://github.com/LiveStockShapeAnalysis to livestock industry, research community and can be used for monitoring livestock growth status.

  11. Impact of capturing rainfall scavenging intermittency using cloud superparameterization on simulated continental scale wildfire smoke transport

    NASA Astrophysics Data System (ADS)

    Pritchard, M. S.; Kooperman, G. J.; Zhao, Z.; Wang, M.; Russell, L. M.; Somerville, R. C.; Ghan, S. J.

    2011-12-01

    Evaluating the fidelity of new aerosol physics in climate models is confounded by uncertainties in source emissions, systematic error in cloud parameterizations, and inadequate sampling of long-range plume concentrations. To explore the degree to which cloud parameterizations distort aerosol processing and scavenging, the Pacific Northwest National Laboratory (PNNL) Aerosol-Enabled Multi-Scale Modeling Framework (AE-MMF), a superparameterized branch of the Community Atmosphere Model Version 5 (CAM5), is applied to represent the unusually active and well sampled North American wildfire season in 2004. In the AE-MMF approach, the evolution of double moment aerosols in the exterior global resolved scale is linked explicitly to convective statistics harvested from an interior cloud resolving scale. The model is configured in retroactive nudged mode to observationally constrain synoptic meteorology, and Arctic wildfire activity is prescribed at high space/time resolution using data from the Global Fire Emissions Database. Comparisons against standard CAM5 bracket the effect of superparameterization to isolate the role of capturing rainfall intermittency on the bulk characteristics of 2004 Arctic plume transport. Ground based lidar and in situ aircraft wildfire plume constraints from the International Consortium for Atmospheric Research on Transport and Transformation field campaign are used as a baseline for model evaluation.

  12. Impact of radiation frequency, precipitation radiative forcing, and radiation column aggregation on convection-permitting West African monsoon simulations

    NASA Astrophysics Data System (ADS)

    Matsui, Toshi; Zhang, Sara Q.; Lang, Stephen E.; Tao, Wei-Kuo; Ichoku, Charles; Peters-Lidard, Christa D.

    2018-03-01

    In this study, the impact of different configurations of the Goddard radiation scheme on convection-permitting simulations (CPSs) of the West African monsoon (WAM) is investigated using the NASA-Unified WRF (NU-WRF). These CPSs had 3 km grid spacing to explicitly simulate the evolution of mesoscale convective systems (MCSs) and their interaction with radiative processes across the WAM domain and were able to reproduce realistic precipitation and energy budget fields when compared with satellite data, although low clouds were overestimated. Sensitivity experiments reveal that (1) lowering the radiation update frequency (i.e., longer radiation update time) increases precipitation and cloudiness over the WAM region by enhancing the monsoon circulation, (2) deactivation of precipitation radiative forcing suppresses cloudiness over the WAM region, and (3) aggregating radiation columns reduces low clouds over ocean and tropical West Africa. The changes in radiation configuration immediately modulate the radiative heating and low clouds over ocean. On the 2nd day of the simulations, patterns of latitudinal air temperature profiles were already similar to the patterns of monthly composites for all radiation sensitivity experiments. Low cloud maintenance within the WAM system is tightly connected with radiation processes; thus, proper coupling between microphysics and radiation processes must be established for each modeling framework.

  13. Performance Management of High Performance Computing for Medical Image Processing in Amazon Web Services.

    PubMed

    Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha

    2016-02-27

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.

  14. Parameterizing correlations between hydrometeor species in mixed-phase Arctic clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Vincent E.; Nielsen, Brandon J.; Fan, Jiwen

    2011-08-16

    Mixed-phase Arctic clouds, like other clouds, contain small-scale variability in hydrometeor fields, such as cloud water or snow mixing ratio. This variability may be worth parameterizing in coarse-resolution numerical models. In particular, for modeling processes such as accretion and aggregation, it would be useful to parameterize subgrid correlations among hydrometeor species. However, one difficulty is that there exist many hydrometeor species and many microphysical processes, leading to complexity and computational expense.Existing lower and upper bounds (inequalities) on linear correlation coefficients provide useful guidance, but these bounds are too loose to serve directly as a method to predict subgrid correlations. Therefore,more » this paper proposes an alternative method that is based on a blend of theory and empiricism. The method begins with the spherical parameterization framework of Pinheiro and Bates (1996), which expresses the correlation matrix in terms of its Cholesky factorization. The values of the elements of the Cholesky matrix are parameterized here using a cosine row-wise formula that is inspired by the aforementioned bounds on correlations. The method has three advantages: 1) the computational expense is tolerable; 2) the correlations are, by construction, guaranteed to be consistent with each other; and 3) the methodology is fairly general and hence may be applicable to other problems. The method is tested non-interactively using simulations of three Arctic mixed-phase cloud cases from two different field experiments: the Indirect and Semi-Direct Aerosol Campaign (ISDAC) and the Mixed-Phase Arctic Cloud Experiment (M-PACE). Benchmark simulations are performed using a large-eddy simulation (LES) model that includes a bin microphysical scheme. The correlations estimated by the new method satisfactorily approximate the correlations produced by the LES.« less

  15. The role of global cloud climatologies in validating numerical models

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1991-01-01

    Reliable estimates of the components of the surface radiation budget are important in studies of ocean-atmosphere interaction, land-atmosphere interaction, ocean circulation and in the validation of radiation schemes used in climate models. The methods currently under consideration must necessarily make certain assumptions regarding both the presence of clouds and their vertical extent. Because of the uncertainties in assumed cloudiness, all these methods involve perhaps unacceptable uncertainties. Here, a theoretical framework that avoids the explicit computation of cloud fraction and the location of cloud base in estimating the surface longwave radiation is presented. Estimates of the global surface downward fluxes and the oceanic surface net upward fluxes were made for four months (April, July, October and January) in 1985 to 1986. These estimates are based on a relationship between cloud radiative forcing at the top of the atmosphere and the surface obtained from a general circulation model. The radiation code is the version used in the UCLA/GLA general circulation model (GCM). The longwave cloud radiative forcing at the top of the atmosphere as obtained from Earth Radiation Budget Experiment (ERBE) measurements is used to compute the forcing at the surface by means of the GCM-derived relationship. This, along with clear-sky fluxes from the computations, yield maps of the downward longwave fluxes and net upward longwave fluxes at the surface. The calculated results are discussed and analyzed. The results are consistent with current meteorological knowledge and explainable on the basis of previous theoretical and observational works; therefore, it can be concluded that this method is applicable as one of the ways to obtain the surface longwave radiation fields from currently available satellite data.

  16. Performance management of high performance computing for medical image processing in Amazon Web Services

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha

    2016-03-01

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.

  17. Performance Management of High Performance Computing for Medical Image Processing in Amazon Web Services

    PubMed Central

    Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha

    2016-01-01

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335

  18. Illustration of microphysical processes in Amazonian deep convective clouds in the gamma phase space: introduction and potential applications

    NASA Astrophysics Data System (ADS)

    Cecchini, Micael A.; Machado, Luiz A. T.; Wendisch, Manfred; Costa, Anja; Krämer, Martina; Andreae, Meinrat O.; Afchine, Armin; Albrecht, Rachel I.; Artaxo, Paulo; Borrmann, Stephan; Fütterer, Daniel; Klimach, Thomas; Mahnke, Christoph; Martin, Scot T.; Minikin, Andreas; Molleker, Sergej; Pardo, Lianet H.; Pöhlker, Christopher; Pöhlker, Mira L.; Pöschl, Ulrich; Rosenfeld, Daniel; Weinzierl, Bernadett

    2017-12-01

    The behavior of tropical clouds remains a major open scientific question, resulting in poor representation by models. One challenge is to realistically reproduce cloud droplet size distributions (DSDs) and their evolution over time and space. Many applications, not limited to models, use the gamma function to represent DSDs. However, even though the statistical characteristics of the gamma parameters have been widely studied, there is almost no study dedicated to understanding the phase space of this function and the associated physics. This phase space can be defined by the three parameters that define the DSD intercept, shape, and curvature. Gamma phase space may provide a common framework for parameterizations and intercomparisons. Here, we introduce the phase space approach and its characteristics, focusing on warm-phase microphysical cloud properties and the transition to the mixed-phase layer. We show that trajectories in this phase space can represent DSD evolution and can be related to growth processes. Condensational and collisional growth may be interpreted as pseudo-forces that induce displacements in opposite directions within the phase space. The actually observed movements in the phase space are a result of the combination of such pseudo-forces. Additionally, aerosol effects can be evaluated given their significant impact on DSDs. The DSDs associated with liquid droplets that favor cloud glaciation can be delimited in the phase space, which can help models to adequately predict the transition to the mixed phase. We also consider possible ways to constrain the DSD in two-moment bulk microphysics schemes, in which the relative dispersion parameter of the DSD can play a significant role. Overall, the gamma phase space approach can be an invaluable tool for studying cloud microphysical evolution and can be readily applied in many scenarios that rely on gamma DSDs.

  19. The role of ensemble-based statistics in variational assimilation of cloud-affected observations from infrared imagers

    NASA Astrophysics Data System (ADS)

    Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris

    2017-04-01

    Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the apparent benefit from using ensembles for cloudy radiance assimilation in an EnVar context.

  20. A new method for assessing surface solar irradiance: Heliosat-4

    NASA Astrophysics Data System (ADS)

    Qu, Z.; Oumbe, A.; Blanc, P.; Lefèvre, M.; Wald, L.; Schroedter-Homscheidt, M.; Gesell, G.

    2012-04-01

    Downwelling shortwave irradiance at surface (SSI) is more and more often assessed by means of satellite-derived estimates of optical properties of the atmosphere. Performances are judged satisfactory for the time being but there is an increasing need for the assessment of the direct and diffuse components of the SSI. MINES ParisTech and the German Aerospace Center (DLR) are currently developing the Heliosat-4 method to assess the SSI and its components in a more accurate way than current practices. This method is composed by two parts: a clear sky module based on the radiative transfer model libRadtran, and a cloud-ground module using two-stream and delta-Eddington approximations for clouds and a database of ground albedo. Advanced products derived from geostationary satellites and recent Earth Observation missions are the inputs of the Heliosat-4 method. Such products are: cloud optical depth, cloud phase, cloud type and cloud coverage from APOLLO of DLR, aerosol optical depth, aerosol type, water vapor in clear-sky, ozone from MACC products (FP7), and ground albedo from MODIS of NASA. In this communication, we briefly present Heliosat-4 and focus on its performances. The results of Heliosat-4 for the period 2004-2010 will be compared to the measurements made in five stations within the Baseline Surface Radiation Network. Extensive statistic analysis as well as case studies are performed in order to better understand Heliosat-4 and have an in-depth view of the performance of Heliosat-4, to understand its advantages comparing to existing methods and to identify its defaults for future improvements. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 218793 (MACC project) and no. 283576 (MACC-II project).

Top