NASA Astrophysics Data System (ADS)
Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.
2016-03-01
We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.
Enhanced backgrounds in scene rendering with GTSIMS
NASA Astrophysics Data System (ADS)
Prussing, Keith F.; Pierson, Oliver; Cordell, Chris; Stewart, John; Nielson, Kevin
2018-05-01
A core component to modeling visible and infrared sensor responses is the ability to faithfully recreate background noise and clutter in a synthetic image. Most tracking and detection algorithms use a combination of signal to noise or clutter to noise ratios to determine if a signature is of interest. A primary source of clutter is the background that defines the environment in which a target is placed. Over the past few years, the Electro-Optical Systems Laboratory (EOSL) at the Georgia Tech Research Institute has made significant improvements to its in house simulation framework GTSIMS. First, we have expanded our terrain models to include the effects of terrain orientation on emission and reflection. Second, we have included the ability to model dynamic reflections with full BRDF support. Third, we have added the ability to render physically accurate cirrus clouds. And finally, we have updated the overall rendering procedure to reduce the time necessary to generate a single frame by taking advantage of hardware acceleration. Here, we present the updates to GTSIMS to better predict clutter and noise doe to non-uniform backgrounds. Specifically, we show how the addition of clouds, terrain, and improved non-uniform sky rendering improve our ability to represent clutter during scene generation.
The application of cloud computing to scientific workflows: a study of cost and performance.
Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S
2013-01-28
The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.
High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL.
Stone, John E; Messmer, Peter; Sisneros, Robert; Schulten, Klaus
2016-05-01
Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications.
High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL
Stone, John E.; Messmer, Peter; Sisneros, Robert; Schulten, Klaus
2016-01-01
Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications. PMID:27747137
NASA Astrophysics Data System (ADS)
Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos
2016-04-01
This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.
Multi-Depth-Map Raytracing for Efficient Large-Scene Reconstruction.
Arikan, Murat; Preiner, Reinhold; Wimmer, Michael
2016-02-01
With the enormous advances of the acquisition technology over the last years, fast processing and high-quality visualization of large point clouds have gained increasing attention. Commonly, a mesh surface is reconstructed from the point cloud and a high-resolution texture is generated over the mesh from the images taken at the site to represent surface materials. However, this global reconstruction and texturing approach becomes impractical with increasing data sizes. Recently, due to its potential for scalability and extensibility, a method for texturing a set of depth maps in a preprocessing and stitching them at runtime has been proposed to represent large scenes. However, the rendering performance of this method is strongly dependent on the number of depth maps and their resolution. Moreover, for the proposed scene representation, every single depth map has to be textured by the images, which in practice heavily increases processing costs. In this paper, we present a novel method to break these dependencies by introducing an efficient raytracing of multiple depth maps. In a preprocessing phase, we first generate high-resolution textured depth maps by rendering the input points from image cameras and then perform a graph-cut based optimization to assign a small subset of these points to the images. At runtime, we use the resulting point-to-image assignments (1) to identify for each view ray which depth map contains the closest ray-surface intersection and (2) to efficiently compute this intersection point. The resulting algorithm accelerates both the texturing and the rendering of the depth maps by an order of magnitude.
Dusty Cloud Acceleration by Radiation Pressure in Rapidly Star-forming Galaxies
NASA Astrophysics Data System (ADS)
Zhang, Dong; Davis, Shane W.; Jiang, Yan-Fei; Stone, James M.
2018-02-01
We perform two-dimensional and three-dimensional radiation hydrodynamic simulations to study cold clouds accelerated by radiation pressure on dust in the environment of rapidly star-forming galaxies dominated by infrared flux. We utilize the reduced speed of light approximation to solve the frequency-averaged, time-dependent radiative transfer equation. We find that radiation pressure is capable of accelerating the clouds to hundreds of kilometers per second while remaining dense and cold, consistent with observations. We compare these results to simulations where acceleration is provided by entrainment in a hot wind, where the momentum injection of the hot flow is comparable to the momentum in the radiation field. We find that the survival time of the cloud accelerated by the radiation field is significantly longer than that of a cloud entrained in a hot outflow. We show that the dynamics of the irradiated cloud depends on the initial optical depth, temperature of the cloud, and intensity of the flux. Additionally, gas pressure from the background may limit cloud acceleration if the density ratio between the cloud and background is ≲ {10}2. In general, a 10 pc-scale optically thin cloud forms a pancake structure elongated perpendicular to the direction of motion, while optically thick clouds form a filamentary structure elongated parallel to the direction of motion. The details of accelerated cloud morphology and geometry can also be affected by other factors, such as the cloud lengthscale, reduced speed of light approximation, spatial resolution, initial cloud structure, and dimensionality of the run, but these have relatively little affect on the cloud velocity or survival time.
Transform coding for hardware-accelerated volume rendering.
Fout, Nathaniel; Ma, Kwan-Liu
2007-01-01
Hardware-accelerated volume rendering using the GPU is now the standard approach for real-time volume rendering, although limited graphics memory can present a problem when rendering large volume data sets. Volumetric compression in which the decompression is coupled to rendering has been shown to be an effective solution to this problem; however, most existing techniques were developed in the context of software volume rendering, and all but the simplest approaches are prohibitive in a real-time hardware-accelerated volume rendering context. In this paper we present a novel block-based transform coding scheme designed specifically with real-time volume rendering in mind, such that the decompression is fast without sacrificing compression quality. This is made possible by consolidating the inverse transform with dequantization in such a way as to allow most of the reprojection to be precomputed. Furthermore, we take advantage of the freedom afforded by off-line compression in order to optimize the encoding as much as possible while hiding this complexity from the decoder. In this context we develop a new block classification scheme which allows us to preserve perceptually important features in the compression. The result of this work is an asymmetric transform coding scheme that allows very large volumes to be compressed and then decompressed in real-time while rendering on the GPU.
NASA Astrophysics Data System (ADS)
Bada, Adedayo; Alcaraz-Calero, Jose M.; Wang, Qi; Grecos, Christos
2014-05-01
This paper describes a comprehensive empirical performance evaluation of 3D video processing employing the physical/virtual architecture implemented in a cloud environment. Different virtualization technologies, virtual video cards and various 3D benchmarks tools have been utilized in order to analyse the optimal performance in the context of 3D online gaming applications. This study highlights 3D video rendering performance under each type of hypervisors, and other factors including network I/O, disk I/O and memory usage. Comparisons of these factors under well-known virtual display technologies such as VNC, Spice and Virtual 3D adaptors reveal the strengths and weaknesses of the various hypervisors with respect to 3D video rendering and streaming.
40 CFR 164.91 - Accelerated decision.
Code of Federal Regulations, 2011 CFR
2011-07-01
... decision. (a) General. The Administrative Law Judge, in his discretion, may at any time render an accelerated decision in favor of Respondent as to all or any portion of the proceeding, including dismissal... matter of law; or (8) Such other and further reasons as are just. (b) Effect. A decision rendered under...
Migrating EO/IR sensors to cloud-based infrastructure as service architectures
NASA Astrophysics Data System (ADS)
Berglie, Stephen T.; Webster, Steven; May, Christopher M.
2014-06-01
The Night Vision Image Generator (NVIG), a product of US Army RDECOM CERDEC NVESD, is a visualization tool used widely throughout Army simulation environments to provide fully attributed synthesized, full motion video using physics-based sensor and environmental effects. The NVIG relies heavily on contemporary hardware-based acceleration and GPU processing techniques, which push the envelope of both enterprise and commodity-level hypervisor support for providing virtual machines with direct access to hardware resources. The NVIG has successfully been integrated into fully virtual environments where system architectures leverage cloudbased technologies to various extents in order to streamline infrastructure and service management. This paper details the challenges presented to engineers seeking to migrate GPU-bound processes, such as the NVIG, to virtual machines and, ultimately, Cloud-Based IAS architectures. In addition, it presents the path that led to success for the NVIG. A brief overview of Cloud-Based infrastructure management tool sets is provided, and several virtual desktop solutions are outlined. A discrimination is made between general purpose virtual desktop technologies compared to technologies that expose GPU-specific capabilities, including direct rendering and hard ware-based video encoding. Candidate hypervisor/virtual machine configurations that nominally satisfy the virtualized hardware-level GPU requirements of the NVIG are presented , and each is subsequently reviewed in light of its implications on higher-level Cloud management techniques. Implementation details are included from the hardware level, through the operating system, to the 3D graphics APls required by the NVIG and similar GPU-bound tools.
Research on Visualization of Ground Laser Radar Data Based on Osg
NASA Astrophysics Data System (ADS)
Huang, H.; Hu, C.; Zhang, F.; Xue, H.
2018-04-01
Three-dimensional (3D) laser scanning is a new advanced technology integrating light, machine, electricity, and computer technologies. It can conduct 3D scanning to the whole shape and form of space objects with high precision. With this technology, you can directly collect the point cloud data of a ground object and create the structure of it for rendering. People use excellent 3D rendering engine to optimize and display the 3D model in order to meet the higher requirements of real time realism rendering and the complexity of the scene. OpenSceneGraph (OSG) is an open source 3D graphics engine. Compared with the current mainstream 3D rendering engine, OSG is practical, economical, and easy to expand. Therefore, OSG is widely used in the fields of virtual simulation, virtual reality, science and engineering visualization. In this paper, a dynamic and interactive ground LiDAR data visualization platform is constructed based on the OSG and the cross-platform C++ application development framework Qt. In view of the point cloud data of .txt format and the triangulation network data file of .obj format, the functions of 3D laser point cloud and triangulation network data display are realized. It is proved by experiments that the platform is of strong practical value as it is easy to operate and provides good interaction.
Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchett, John M; Ahrens, James P; Lo, Li - Ta
2010-10-15
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less
Using FastX on the Peregrine System | High-Performance Computing | NREL
with full 3D hardware acceleration. The traditional method of displaying graphics applications to a remote X server (indirect rendering) supports 3D hardware acceleration, but this approach causes all of the OpenGL commands and 3D data to be sent over the network to be rendered on the client machine. With
Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser
NASA Astrophysics Data System (ADS)
Christen, M.
2016-06-01
Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.
3D in the Fast Lane: Render as You Go with the Latest OpenGL Boards.
ERIC Educational Resources Information Center
Sauer, Jeff; Murphy, Sam
1997-01-01
NT OpenGL hardware allows modelers and animators to work at relatively inexpensive NT workstations in their own offices or homes previous to shared space and workstation time in expensive studios. Rates seven OpenGL boards and two QuickDraw 3D accelerator boards for Mac users on overall value, wireframe and texture rendering, 2D acceleration, and…
NASA Astrophysics Data System (ADS)
Tanaka, S.; Hasegawa, K.; Okamoto, N.; Umegaki, R.; Wang, S.; Uemura, M.; Okamoto, A.; Koyamada, K.
2016-06-01
We propose a method for the precise 3D see-through imaging, or transparent visualization, of the large-scale and complex point clouds acquired via the laser scanning of 3D cultural heritage objects. Our method is based on a stochastic algorithm and directly uses the 3D points, which are acquired using a laser scanner, as the rendering primitives. This method achieves the correct depth feel without requiring depth sorting of the rendering primitives along the line of sight. Eliminating this need allows us to avoid long computation times when creating natural and precise 3D see-through views of laser-scanned cultural heritage objects. The opacity of each laser-scanned object is also flexibly controllable. For a laser-scanned point cloud consisting of more than 107 or 108 3D points, the pre-processing requires only a few minutes, and the rendering can be executed at interactive frame rates. Our method enables the creation of cumulative 3D see-through images of time-series laser-scanned data. It also offers the possibility of fused visualization for observing a laser-scanned object behind a transparent high-quality photographic image placed in the 3D scene. We demonstrate the effectiveness of our method by applying it to festival floats of high cultural value. These festival floats have complex outer and inner 3D structures and are suitable for see-through imaging.
Application of cellular automata approach for cloud simulation and rendering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopher Immanuel, W.; Paul Mary Deborrah, S.; Samuel Selvaraj, R.
Current techniques for creating clouds in games and other real time applications produce static, homogenous clouds. These clouds, while viable for real time applications, do not exhibit an organic feel that clouds in nature exhibit. These clouds, when viewed over a time period, were able to deform their initial shape and move in a more organic and dynamic way. With cloud shape technology we should be able in the future to extend to create even more cloud shapes in real time with more forces. Clouds are an essential part of any computer model of a landscape or an animation ofmore » an outdoor scene. A realistic animation of clouds is also important for creating scenes for flight simulators, movies, games, and other. Our goal was to create a realistic animation of clouds.« less
40 CFR 164.91 - Accelerated decision.
Code of Federal Regulations, 2010 CFR
2010-07-01
... decision. (a) General. The Administrative Law Judge, in his discretion, may at any time render an... matter of law; or (8) Such other and further reasons as are just. (b) Effect. A decision rendered under...
Cloud Computing and Validated Learning for Accelerating Innovation in IoT
ERIC Educational Resources Information Center
Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus
2015-01-01
Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…
Properties of the electron cloud in a high-energy positron and electron storage ring
Harkay, K. C.; Rosenberg, R. A.
2003-03-20
Low-energy, background electrons are ubiquitous in high-energy particle accelerators. Under certain conditions, interactions between this electron cloud and the high-energy beam can give rise to numerous effects that can seriously degrade the accelerator performance. These effects range from vacuum degradation to collective beam instabilities and emittance blowup. Although electron-cloud effects were first observed two decades ago in a few proton storage rings, they have in recent years been widely observed and intensely studied in positron and proton rings. Electron-cloud diagnostics developed at the Advanced Photon Source enabled for the first time detailed, direct characterization of the electron-cloud properties in amore » positron and electron storage ring. From in situ measurements of the electron flux and energy distribution at the vacuum chamber wall, electron-cloud production mechanisms and details of the beam-cloud interaction can be inferred. A significant longitudinal variation of the electron cloud is also observed, due primarily to geometrical details of the vacuum chamber. Furthermore, such experimental data can be used to provide realistic limits on key input parameters in modeling efforts, leading ultimately to greater confidence in predicting electron-cloud effects in future accelerators.« less
Abdellah, Marwan; Eldeib, Ayman; Owis, Mohamed I
2015-01-01
This paper features an advanced implementation of the X-ray rendering algorithm that harnesses the giant computing power of the current commodity graphics processors to accelerate the generation of high resolution digitally reconstructed radiographs (DRRs). The presented pipeline exploits the latest features of NVIDIA Graphics Processing Unit (GPU) architectures, mainly bindless texture objects and dynamic parallelism. The rendering throughput is substantially improved by exploiting the interoperability mechanisms between CUDA and OpenGL. The benchmarks of our optimized rendering pipeline reflect its capability of generating DRRs with resolutions of 2048(2) and 4096(2) at interactive and semi interactive frame-rates using an NVIDIA GeForce 970 GTX device.
High-energy radiation from collisions of high-velocity clouds and the Galactic disc
NASA Astrophysics Data System (ADS)
del Valle, Maria V.; Müller, A. L.; Romero, G. E.
2018-04-01
High-velocity clouds (HVCs) are interstellar clouds of atomic hydrogen that do not follow normal Galactic rotation and have velocities of a several hundred kilometres per second. A considerable number of these clouds are falling down towards the Galactic disc. HVCs form large and massive complexes, so if they collide with the disc a great amount of energy would be released into the interstellar medium. The cloud-disc interaction produces two shocks: one propagates through the cloud and the other through the disc. The properties of these shocks depend mainly on the cloud velocity and the disc-cloud density ratio. In this work, we study the conditions necessary for these shocks to accelerate particles by diffusive shock acceleration and we study the non-thermal radiation that is produced. We analyse particle acceleration in both the cloud and disc shocks. Solving a time-dependent two-dimensional transport equation for both relativistic electrons and protons, we obtain particle distributions and non-thermal spectral energy distributions. In a shocked cloud, significant synchrotron radio emission is produced along with soft gamma rays. In the case of acceleration in the shocked disc, the non-thermal radiation is stronger; the gamma rays, of leptonic origin, might be detectable with current instruments. A large number of protons are injected into the Galactic interstellar medium, and locally exceed the cosmic ray background. We conclude that under adequate conditions the contribution from HVC-disc collisions to the galactic population of relativistic particles and the associated extended non-thermal radiation might be important.
NASA Astrophysics Data System (ADS)
Pritchard, M. S.; Bretherton, C. S.; DeMott, C. A.
2014-12-01
New trade-offs are discussed in the cloud superparameterization approach to explicitly representing deep convection in global climate models. Intrinsic predictability tests show that the memory of cloud-resolving-scale organization is not critical for producing desired modes of organized convection such as the Madden-Julian Oscillation (MJO). This has implications for the feasibility of data assimilation and real-world initialization for superparameterized weather forecasting. Climate simulation sensitivity tests demonstrate that 400% acceleration of cloud superparameterization is possible by restricting the 32-128 km scale regime without deteriorating the realism of the simulated MJO but the number of cloud resolving model grid columns is discovered to constrain the efficiency of vertical mixing, with consequences for the simulated liquid cloud climatology. Tuning opportunities for next generation accelerated superparameterized climate models are discussed.
Beamlets from stochastic acceleration
NASA Astrophysics Data System (ADS)
Perri, Silvia; Carbone, Vincenzo
2008-09-01
We investigate the dynamics of a realization of the stochastic Fermi acceleration mechanism. The model consists of test particles moving between two oscillating magnetic clouds and differs from the usual Fermi-Ulam model in two ways. (i) Particles can penetrate inside clouds before being reflected. (ii) Particles can radiate a fraction of their energy during the process. Since the Fermi mechanism is at work, particles are stochastically accelerated, even in the presence of the radiated energy. Furthermore, due to a kind of resonance between particles and oscillating clouds, the probability density function of particles is strongly modified, thus generating beams of accelerated particles rather than a translation of the whole distribution function to higher energy. This simple mechanism could account for the presence of beamlets in some space plasma physics situations.
Mean-state acceleration of cloud-resolving models and large eddy simulations
Jones, C. R.; Bretherton, C. S.; Pritchard, M. S.
2015-10-29
In this study, large eddy simulations and cloud-resolving models (CRMs) are routinely used to simulate boundary layer and deep convective cloud processes, aid in the development of moist physical parameterization for global models, study cloud-climate feedbacks and cloud-aerosol interaction, and as the heart of superparameterized climate models. These models are computationally demanding, placing practical constraints on their use in these applications, especially for long, climate-relevant simulations. In many situations, the horizontal-mean atmospheric structure evolves slowly compared to the turnover time of the most energetic turbulent eddies. We develop a simple scheme to reduce this time scale separation to accelerate themore » evolution of the mean state. Using this approach we are able to accelerate the model evolution by a factor of 2–16 or more in idealized stratocumulus, shallow and deep cumulus convection without substantial loss of accuracy in simulating mean cloud statistics and their sensitivity to climate change perturbations. As a culminating test, we apply this technique to accelerate the embedded CRMs in the Superparameterized Community Atmosphere Model by a factor of 2, thereby showing that the method is robust and stable to realistic perturbations across spatial and temporal scales typical in a GCM.« less
FERMI LAT Discovery of Extended Gamma-Ray Emission in the Direction of Supernova Remnant W51C
Abdo, A. A.; Ackermann, M.; Ajello, M.; ...
2009-10-27
In this paper, the discovery of bright gamma-ray emission coincident with supernova remnant (SNR) W51C is reported using the Large Area Telescope (LAT) onboard the Fermi Gamma-ray Space Telescope. W51C is a middle-aged remnant (~10 4 yr) with intense radio synchrotron emission in its shell and known to be interacting with a molecular cloud. The gamma-ray emission is spatially extended, broadly consistent with the radio and X-ray extent of SNR W51C. The energy spectrum in the 0.2-50 GeV band exhibits steepening toward high energies. The luminosity is greater than 1 × 10 36 erg s –1 given the distance constraint of D > 5.5 kpc, which makes this object one of the most luminous gamma-ray sources in our Galaxy. The observed gamma-rays can be explained reasonably by a combination of efficient acceleration of nuclear cosmic rays at supernova shocks and shock-cloud interactions. The decay of neutral π mesons produced in hadronic collisions provides a plausible explanation for the gamma-ray emission. The product of the average gas density and the total energy content of the accelerated protons amounts tomore » $$\\bar{n}_{\\rm H}W_p \\simeq 5\\times 10^{51}\\ (D/6\\ {\\rm kpc})^2\\ \\rm erg\\ cm^{-3}$$. Electron density constraints from the radio and X-ray bands render it difficult to explain the LAT signal as due to inverse Compton scattering. Finally, the Fermi LAT source coincident with SNR W51C sheds new light on the origin of Galactic cosmic rays.« less
Fermi-LAT Discovery of Extended Gamma-Ray Emission in the Direction of Supernova Remnant W51C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdo, A.A.; /Naval Research Lab, Wash., D.C. /Federal City Coll.; Ackermann, M.
The discovery of bright gamma-ray emission coincident with supernova remnant (SNR) W51C is reported using the Large Area Telescope (LAT) onboard the Fermi Gamma-ray Space Telescope. W51C is a middle-aged remnant ({approx}10{sup 4} yr) with intense radio synchrotron emission in its shell and known to be interacting with a molecular cloud. The gamma-ray emission is spatially extended, broadly consistent with the radio and X-ray extent of SNR W51C. The energy spectrum in the 0.2-50 GeV band exhibits steepening toward high energies. The luminosity is greater than 1 x 10{sup 36} erg s{sup -1} given the distance constraint of D >more » 5.5 kpc, which makes this object one of the most luminous gamma-ray sources in our Galaxy. The observed gamma-rays can be explained reasonably by a combination of efficient acceleration of nuclear cosmic rays at supernova shocks and shock-cloud interactions. The decay of neutral p mesons produced in hadronic collisions provides a plausible explanation for the gamma-ray emission. The product of the average gas density and the total energy content of the accelerated protons amounts to {bar n}{sub H} W{sub p} {approx_equal} 5 x 10{sup 51} (D/6 kpc){sup 2} erg cm{sup -3}. Electron density constraints from the radio and X-ray bands render it difficult to explain the LAT signal as due to inverse Compton scattering. The Fermi LAT source coincident with SNR W51C sheds new light on the origin of Galactic cosmic rays.« less
Particle nonuniformity effects on particle cloud flames in low gravity
NASA Technical Reports Server (NTRS)
Berlad, A. L.; Tangirala, V.; Seshadri, K.; Facca, L. T.; Ogrin, J.; Ross, H.
1991-01-01
Experimental and analytical studies of particle cloud combustion at reduced gravity reveal the substantial roles that particle cloud nonuniformities may play in particle cloud combustion. Macroscopically uniform, quiescent particle cloud systems (at very low gravitational levels and above) sustain processes which can render them nonuniform on both macroscopic and microscopic scales. It is found that a given macroscopically uniform, quiescent particle cloud flame system can display a range of microscopically nonuniform features which lead to a range of combustion features. Microscopically nonuniform particle cloud distributions are difficult experimentally to detect and characterize. A uniformly distributed lycopodium cloud of particle-enriched microscopic nonuniformities in reduced gravity displays a range of burning velocities for any given overall stoichiometry. The range of observed and calculated burning velocities corresponds to the range of particle enriched concentrations within a characteristic microscopic nonuniformity. Sedimentation effects (even in reduced gravity) are also examined.
Rendering the "Not-So-Simple" Pendulum Experimentally Accessible.
ERIC Educational Resources Information Center
Jackson, David P.
1996-01-01
Presents three methods for obtaining experimental data related to acceleration of a simple pendulum. Two of the methods involve angular position measurements and the subsequent calculation of the acceleration while the third method involves a direct measurement of the acceleration. Compares these results with theoretical calculations and…
IceT users' guide and reference.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.
2011-01-01
The Image Composition Engine for Tiles (IceT) is a high-performance sort-last parallel rendering library. In addition to providing accelerated rendering for a standard display, IceT provides the unique ability to generate images for tiled displays. The overall resolution of the display may be several times larger than any viewport that may be rendered by a single machine. This document is an overview of the user interface to IceT.
The Launching of Cold Clouds by Galaxy Outflows. I. Hydrodynamic Interactions with Radiative Cooling
NASA Astrophysics Data System (ADS)
Scannapieco, Evan; Brüggen, Marcus
2015-06-01
To better understand the nature of the multiphase material found in outflowing galaxies, we study the evolution of cold clouds embedded in flows of hot and fast material. Using a suite of adaptive mesh refinement simulations that include radiative cooling, we investigate both cloud mass loss and cloud acceleration under the full range of conditions observed in galaxy outflows. The simulations are designed to track the cloud center of mass, enabling us to study the cloud evolution at long disruption times. For supersonic flows, a Mach cone forms around the cloud, which damps the Kelvin-Helmholtz instability but also establishes a streamwise pressure gradient that stretches the cloud apart. If time is expressed in units of the cloud crushing time, both the cloud lifetime and the cloud acceleration rate are independent of cloud radius, and we find simple scalings for these quantities as a function of the Mach number of the external medium. A resolution study suggests that our simulations accurately describe the evolution of cold clouds in the absence of thermal conduction and magnetic fields, physical processes whose roles will be studied in forthcoming papers.
NASA Astrophysics Data System (ADS)
Wu, S.; Yan, Y.; Du, Z.; Zhang, F.; Liu, R.
2017-10-01
The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.
LOD-Sprite Technique for Accelerated Terrain Rendering
1999-01-01
includes limited parallax, is possible. Another category samples the full plenoptic function, resulting in 3D, 4D or even 5D image sprites [13, 10... Plenoptic modeling: An image- based rendering system. Computer Graphics (Proc. SIG- GRAPH ’95), pages 39–46, 1995. [19] P. Rademacher and G. Bishop
Does the climate warming hiatus exist over the Tibetan Plateau?
Duan, Anmin; Xiao, Zhixiang
2015-09-02
The surface air temperature change over the Tibetan Plateau is determined based on historical observations from 1980 to 2013. In contrast to the cooling trend in the rest of China, and the global warming hiatus post-1990s, an accelerated warming trend has appeared over the Tibetan Plateau during 1998-2013 (0.25 °C decade(-1)), compared with that during 1980-1997 (0.21 °C decade(-1)). Further results indicate that, to some degree, such an accelerated warming trend might be attributable to cloud-radiation feedback. The increased nocturnal cloud over the northern Tibetan Plateau would warm the nighttime temperature via enhanced atmospheric back-radiation, while the decreased daytime cloud over the southern Tibetan Plateau would induce the daytime sunshine duration to increase, resulting in surface air temperature warming. Meanwhile, the in situ surface wind speed has recovered gradually since 1998, and thus the energy concentration cannot explain the accelerated warming trend over the Tibetan Plateau after the 1990s. It is suggested that cloud-radiation feedback may play an important role in modulating the recent accelerated warming trend over the Tibetan Plateau.
Supernova Remnant Kes 17: An Efficient Cosmic Ray Accelerator inside a Molecular Cloud
NASA Astrophysics Data System (ADS)
Gelfand, Joseph D.; Castro, Daniel; Slane, Patrick O.; Temim, Tea; Hughes, John P.; Rakowski, Cara
2013-11-01
The supernova remnant Kes 17 (SNR G304.6+0.1) is one of a few but growing number of remnants detected across the electromagnetic spectrum. In this paper, we analyze recent radio, X-ray, and γ-ray observations of this object, determining that efficient cosmic ray acceleration is required to explain its broadband non-thermal spectrum. These observations also suggest that Kes 17 is expanding inside a molecular cloud, though our determination of its age depends on whether thermal conduction or clump evaporation is primarily responsible for its center-filled thermal X-ray morphology. Evidence for efficient cosmic ray acceleration in Kes 17 supports recent theoretical work concluding that the strong magnetic field, turbulence, and clumpy nature of molecular clouds enhance cosmic ray production in supernova remnants. While additional observations are needed to confirm this interpretation, further study of Kes 17 is important for understanding how cosmic rays are accelerated in supernova remnants.
An Approach of Web-based Point Cloud Visualization without Plug-in
NASA Astrophysics Data System (ADS)
Ye, Mengxuan; Wei, Shuangfeng; Zhang, Dongmei
2016-11-01
With the advances in three-dimensional laser scanning technology, the demand for visualization of massive point cloud is increasingly urgent, but a few years ago point cloud visualization was limited to desktop-based solutions until the introduction of WebGL, several web renderers are available. This paper addressed the current issues in web-based point cloud visualization, and proposed a method of web-based point cloud visualization without plug-in. The method combines ASP.NET and WebGL technologies, using the spatial database PostgreSQL to store data and the open web technologies HTML5 and CSS3 to implement the user interface, a visualization system online for 3D point cloud is developed by Javascript with the web interactions. Finally, the method is applied to the real case. Experiment proves that the new model is of great practical value which avoids the shortcoming of the existing WebGIS solutions.
Rendering of dense, point cloud data in a high fidelity driving simulator.
DOT National Transportation Integrated Search
2014-09-01
Driving Simulators are advanced tools that can address many research questions in transportation. Recently they have been used to advance the practice of transportation engineering, specifically signs, signals, pavement markings, and most powerfully ...
Distributed shared memory for roaming large volumes.
Castanié, Laurent; Mion, Christophe; Cavin, Xavier; Lévy, Bruno
2006-01-01
We present a cluster-based volume rendering system for roaming very large volumes. This system allows to move a gigabyte-sized probe inside a total volume of several tens or hundreds of gigabytes in real-time. While the size of the probe is limited by the total amount of texture memory on the cluster, the size of the total data set has no theoretical limit. The cluster is used as a distributed graphics processing unit that both aggregates graphics power and graphics memory. A hardware-accelerated volume renderer runs in parallel on the cluster nodes and the final image compositing is implemented using a pipelined sort-last rendering algorithm. Meanwhile, volume bricking and volume paging allow efficient data caching. On each rendering node, a distributed hierarchical cache system implements a global software-based distributed shared memory on the cluster. In case of a cache miss, this system first checks page residency on the other cluster nodes instead of directly accessing local disks. Using two Gigabit Ethernet network interfaces per node, we accelerate data fetching by a factor of 4 compared to directly accessing local disks. The system also implements asynchronous disk access and texture loading, which makes it possible to overlap data loading, volume slicing and rendering for optimal volume roaming.
Beam induced electron cloud resonances in dipole magnetic fields
Calvey, J. R.; Hartung, W.; Makita, J.; ...
2016-07-01
The buildup of low energy electrons in an accelerator, known as electron cloud, can be severely detrimental to machine performance. Under certain beam conditions, the beam can become resonant with the cloud dynamics, accelerating the buildup of electrons. This paper will examine two such effects: multipacting resonances, in which the cloud development time is resonant with the bunch spacing, and cyclotron resonances, in which the cyclotron period of electrons in a magnetic field is a multiple of bunch spacing. Both resonances have been studied directly in dipole fields using retarding field analyzers installed in the Cornell Electron Storage Ring. Thesemore » measurements are supported by both analytical models and computer simulations.« less
A Simple Technique for Securing Data at Rest Stored in a Computing Cloud
NASA Astrophysics Data System (ADS)
Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai
"Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.
High-dynamic-range imaging for cloud segmentation
NASA Astrophysics Data System (ADS)
Dev, Soumyabrata; Savoy, Florian M.; Lee, Yee Hui; Winkler, Stefan
2018-04-01
Sky-cloud images obtained from ground-based sky cameras are usually captured using a fisheye lens with a wide field of view. However, the sky exhibits a large dynamic range in terms of luminance, more than a conventional camera can capture. It is thus difficult to capture the details of an entire scene with a regular camera in a single shot. In most cases, the circumsolar region is overexposed, and the regions near the horizon are underexposed. This renders cloud segmentation for such images difficult. In this paper, we propose HDRCloudSeg - an effective method for cloud segmentation using high-dynamic-range (HDR) imaging based on multi-exposure fusion. We describe the HDR image generation process and release a new database to the community for benchmarking. Our proposed approach is the first using HDR radiance maps for cloud segmentation and achieves very good results.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...-1659-01] Request for Comments on NIST Special Publication 500-293, US Government Cloud Computing... Publication 500-293, US Government Cloud Computing Technology Roadmap, Release 1.0 (Draft). This document is... (USG) agencies to accelerate their adoption of cloud computing. The roadmap has been developed through...
Beam Tests of Diamond-Like Carbon Coating for Mitigation of Electron Cloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffrey; Backfish, Michael; Kato, Shigeki
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Our results evaluate the efficacy of a diamond-like carbon (DLC) coating for the mitigation of electron in the Fermilab Main Injector. The interior surface of the beampipe conditions in response to electron bombardment from the electron cloud and we track the change in electron cloud flux over time in the DLC coated beampipe and uncoated stainless steel beampipe. The electron flux is measured by retarding field analyzers placed in a field-free region of the Mainmore » Injector. We find the DLC coating reduces the electron cloud signal to roughly 2\\% of that measured in the uncoated stainless steel beampipe.« less
Electron Cloud Effects in Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
Abstract We present a brief summary of various aspects of the electron-cloud effect (ECE) in accelerators. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire ?ECLOUD? series [1?22]. In addition, the proceedings of the various flavors of Particle Accelerator Conferences [23] contain a large number of EC-related publications. The ICFA Beam Dynamics Newsletter series [24] contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC [25].
Beam tests of beampipe coatings for electron cloud mitigation in Fermilab Main Injector
Backfish, Michael; Eldred, Jeffrey; Tan, Cheng Yang; ...
2015-10-26
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Dedicated tests have shown beampipe coatings dramatically reduce the density of electron cloud in particle accelerators. In this work, we evaluate the performance of titanium nitride, amorphous carbon, and diamond-like carbon as beampipe coatings for the mitigation of electron cloud in the Fermilab Main Injector. Altogether our tests represent 2700 ampere-hours of proton operation spanning five years. Three electron cloud detectors, retarding field analyzers, are installed in a straight section and allow a direct comparisonmore » between the electron flux in the coated and uncoated stainless steel beampipe. We characterize the electron flux as a function of intensity up to a maximum of 50 trillion protons per cycle. Each beampipe material conditions in response to electron bombardment from the electron cloud and we track the changes in these materials as a function of time and the number of absorbed electrons. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. We measure the energy spectrum of electrons incident on the stainless steel, titanium nitride and amorphous carbon beampipes. We find the electron cloud signal is highly sensitive to stray magnetic fields and bunch-length over the Main Injector ramp cycle. In conclusion, we conduct a complete survey of the stray magnetic fields at the test station and compare the electron cloud signal to that in a field-free region.« less
Distributed rendering for multiview parallax displays
NASA Astrophysics Data System (ADS)
Annen, T.; Matusik, W.; Pfister, H.; Seidel, H.-P.; Zwicker, M.
2006-02-01
3D display technology holds great promise for the future of television, virtual reality, entertainment, and visualization. Multiview parallax displays deliver stereoscopic views without glasses to arbitrary positions within the viewing zone. These systems must include a high-performance and scalable 3D rendering subsystem in order to generate multiple views at real-time frame rates. This paper describes a distributed rendering system for large-scale multiview parallax displays built with a network of PCs, commodity graphics accelerators, multiple projectors, and multiview screens. The main challenge is to render various perspective views of the scene and assign rendering tasks effectively. In this paper we investigate two different approaches: Optical multiplexing for lenticular screens and software multiplexing for parallax-barrier displays. We describe the construction of large-scale multi-projector 3D display systems using lenticular and parallax-barrier technology. We have developed different distributed rendering algorithms using the Chromium stream-processing framework and evaluate the trade-offs and performance bottlenecks. Our results show that Chromium is well suited for interactive rendering on multiview parallax displays.
View compensated compression of volume rendered images for remote visualization.
Lalgudi, Hariharan G; Marcellin, Michael W; Bilgin, Ali; Oh, Han; Nadar, Mariappan S
2009-07-01
Remote visualization of volumetric images has gained importance over the past few years in medical and industrial applications. Volume visualization is a computationally intensive process, often requiring hardware acceleration to achieve a real time viewing experience. One remote visualization model that can accomplish this would transmit rendered images from a server, based on viewpoint requests from a client. For constrained server-client bandwidth, an efficient compression scheme is vital for transmitting high quality rendered images. In this paper, we present a new view compensation scheme that utilizes the geometric relationship between viewpoints to exploit the correlation between successive rendered images. The proposed method obviates motion estimation between rendered images, enabling significant reduction to the complexity of a compressor. Additionally, the view compensation scheme, in conjunction with JPEG2000 performs better than AVC, the state of the art video compression standard.
NASA Technical Reports Server (NTRS)
Hussey, K. J.; Hall, J. R.; Mortensen, R. A.
1986-01-01
Image processing methods and software used to animate nonimaging remotely sensed data on cloud cover are described. Three FORTRAN programs were written in the VICAR2/TAE image processing domain to perform 3D perspective rendering, to interactively select parameters controlling the projection, and to interpolate parameter sets for animation images between key frames. Operation of the 3D programs and transferring the images to film is automated using executive control language and custom hardware to link the computer and camera.
Accelerating Time-Varying Hardware Volume Rendering Using TSP Trees and Color-Based Error Metrics
NASA Technical Reports Server (NTRS)
Ellsworth, David; Chiang, Ling-Jen; Shen, Han-Wei; Kwak, Dochan (Technical Monitor)
2000-01-01
This paper describes a new hardware volume rendering algorithm for time-varying data. The algorithm uses the Time-Space Partitioning (TSP) tree data structure to identify regions within the data that have spatial or temporal coherence. By using this coherence, the rendering algorithm can improve performance when the volume data is larger than the texture memory capacity by decreasing the amount of textures required. This coherence can also allow improved speed by appropriately rendering flat-shaded polygons instead of textured polygons, and by not rendering transparent regions. To reduce the polygonization overhead caused by the use of the hierarchical data structure, we introduce an optimization method using polygon templates. The paper also introduces new color-based error metrics, which more accurately identify coherent regions compared to the earlier scalar-based metrics. By showing experimental results from runs using different data sets and error metrics, we demonstrate that the new methods give substantial improvements in volume rendering performance.
Summary of SLAC's SEY Measurement On Flat Accelerator Wall Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Pimpec, F.; /PSI, Villigen /SLAC
The electron cloud effect (ECE) causes beam instabilities in accelerator structures with intense positively charged bunched beams. Reduction of the secondary electron yield (SEY) of the beam pipe inner wall is effective in controlling cloud formation. We summarize SEY results obtained from flat TiN, TiZrV and Al surfaces carried out in a laboratory environment. SEY was measured after thermal conditioning, as well as after low energy, less than 300 eV, particle exposure.
A proposal for antiparallel acceleration of positrons using CEBAF
NASA Astrophysics Data System (ADS)
Tiefenback, M.; Wojtsekhowski, B.
2018-05-01
We present a scheme for positron beam acceleration in CEBAF antiparallel to the normal electron path, requiring no change in polarity of the magnet systems. This feature is essential to the principal benefit: enabling extremely simple configuration changes between conventional (clockwise) e- acceleration and counter clockwise e+ acceleration. Additionally, it appears possible to configure the accelerating cavity phases to support concurrent acceleration of the electron and positron beams. The last mode also may enable use of the higher peak current electron beam for system diagnostics. The inherent penalty of the concurrent mode in acceleration efficiency and increased energy spread may render this a commissioning-only diagnostic option, but the possibility appears worthy of consideration.
Diffuse Galactic gamma rays from shock-accelerated cosmic rays.
Dermer, Charles D
2012-08-31
A shock-accelerated particle flux is proportional to p(-s), where p is the particle momentum, follows from simple theoretical considerations of cosmic-ray acceleration at nonrelativistic shocks followed by rigidity-dependent escape into the Galactic halo. A flux of shock-accelerated cosmic-ray protons with s≈2.8 provides an adequate fit to the Fermi Large Area Telescope γ-ray emission spectra of high-latitude and molecular cloud gas when uncertainties in nuclear production models are considered. A break in the spectrum of cosmic-ray protons claimed by Neronov, Semikoz, and Taylor [Phys. Rev. Lett. 108, 051105 (2012)] when fitting the γ-ray spectra of high-latitude molecular clouds is a consequence of using a cosmic-ray proton flux described by a power law in kinetic energy.
On the performance of metrics to predict quality in point cloud representations
NASA Astrophysics Data System (ADS)
Alexiou, Evangelos; Ebrahimi, Touradj
2017-09-01
Point clouds are a promising alternative for immersive representation of visual contents. Recently, an increased interest has been observed in the acquisition, processing and rendering of this modality. Although subjective and objective evaluations are critical in order to assess the visual quality of media content, they still remain open problems for point cloud representation. In this paper we focus our efforts on subjective quality assessment of point cloud geometry, subject to typical types of impairments such as noise corruption and compression-like distortions. In particular, we propose a subjective methodology that is closer to real-life scenarios of point cloud visualization. The performance of the state-of-the-art objective metrics is assessed by considering the subjective scores as the ground truth. Moreover, we investigate the impact of adopting different test methodologies by comparing them. Advantages and drawbacks of every approach are reported, based on statistical analysis. The results and conclusions of this work provide useful insights that could be considered in future experimentation.
AstroCloud: An Agile platform for data visualization and specific analyzes in 2D and 3D
NASA Astrophysics Data System (ADS)
Molina, F. Z.; Salgado, R.; Bergel, A.; Infante, A.
2017-07-01
Nowadays, astronomers commonly run their own tools, or distributed computational packages, for data analysis and then visualizing the results with generic applications. This chain of processes comes at high cost: (a) analyses are manually applied, they are therefore difficult to be automatized, and (b) data have to be serialized, thus increasing the cost of parsing and saving intermediary data. We are developing AstroCloud, an agile visualization multipurpose platform intended for specific analyses of astronomical images (https://astrocloudy.wordpress.com). This platform incorporates domain-specific languages which make it easily extensible. AstroCloud supports customized plug-ins, which translate into time reduction on data analysis. Moreover, it also supports 2D and 3D rendering, including interactive features in real time. AstroCloud is under development, we are currently implementing different choices for data reduction and physical analyzes.
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-08
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.
Compression of 3D Point Clouds Using a Region-Adaptive Hierarchical Transform.
De Queiroz, Ricardo; Chou, Philip A
2016-06-01
In free-viewpoint video, there is a recent trend to represent scene objects as solids rather than using multiple depth maps. Point clouds have been used in computer graphics for a long time and with the recent possibility of real time capturing and rendering, point clouds have been favored over meshes in order to save computation. Each point in the cloud is associated with its 3D position and its color. We devise a method to compress the colors in point clouds which is based on a hierarchical transform and arithmetic coding. The transform is a hierarchical sub-band transform that resembles an adaptive variation of a Haar wavelet. The arithmetic encoding of the coefficients assumes Laplace distributions, one per sub-band. The Laplace parameter for each distribution is transmitted to the decoder using a custom method. The geometry of the point cloud is encoded using the well-established octtree scanning. Results show that the proposed solution performs comparably to the current state-of-the-art, in many occasions outperforming it, while being much more computationally efficient. We believe this work represents the state-of-the-art in intra-frame compression of point clouds for real-time 3D video.
Synthetic Absorption Lines for a Clumpy Medium: A Spectral Signature for Cloud Acceleration in AGN?
NASA Technical Reports Server (NTRS)
Waters, Tim; Proga, Daniel; Dannen, Randall; Kallman, Timothy R.
2017-01-01
There is increasing evidence that the highly ionized multiphase components of AGN disc winds may be due to thermal instability. The ions responsible for forming the observed X-ray absorption lines may only exist in relatively cool clumps that can be identified with the so-called warm absorbers. Here we calculate synthetic absorption lines for such warm absorbers from first principles by combining 2D hydrodynamic solutions of a two-phase medium with a dense grid of photoionization models to determine the detailed ionization structure of the gas. Our calculations reveal that cloud disruption, which leads to a highly complicated velocity field (i.e. a clumpy flow), will only mildly affect line shapes and strengths when the warm gas becomes highly mixed but not depleted. Prior to complete disruption, clouds that are optically thin to the driving UV resonance lines will cause absorption at an increasingly blueshifted line-of-sight velocity as they are accelerated. This behavior will imprint an identifiable signature on the line profile if warm absorbers are enshrouded in an even broader absorption line produced by a high column of intercloud gas. Interestingly, we show that it is possible to develop a spectral diagnostic for cloud acceleration by differencing the absorption components of a doublet line, a result that can be qualitatively understood using a simple partial covering model. Our calculations also permit us to comment on the spectral differences between cloud disruption and ionization changes driven by flux variability. Notably, cloud disruption offers another possibility for explaining absorption line variability.
NASA Astrophysics Data System (ADS)
Jun, Byung-Il; Jones, T. W.
1999-02-01
We present two-dimensional MHD simulations of the evolution of a young Type Ia supernova remnant (SNR) during its interaction with an interstellar cloud of comparable size at impact. We include for the first time in such simulations explicit relativistic electron transport. This was done using a simplified treatment of the diffusion-advection equation, thus allowing us to model injection and acceleration of cosmic-ray electrons at shocks and their subsequent transport. From this information we also model radio synchrotron emission, including spectral information. The simulations were carried out in spherical coordinates with azimuthal symmetry and compare three different situations, each incorporating an initially uniform interstellar magnetic field oriented in the polar direction on the grid. In particular, we modeled the SNR-cloud interactions for a spherical cloud on the polar axis, a toroidal cloud whose axis is aligned with the polar axis, and, for comparison, a uniform medium with no cloud. We find that the evolution of the overrun cloud qualitatively resembles that seen in simulations of simpler but analogous situations: that is, the cloud is crushed and begins to be disrupted by Rayleigh-Taylor and Kelvin-Helmholtz instabilities. However, we demonstrate here that, in addition, the internal structure of the SNR is severely distorted as such clouds are engulfed. This has important dynamical and observational implications. The principal new conclusions we draw from these experiments are the following. (1) Independent of the cloud interaction, the SNR reverse shock can be an efficient site for particle acceleration in a young SNR. (2) The internal flows of the SNR become highly turbulent once it encounters a large cloud. (3) An initially uniform magnetic field is preferentially amplified along the magnetic equator of the SNR, primarily because of biased amplification in that region by Rayleigh-Taylor instabilities. A similar bias produces much greater enhancement to the magnetic energy in the SNR during an encounter with a cloud when the interstellar magnetic field is partially transverse to the expansion of the SNR. The enhanced magnetic fields have a significant radial component, independent of the field orientation external to the SNR. This leads to a strong equatorial bias in synchrotron brightness that could easily mask any enhancements to electron-acceleration efficiency near the magnetic equator of the SNR. Thus, to establish the latter effect, it will be essential to establish that the magnetic field in the brightest regions are actually tangential to the blast wave. (4) The filamentary radio structures correlate well with ``turbulence-enhanced'' magnetic structures, while the diffuse radio emission more closely follows the gas-density distribution within the SNR. (5) At these early times, the synchrotron spectral index due to electrons accelerated at the primary shocks should be close to 0.5 unless those shocks are modified by cosmic-ray proton pressures. While that result is predictable, we find that this simple result can be significantly complicated in practice by SNR interactions with clouds. Those events can produce regions with significantly steeper spectra. Especially if there are multiple cloud encounters, this interaction can lead to nonuniform spatial spectral distributions or, through turbulent mixing, produce a spectrum that is difficult to relate to the actual strength of the blast wave. (6) Interaction with the cloud enhances the nonthermal electron population in the SNR in our simulations because of additional electron injection taking place in the shocks associated with the cloud. Together with point 3, this means that SNR-cloud encounters can significantly increase the radio emission from the SNR.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A
2017-04-01
In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.
A proposal for antiparallel acceleration of positrons using CEBAF
Tiefenback, M.; Wojtsekhowski, B.
2018-05-01
Here, we present a scheme for positron beam acceleration in CEBAF antiparallel to the normal electron path, requiring no change in polarity of the magnet systems. This feature is essential to the principal benefit: enabling extremely simple configuration changes between conventional (clockwise) e - acceleration and counter clockwise e + acceleration. Additionally, it appears possible to configure the accelerating cavity phases to support concurrent acceleration of the electron and positron beams. The last mode also may enable use of the higher peak current electron beam for system diagnostics. The inherent penalty of the concurrent mode in acceleration efficiency and increasedmore » energy spread may render this a commissioning-only diagnostic option, but the possibility appears worthy of consideration.« less
A proposal for antiparallel acceleration of positrons using CEBAF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiefenback, M.; Wojtsekhowski, B.
Here, we present a scheme for positron beam acceleration in CEBAF antiparallel to the normal electron path, requiring no change in polarity of the magnet systems. This feature is essential to the principal benefit: enabling extremely simple configuration changes between conventional (clockwise) e - acceleration and counter clockwise e + acceleration. Additionally, it appears possible to configure the accelerating cavity phases to support concurrent acceleration of the electron and positron beams. The last mode also may enable use of the higher peak current electron beam for system diagnostics. The inherent penalty of the concurrent mode in acceleration efficiency and increasedmore » energy spread may render this a commissioning-only diagnostic option, but the possibility appears worthy of consideration.« less
Ice Clouds in Martian Arctic (Accelerated Movie)
NASA Technical Reports Server (NTRS)
2008-01-01
Clouds scoot across the Martian sky in a movie clip consisting of 10 frames taken by the Surface Stereo Imager on NASA's Phoenix Mars Lander. This clip accelerates the motion. The camera took these 10 frames over a 10-minute period from 2:52 p.m. to 3:02 p.m. local solar time at the Phoenix site during Sol 94 (Aug. 29), the 94th Martian day since landing. Particles of water-ice make up these clouds, like ice-crystal cirrus clouds on Earth. Ice hazes have been common at the Phoenix site in recent days. The camera took these images as part of a campaign by the Phoenix team to see clouds and track winds. The view is toward slightly west of due south, so the clouds are moving westward or west-northwestward. The clouds are a dramatic visualization of the Martian water cycle. The water vapor comes off the north pole during the peak of summer. The northern-Mars summer has just passed its peak water-vapor abundance at the Phoenix site. The atmospheric water is available to form into clouds, fog and frost, such as the lander has been observing recently. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis
NASA Astrophysics Data System (ADS)
Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.
2015-08-01
The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-01
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4× speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration. PMID:28075343
Three dimensional Visualization of Jupiter's Equatorial Region
NASA Technical Reports Server (NTRS)
1997-01-01
Frames from a three dimensional visualization of Jupiter's equatorial region. The images used cover an area of 34,000 kilometers by 11,000 kilometers (about 21,100 by 6,800 miles) near an equatorial 'hotspot' similar to the site where the probe from NASA's Galileo spacecraft entered Jupiter's atmosphere on December 7th, 1995. These features are holes in the bright, reflective, equatorial cloud layer where warmer thermal emission from Jupiter's deep atmosphere can pass through. The circulation patterns observed here along with the composition measurements from the Galileo Probe suggest that dry air may be converging and sinking over these regions, maintaining their cloud-free appearance. The bright clouds to the right of the hotspot as well as the other bright features may be examples of upwelling of moist air and condensation.
This frame is a view to the northeast, from between the cloud layers and above the streaks in the lower cloud leading towards the hotspot. The upper haze layer has some features that match the lower cloud, such as the bright streak in the foreground of the frame. These are probably thick clouds that span several tens of vertical kilometers.Galileo is the first spacecraft to image Jupiter in near-infrared light (which is invisible to the human eye) using three filters at 727, 756, and 889 nanometers (nm). Because light at these three wavelengths is absorbed at different altitudes by atmospheric methane, a comparison of the resulting images reveals information about the heights of clouds in Jupiter's atmosphere. This information can be visualized by rendering cloud surfaces with the appropriate height variations.The visualization reduces Jupiter's true cloud structure to two layers. The height of a high haze layer is assumed to be proportional to the reflectivity of Jupiter at 889 nm. The height of a lower tropospheric cloud is assumed to be proportional to the reflectivity at 727 nm divided by that at 756 nm. This model is overly simplistic, but is based on more sophisticated studies of Jupiter's cloud structure. The upper and lower clouds are separated in the rendering by an arbitrary amount, and the height variations are exaggerated by a factor of 25.The lower cloud is colored using the same false color scheme used in previously released image products, assigning red, green, and blue to the 756, 727, and 889 nanometer mosaics, respectively. Light bluish clouds are high and thin, reddish clouds are low, and white clouds are high and thick. The dark blue hotspot in the center is a hole in the lower cloud with an overlying thin haze.The images used cover latitudes 1 to 10 degrees and are centered at longitude 336 degrees west. The smallest resolved features are tens of kilometers in size. These images were taken on December 17, 1996, at a range of 1.5 million kilometers (about 930,000 miles) by the Solid State Imaging (CCD) system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://www.jpl.nasa.gov/ galileo.Three dimensional Visualization of Jupiter's Equatorial Region
NASA Technical Reports Server (NTRS)
1997-01-01
Frames from a three dimensional visualization of Jupiter's equatorial region. The images used cover an area of 34,000 kilometers by 11,000 kilometers (about 21,100 by 6,800 miles) near an equatorial 'hotspot' similar to the site where the probe from NASA's Galileo spacecraft entered Jupiter's atmosphere on December 7th, 1995. These features are holes in the bright, reflective, equatorial cloud layer where warmer thermal emission from Jupiter's deep atmosphere can pass through. The circulation patterns observed here along with the composition measurements from the Galileo Probe suggest that dry air may be converging and sinking over these regions, maintaining their cloud-free appearance. The bright clouds to the right of the hotspot as well as the other bright features may be examples of upwelling of moist air and condensation.
This frame is a view to the northeast, from between the cloud layers and above the streaks in the lower cloud leading towards the hotspot. The hotspot is clearly visible as a deep blue feature. The cloud streaks end near the hotspot, consistent with the idea that clouds traveling along these streak lines descend and evaporate as they approach the hotspot. The upper haze layer is slightly bowed upwards above the hotspot.Galileo is the first spacecraft to image Jupiter in near-infrared light (which is invisible to the human eye) using three filters at 727, 756, and 889 nanometers (nm). Because light at these three wavelengths is absorbed at different altitudes by atmospheric methane, a comparison of the resulting images reveals information about the heights of clouds in Jupiter's atmosphere. This information can be visualized by rendering cloud surfaces with the appropriate height variations.The visualization reduces Jupiter's true cloud structure to two layers. The height of a high haze layer is assumed to be proportional to the reflectivity of Jupiter at 889 nm. The height of a lower tropospheric cloud is assumed to be proportional to the reflectivity at 727 nm divided by that at 756 nm. This model is overly simplistic, but is based on more sophisticated studies of Jupiter's cloud structure. The upper and lower clouds are separated in the rendering by an arbitrary amount, and the height variations are exaggerated by a factor of 25.The lower cloud is colored using the same false color scheme used in previously released image products, assigning red, green, and blue to the 756, 727, and 889 nanometer mosaics, respectively. Light bluish clouds are high and thin, reddish clouds are low, and white clouds are high and thick. The dark blue hotspot in the center is a hole in the lower cloud with an overlying thin haze.The images used cover latitudes 1 to 10 degrees and are centered at longitude 336 degrees west. The smallest resolved features are tens of kilometers in size. These images were taken on December 17, 1996, at a range of 1.5 million kilometers (about 930,000 miles) by the Solid State Imaging (CCD) system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://www.jpl.nasa.gov/ galileo.Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; ...
2016-04-11
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnostics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud withmore » stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains, 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this study we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.« less
NASA Astrophysics Data System (ADS)
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; Dugan, G. F.; Flanagan, J.; McArdle, K. E.; Miller, M. I.; Palmer, M. A.; Ramirez, G. A.; Sonnad, K. G.; Totten, M. M.; Tucker, S. L.; Williams, H. A.
2016-04-01
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnotics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud with stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains; 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this paper we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.
Artist's Rendering of Multiple Whirlpools in a Sodium Gas Cloud
NASA Technical Reports Server (NTRS)
2003-01-01
This image depicts the formation of multiple whirlpools in a sodium gas cloud. Scientists who cooled the cloud and made it spin created the whirlpools in a Massachusetts Institute of Technology laboratory, as part of NASA-funded research. This process is similar to a phenomenon called starquakes that appear as glitches in the rotation of pulsars in space. MIT's Wolgang Ketterle and his colleagues, who conducted the research under a grant from the Biological and Physical Research Program through NASA's Jet Propulsion Laboratory, Pasadena, Calif., cooled the sodium gas to less than one millionth of a degree above absolute zero (-273 Celsius or -460 Fahrenheit). At such extreme cold, the gas cloud converts to a peculiar form of matter called Bose-Einstein condensate, as predicted by Albert Einstein and Satyendra Bose of India in 1927. No physical container can hold such ultra-cold matter, so Ketterle's team used magnets to keep the cloud in place. They then used a laser beam to make the gas cloud spin, a process Ketterle compares to stroking a ping-pong ball with a feather until it starts spirning. The spinning sodium gas cloud, whose volume was one- millionth of a cubic centimeter, much smaller than a raindrop, developed a regular pattern of more than 100 whirlpools.
NASA Astrophysics Data System (ADS)
Feng, Bing
Electron cloud instabilities have been observed in many circular accelerators around the world and raised concerns of future accelerators and possible upgrades. In this thesis, the electron cloud instabilities are studied with the quasi-static particle-in-cell (PIC) code QuickPIC. Modeling in three-dimensions the long timescale propagation of beam in electron clouds in circular accelerators requires faster and more efficient simulation codes. Thousands of processors are easily available for parallel computations. However, it is not straightforward to increase the effective speed of the simulation by running the same problem size on an increasingly number of processors because there is a limit to domain size in the decomposition of the two-dimensional part of the code. A pipelining algorithm applied on the fully parallelized particle-in-cell code QuickPIC is implemented to overcome this limit. The pipelining algorithm uses multiple groups of processors and optimizes the job allocation on the processors in parallel computing. With this novel algorithm, it is possible to use on the order of 102 processors, and to expand the scale and the speed of the simulation with QuickPIC by a similar factor. In addition to the efficiency improvement with the pipelining algorithm, the fidelity of QuickPIC is enhanced by adding two physics models, the beam space charge effect and the dispersion effect. Simulation of two specific circular machines is performed with the enhanced QuickPIC. First, the proposed upgrade to the Fermilab Main Injector is studied with an eye upon guiding the design of the upgrade and code validation. Moderate emittance growth is observed for the upgrade of increasing the bunch population by 5 times. But the simulation also shows that increasing the beam energy from 8GeV to 20GeV or above can effectively limit the emittance growth. Then the enhanced QuickPIC is used to simulate the electron cloud effect on electron beam in the Cornell Energy Recovery Linac (ERL) due to extremely small emittance and high peak currents anticipated in the machine. A tune shift is discovered from the simulation; however, emittance growth of the electron beam in electron cloud is not observed for ERL parameters.
Genes2WordCloud: a quick way to identify biological themes from gene lists and free text.
Baroukh, Caroline; Jenkins, Sherry L; Dannenfelser, Ruth; Ma'ayan, Avi
2011-10-13
Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications.
Genes2WordCloud: a quick way to identify biological themes from gene lists and free text
2011-01-01
Background Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Results Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Methods Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Conclusions Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications. PMID:21995939
Three dimensional Visualization of Jupiter's Equatorial Region
NASA Technical Reports Server (NTRS)
1997-01-01
Frames from a three dimensional visualization of Jupiter's equatorial region. The images used cover an area of 34,000 kilometers by 11,000 kilometers (about 21,100 by 6,800 miles) near an equatorial 'hotspot' similar to the site where the probe from NASA's Galileo spacecraft entered Jupiter's atmosphere on December 7th, 1995. These features are holes in the bright, reflective, equatorial cloud layer where warmer thermal emission from Jupiter's deep atmosphere can pass through. The circulation patterns observed here along with the composition measurements from the Galileo Probe suggest that dry air may be converging and sinking over these regions, maintaining their cloud-free appearance. The bright clouds to the right of the hotspot as well as the other bright features may be examples of upwelling of moist air and condensation.
This frame is a view from the southwest looking northeast, from an altitude just above the high haze layer. The streaks in the lower cloud leading towards the hotspot are visible. The upper haze layer is mostly flat, with notable small peaks that can be matched with features in the lower cloud. In reality, these areas may represent a continuous vertical cloud column.Galileo is the first spacecraft to image Jupiter in near-infrared light (which is invisible to the human eye) using three filters at 727, 756, and 889 nanometers (nm). Because light at these three wavelengths is absorbed at different altitudes by atmospheric methane, a comparison of the resulting images reveals information about the heights of clouds in Jupiter's atmosphere. This information can be visualized by rendering cloud surfaces with the appropriate height variations.The visualization reduces Jupiter's true cloud structure to two layers. The height of a high haze layer is assumed to be proportional to the reflectivity of Jupiter at 889 nm. The height of a lower tropospheric cloud is assumed to be proportional to the reflectivity at 727 nm divided by that at 756 nm. This model is overly simplistic, but is based on more sophisticated studies of Jupiter's cloud structure. The upper and lower clouds are separated in the rendering by an arbitrary amount, and the height variations are exaggerated by a factor of 25.The lower cloud is colored using the same false color scheme used in previously released image products, assigning red, green, and blue to the 756, 727, and 889 nanometer mosaics, respectively. Light bluish clouds are high and thin, reddish clouds are low, and white clouds are high and thick. The dark blue hotspot in the center is a hole in the lower cloud with an overlying thin haze.The images used cover latitudes 1 to 10 degrees and are centered at longitude 336 degrees west. The smallest resolved features are tens of kilometers in size. These images were taken on December 17, 1996, at a range of 1.5 million kilometers (about 930,000 miles) by the Solid State Imaging (CCD) system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://galileo.jpl.nasa.gov.Three dimensional Visualization of Jupiter's Equatorial Region
NASA Technical Reports Server (NTRS)
1997-01-01
Frames from a three dimensional visualization of Jupiter's equatorial region. The images used cover an area of 34,000 kilometers by 11,000 kilometers (about 21,100 by 6,800 miles) near an equatorial 'hotspot' similar to the site where the probe from NASA's Galileo spacecraft entered Jupiter's atmosphere on December 7th, 1995. These features are holes in the bright, reflective, equatorial cloud layer where warmer thermal emission from Jupiter's deep atmosphere can pass through. The circulation patterns observed here along with the composition measurements from the Galileo Probe suggest that dry air may be converging and sinking over these regions, maintaining their cloud-free appearance. The bright clouds to the right of the hotspot as well as the other bright features may be examples of upwelling of moist air and condensation.
This frame is a view to the southeast, from between the cloud layers and over the north center of the region. The tall white clouds in the lower cloud deck are probably much like large terrestrial thunderclouds. They may be regions where atmospheric water powers vertical convection over large horizontal distances.Galileo is the first spacecraft to image Jupiter in near-infrared light (which is invisible to the human eye) using three filters at 727, 756, and 889 nanometers (nm). Because light at these three wavelengths is absorbed at different altitudes by atmospheric methane, a comparison of the resulting images reveals information about the heights of clouds in Jupiter's atmosphere. This information can be visualized by rendering cloud surfaces with the appropriate height variations.The visualization reduces Jupiter's true cloud structure to two layers. The height of a high haze layer is assumed to be proportional to the reflectivity of Jupiter at 889 nm. The height of a lower tropospheric cloud is assumed to be proportional to the reflectivity at 727 nm divided by that at 756 nm. This model is overly simplistic, but is based on more sophisticated studies of Jupiter's cloud structure. The upper and lower clouds are separated in the rendering by an arbitrary amount, and the height variations are exaggerated by a factor of 25.The lower cloud is colored using the same false color scheme used in previously released image products, assigning red, green, and blue to the 756, 727, and 889 nanometer mosaics, respectively. Light bluish clouds are high and thin, reddish clouds are low, and white clouds are high and thick. The dark blue hotspot in the center is a hole in the lower cloud with an overlying thin haze.The images used cover latitudes 1 to 10 degrees and are centered at longitude 336 degrees west. The smallest resolved features are tens of kilometers in size. These images were taken on December 17, 1996, at a range of 1.5 million kilometers (about 930,000 miles) by the Solid State Imaging (CCD) system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://www.jpl.nasa.gov/ galileo.Three dimensional Visualization of Jupiter's Equatorial Region
NASA Technical Reports Server (NTRS)
1997-01-01
Frames from a three dimensional visualization of Jupiter's equatorial region. The images used cover an area of 34,000 kilometers by 11,000 kilometers (about 21,100 by 6,800 miles) near an equatorial 'hotspot' similar to the site where the probe from NASA's Galileo spacecraft entered Jupiter's atmosphere on December 7th, 1995. These features are holes in the bright, reflective, equatorial cloud layer where warmer thermal emission from Jupiter's deep atmosphere can pass through. The circulation patterns observed here along with the composition measurements from the Galileo Probe suggest that dry air may be converging and sinking over these regions, maintaining their cloud-free appearance. The bright clouds to the right of the hotspot as well as the other bright features may be examples of upwelling of moist air and condensation.
This frame is a view to the west, from between the cloud layers and over the patchy white clouds to the east of the hotspot. This is probably an area where moist convection is occurring over large horizontal distances, similar to the atmosphere over the equatorial ocean on Earth. The clouds are high and thick, and are observed to change rapidly over short time scales.Galileo is the first spacecraft to image Jupiter in near-infrared light (which is invisible to the human eye) using three filters at 727, 756, and 889 nanometers (nm). Because light at these three wavelengths is absorbed at different altitudes by atmospheric methane, a comparison of the resulting images reveals information about the heights of clouds in Jupiter's atmosphere. This information can be visualized by rendering cloud surfaces with the appropriate height variations.The visualization reduces Jupiter's true cloud structure to two layers. The height of a high haze layer is assumed to be proportional to the reflectivity of Jupiter at 889 nm. The height of a lower tropospheric cloud is assumed to be proportional to the reflectivity at 727 nm divided by that at 756 nm. This model is overly simplistic, but is based on more sophisticated studies of Jupiter's cloud structure. The upper and lower clouds are separated in the rendering by an arbitrary amount, and the height variations are exaggerated by a factor of 25.The lower cloud is colored using the same false color scheme used in previously released image products, assigning red, green, and blue to the 756, 727, and 889 nanometer mosaics, respectively. Light bluish clouds are high and thin, reddish clouds are low, and white clouds are high and thick. The dark blue hotspot in the center is a hole in the lower cloud with an overlying thin haze.The images used cover latitudes 1 to 10 degrees and are centered at longitude 336 degrees west. The smallest resolved features are tens of kilometers in size. These images were taken on December 17, 1996, at a range of 1.5 million kilometers (about 930,000 miles) by the Solid State Imaging (CCD) system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://galileo.jpl.nasa.gov.Particle tracking acceleration via signed distance fields in direct-accelerated geometry Monte Carlo
Shriwise, Patrick C.; Davis, Andrew; Jacobson, Lucas J.; ...
2017-08-26
Computer-aided design (CAD)-based Monte Carlo radiation transport is of value to the nuclear engineering community for its ability to conduct transport on high-fidelity models of nuclear systems, but it is more computationally expensive than native geometry representations. This work describes the adaptation of a rendering data structure, the signed distance field, as a geometric query tool for accelerating CAD-based transport in the direct-accelerated geometry Monte Carlo toolkit. Demonstrations of its effectiveness are shown for several problems. The beginnings of a predictive model for the data structure's utilization based on various problem parameters is also introduced.
Interactive distributed hardware-accelerated LOD-sprite terrain rendering with stable frame rates
NASA Astrophysics Data System (ADS)
Swan, J. E., II; Arango, Jesus; Nakshatrala, Bala K.
2002-03-01
A stable frame rate is important for interactive rendering systems. Image-based modeling and rendering (IBMR) techniques, which model parts of the scene with image sprites, are a promising technique for interactive systems because they allow the sprite to be manipulated instead of the underlying scene geometry. However, with IBMR techniques a frequent problem is an unstable frame rate, because generating an image sprite (with 3D rendering) is time-consuming relative to manipulating the sprite (with 2D image resampling). This paper describes one solution to this problem, by distributing an IBMR technique into a collection of cooperating threads and executable programs across two computers. The particular IBMR technique distributed here is the LOD-Sprite algorithm. This technique uses a multiple level-of-detail (LOD) scene representation. It first renders a keyframe from a high-LOD representation, and then caches the frame as an image sprite. It renders subsequent spriteframes by texture-mapping the cached image sprite into a lower-LOD representation. We describe a distributed architecture and implementation of LOD-Sprite, in the context of terrain rendering, which takes advantage of graphics hardware. We present timing results which indicate we have achieved a stable frame rate. In addition to LOD-Sprite, our distribution method holds promise for other IBMR techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asahina, Yuta; Ohsuga, Ken; Nomura, Mariko, E-mail: asahina@cfca.jp
By performing three-dimensional magnetohydrodynamics simulations of subrelativistic jets and disk winds propagating into the magnetized inhomogeneous interstellar medium (ISM), we investigate the magnetic effects on the active galactic nucleus feedback. Our simulations reveal that the magnetic tension force promotes the acceleration of the dense gas clouds, since the magnetic field lines, which are initially straight, bend around the gas clouds. In the jet models, the velocity dispersion of the clouds increases with an increase in the initial magnetic fields. The increment of the kinetic energy of the clouds is proportional to the initial magnetic fields, implying that the magnetic tensionmore » force increases the energy conversion efficiency from the jet to the gas clouds. Through simulations of the mildly collimated disk wind and the funnel-shaped disk wind, we confirm that such an enhancement of the energy conversion efficiency via the magnetic fields appears even if the energy is injected via the disk winds. The enhancement of the acceleration of the dense part of the magnetized ISM via the magnetic tension force will occur wherever the magnetized inhomogeneous matter is blown away.« less
Equalizer: a scalable parallel rendering framework.
Eilemann, Stefan; Makhinya, Maxim; Pajarola, Renato
2009-01-01
Continuing improvements in CPU and GPU performances as well as increasing multi-core processor and cluster-based parallelism demand for flexible and scalable parallel rendering solutions that can exploit multipipe hardware accelerated graphics. In fact, to achieve interactive visualization, scalable rendering systems are essential to cope with the rapid growth of data sets. However, parallel rendering systems are non-trivial to develop and often only application specific implementations have been proposed. The task of developing a scalable parallel rendering framework is even more difficult if it should be generic to support various types of data and visualization applications, and at the same time work efficiently on a cluster with distributed graphics cards. In this paper we introduce a novel system called Equalizer, a toolkit for scalable parallel rendering based on OpenGL which provides an application programming interface (API) to develop scalable graphics applications for a wide range of systems ranging from large distributed visualization clusters and multi-processor multipipe graphics systems to single-processor single-pipe desktop machines. We describe the system architecture, the basic API, discuss its advantages over previous approaches, present example configurations and usage scenarios as well as scalability results.
To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery
Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.
2018-01-01
The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005
Giant molecular clouds as regions of particle acceleration
NASA Technical Reports Server (NTRS)
Dogiel, V. A.; Gurevich, A. V.; Istomin, Y. N.; Zybin, K. A.
1985-01-01
One of the most interesting results of investigations carried out on the satellites SAS-II and COS-B is the discovery of unidentified discrete gamma sources. Possibly a considerable part of them may well be giant molecular clouds. Gamma emission from clouds is caused by the processes with participation of cosmic rays. The estimation of the cosmic ray density in clouds has shown that for the energy E approx. = I GeV their density can 10 to 1000 times exceed the one in intercloud space. We have made an attempt to determine the mechanism which could lead to the increase in the cosmic ray density in clouds.
Volcanic explosion clouds - Density, temperature, and particle content estimates from cloud motion
NASA Technical Reports Server (NTRS)
Wilson, L.; Self, S.
1980-01-01
Photographic records of 10 vulcanian eruption clouds produced during the 1978 eruption of Fuego Volcano in Guatemala have been analyzed to determine cloud velocity and acceleration at successive stages of expansion. Cloud motion is controlled by air drag (dominant during early, high-speed motion) and buoyancy (dominant during late motion when the cloud is convecting slowly). Cloud densities in the range 0.6 to 1.2 times that of the surrounding atmosphere were obtained by fitting equations of motion for two common cloud shapes (spheres and vertical cylinders) to the observed motions. Analysis of the heat budget of a cloud permits an estimate of cloud temperature and particle weight fraction to be made from the density. Model results suggest that clouds generally reached temperatures within 10 K of that of the surrounding air within 10 seconds of formation and that dense particle weight fractions were less than 2% by this time. The maximum sizes of dense particles supported by motion in the convecting clouds range from 140 to 1700 microns.
Ongoing cosmic ray acceleration in the supernova remnant W51C revealed with the MAGIC telescopes
NASA Astrophysics Data System (ADS)
Krause, J.; Reichardt, I.; Carmona, E.; Gozzini, S. R.; Jankowski, F.; MAGIC Collaboration
2012-12-01
The supernova remnant (SNR) W51C interacts with the molecular clouds of the star-forming region W51B, making the W51 complex one of the most promising targets to study cosmic ray acceleration. Gamma-ray emission from this region was discovered by Fermi/LAT and H.E.S.S., although its location was compatible with the SNR shell, the molecular cloud (MC) and a pulsar wind nebula (PWN) candidate. The modeling of the spectral energy distribution presented by the Fermi/LAT collaboration suggests a hadronic emission mechanism. Furthermore indications of an enhanced flux of low energy cosmic rays in the interaction region between SNR and MC have been reported based on ionization measurements in the mm regime. MAGIC conducted deep observations of W51, yielding a detection of an extended emission with more than 11 standard deviations. We extend the spectrum from the highest Fermi/LAT energies to ~5 TeV and find that it follows a single power law with an index of 2.58+/-0.07stat+/-0.22syst. We restrict the main part of the emission region to the zone where the SNR interacts with the molecular clouds. We also find a tail extending towards the PWN candidate CXO J192318.5+140305, possibly contributing up to 20% of the total flux. The broad band spectral energy distribution can be explained with a hadronic model that implies proton acceleration at least up to 50 TeV. This result, together with the morphology of the source, suggests that we observe ongoing acceleration of ions in the interaction zone between the SNR and the cloud.
Hongyi Xu; Barbic, Jernej
2017-01-01
We present an algorithm for fast continuous collision detection between points and signed distance fields, and demonstrate how to robustly use it for 6-DoF haptic rendering of contact between objects with complex geometry. Continuous collision detection is often needed in computer animation, haptics, and virtual reality applications, but has so far only been investigated for polygon (triangular) geometry representations. We demonstrate how to robustly and continuously detect intersections between points and level sets of the signed distance field. We suggest using an octree subdivision of the distance field for fast traversal of distance field cells. We also give a method to resolve continuous collisions between point clouds organized into a tree hierarchy and a signed distance field, enabling rendering of contact between rigid objects with complex geometry. We investigate and compare two 6-DoF haptic rendering methods now applicable to point-versus-distance field contact for the first time: continuous integration of penalty forces, and a constraint-based method. An experimental comparison to discrete collision detection demonstrates that the continuous method is more robust and can correctly resolve collisions even under high velocities and during complex contact.
Scavenging of black carbon in mixed phase clouds at the high alpine site Jungfraujoch
NASA Astrophysics Data System (ADS)
Cozic, J.; Verheggen, B.; Mertes, S.; Connolly, P.; Bower, K.; Petzold, A.; Baltensperger, U.; Weingartner, E.
2006-11-01
The scavenging of black carbon (BC) in liquid and mixed phase clouds was investigated during intensive experiments in winter 2004, summer 2004 and winter 2005 at the high alpine research station Jungfraujoch (3580 m a.s.l., Switzerland). Aerosol residuals were sampled behind two well characterized inlets; a total inlet which collected cloud particles (drops and ice particles) as well as interstitial aerosol particles; an interstitial inlet which collected only interstitial (unactivated) aerosol particles. BC concentrations were measured behind each of these inlets along with the submicrometer aerosol number size distribution, from which a volume concentration was derived. These measurements were complemented by in-situ measurements of cloud microphysical parameters. BC was found to be scavenged into the cloud phase to the same extent as the bulk aerosol, which suggests that BC was covered with soluble material through aging processes, rendering it more hygroscopic. The scavenged fraction of BC (FScav,BC), defined as the fraction of BC that is incorporated into cloud droplets and ice crystals, decreases with increasing cloud ice mass fraction (IMF) from FScav,BC=60% in liquid phase clouds to FScav,BC~10% in mixed-phase clouds with IMF>0.2. This is explained by the evaporation of liquid droplets in the presence of ice crystals (Wegener-Bergeron-Findeisen process), releasing BC containing cloud condensation nuclei back into the interstitial phase. In liquid clouds, the scavenged BC fraction is found to decrease with decreasing cloud liquid water content. The scavenged BC fraction is also found to decrease with increasing BC mass concentration since there is an increased competition for the available water vapour.
A secure EHR system based on hybrid clouds.
Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke
2012-10-01
Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.
Supernova Remnant Kes 17: An Efficient Cosmic Ray Accelerator inside a Molecular Cloud
NASA Astrophysics Data System (ADS)
Gelfand, Joseph; Slane, Patrick; Hughes, John; Temim, Tea; Castro, Daniel; Rakowski, Cara
Supernova remnant are believed to be the dominant source of cosmic rays protons below the "knee" in the energy spectrum. However, relatively few supernova remnants have been identified as efficient producers of cosmic ray protons. In this talk, I will present evidence that the production of cosmic ray protons is required to explain the broadband non-thermal spectrum of supernova remnant Kes 17 (SNR G304.6+0.1). Evidence for efficient cosmic ray acceleration in Kes 17 supports recent theoretical work concluding that the strong magnetic field, turbulence, and clumpy nature of molecular clouds enhance cosmic ray production in supernova remnants. While additional observations are needed to confirm this interpretation, further study of Kes 17 and similar sources are important for understanding how cosmic rays are accelerated in supernova remnants.
Trirotron: triode rotating beam radio frequency amplifier
Lebacqz, Jean V.
1980-01-01
High efficiency amplification of radio frequencies to very high power levels including: establishing a cylindrical cloud of electrons; establishing an electrical field surrounding and coaxial with the electron cloud to bias the electrons to remain in the cloud; establishing a rotating electrical field that surrounds and is coaxial with the steady field, the circular path of the rotating field being one wavelength long, whereby the peak of one phase of the rotating field is used to accelerate electrons in a beam through the bias field in synchronism with the peak of the rotating field so that there is a beam of electrons continuously extracted from the cloud and rotating with the peak; establishing a steady electrical field that surrounds and is coaxial with the rotating field for high-energy radial acceleration of the rotating beam of electrons; and resonating the rotating beam of electrons within a space surrounding the second field, the space being selected to have a phase velocity equal to that of the rotating field to thereby produce a high-power output at the frequency of the rotating field.
Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S
2017-01-20
Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.
Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus
NASA Astrophysics Data System (ADS)
Baun, Christian; Kunze, Marcel
Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
Nighttime Clouds in Martian Arctic (Accelerated Movie)
NASA Technical Reports Server (NTRS)
2008-01-01
An angry looking sky is captured in a movie clip consisting of 10 frames taken by the Surface Stereo Imager on NASA's Phoenix Mars Lander. The clip accelerates the motion. The images were take around 3 a.m. local solar time at the Phoenix site during Sol 95 (Aug. 30), the 95th Martian day since landing. The swirling clouds may be moving generally in a westward direction over the lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J.-L.; Furman, M.A.; Azevedo, A.W.
2004-04-19
We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.
Impact of a Pioneer/Rindler-type acceleration on the Oort Cloud
NASA Astrophysics Data System (ADS)
Iorio, Lorenzo
2012-01-01
According to a recent modified model of gravity at large distances, a radial constant and uniform extra-acceleration ? of Rindler type acts upon a test particle p in the static field of a central mass M if certain conditions are satisfied. Among other things, it was proposed as a potentially viable explanation of a part of the Pioneer anomaly. We study the impact that an anomalous Rindler-type term as large as ? m s-2 may have on the the orbital dynamics of a typical object of the Oort Cloud whose self-energy is quite smaller than its putative Rindler energy. By taking a typical comet moving along a highly eccentric and inclined orbit throughout the expected entire extension of the Oort Cloud (? pc), it turns out that the addition of an outward Rindler-like acceleration, that is, for ?, does not allow bound orbits. Instead, if ?, the resulting numerically integrated trajectory is limited in space, but it radically differs from the standard Keplerian ellipse. In particular, the heliocentric distance of the comet gets markedly reduced and experiences high-frequency oscillations, its speed is increased, and the overall pattern of the trajectory is quite isotropic. As a consequence, the standard picture of the Oort Cloud is radically altered since its modified orbits are much less sensitive to the disturbing actions of the Galactic tide and nearby passing stars whose effects, in the standard scenario, are responsible for the phenomenology on which our confidence in the existence of the cloud itself is based. The present analysis may be supplemented in future by further statistical Monte Carlo type investigations by randomly varying the initial conditions of the comets.
Scavenging of black carbon in mixed phase clouds at the high alpine site Jungfraujoch
NASA Astrophysics Data System (ADS)
Cozic, J.; Verheggen, B.; Mertes, S.; Connolly, P.; Bower, K.; Petzold, A.; Baltensperger, U.; Weingartner, E.
2007-04-01
The scavenging of black carbon (BC) in liquid and mixed phase clouds was investigated during intensive experiments in winter 2004, summer 2004 and winter 2005 at the high alpine research station Jungfraujoch (3580 m a.s.l., Switzerland). Aerosol residuals were sampled behind two well characterized inlets; a total inlet which collected cloud particles (droplets and ice particles) as well as interstitial (unactivated) aerosol particles; an interstitial inlet which collected only interstitial aerosol particles. BC concentrations were measured behind each of these inlets along with the submicrometer aerosol number size distribution, from which a volume concentration was derived. These measurements were complemented by in-situ measurements of cloud microphysical parameters. BC was found to be scavenged into the condensed phase to the same extent as the bulk aerosol, which suggests that BC was covered with soluble material through aging processes, rendering it more hygroscopic. The scavenged fraction of BC (FScav,BC), defined as the fraction of BC that is incorporated into cloud droplets and ice crystals, decreases with increasing cloud ice mass fraction (IMF) from FScav,BC=60% in liquid phase clouds to FScav,BC~5-10% in mixed-phase clouds with IMF>0.2. This can be explained by the evaporation of liquid droplets in the presence of ice crystals (Wegener-Bergeron-Findeisen process), releasing BC containing cloud condensation nuclei back into the interstitial phase. In liquid clouds, the scavenged BC fraction is found to decrease with decreasing cloud liquid water content. The scavenged BC fraction is also found to decrease with increasing BC mass concentration since there is an increased competition for the available water vapour.
NASA Astrophysics Data System (ADS)
Dietlicher, Remo; Neubauer, David; Lohmann, Ulrike
2018-04-01
A new scheme for stratiform cloud microphysics has been implemented in the ECHAM6-HAM2 general circulation model. It features a widely used description of cloud water with two categories for cloud droplets and raindrops. The unique aspect of the new scheme is the break with the traditional approach to describe cloud ice analogously. Here we parameterize cloud ice by a single category that predicts bulk particle properties (P3). This method has already been applied in a regional model and most recently also in the Community Atmosphere Model 5 (CAM5). A single cloud ice category does not rely on heuristic conversion rates from one category to another. Therefore, it is conceptually easier and closer to first principles. This work shows that a single category is a viable approach to describe cloud ice in climate models. Prognostic representation of sedimentation is achieved by a nested approach for sub-stepping the cloud microphysics scheme. This yields good results in terms of accuracy and performance as compared to simulations with high temporal resolution. Furthermore, the new scheme allows for a competition between various cloud processes and is thus able to unbiasedly represent the ice formation pathway from nucleation to growth by vapor deposition and collisions to sedimentation. Specific aspects of the P3 method are evaluated. We could not produce a purely stratiform cloud where rime growth dominates growth by vapor deposition and conclude that the lack of appropriate conditions renders the prognostic parameters associated with the rime properties unnecessary. Limitations inherent in a single category are examined.
Three dimensional Visualization of Jupiter's Equatorial Region
NASA Technical Reports Server (NTRS)
1997-01-01
Frames from a three dimensional visualization of Jupiter's equatorial region. The images used cover an area of 34,000 kilometers by 11,000 kilometers (about 21,100 by 6,800 miles) near an equatorial 'hotspot' similar to the site where the probe from NASA's Galileo spacecraft entered Jupiter's atmosphere on December 7th, 1995. These features are holes in the bright, reflective, equatorial cloud layer where warmer thermal emission from Jupiter's deep atmosphere can pass through. The circulation patterns observed here along with the composition measurements from the Galileo Probe suggest that dry air may be converging and sinking over these regions, maintaining their cloud-free appearance. The bright clouds to the right of the hotspot as well as the other bright features may be examples of upwelling of moist air and condensation.
This frame is a view from above and to the south of the visualized area, showing the entire model. The entire region is overlain by a thin, transparent haze. In places the haze is high and thick, especially to the east (to the right of) the hotspot.Galileo is the first spacecraft to image Jupiter in near-infrared light (which is invisible to the human eye) using three filters at 727, 756, and 889 nanometers (nm). Because light at these three wavelengths is absorbed at different altitudes by atmospheric methane, a comparison of the resulting images reveals information about the heights of clouds in Jupiter's atmosphere. This information can be visualized by rendering cloud surfaces with the appropriate height variations.The visualization reduces Jupiter's true cloud structure to two layers. The height of a high haze layer is assumed to be proportional to the reflectivity of Jupiter at 889 nm. The height of a lower tropospheric cloud is assumed to be proportional to the reflectivity at 727 nm divided by that at 756 nm. This model is overly simplistic, but is based on more sophisticated studies of Jupiter's cloud structure. The upper and lower clouds are separated in the rendering by an arbitrary amount, and the height variations are exaggerated by a factor of 25.The lower cloud is colored using the same false color scheme used in previously released image products, assigning red, green, and blue to the 756, 727, and 889 nanometer mosaics, respectively. Light bluish clouds are high and thin, reddish clouds are low, and white clouds are high and thick. The dark blue hotspot in the center is a hole in the lower cloud with an overlying thin haze.The images used cover latitudes 1 to 10 degrees and are centered at longitude 336 degrees west. The smallest resolved features are tens of kilometers in size. These images were taken on December 17, 1996, at a range of 1.5 million kilometers (about 930,000 miles) by the Solid State Imaging (CCD) system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://www.jpl.nasa.gov/ galileo.Community Project for Accelerator Science and Simulation (ComPASS) Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cary, John R.; Cowan, Benjamin M.; Veitzer, S. A.
2016-03-04
Tech-X participated across the full range of ComPASS activities, with efforts in the Energy Frontier primarily through modeling of laser plasma accelerators and dielectric laser acceleration, in the Intensity Frontier primarily through electron cloud modeling, and in Uncertainty Quantification being applied to dielectric laser acceleration. In the following we present the progress and status of our activities for the entire period of the ComPASS project for the different areas of Energy Frontier, Intensity Frontier and Uncertainty Quantification.
Searching for SNPs with cloud computing
2009-01-01
As DNA sequencing outpaces improvements in computer speed, there is a critical need to accelerate tasks like alignment and SNP calling. Crossbow is a cloud-computing software tool that combines the aligner Bowtie and the SNP caller SOAPsnp. Executing in parallel using Hadoop, Crossbow analyzes data comprising 38-fold coverage of the human genome in three hours using a 320-CPU cluster rented from a cloud computing service for about $85. Crossbow is available from http://bowtie-bio.sourceforge.net/crossbow/. PMID:19930550
NASA Astrophysics Data System (ADS)
Taylor, R.; Wünsch, R.; Palouš, J.
2018-05-01
Most detected neutral atomic hydrogen (HI) at low redshift is associated with optically bright galaxies. However, a handful of HI clouds are known which appear to be optically dark and have no nearby potential progenitor galaxies, making tidal debris an unlikely explanation. In particular, 6 clouds identified by the Arecibo Galaxy Environment Survey are interesting due to the combination of their small size, isolation, and especially their broad line widths atypical of other such clouds. A recent suggestion is that these clouds exist in pressure equilibrium with the intracluster medium, with the line width arising from turbulent internal motions. Here we explore that possibility by using the FLASH code to perform a series of 3D hydro simulations. Our clouds are modelled using spherical Gaussian density profiles, embedded in a hot, low-density gas representing the intracluster medium. The simulations account for heating and cooling of the gas, and we vary the structure and strength of their internal motions. We create synthetic HI spectra, and find that none of our simulations reproduce the observed cloud parameters for longer than ˜100 Myr : the clouds either collapse, disperse, or experience rapid heating which would cause ionisation and render them undetectable to HI surveys. While the turbulent motions required to explain the high line widths generate structures which appear to be inherently unstable, making this an unlikely explanation for the observed clouds, these simulations demonstrate the importance of including the intracluster medium in any model seeking to explain the existence of these objects.
Point Cloud Management Through the Realization of the Intelligent Cloud Viewer Software
NASA Astrophysics Data System (ADS)
Costantino, D.; Angelini, M. G.; Settembrini, F.
2017-05-01
The paper presents a software dedicated to the elaboration of point clouds, called Intelligent Cloud Viewer (ICV), made in-house by AESEI software (Spin-Off of Politecnico di Bari), allowing to view point cloud of several tens of millions of points, also on of "no" very high performance systems. The elaborations are carried out on the whole point cloud and managed by means of the display only part of it in order to speed up rendering. It is designed for 64-bit Windows and is fully written in C ++ and integrates different specialized modules for computer graphics (Open Inventor by SGI, Silicon Graphics Inc), maths (BLAS, EIGEN), computational geometry (CGAL, Computational Geometry Algorithms Library), registration and advanced algorithms for point clouds (PCL, Point Cloud Library), advanced data structures (BOOST, Basic Object Oriented Supporting Tools), etc. ICV incorporates a number of features such as, for example, cropping, transformation and georeferencing, matching, registration, decimation, sections, distances calculation between clouds, etc. It has been tested on photographic and TLS (Terrestrial Laser Scanner) data, obtaining satisfactory results. The potentialities of the software have been tested by carrying out the photogrammetric survey of the Castel del Monte which was already available in previous laser scanner survey made from the ground by the same authors. For the aerophotogrammetric survey has been adopted a flight height of approximately 1000ft AGL (Above Ground Level) and, overall, have been acquired over 800 photos in just over 15 minutes, with a covering not less than 80%, the planned speed of about 90 knots.
Desert dust suppressing precipitation: A possible desertification feedback loop
Rosenfeld, Daniel; Rudich, Yinon; Lahav, Ronen
2001-01-01
The effect of desert dust on cloud properties and precipitation has so far been studied solely by using theoretical models, which predict that rainfall would be enhanced. Here we present observations showing the contrary; the effect of dust on cloud properties is to inhibit precipitation. Using satellite and aircraft observations we show that clouds forming within desert dust contain small droplets and produce little precipitation by drop coalescence. Measurement of the size distribution and the chemical analysis of individual Saharan dust particles collected in such a dust storm suggest a possible mechanism for the diminished rainfall. The detrimental impact of dust on rainfall is smaller than that caused by smoke from biomass burning or anthropogenic air pollution, but the large abundance of desert dust in the atmosphere renders it important. The reduction of precipitation from clouds affected by desert dust can cause drier soil, which in turn raises more dust, thus providing a possible feedback loop to further decrease precipitation. Furthermore, anthropogenic changes of land use exposing the topsoil can initiate such a desertification feedback process. PMID:11353821
Grids, Clouds, and Virtualization
NASA Astrophysics Data System (ADS)
Cafaro, Massimo; Aloisio, Giovanni
This chapter introduces and puts in context Grids, Clouds, and Virtualization. Grids promised to deliver computing power on demand. However, despite a decade of active research, no viable commercial grid computing provider has emerged. On the other hand, it is widely believed - especially in the Business World - that HPC will eventually become a commodity. Just as some commercial consumers of electricity have mission requirements that necessitate they generate their own power, some consumers of computational resources will continue to need to provision their own supercomputers. Clouds are a recent business-oriented development with the potential to render this eventually as rare as organizations that generate their own electricity today, even among institutions who currently consider themselves the unassailable elite of the HPC business. Finally, Virtualization is one of the key technologies enabling many different Clouds. We begin with a brief history in order to put them in context, and recall the basic principles and concepts underlying and clearly differentiating them. A thorough overview and survey of existing technologies provides the basis to delve into details as the reader progresses through the book.
Large-Scale Point-Cloud Visualization through Localized Textured Surface Reconstruction.
Arikan, Murat; Preiner, Reinhold; Scheiblauer, Claus; Jeschke, Stefan; Wimmer, Michael
2014-09-01
In this paper, we introduce a novel scene representation for the visualization of large-scale point clouds accompanied by a set of high-resolution photographs. Many real-world applications deal with very densely sampled point-cloud data, which are augmented with photographs that often reveal lighting variations and inaccuracies in registration. Consequently, the high-quality representation of the captured data, i.e., both point clouds and photographs together, is a challenging and time-consuming task. We propose a two-phase approach, in which the first (preprocessing) phase generates multiple overlapping surface patches and handles the problem of seamless texture generation locally for each patch. The second phase stitches these patches at render-time to produce a high-quality visualization of the data. As a result of the proposed localization of the global texturing problem, our algorithm is more than an order of magnitude faster than equivalent mesh-based texturing techniques. Furthermore, since our preprocessing phase requires only a minor fraction of the whole data set at once, we provide maximum flexibility when dealing with growing data sets.
SHOCK-CLOUD INTERACTION AND PARTICLE ACCELERATION IN THE SOUTHWESTERN LIMB OF SN 1006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miceli, M.; Orlando, S.; Bocchino, F.
2014-02-20
The supernova remnant SN 1006 is a powerful source of high-energy particles and evolves in a relatively tenuous and uniform environment despite interacting with an atomic cloud in its northwestern limb. The X-ray image of SN 1006 reveals an indentation in the southwestern part of the shock front and the H I maps show an isolated (southwestern) cloud, having the same velocity as the northwestern cloud, whose morphology fits perfectly in the indentation. We performed spatially resolved spectral analysis of a set of small regions in the southwestern nonthermal limb and studied the deep X-ray spectra obtained within the XMM-Newton SN 1006 Largemore » Program. We also analyzed archive H I data, obtained by combining single-dish and interferometric observations. We found that the best-fit value of N {sub H} derived from the X-ray spectra significantly increases in regions corresponding to the southwestern cloud, while the cutoff energy of the synchrotron emission decreases. The N {sub H} variation corresponds perfectly with the H I column density of the southwestern cloud, as measured from the radio data. The decrease in the cutoff energy at the indentation clearly reveals that the back side of the cloud is actually interacting with the remnant. The southwestern limb therefore presents a unique combination of efficient particle acceleration and high ambient density, thus being the most promising region for γ-ray hadronic emission in SN 1006. We estimate that such emission will be detectable with the Fermi telescope within a few years.« less
Electron Cloud Trapping in Recycler Combined Function Dipole Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey A.; Nagaitsev, S.
2016-10-04
Electron cloud can lead to a fast instability in intense proton and positron beams in circular accelerators. In the Fermilab Recycler the electron cloud is confined within its combined function magnets. We show that the field of combined function magnets traps the electron cloud, present the results of analytical estimates of trapping, and compare them to numerical simulations of electron cloud formation. The electron cloud is located at the beam center and up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electronsmore » significantly increases the density of the cloud on the next revolution. In a Recycler combined function dipole this multiturn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The multi-turn build-up can be stopped by injection of a clearing bunch of 1010 p at any position in the ring.« less
Cloud4Psi: cloud computing for 3D protein structure similarity searching.
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur
2014-10-01
Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.
Cloud4Psi: cloud computing for 3D protein structure similarity searching
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur
2014-01-01
Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billing, M. G.; Conway, J. V.; Crittenden, J. A.
Cornell's electron/positron storage ring (CESR) was modified over a series of accelerator shutdowns beginning in May 2008, which substantially improves its capability for research and development for particle accelerators. CESR's energy span from 1.8 to 5.6 GeV with both electrons and positrons makes it ideal for the study of a wide spectrum of accelerator physics issues and instrumentation related to present light sources and future lepton damping rings. Additionally a number of these are also relevant for the beam physics of proton accelerators. This paper is the third in a series of four describing the conversion of CESR to themore » test accelerator, CESRTA. The first two papers discuss the overall plan for the conversion of the storage ring to an instrument capable of studying advanced accelerator physics issues [1] and the details of the vacuum system upgrades [2]. This paper focuses on the necessary development of new instrumentation, situated in four dedicated experimental regions, capable of studying such phenomena as electron clouds (ECs) and methods to mitigate EC effects. The fourth paper in this series describes the vacuum system modifications of the superconducting wigglers to accommodate the diagnostic instrumentation for the study of EC behavior within wigglers. Lastly, while the initial studies of CESRTA focused on questions related to the International Linear Collider damping ring design, CESRTA is a very versatile storage ring, capable of studying a wide range of accelerator physics and instrumentation questions.« less
Billing, M. G.; Conway, J. V.; Crittenden, J. A.; ...
2016-04-28
Cornell's electron/positron storage ring (CESR) was modified over a series of accelerator shutdowns beginning in May 2008, which substantially improves its capability for research and development for particle accelerators. CESR's energy span from 1.8 to 5.6 GeV with both electrons and positrons makes it ideal for the study of a wide spectrum of accelerator physics issues and instrumentation related to present light sources and future lepton damping rings. Additionally a number of these are also relevant for the beam physics of proton accelerators. This paper is the third in a series of four describing the conversion of CESR to themore » test accelerator, CESRTA. The first two papers discuss the overall plan for the conversion of the storage ring to an instrument capable of studying advanced accelerator physics issues [1] and the details of the vacuum system upgrades [2]. This paper focuses on the necessary development of new instrumentation, situated in four dedicated experimental regions, capable of studying such phenomena as electron clouds (ECs) and methods to mitigate EC effects. The fourth paper in this series describes the vacuum system modifications of the superconducting wigglers to accommodate the diagnostic instrumentation for the study of EC behavior within wigglers. Lastly, while the initial studies of CESRTA focused on questions related to the International Linear Collider damping ring design, CESRTA is a very versatile storage ring, capable of studying a wide range of accelerator physics and instrumentation questions.« less
Enhanced Graphics for Extended Scale Range
NASA Technical Reports Server (NTRS)
Hanson, Andrew J.; Chi-Wing Fu, Philip
2012-01-01
Enhanced Graphics for Extended Scale Range is a computer program for rendering fly-through views of scene models that include visible objects differing in size by large orders of magnitude. An example would be a scene showing a person in a park at night with the moon, stars, and galaxies in the background sky. Prior graphical computer programs exhibit arithmetic and other anomalies when rendering scenes containing objects that differ enormously in scale and distance from the viewer. The present program dynamically repartitions distance scales of objects in a scene during rendering to eliminate almost all such anomalies in a way compatible with implementation in other software and in hardware accelerators. By assigning depth ranges correspond ing to rendering precision requirements, either automatically or under program control, this program spaces out object scales to match the precision requirements of the rendering arithmetic. This action includes an intelligent partition of the depth buffer ranges to avoid known anomalies from this source. The program is written in C++, using OpenGL, GLUT, and GLUI standard libraries, and nVidia GEForce Vertex Shader extensions. The program has been shown to work on several computers running UNIX and Windows operating systems.
PRISM: An open source framework for the interactive design of GPU volume rendering shaders.
Drouin, Simon; Collins, D Louis
2018-01-01
Direct volume rendering has become an essential tool to explore and analyse 3D medical images. Despite several advances in the field, it remains a challenge to produce an image that highlights the anatomy of interest, avoids occlusion of important structures, provides an intuitive perception of shape and depth while retaining sufficient contextual information. Although the computer graphics community has proposed several solutions to address specific visualization problems, the medical imaging community still lacks a general volume rendering implementation that can address a wide variety of visualization use cases while avoiding complexity. In this paper, we propose a new open source framework called the Programmable Ray Integration Shading Model, or PRISM, that implements a complete GPU ray-casting solution where critical parts of the ray integration algorithm can be replaced to produce new volume rendering effects. A graphical user interface allows clinical users to easily experiment with pre-existing rendering effect building blocks drawn from an open database. For programmers, the interface enables real-time editing of the code inside the blocks. We show that in its default mode, the PRISM framework produces images very similar to those produced by a widely-adopted direct volume rendering implementation in VTK at comparable frame rates. More importantly, we demonstrate the flexibility of the framework by showing how several volume rendering techniques can be implemented in PRISM with no more than a few lines of code. Finally, we demonstrate the simplicity of our system in a usability study with 5 medical imaging expert subjects who have none or little experience with volume rendering. The PRISM framework has the potential to greatly accelerate development of volume rendering for medical applications by promoting sharing and enabling faster development iterations and easier collaboration between engineers and clinical personnel.
PRISM: An open source framework for the interactive design of GPU volume rendering shaders
Collins, D. Louis
2018-01-01
Direct volume rendering has become an essential tool to explore and analyse 3D medical images. Despite several advances in the field, it remains a challenge to produce an image that highlights the anatomy of interest, avoids occlusion of important structures, provides an intuitive perception of shape and depth while retaining sufficient contextual information. Although the computer graphics community has proposed several solutions to address specific visualization problems, the medical imaging community still lacks a general volume rendering implementation that can address a wide variety of visualization use cases while avoiding complexity. In this paper, we propose a new open source framework called the Programmable Ray Integration Shading Model, or PRISM, that implements a complete GPU ray-casting solution where critical parts of the ray integration algorithm can be replaced to produce new volume rendering effects. A graphical user interface allows clinical users to easily experiment with pre-existing rendering effect building blocks drawn from an open database. For programmers, the interface enables real-time editing of the code inside the blocks. We show that in its default mode, the PRISM framework produces images very similar to those produced by a widely-adopted direct volume rendering implementation in VTK at comparable frame rates. More importantly, we demonstrate the flexibility of the framework by showing how several volume rendering techniques can be implemented in PRISM with no more than a few lines of code. Finally, we demonstrate the simplicity of our system in a usability study with 5 medical imaging expert subjects who have none or little experience with volume rendering. The PRISM framework has the potential to greatly accelerate development of volume rendering for medical applications by promoting sharing and enabling faster development iterations and easier collaboration between engineers and clinical personnel. PMID:29534069
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsouleas, Thomas; Decyk, Viktor
Final Report for grant DE-FG02-06ER54888, "Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models" Viktor K. Decyk, University of California, Los Angeles Los Angeles, CA 90095-1547 The primary goal of this collaborative proposal was to modify the code QuickPIC and apply it to study the long-time stability of beam propagation in low density electron clouds present in circular accelerators. The UCLA contribution to this collaborative proposal was in supporting the development of the pipelining scheme for the QuickPIC code, which extended the parallel scaling of this code by two orders of magnitude. The USC work was as describedmore » here the PhD research for Ms. Bing Feng, lead author in reference 2 below, who performed the research at USC under the guidance of the PI Tom Katsouleas and the collaboration of Dr. Decyk The QuickPIC code [1] is a multi-scale Particle-in-Cell (PIC) code. The outer 3D code contains a beam which propagates through a long region of plasma and evolves slowly. The plasma response to this beam is modeled by slices of a 2D plasma code. This plasma response then is fed back to the beam code, and the process repeats. The pipelining is based on the observation that once the beam has passed a 2D slice, its response can be fed back to the beam immediately without waiting for the beam to pass all the other slices. Thus independent blocks of 2D slices from different time steps can be running simultaneously. The major difficulty was when particles at the edges needed to communicate with other blocks. Two versions of the pipelining scheme were developed, for the the full quasi-static code and the other for the basic quasi-static code used by this e-cloud proposal. Details of the pipelining scheme were published in [2]. The new version of QuickPIC was able to run with more than 1,000 processors, and was successfully applied in modeling e-clouds by our collaborators in this proposal [3-8]. Jean-Luc Vay at Lawrence Berkeley National Lab later implemented a similar basic quasistatic scheme including pipelining in the code WARP [9] and found good to very good quantitative agreement between the two codes in modeling e-clouds. References [1] C. Huang, V. K. Decyk, C. Ren, M. Zhou, W. Lu, W. B. Mori, J. H. Cooley, T. M. Antonsen, Jr., and T. Katsouleas, "QUICKPIC: A highly efficient particle-in-cell code for modeling wakefield acceleration in plasmas," J. Computational Phys. 217, 658 (2006). [2] B. Feng, C. Huang, V. K. Decyk, W. B. Mori, P. Muggli, and T. Katsouleas, "Enhancing parallel quasi-static particle-in-cell simulations with a pipelining algorithm," J. Computational Phys, 228, 5430 (2009). [3] C. Huang, V. K. Decyk, M. Zhou, W. Lu, W. B. Mori, J. H. Cooley, T. M. Antonsen, Jr., and B. Feng, T. Katsouleas, J. Vieira, and L. O. Silva, "QUICKPIC: A highly efficient fully parallelized PIC code for plasma-based acceleration," Proc. of the SciDAC 2006 Conf., Denver, Colorado, June, 2006 [Journal of Physics: Conference Series, W. M. Tang, Editor, vol. 46, Institute of Physics, Bristol and Philadelphia, 2006], p. 190. [4] B. Feng, C. Huang, V. Decyk, W. B. Mori, T. Katsouleas, P. Muggli, "Enhancing Plasma Wakefield and E-cloud Simulation Performance Using a Pipelining Algorithm," Proc. 12th Workshop on Advanced Accelerator Concepts, Lake Geneva, WI, July, 2006, p. 201 [AIP Conf. Proceedings, vol. 877, Melville, NY, 2006]. [5] B. Feng, P. Muggli, T. Katsouleas, V. Decyk, C. Huang, and W. Mori, "Long Time Electron Cloud Instability Simulation Using QuickPIC with Pipelining Algorithm," Proc. of the 2007 Particle Accelerator Conference, Albuquerque, NM, June, 2007, p. 3615. [6] B. Feng, C. Huang, V. Decyk, W. B. Mori, G. H. Hoffstaetter, P. Muggli, T. Katsouleas, "Simulation of Electron Cloud Effects on Electron Beam at ERL with Pipelined QuickPIC," Proc. 13th Workshop on Advanced Accelerator Concepts, Santa Cruz, CA, July-August, 2008, p. 340 [AIP Conf. Proceedings, vol. 1086, Melville, NY, 2008]. [7] B. Feng, C. Huang, V. K. Decyk, W. B. Mori, P. Muggli, and T. Katsouleas, "Enhancing parallel quasi-static particle-in-cell simulations with a pipelining algorithm," J. Computational Phys, 228, 5430 (2009). [8] C. Huang, W. An, V. K. Decyk, W. Lu, W. B. Mori, F. S. Tsung, M. Tzoufras, S. Morshed, T. Antonsen, B. Feng, T. Katsouleas, R., A. Fonseca, S. F. Martins, J. Vieira, L. O. Silva, E. Esarey, C. G. R. Geddes, W. P. Leemans, E. Cormier-Michel, J.-L. Vay, D. L. Bruhwiler, B. Cowan, J. R. Cary, and K. Paul, "Recent results and future challenges for large scale particleion- cell simulations of plasma-based accelerator concepts," Proc. of the SciDAC 2009 Conf., San Diego, CA, June, 2009 [Journal of Physics: Conference Series, vol. 180, Institute of Physics, Bristol and Philadelphia, 2009], p. 012005. [9] J.-L. Vay, C. M. Celata, M. A. Furman, G. Penn, M. Venturini, D. P. Grote, and K. G. Sonnad, ?Update on Electron-Cloud Simulations Using the Package WARP-POSINST.? Proc. of the 2009 Particle Accelerator Conference PAC09, Vancouver, Canada, June, 2009, paper FR5RFP078.« less
Image-Guided Rendering with an Evolutionary Algorithm Based on Cloud Model
2018-01-01
The process of creating nonphotorealistic rendering images and animations can be enjoyable if a useful method is involved. We use an evolutionary algorithm to generate painterly styles of images. Given an input image as the reference target, a cloud model-based evolutionary algorithm that will rerender the target image with nonphotorealistic effects is evolved. The resulting animations have an interesting characteristic in which the target slowly emerges from a set of strokes. A number of experiments are performed, as well as visual comparisons, quantitative comparisons, and user studies. The average scores in normalized feature similarity of standard pixel-wise peak signal-to-noise ratio, mean structural similarity, feature similarity, and gradient similarity based metric are 0.486, 0.628, 0.579, and 0.640, respectively. The average scores in normalized aesthetic measures of Benford's law, fractal dimension, global contrast factor, and Shannon's entropy are 0.630, 0.397, 0.418, and 0.708, respectively. Compared with those of similar method, the average score of the proposed method, except peak signal-to-noise ratio, is higher by approximately 10%. The results suggest that the proposed method can generate appealing images and animations with different styles by choosing different strokes, and it would inspire graphic designers who may be interested in computer-based evolutionary art. PMID:29805440
Experimental demonstration of plasma-drag acceleration of a dust cloud to hypervelocities.
Ticoş, C M; Wang, Zhehui; Wurden, G A; Kline, J L; Montgomery, D S; Dorf, L A; Shukla, P K
2008-04-18
Simultaneous acceleration of hundreds of dust particles to hypervelocities by collimated plasma flows ejected from a coaxial gun is demonstrated. Graphite and diamond grains with radii between 5 and 30 microm, and flying at speeds up to 3.7 km/s, have been recorded with a high-speed camera. The observations agree well with a model for plasma-drag acceleration of microparticles much larger than the plasma screening length.
Longitudinal and transverse dynamics of ions from residual gas in an electron accelerator
NASA Astrophysics Data System (ADS)
Gamelin, A.; Bruni, C.; Radevych, D.
2018-05-01
The ion cloud produced from residual gas in an electron accelerator can degrade machine performances and produce instabilities. The ion dynamics in an accelerator is governed by the beam-ion interaction, magnetic fields and eventual mitigation strategies. Due to the fact that the beam has a nonuniform transverse size along its orbit, the ions move longitudinally and accumulate naturally at some points in the accelerator. In order to design effective mitigation strategies it is necessary to understand the ion dynamics not only in the transverse plane but also in the longitudinal direction. After introducing the physics behind the beam-ion interaction, we show how to get accumulation points for a realistic electron storage ring lattice. Simulations of the ion cloud dynamics, including the effect of magnetic fields on the ions, clearing electrodes and clearing gaps are shown. Longitudinal ion trapping due to the magnetic mirror effect in the dipole fringe fields is also detailed. Finally, the effectiveness of clearing electrode using longitudinal clearing fields is discussed and compared to clearing electrodes producing transverse field only.
Baines, K.H.; Delitsky, M.L.; Momary, T.W.; Brown, R.H.; Buratti, B.J.; Clark, R.N.; Nicholson, P.D.
2009-01-01
Thunderstorm activity on Saturn is associated with optically detectable clouds that are atypically dark throughout the near-infrared. As observed by Cassini/VIMS, these clouds are ~20% less reflective than typical neighboring clouds throughout the spectral range from 0.8 ??m to at least 4.1 ??m. We propose that active thunderstorms originating in the 10-20 bar water-condensation region vertically transport dark materials at depth to the ~1 bar level where they can be observed. These materials in part may be produced by chemical processes associated with lightning, likely within the water clouds near the ~10 bar freezing level of water, as detected by the electrostatic discharge of lightning flashes observed by Cassini/RPWS (e.g., Fischer et al. 2008, Space Sci. Rev., 137, 271-285). We review lightning-induced pyrolytic chemistry involving a variety of Saturnian constituents, including hydrogen, methane, ammonia, hydrogen sulfide, phosphine, and water. We find that the lack of absorption in the 1-2 ??m spectral region by lightning-generated sulfuric and phosphorous condensates renders these constituents as minor players in determining the color of the dark storm clouds. Relatively small particulates of elemental carbon, formed by lightning-induced dissociation of methane and subsequently upwelled from depth - perhaps embedded within and on the surface of spectrally bright condensates such as ammonium hydrosulfide or ammonia - may be a dominant optical material within the dark thunderstorm-related clouds of Saturn. ?? 2009 Elsevier Ltd. All rights reserved.
Temporally rendered automatic cloud extraction (TRACE) system
NASA Astrophysics Data System (ADS)
Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.
1999-10-01
Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.
Remote volume rendering pipeline for mHealth applications
NASA Astrophysics Data System (ADS)
Gutenko, Ievgeniia; Petkov, Kaloian; Papadopoulos, Charilaos; Zhao, Xin; Park, Ji Hwan; Kaufman, Arie; Cha, Ronald
2014-03-01
We introduce a novel remote volume rendering pipeline for medical visualization targeted for mHealth (mobile health) applications. The necessity of such a pipeline stems from the large size of the medical imaging data produced by current CT and MRI scanners with respect to the complexity of the volumetric rendering algorithms. For example, the resolution of typical CT Angiography (CTA) data easily reaches 512^3 voxels and can exceed 6 gigabytes in size by spanning over the time domain while capturing a beating heart. This explosion in data size makes data transfers to mobile devices challenging, and even when the transfer problem is resolved the rendering performance of the device still remains a bottleneck. To deal with this issue, we propose a thin-client architecture, where the entirety of the data resides on a remote server where the image is rendered and then streamed to the client mobile device. We utilize the display and interaction capabilities of the mobile device, while performing interactive volume rendering on a server capable of handling large datasets. Specifically, upon user interaction the volume is rendered on the server and encoded into an H.264 video stream. H.264 is ubiquitously hardware accelerated, resulting in faster compression and lower power requirements. The choice of low-latency CPU- and GPU-based encoders is particularly important in enabling the interactive nature of our system. We demonstrate a prototype of our framework using various medical datasets on commodity tablet devices.
Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts
2015-01-01
Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narayan, Ramesh; Sironi, Lorenzo; Oezel, Feryal
2012-10-01
A dense ionized cloud of gas has been recently discovered to be moving directly toward the supermassive black hole, Sgr A*, at the Galactic center. In 2013 June, at the pericenter of its highly eccentric orbit, the cloud will be approximately 3100 Schwarzschild radii from the black hole and will move supersonically through the ambient hot gas with a velocity of v{sub p} Almost-Equal-To 5400 km s{sup -1}. A bow shock is likely to form in front of the cloud and could accelerate electrons to relativistic energies. We estimate via particle-in-cell simulations the energy distribution of the accelerated electrons andmore » show that the non-thermal synchrotron emission from these electrons might exceed the quiescent radio emission from Sgr A* by a factor of several. The enhanced radio emission should be detectable at GHz and higher frequencies around the time of pericentric passage and in the following months. The bow shock emission is expected to be displaced from the quiescent radio emission of Sgr A* by {approx}33 mas. Interferometric observations could resolve potential changes in the radio image of Sgr A* at wavelengths {approx}< 6 cm.« less
Wilson, Sarah Jane; Rhemtulla, Jeanine M
2016-01-01
Community-based tropical forest restoration projects, often promoted as a win-win solution for local communities and the environment, have increased dramatically in number in the past decade. Many such projects are underway in Andean cloud forests, which, given their extremely high biodiversity and history of extensive clearing, are understudied. This study investigates the efficacy of community-based tree-planting projects to accelerate cloud forest recovery, as compared to unassisted natural regeneration. This study takes place in northwest Andean Ecuador, where the majority of the original, highly diverse cloud forests have been cleared, in five communities that initiated tree-planting projects to restore forests in 2003. In 2011, we identified tree species along transects in planted forests (n = 5), naturally regenerating forests (n = 5), and primary forests (n = 5). We also surveyed 120 households about their restoration methods, tree preferences, and forest uses. We found that tree diversity was higher in planted than in unplanted secondary forest, but both were less diverse than primary forests. Ordination analysis showed that all three forests had distinct species compositions, although planted forests shared more species with primary forests than did unplanted forests. Planted forests also contained more animal-dispersed species in both the planted canopy and in the unplanted, regenerating understory than unplanted forests, and contained the highest proportion of species with use value for local people. While restoring forest increased biodiversity and accelerated forest recovery, restored forests may also represent novel ecosystems that are distinct from the region's previous ecosystems and, given their usefulness to people, are likely to be more common in the future.
Rendering potential wearable robot designs with the LOPES gait trainer.
Koopman, B; van Asseldonk, E H F; van der Kooij, H; van Dijk, W; Ronsse, R
2011-01-01
In recent years, wearable robots (WRs) for rehabilitation, personal assistance, or human augmentation are gaining increasing interest. To make these devices more energy efficient, radical changes to the mechanical structure of the device are being considered. However, it remains very difficult to predict how people will respond to, and interact with, WRs that differ in terms of mechanical design. Users may adjust their gait pattern in response to the mechanical restrictions or properties of the device. The goal of this pilot study is to show the feasibility of rendering the mechanical properties of different potential WR designs using the robotic gait training device LOPES. This paper describes a new method that selectively cancels the dynamics of LOPES itself and adds the dynamics of the rendered WR using two parallel inverse models. Adaptive frequency oscillators were used to get estimates of the joint position, velocity, and acceleration. Using the inverse models, different WR designs can be evaluated, eliminating the need to build several prototypes. As a proof of principle, we simulated the effect of a very simple WR that consisted of a mass attached to the ankles. Preliminary results show that we are partially able to cancel the dynamics of LOPES. Additionally, the simulation of the mass showed an increase in muscle activity but not in the same level as during the control, where subjects actually carried the mass. In conclusion, the results in this paper suggest that LOPES can be used to render different WRs. In addition, it is very likely that the results can be further optimized when more effort is put in retrieving proper estimations for the velocity and acceleration, which are required for the inverse models. © 2011 IEEE
NASA Astrophysics Data System (ADS)
Hueso, Ricardo; Garate-Lopez, I.; Peralta, J.; Bandos, T.; Sánchez-Lavega, A.
2013-10-01
After more than 6 years orbiting Venus the Venus Express mission has provided the largest database of observations of Venus atmosphere at different cloud layers with the combination of VMC and VIRTIS instruments. We present measurements of cloud motions in the South hemisphere of Venus analyzing images from the VIRTIS-M visible channel at different wavelengths sensitive to the upper cloud haze at 65-70 km height (dayside ultraviolet images) and the middle cloud deck (dayside visible and near infrared images around 1 μm) about 5-8 km deeper in the atmosphere. We combine VIRTIS images in nearby wavelengths to increase the contrast of atmospheric details and measurements were obtained with a semi-automatic cloud correlation algorithm. Both cloud layers are studied simultaneously to infer similarities and differences in these vertical levels in terms of cloud morphologies and winds. For both levels we present global mean zonal and meridional winds, latitudinal distribution of winds with local time and the wind shear between both altitudes. The upper branch of the Hadley cell circulation is well resolved in UV images with an acceleration of the meridional circulation at mid-latitudes with increasing local time peaking at 14-16h. This organized meridional circulation is almost absent in NIR images. Long-term variability of zonal winds is also found in UV images with increasing winds over time during the VEX mission. This is in agreement with current analysis of VMC images (Kathuntsev et al. 2013). The possible long-term acceleration of zonal winds is also examined for NIR images. References Khatuntsev et al. Icarus 226, 140-158 (2013)
cellVIEW: a Tool for Illustrative and Multi-Scale Rendering of Large Biomolecular Datasets
Le Muzic, Mathieu; Autin, Ludovic; Parulek, Julius; Viola, Ivan
2017-01-01
In this article we introduce cellVIEW, a new system to interactively visualize large biomolecular datasets on the atomic level. Our tool is unique and has been specifically designed to match the ambitions of our domain experts to model and interactively visualize structures comprised of several billions atom. The cellVIEW system integrates acceleration techniques to allow for real-time graphics performance of 60 Hz display rate on datasets representing large viruses and bacterial organisms. Inspired by the work of scientific illustrators, we propose a level-of-detail scheme which purpose is two-fold: accelerating the rendering and reducing visual clutter. The main part of our datasets is made out of macromolecules, but it also comprises nucleic acids strands which are stored as sets of control points. For that specific case, we extend our rendering method to support the dynamic generation of DNA strands directly on the GPU. It is noteworthy that our tool has been directly implemented inside a game engine. We chose to rely on a third party engine to reduce software development work-load and to make bleeding-edge graphics techniques more accessible to the end-users. To our knowledge cellVIEW is the only suitable solution for interactive visualization of large bimolecular landscapes on the atomic level and is freely available to use and extend. PMID:29291131
Magnetohydrodynamic Simulations of a Plunging Black Hole into a Molecular Cloud
NASA Astrophysics Data System (ADS)
Nomura, Mariko; Oka, Tomoharu; Yamada, Masaya; Takekawa, Shunya; Ohsuga, Ken; Takahashi, Hiroyuki R.; Asahina, Yuta
2018-05-01
Using two-dimensional magnetohydrodynamic simulations, we investigated the gas dynamics around a black hole (BH) plunging into a molecular cloud. In these calculations, we assumed a parallel-magnetic-field layer in the cloud. The size of the accelerated region is far larger than the Bondi–Hoyle–Lyttleton radius, being approximately inversely proportional to the Alfvén Mach number for the plunging BH. Our results successfully reproduce the “Y” shape in position–velocity maps of the “Bullet” in the W44 molecular cloud. The size of the Bullet is also reproduced within an order of magnitude using a reasonable parameter set. This consistency supports the shooting model of the Bullet, according to which an isolated BH plunged into a molecular cloud to form a compact broad-velocity-width feature.
A Contextual Information Acquisition Approach Based on Semantics and Mashup Technology
NASA Astrophysics Data System (ADS)
He, Yangfan; Li, Lu; He, Keqing; Chen, Xiuhong
Pay per use is an essential feature of cloud computing. Users can make use of some parts of a large scale service to satisfy their requirements, merely at the cost of a little payment. A good understanding of the users' requirement is a prerequisite for choosing the service in need precisely. Context implies users' potential requirements, which can be a complement to the requirements delivered explicitly. However, traditional context-aware computing research always demands some specific kinds of sensors to acquire contextual information, which renders a threshold too high for an application to become context-aware. This paper comes up with an approach which combines contextual information obtained directly and indirectly from the cloud services. Semantic relationship between different kinds of contexts lays foundation for the searching of the cloud services. And mashup technology is adopted to compose the heterogonous services. Abundant contextual information may lend strong support to a comprehensive understanding of users' context and a bettered abstraction of contextual requirements.
An Effective Algorithm Research of Scenario Voxelization Organization and Occlusion Culling
NASA Astrophysics Data System (ADS)
Lai, Guangling; Ding, Lu; Qin, Zhiyuan; Tong, Xiaochong
2016-11-01
Compared with the traditional triangulation approaches, the voxelized point cloud data can reduce the sensitivity of scenario and complexity of calculation. While on the base of the point cloud data, implementation scenario organization could be accomplishment by subtle voxel, but it will add more memory consumption. Therefore, an effective voxel representation method is very necessary. At present, the specific study of voxel visualization algorithm is less. This paper improved the ray tracing algorithm by the characteristics of voxel configuration. Firstly, according to the scope of point cloud data, determined the scope of the pixels on the screen. Then, calculated the light vector came from each pixel. Lastly, used the rules of voxel configuration to calculate all the voxel penetrated through by light. The voxels closest to viewpoint were named visible ones, the rest were all obscured ones. This experimental showed that the method could realize voxelization organization and voxel occlusion culling of implementation scenario efficiently, and increased the render efficiency.
NASA Technical Reports Server (NTRS)
Suomi, V. E.
1975-01-01
The complete output of the Synchronous Meteorological Satellite was recorded on one inch magnetic tape. A quality control subsystem tests cloud track vectors against four sets of criteria: (1) rejection if best match occurs on correlation boundary; (2) rejection if major correlation peak is not distinct and significantly greater than secondary peak; (3) rejection if correlation is not persistent; and (4) rejection if acceleration is too great. A cloud height program determines cloud optical thickness from visible data and computer infrared emissivity. From infrared data and temperature profile, cloud height is determined. A functional description and electronic schematics of equipment are given.
Modeling the Virtual Machine Launching Overhead under Fermicloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele; Wu, Hao; Ren, Shangping
FermiCloud is a private cloud developed by the Fermi National Accelerator Laboratory for scientific workflows. The Cloud Bursting module of the FermiCloud enables the FermiCloud, when more computational resources are needed, to automatically launch virtual machines to available resources such as public clouds. One of the main challenges in developing the cloud bursting module is to decide when and where to launch a VM so that all resources are most effectively and efficiently utilized and the system performance is optimized. However, based on FermiCloud’s system operational data, the VM launching overhead is not a constant. It varies with physical resourcemore » (CPU, memory, I/O device) utilization at the time when a VM is launched. Hence, to make judicious decisions as to when and where a VM should be launched, a VM launch overhead reference model is needed. The paper is to develop a VM launch overhead reference model based on operational data we have obtained on FermiCloud and uses the reference model to guide the cloud bursting process.« less
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Moncrieff, Mitchell; Einaud, Franco (Technical Monitor)
2001-01-01
Numerical cloud models have been developed and applied extensively to study cloud-scale and mesoscale processes during the past four decades. The distinctive aspect of these cloud models is their ability to treat explicitly (or resolve) cloud-scale dynamics. This requires the cloud models to be formulated from the non-hydrostatic equations of motion that explicitly include the vertical acceleration terms since the vertical and horizontal scales of convection are similar. Such models are also necessary in order to allow gravity waves, such as those triggered by clouds, to be resolved explicitly. In contrast, the hydrostatic approximation, usually applied in global or regional models, does allow the presence of gravity waves. In addition, the availability of exponentially increasing computer capabilities has resulted in time integrations increasing from hours to days, domain grids boxes (points) increasing from less than 2000 to more than 2,500,000 grid points with 500 to 1000 m resolution, and 3-D models becoming increasingly prevalent. The cloud resolving model is now at a stage where it can provide reasonably accurate statistical information of the sub-grid, cloud-resolving processes poorly parameterized in climate models and numerical prediction models.
Effects of turbulence on warm clouds and precipitation with various aerosol concentrations
NASA Astrophysics Data System (ADS)
Lee, Hyunho; Baik, Jong-Jin; Han, Ji-Young
2015-02-01
This study investigates the effects of turbulence-induced collision enhancement (TICE) on warm clouds and precipitation by changing the cloud condensation nuclei (CCN) number concentration using a two-dimensional dynamic model with bin microphysics. TICE is determined according to the Taylor microscale Reynolds number and the turbulent dissipation rate. The thermodynamic sounding used in this study is characterized by a warm and humid atmosphere with a capping inversion layer, which is suitable for simulating warm clouds. For all CCN concentrations, TICE slightly reduces the liquid water path during the early stage of cloud development and accelerates the onset of surface precipitation. However, changes in the rainwater path and in the amount of surface precipitation that are caused by TICE depend on the CCN concentrations. For high CCN concentrations, the mean cloud drop number concentration (CDNC) decreases and the mean effective radius increases due to TICE. These changes cause an increase in the amount of surface precipitation. However, for low CCN concentrations, changes in the mean CDNC and in the mean effective radius induced by TICE are small and the amount of surface precipitation decreases slightly due to TICE. A decrease in condensation due to the accelerated coalescence between droplets explains the surface precipitation decrease. In addition, an increase in the CCN concentration can lead to an increase in the amount of surface precipitation, and the relationship between the CCN concentration and the amount of surface precipitation is affected by TICE. It is shown that these results depend on the atmospheric relative humidity.
Narrow-line region kinematics in Seyfert nuclei
NASA Astrophysics Data System (ADS)
Moore, David J.
1994-01-01
We present results of a study of narrow-line region (NLR) kinematics in Seyfert nuclei. This study has involved extensive modeling which includes collimated emission, radially dependent rotation and turbulence, explicit photoionization calculations, realistic treatments of both internal and external obscuration, and allows for gradients in the electron density and the radial velocity of clouds throughout the NLR. Line profiles of (O II) lambda 3727, (Ne III) lambda 3869, (O III) lambda 5007, (Fe VII) lambda 6087, (Fe X) lambda 6374, (O I) lambda 6300, H alpha lambda 6563, and (S II) lambda 6731 are calculated for a wide range of physical conditions throughout the NLR. The model profiles are compared with line profiles derived from data taken with the Mount Palomar 5 m Hale Telescope as well as from profiles taken from the literature. The scenario in agreement with the largest of observational considerations consists of clouds which are accelerating outward with v varies as square root of r (i.e., constant force) and ne varies as 1/r2. The cloud start out at the inner NLR radium with ne approximately equal to 106/cu cm and with a very large column density (1023 - 10(exp 24/sq cm). These clouds are uniformly accelerated from a few tens of km/sec to approximately less than 1,000 km/sec. When the clouds reached the outer NLR radius, they have ne approximately greater than 102/cu cm and a column density of 1021-1022/sq cm. The clouds maintain an ionization parameter of about 0.3 throughout the NLR.
Graphics performance in rich Internet applications.
Hoetzlein, Rama C
2012-01-01
Rendering performance for rich Internet applications (RIAs) has recently focused on the debate between using Flash and HTML5 for streaming video and gaming on mobile devices. A key area not widely explored, however, is the scalability of raw bitmap graphics performance for RIAs. Does Flash render animated sprites faster than HTML5? How much faster is WebGL than Flash? Answers to these questions are essential for developing large-scale data visualizations, online games, and truly dynamic websites. A new test methodology analyzes graphics performance across RIA frameworks and browsers, revealing specific performance outliers in existing frameworks. The results point toward a future in which all online experiences might be GPU accelerated.
NASA Astrophysics Data System (ADS)
Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin
2018-03-01
Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.
High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators
NASA Astrophysics Data System (ADS)
Feiz Zarrin Ghalam, Ali
Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)
CloudSat Anomaly Recovery and Operational Lessons Learned
NASA Technical Reports Server (NTRS)
Witkowski, Mona; Vane, Deborah; Livermore, Thomas; Rokey, Mark; Barthuli, Marda; Gravseth, Ian J.; Pieper, Brian; Rodzinak, Aaron; Silva, Steve; Woznick, Paul;
2012-01-01
In April 2011, NASA's pioneering cloud profiling radar satellite, CloudSat, experienced a battery anomaly that placed it into emergency mode and rendered it operations incapable. All initial attempts to recover the spacecraft failed as the resultant power limitations could not support even the lowest power mode. Originally part of a six-satellite constellation known as the "A-Train", CloudSat was unable to stay within its assigned control box, posing a threat to other A-Train satellites. CloudSat needed to exit the constellation, but with the tenuous power profile, conducting maneuvers was very risky. The team was able to execute a complex sequence of operations which recovered control, conducted an orbit lower maneuver, and returned the satellite to safe mode, within one 65 minute sunlit period. During the course of the anomaly recovery, the team developed several bold, innovative operational strategies. Details of the investigation into the root-cause and the multiple approaches to revive CloudSat are examined. Satellite communication and commanding during the anomaly are presented. A radical new system of "Daylight Only Operations" (DO-OP) was developed, which cycles the payload and subsystem components off in tune with earth eclipse entry and exit in order to maintain positive power and thermal profiles. The scientific methodology and operational results behind the graduated testing and ramp-up to DO-OP are analyzed. In November 2011, the CloudSat team successfully restored the vehicle to consistent operational collection of cloud radar data during sunlit portions of the orbit. Lessons learned throughout the six-month return-to-operations recovery effort are discussed and offered for application to other R&D satellites, in the context of on-orbit anomaly resolution efforts.
Cloud-based design of high average power traveling wave linacs
NASA Astrophysics Data System (ADS)
Kutsaev, S. V.; Eidelman, Y.; Bruhwiler, D. L.; Moeller, P.; Nagler, R.; Barbe Welzel, J.
2017-12-01
The design of industrial high average power traveling wave linacs must accurately consider some specific effects. For example, acceleration of high current beam reduces power flow in the accelerating waveguide. Space charge may influence the stability of longitudinal or transverse beam dynamics. Accurate treatment of beam loading is central to the design of high-power TW accelerators, and it is especially difficult to model in the meter-scale region where the electrons are nonrelativistic. Currently, there are two types of available codes: tracking codes (e.g. PARMELA or ASTRA) that cannot solve self-consistent problems, and particle-in-cell codes (e.g. Magic 3D or CST Particle Studio) that can model the physics correctly but are very time-consuming and resource-demanding. Hellweg is a special tool for quick and accurate electron dynamics simulation in traveling wave accelerating structures. The underlying theory of this software is based on the differential equations of motion. The effects considered in this code include beam loading, space charge forces, and external magnetic fields. We present the current capabilities of the code, provide benchmarking results, and discuss future plans. We also describe the browser-based GUI for executing Hellweg in the cloud.
Does the climate warming hiatus exist over the Tibetan Plateau?
Duan, Anmin; Xiao, Zhixiang
2015-01-01
The surface air temperature change over the Tibetan Plateau is determined based on historical observations from 1980 to 2013. In contrast to the cooling trend in the rest of China, and the global warming hiatus post-1990s, an accelerated warming trend has appeared over the Tibetan Plateau during 1998–2013 (0.25 °C decade−1), compared with that during 1980–1997 (0.21 °C decade−1). Further results indicate that, to some degree, such an accelerated warming trend might be attributable to cloud–radiation feedback. The increased nocturnal cloud over the northern Tibetan Plateau would warm the nighttime temperature via enhanced atmospheric back-radiation, while the decreased daytime cloud over the southern Tibetan Plateau would induce the daytime sunshine duration to increase, resulting in surface air temperature warming. Meanwhile, the in situ surface wind speed has recovered gradually since 1998, and thus the energy concentration cannot explain the accelerated warming trend over the Tibetan Plateau after the 1990s. It is suggested that cloud–radiation feedback may play an important role in modulating the recent accelerated warming trend over the Tibetan Plateau. PMID:26329678
NASA Astrophysics Data System (ADS)
Kim, V. P.
2017-04-01
The long-term experience in controlling the electric field distribution in the discharge gaps of plasma accelerators and thrusters with closed electron drift and the key ideas determining the concepts of these devices and tendencies of their development are analyzed. It is shown that an electrostatic mechanism of ion acceleration in plasma by an uncompensated space charge of the cloud of magnetized electrons "kept" to the magnetic field takes place in the acceleration zones and that the electric field distribution can be controlled by varying the magnetic field in the discharge gap. The role played by the space charge makes the mechanism of ion acceleration in this type of thrusters is fundamentally different from the acceleration mechanism operating in purely electrostatic thrusters.
Photolysis frequency and cloud dynamics during DC3 and SEAC4RS
NASA Astrophysics Data System (ADS)
Hall, S. R.; Ullmann, K.; Madronich, S.; Hair, J. W.; Butler, C. F.; Fenn, M. A.
2013-12-01
Cloud shading plays a critical role in extending the lifetime of short-lived chemical species. During convection, photochemistry is reduced such that short-lived species may be transported from the boundary layer to the upper troposphere/ lower stratosphere. In the anvil outflow, shading continues within and below the cloud. However, near the highly scattering cloud top, the chemistry is greatly accelerated. In this rapidly evolving environment, accurate photolysis frequencies are required to study photochemical evolution of the complex composition. During the Deep Convective Clouds and Chemistry (DC3, 2012) and the Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, 2013) campaigns, photolysis frequencies were determined by measurement of spectrally resolved actinic flux by the Charge-coupled device Actinic Flux Spectroradiometer (CAFS) on the NASA DC-8 and the HIAPER Airborne Radiation Package (HARP) on the NCAR G-V aircraft. Vertical flight profiles allowed in situ characterization of the radiation environment. Input of geometrical cloud characteristics into the Tropospheric Ultraviolet and Visible (TUV) Radiation was used to constrain cloud optical depths for more spatially and temporally stable conditions.
NASA Astrophysics Data System (ADS)
Dieckmann, M. E.
2008-11-01
Recent particle-in-cell (PIC) simulation studies have addressed particle acceleration and magnetic field generation in relativistic astrophysical flows by plasma phase space structures. We discuss the astrophysical environments such as the jets of compact objects, and we give an overview of the global PIC simulations of shocks. These reveal several types of phase space structures, which are relevant for the energy dissipation. These structures are typically coupled in shocks, but we choose to consider them here in an isolated form. Three structures are reviewed. (1) Simulations of interpenetrating or colliding plasma clouds can trigger filamentation instabilities, while simulations of thermally anisotropic plasmas observe the Weibel instability. Both transform a spatially uniform plasma into current filaments. These filament structures cause the growth of the magnetic fields. (2) The development of a modified two-stream instability is discussed. It saturates first by the formation of electron phase space holes. The relativistic electron clouds modulate the ion beam and a secondary, spatially localized electrostatic instability grows, which saturates by forming a relativistic ion phase space hole. It accelerates electrons to ultra-relativistic speeds. (3) A simulation is also revised, in which two clouds of an electron-ion plasma collide at the speed 0.9c. The inequal densities of both clouds and a magnetic field that is oblique to the collision velocity vector result in waves with a mixed electrostatic and electromagnetic polarity. The waves give rise to growing corkscrew distributions in the electrons and ions that establish an equipartition between the electron, the ion and the magnetic energy. The filament-, phase space hole- and corkscrew structures are discussed with respect to electron acceleration and magnetic field generation.
NASA Astrophysics Data System (ADS)
Goldsmith, K. J. A.; Pittard, J. M.
2018-05-01
The similarities, or otherwise, of a shock or wind interacting with a cloud of density contrast χ = 10 were explored in a previous paper. Here, we investigate such interactions with clouds of higher density contrast. We compare the adiabatic hydrodynamic interaction of a Mach 10 shock with a spherical cloud of χ = 103 with that of a cloud embedded in a wind with identical parameters to the post-shock flow. We find that initially there are only minor morphological differences between the shock-cloud and wind-cloud interactions, compared to when χ = 10. However, once the transmitted shock exits the cloud, the development of a turbulent wake and fragmentation of the cloud differs between the two simulations. On increasing the wind Mach number, we note the development of a thin, smooth tail of cloud material, which is then disrupted by the fragmentation of the cloud core and subsequent `mass-loading' of the flow. We find that the normalized cloud mixing time (tmix) is shorter at higher χ. However, a strong Mach number dependence on tmix and the normalized cloud drag time, t_{drag}^' }, is not observed. Mach-number-dependent values of tmix and t_{drag}^' } from comparable shock-cloud interactions converge towards the Mach-number-independent time-scales of the wind-cloud simulations. We find that high χ clouds can be accelerated up to 80-90 per cent of the wind velocity and travel large distances before being significantly mixed. However, complete mixing is not achieved in our simulations and at late times the flow remains perturbed.
ClipCard: Sharable, Searchable Visual Metadata Summaries on the Cloud to Render Big Data Actionable
NASA Astrophysics Data System (ADS)
Saripalli, P.; Davis, D.; Cunningham, R.
2013-12-01
Research firm IDC estimates that approximately 90 percent of the Enterprise Big Data go un-analyzed, as 'dark data' - an enormous corpus of undiscovered, untagged information residing on data warehouses, servers and Storage Area Networks (SAN). In the geosciences, these data range from unpublished model runs to vast survey data assets to raw sensor data. Many of these are now being collected instantaneously, at a greater volume and in new data formats. Not all of these data can be analyzed, nor processed in real time, and their features may not be well described at the time of collection. These dark data are a serious data management problem for science organizations of all types, especially ones with mandated or required data reporting and compliance requirements. Additionally, data curators and scientists are encouraged to quantify the impact of their data holdings as a way to measure research success. Deriving actionable insights is the foremost goal of Big Data Analytics (BDA), which is especially true with geoscience, given its direct impact on most of the pressing global issues. Clearly, there is a pressing need for innovative approaches to making dark data discoverable, measurable, and actionable. We report on ClipCard, a Cloud-based SaaS analytic platform for instant summarization, quick search, visualization and easy sharing of metadata summaries form the Dark Data at hierarchical levels of detail, thus rendering it 'white', i.e., actionable. We present a use case of the ClipCard platform, a cloud-based application which helps generate (abstracted) visual metadata summaries and meta-analytics for environmental data at hierarchical scales within and across big data containers. These summaries and analyses provide important new tools for managing big data and simplifying collaboration through easy to deploy sharing APIs. The ClipCard application solves a growing data management bottleneck by helping enterprises and large organizations to summarize, search, discover, and share the potential in their unused data and information assets. Using Cloud as the base platform enables wider reach, quick dissemination and easy sharing of the metadata summaries, without actually storing or sharing the original data assets per se.
CRRES: The combined release and radiation effects satellite program directory
NASA Technical Reports Server (NTRS)
Layman, Laura D.; Miller, George P.
1992-01-01
As a result of natural processes, plasma clouds are often injected into the magnetosphere. These chemical releases can be used to study many aspects of such injections. When a dense plasma is injected into the inner magnetosphere, it is expected to take up the motion of the ambient plasma. However, it has been observed in previous releases at moderate altitudes that the cloud preserved its momentum for some time following the release and that parts of the cloud peeled off from the main cloud presumable due to the action of an instability. As one moves outward into the magnetosphere, the mirror force becomes less dominant and the initial conditions following a release are dominated by the formation of a diamagnetic cavity since the initial plasma pressure from the injected Ba ions is greater than the magnetic field energy density. A previous high-altitude release (31,300 km) showed this to be the case initially, but at later times there was evidence for acceleration of the Ba plasma to velocities corresponding to 60,000 K. This effect is not explained. This series of experiments is therefore designed to inject plasma clouds into the magnetosphere under widely varying conditions of magnetic field strength and ambient plasma density. In this way the coupling of injected clouds to the ambient plasma and magnetic field, the formation of striations due to instabilities, and possible heating and acceleration of the injected Ba plasma can be studied over a wide range of magnetosphere parameters. Adding to the scientific yield will be the availability of measurements for the DOD/SPACERAD instruments which can monitor plasma parameters, electric and magnetic fields, and waves before, during and after the releases.
Laboratory investigation of dust impacts on antennas in space
NASA Astrophysics Data System (ADS)
Drake, K.; Gruen, E.; Malaspina, D.; Sternovsky, Z.
2013-12-01
We are performing calibration measurements in our laboratory using a dust accelerator to understand the mechanisms how dust impact generated plasma clouds couple into electric field antennas on spacecraft. The S/WAVES electric field instruments on board the twin STEREO spacecraft observed short duration (milliseconds), large amplitude (> 15 mV) voltage spikes associated with the impact of high velocity dust particles on the spacecraft [St. Cyr et al., 2009, MeyerVernet et al, 2009a, Zaslavsky et al., 2012]. These sharp spikes have been attributed to plasma clouds generated by the impact ionization of high velocity dust particles. The high count rate has lead to the interpretation that S/WAVES is detecting nanometer sized dust particles (nano-dust) generated in the inner solar system and accelerated to close to solar wind velocities before impacting the spacecraft at 1 AU. The S/WAVES nano-dust interpretation is currently based on an incomplete understanding of the charge generated from relevant materials and the coupling mechanism between the plasma cloud and the electric field instrument. Calibration measurements are performed at the dust accelerator facility at the University of Colorado to investigate the effect of various impact parameters on the signals measured by the electric field instrument. The dust accelerator facility allows experimental control over target materials, size (micron to sub-micron), and velocity (1-60 km/s) of impacting dust particles, geometry of the impact, the ';spacecraft' potential, and the presence or absence of photoelectrons, allowing each coupling factor to be isolated and quantified. As the first step in this effort, we measure the impact charge generation for materials relevant for the STEREO spacecraft.
Measurement of the Lorentz-FitzGerald body contraction
NASA Astrophysics Data System (ADS)
Rafelski, Johann
2018-02-01
A complete foundational discussion of acceleration in the context of Special Relativity (SR) is presented. Acceleration allows the measurement of a Lorentz-FitzGerald body contraction created. It is argued that in the back scattering of a probing laser beam from a relativistic flying electron cloud mirror generated by an ultra-intense laser pulse, a first measurement of a Lorentz-FitzGerald body contraction is feasible.
Rohmer, Kai; Jendersie, Johannes; Grosch, Thorsten
2017-11-01
Augmented Reality offers many applications today, especially on mobile devices. Due to the lack of mobile hardware for illumination measurements, photorealistic rendering with consistent appearance of virtual objects is still an area of active research. In this paper, we present a full two-stage pipeline for environment acquisition and augmentation of live camera images using a mobile device with a depth sensor. We show how to directly work on a recorded 3D point cloud of the real environment containing high dynamic range color values. For unknown and automatically changing camera settings, a color compensation method is introduced. Based on this, we show photorealistic augmentations using variants of differential light simulation techniques. The presented methods are tailored for mobile devices and run at interactive frame rates. However, our methods are scalable to trade performance for quality and can produce quality renderings on desktop hardware.
NASA Technical Reports Server (NTRS)
Cargill, Peter J.; Chen, James; Spicer, D. S.; Zalesak, S. T.
1994-01-01
Two dimensional magnetohydrodynamic simulations of the distortion of a magnetic flux tube, accelerated through ambient solar wind plasma, are presented. Vortices form on the trailing edge of the flux tube, and couple strongly to its interior. If the flux tube azimuthal field is weak, it deforms into an elongated banana-like shape after a few Alfven transit times. A significant azimuthal field component inhibits this distortion. In the case of magnetic clouds in the solar wind, it is suggested that the shape observed at 1 AU was determined by distortion of the cloud in the inner heliosphere. Distortion of the cloud beyond 1 AU takes many days. It is estimated that effective drag coefficients slightly greater than unity are appropriate for modeling flux tube propagation. Synthetic magnetic field profiles as would be seen by a spacecraft traversing the cloud are presented.
Clouds enhance Greenland ice sheet meltwater runoff.
Van Tricht, K; Lhermitte, S; Lenaerts, J T M; Gorodetskaya, I V; L'Ecuyer, T S; Noël, B; van den Broeke, M R; Turner, D D; van Lipzig, N P M
2016-01-12
The Greenland ice sheet has become one of the main contributors to global sea level rise, predominantly through increased meltwater runoff. The main drivers of Greenland ice sheet runoff, however, remain poorly understood. Here we show that clouds enhance meltwater runoff by about one-third relative to clear skies, using a unique combination of active satellite observations, climate model data and snow model simulations. This impact results from a cloud radiative effect of 29.5 (±5.2) W m(-2). Contrary to conventional wisdom, however, the Greenland ice sheet responds to this energy through a new pathway by which clouds reduce meltwater refreezing as opposed to increasing surface melt directly, thereby accelerating bare-ice exposure and enhancing meltwater runoff. The high sensitivity of the Greenland ice sheet to both ice-only and liquid-bearing clouds highlights the need for accurate cloud representations in climate models, to better predict future contributions of the Greenland ice sheet to global sea level rise.
Clouds enhance Greenland ice sheet meltwater runoff
Van Tricht, K.; Lhermitte, S.; Lenaerts, J. T. M.; Gorodetskaya, I. V.; L'Ecuyer, T. S.; Noël, B.; van den Broeke, M. R.; Turner, D. D.; van Lipzig, N. P. M.
2016-01-01
The Greenland ice sheet has become one of the main contributors to global sea level rise, predominantly through increased meltwater runoff. The main drivers of Greenland ice sheet runoff, however, remain poorly understood. Here we show that clouds enhance meltwater runoff by about one-third relative to clear skies, using a unique combination of active satellite observations, climate model data and snow model simulations. This impact results from a cloud radiative effect of 29.5 (±5.2) W m−2. Contrary to conventional wisdom, however, the Greenland ice sheet responds to this energy through a new pathway by which clouds reduce meltwater refreezing as opposed to increasing surface melt directly, thereby accelerating bare-ice exposure and enhancing meltwater runoff. The high sensitivity of the Greenland ice sheet to both ice-only and liquid-bearing clouds highlights the need for accurate cloud representations in climate models, to better predict future contributions of the Greenland ice sheet to global sea level rise. PMID:26756470
High Performance GPU-Based Fourier Volume Rendering.
Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr
2015-01-01
Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)logN) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.
Fast Time-Varying Volume Rendering Using Time-Space Partition (TSP) Tree
NASA Technical Reports Server (NTRS)
Shen, Han-Wei; Chiang, Ling-Jen; Ma, Kwan-Liu
1999-01-01
We present a new, algorithm for rapid rendering of time-varying volumes. A new hierarchical data structure that is capable of capturing both the temporal and the spatial coherence is proposed. Conventional hierarchical data structures such as octrees are effective in characterizing the homogeneity of the field values existing in the spatial domain. However, when treating time merely as another dimension for a time-varying field, difficulties frequently arise due to the discrepancy between the field's spatial and temporal resolutions. In addition, treating spatial and temporal dimensions equally often prevents the possibility of detecting the coherence that is unique in the temporal domain. Using the proposed data structure, our algorithm can meet the following goals. First, both spatial and temporal coherence are identified and exploited for accelerating the rendering process. Second, our algorithm allows the user to supply the desired error tolerances at run time for the purpose of image-quality/rendering-speed trade-off. Third, the amount of data that are required to be loaded into main memory is reduced, and thus the I/O overhead is minimized. This low I/O overhead makes our algorithm suitable for out-of-core applications.
NASA Astrophysics Data System (ADS)
Le Marshall, J.; Jung, J.; Lord, S. J.; Derber, J. C.; Treadon, R.; Joiner, J.; Goldberg, M.; Wolf, W.; Liu, H. C.
2005-08-01
The National Aeronautics and Space Administration (NASA), National Oceanic and Atmospheric Administration (NOAA), and Department of Defense (DoD), Joint Center for Satellite Data Assimilation (JCSDA) was established in 2000/2001. The goal of the JCSDA is to accelerate the use of observations from earth-orbiting satellites into operational numerical environmental analysis and prediction systems for the purpose of improving weather and oceanic forecasts, seasonal climate forecasts and the accuracy of climate data sets. As a result, a series of data assimilation experiments were undertaken at the JCSDA as part of the preparations for the operational assimilation of AIRS data by its partner organizations1,2. Here, for the first time full spatial resolution radiance data, available in real-time from the AIRS instrument, were used at the JCSDA in data assimilation studies over the globe utilizing the operational NCEP Global Forecast System (GFS). The radiance data from each channel of the instrument were carefully screened for cloud effects and those radiances which were deemed to be clear of cloud effects were used by the GFS forecast system. The result of these assimilation trials has been a first demonstration of significant improvements in forecast skill over both the Northern and Southern Hemisphere compared to the operational system without AIRS data. The experimental system was designed in a way that rendered it feasible for operational application, and that constraint involved using the subset of AIRS channels chosen for operational distribution and an analysis methodology close to the current analysis practice, with particular consideration given to time limitations. As a result, operational application of these AIRS data was enabled by the recent NCEP operational upgrade. In addition, because of the improved impact resulting from use of this enhanced data set compared to that used operationally to date, provision of a realtime "warmest field" of view data set has been established for use by international NWP Centers.
Fermi Large Area Telescope Observations of the Supernova Remnant GS.7-0.1
NASA Technical Reports Server (NTRS)
Ajello, M.; Allafort, A.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bechtol, K.; Bellazzini, R.; Berenji, B.; Blandford, R. D.;
2011-01-01
We present a detailed analysis of the GeV gamma-ray emission toward the supernova remnant (SNR) G8.7-0.1 with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope. An investigation of the relationship among G8.7-0.l and the TeV unidentified source HESS J1804-216 provides us with an important clue on diffusion process of cosmic rays if particle acceleration operates in the SNR. The GeV gamma-ray emission is extended with most of the emission in positional coincidence with the SNR G8.7-0.l and a lesser part located outside the western boundary of G8.7-0.l. The region of the gamma-ray emission overlaps spatially-connected molecular clouds, implying a physical connection for the gamma-ray structure. The total gamma-ray spectrum measured with LAT from 200 MeV-100 GeV can be described by a broken power-law function with a break of 2.4 +/- 0.6 (stat) +/- 1.2 (sys) GeV, and photon indices of 2.10 +/- 0.06 (stat) +/- 0.10 (sys) below the break and 2.70 +/- 0.12 (stat) +/- 0.l4 (sys) above the break. Given the spatial association among the gamma rays, the radio emission of G8.7-0.1, and the molecular clouds, the decay of 1IoS produced by particles accelerated in the SNR and hitting the molecular clouds naturally explains the GeV gamma-ray spectrum. We also find that the GeV morphology is not well represented by the TeV emission from HESS J1804-216 and that the spectrum in the GeV band is not consistent with the extrapolation of the TeV gamma-ray spectrum. The spectral index of the TeV emission is consistent with the particle spectral index predicted by a theory that assumes energy-dependent diffusion of particles accelerated in an SNR. We discuss the possibility that the TeV spectrum originates from the interaction of particles accelerated in G8.7-0.1 with molecular clouds, and we constrain the diffusion coefficient of the particles.
Fermi Large Area Telescope Observations of the Supernova Remnant GS.7-0.1
NASA Technical Reports Server (NTRS)
Ferrara, E. C.; Hays, E.; Troja, E.; Moiseev, A. A.
2012-01-01
We present a detailed analysis of the GeV gamma-ray emission toward the supernova remnant (SNR) G8.7-0.1 with the Large Area Telescope (LAT) onboard the Fermi Gamma-ray Space Telescope. An investigation of the relationship among G8.7-0.1 and the TeV unidentified source HESS J1804-216 provides us with an important clue on diffusion process of cosmic rays if particle acceleration operates in the SNR. The GeV gamma-ray emission is extended with most of the emission in positional coincidence with the SNR G8.7-0.1 and a lesser part located outside the western boundary of G8.7-0.1. The region of the gamma-ray emission overlaps spatially-connected molecular clouds, implying a physical connection for the gamma-ray structure. The total gamma-ray spectrum measured with LAT from 200 MeV-100 GeV can be described by a broken power-law function with a break of 2.4 +/- 0.6 (stat) +/- 1.2 (sys) GeV, and photon indices of2.10 +/- 0.06 (stat) +/- 0.10 (sys) below the break and 2.70 +/- 0.12 (stat) +/- 0.14 (sys) above the break. Given the spatial association among the gamma rays, the radio emission ofG8.7-0.1, and the molecular clouds, the decay of pions produced by particles accelerated in the SNR and hitting the molecular clouds naturally explains the GeV gamma-ray spectrum. We also find that the GeV morphology is not well represented by the TeV emission from HESS Jl804-2l6 and that the spectrum in the Ge V band is not consistent with the extrapolation of the TeV gamma-ray spectrum. The spectral index of the TeV emission is consistent with the particle spectral index predicted by a theory that assumes energy-dependent diffusion of particles accelerated in an SNR. We discuss the possibility that the TeV-spectrum originates from the interaction of particles accelerated in G8.7-0.l with molecular clouds, and we constrain the diffusion coefficient of the particles.
Fermi Large Area Telescope Observations of the Supernova Remnant G8.7-0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ajello, M.; Allafort, A.; /Stanford U., HEPL /KIPAC, Menlo Park /SLAC
We present a detailed analysis of the GeV gamma-ray emission toward the supernova remnant (SNR) G8.7-0.1 with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope. An investigation of the relationship between G8.7-0.1 and the TeV unidentified source HESS J1804-216 provides us with an important clue on diffusion process of cosmic rays if particle acceleration operates in the SNR. The GeV gamma-ray emission is extended with most of the emission in positional coincidence with the SNR G8.7-0.1 and a lesser part located outside the western boundary of G8.7-0.1. The region of the gamma-ray emission overlaps spatially connectedmore » molecular clouds, implying a physical connection for the gamma-ray structure. The total gamma-ray spectrum measured with LAT from 200 MeV-100 GeV can be described by a broken power-law function with a break of 2.4 {+-} 0.6 (stat) {+-} 1.2 (sys) GeV, and photon indices of 2.10 {+-} 0.06 (stat) {+-} 0.10 (sys) below the break and 2.70 {+-} 0.12 (stat) {+-} 0.14 (sys) above the break. Given the spatial association among the gamma rays, the radio emission of G8.7-0.1, and the molecular clouds, the decay of p0s produced by particles accelerated in the SNR and hitting the molecular clouds naturally explains the GeV gamma-ray spectrum. We also find that the GeV morphology is not well represented by the TeV emission from HESS J1804-216 and that the spectrum in the GeV band is not consistent with the extrapolation of the TeV gamma-ray spectrum. The spectral index of the TeV emission is consistent with the particle spectral index predicted by a theory that assumes energy-dependent diffusion of particles accelerated in an SNR. We discuss the possibility that the TeV spectrum originates from the interaction of particles accelerated in G8.7-0.1 with molecular clouds, and we constrain the diffusion coefficient of the particles.« less
THE LAUNCHING OF COLD CLOUDS BY GALAXY OUTFLOWS. II. THE ROLE OF THERMAL CONDUCTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brüggen, Marcus; Scannapieco, Evan
2016-05-01
We explore the impact of electron thermal conduction on the evolution of radiatively cooled cold clouds embedded in flows of hot and fast material as it occurs in outflowing galaxies. Performing a parameter study of three-dimensional adaptive mesh refinement hydrodynamical simulations, we show that electron thermal conduction causes cold clouds to evaporate, but it can also extend their lifetimes by compressing them into dense filaments. We distinguish between low column-density clouds, which are disrupted on very short times, and high-column density clouds with much longer disruption times that are set by a balance between impinging thermal energy and evaporation. Wemore » provide fits to the cloud lifetimes and velocities that can be used in galaxy-scale simulations of outflows in which the evolution of individual clouds cannot be modeled with the required resolution. Moreover, we show that the clouds are only accelerated to a small fraction of the ambient velocity because compression by evaporation causes the clouds to present a small cross-section to the ambient flow. This means that either magnetic fields must suppress thermal conduction, or that the cold clouds observed in galaxy outflows are not formed of cold material carried out from the galaxy.« less
NASA Astrophysics Data System (ADS)
Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.
2016-01-01
The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.
Longitudinal Control for Mengshi Autonomous Vehicle via Cloud Model
NASA Astrophysics Data System (ADS)
Gao, H. B.; Zhang, X. Y.; Li, D. Y.; Liu, Y. C.
2018-03-01
Dynamic robustness and stability control is a requirement for self-driving of autonomous vehicle. Longitudinal control method of autonomous is a key technique which has drawn the attention of industry and academe. In this paper, we present a longitudinal control algorithm based on cloud model for Mengshi autonomous vehicle to ensure the dynamic stability and tracking performance of Mengshi autonomous vehicle. An experiments is applied to test the implementation of the longitudinal control algorithm. Empirical results show that if the longitudinal control algorithm based Gauss cloud model are applied to calculate the acceleration, and the vehicles drive at different speeds, a stable longitudinal control effect is achieved.
Immersive Molecular Visualization with Omnidirectional Stereoscopic Ray Tracing and Remote Rendering
Stone, John E.; Sherman, William R.; Schulten, Klaus
2016-01-01
Immersive molecular visualization provides the viewer with intuitive perception of complex structures and spatial relationships that are of critical interest to structural biologists. The recent availability of commodity head mounted displays (HMDs) provides a compelling opportunity for widespread adoption of immersive visualization by molecular scientists, but HMDs pose additional challenges due to the need for low-latency, high-frame-rate rendering. State-of-the-art molecular dynamics simulations produce terabytes of data that can be impractical to transfer from remote supercomputers, necessitating routine use of remote visualization. Hardware-accelerated video encoding has profoundly increased frame rates and image resolution for remote visualization, however round-trip network latencies would cause simulator sickness when using HMDs. We present a novel two-phase rendering approach that overcomes network latencies with the combination of omnidirectional stereoscopic progressive ray tracing and high performance rasterization, and its implementation within VMD, a widely used molecular visualization and analysis tool. The new rendering approach enables immersive molecular visualization with rendering techniques such as shadows, ambient occlusion lighting, depth-of-field, and high quality transparency, that are particularly helpful for the study of large biomolecular complexes. We describe ray tracing algorithms that are used to optimize interactivity and quality, and we report key performance metrics of the system. The new techniques can also benefit many other application domains. PMID:27747138
Evolution of the Debris Cloud Generated by the Fengyun-1C Fragmentation Event
NASA Technical Reports Server (NTRS)
Pardini, Carmen; Anselmo, Luciano
2007-01-01
The cloud of cataloged debris produced in low earth orbit by the fragmentation of the Fengyun-1C spacecraft was propagated for 15 years, taking into account all relevant perturbations. Unfortunately, the cloud resulted to be very stable, not suffering substantial debris decay during the time span considered. The only significant short term evolution was the differential spreading of the orbital planes of the fragments, leading to the formation of a debris shell around the earth approximately 7-8 months after the breakup, and the perigee precession of the elliptical orbits. Both effects will render the shell more "isotropic" in the coming years. The immediate consequence of the Chinese anti-satellite test, carried out in an orbital regime populated by many important operational satellites, was to increase significantly the probability of collision with man-made debris. For the two Italian spacecraft launched in the first half of 2007, the collision probability with cataloged objects increased by 12% for AGILE, in equatorial orbit, and by 38% for COSMO-SkyMed 1, in sun-synchronous orbit.
Satoh, Akihiro
2016-04-01
The purpose of this study is to develop a new system to get and share some data of a patient which are required for a radiological examination not using an electronic medical chart or a radiological information system (RIS), and also to demonstrate that this system is operated on cloud technology. I used Java Enterprise Edition (Java EE) as a programing language and MySQL as a server software, and I used two laptops as hardware for client computer and server computer. For cloud computing, I hired a server of Google App Engine for Java (GAE). As a result, I could get some data of the patient required at his/her examination instantly using this system. This system also helps to improve the efficiency of examination. For example, it has been useful when I want to decide radiographic condition or to create CT images such as multi-planar reconstruction (MPR) or volume rendering (VR). When it comes to cloud computing, the GAE was used experimentally due to some legal restrictions. From the above points it is clear that this system has played an important role in radiological examinations, but there has been still few things which I have to resolve for cloud computing.
Visualization of the Construction of Ancient Roman Buildings in Ostia Using Point Cloud Data
NASA Astrophysics Data System (ADS)
Hori, Y.; Ogawa, T.
2017-02-01
The implementation of laser scanning in the field of archaeology provides us with an entirely new dimension in research and surveying. It allows us to digitally recreate individual objects, or entire cities, using millions of three-dimensional points grouped together in what is referred to as "point clouds". In addition, the visualization of the point cloud data, which can be used in the final report by archaeologists and architects, should usually be produced as a JPG or TIFF file. Not only the visualization of point cloud data, but also re-examination of older data and new survey of the construction of Roman building applying remote-sensing technology for precise and detailed measurements afford new information that may lead to revising drawings of ancient buildings which had been adduced as evidence without any consideration of a degree of accuracy, and finally can provide new research of ancient buildings. We used laser scanners at fields because of its speed, comprehensive coverage, accuracy and flexibility of data manipulation. Therefore, we "skipped" many of post-processing and focused on the images created from the meta-data simply aligned using a tool which extended automatic feature-matching algorithm and a popular renderer that can provide graphic results.
Speeding Up Geophysical Research Using Docker Containers Within Multi-Cloud Environment.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.; Starovoit, Y. O.
2016-12-01
How useful are the geophysical observations in a scope of minimizing losses from natural disasters today? Does it help to decrease number of human victims during tsunami and earthquake? Unfortunately it's still at early stage these days. It's a big goal and achievement to make such observations more useful by improving early warning and prediction systems with the help of cloud computing. Cloud computing technologies have proved the ability to speed up application development in many areas for 10 years already. Cloud unlocks new opportunities for geoscientists by providing access to modern data processing tools and algorithms including real-time high-performance computing, big data processing, artificial intelligence and others. Emerging lightweight cloud technologies, such as Docker containers, are gaining wide traction in IT due to the fact of faster and more efficient deployment of different applications in a cloud environment. It allows to deploy and manage geophysical applications and systems in minutes across multiple clouds and data centers that becomes of utmost importance for the next generation applications. In this session we'll demonstrate how Docker containers technology within multi-cloud can accelerate the development of applications specifically designed for geophysical researches.
Green Bank Telescope Detection of HI Clouds in the Fermi Bubble Wind
NASA Astrophysics Data System (ADS)
Lockman, Felix; Di Teodoro, Enrico M.; McClure-Griffiths, Naomi M.
2018-01-01
We used the Robert C. Byrd Green Bank Telescope to map HI 21cm emission in two large regions around the Galactic Center in a search for HI clouds that might be entrained in the nuclear wind that created the Fermi bubbles. In a ~160 square degree region at |b|>4 deg. and |long|<10 deg we detect 106 HI clouds that have large non-circular velocities consistent with their acceleration by the nuclear wind. Rapidly moving clouds are found as far as 1.5 kpc from the center; there are no detectable asymmetries in the cloud populations above and below the Galactic Center. The cloud kinematics is modeled as a population with an outflow velocity of 330 km/s that fills a cone with an opening angle ~140 degrees. The total mass in the clouds is ~10^6 solar masses and we estimate cloud lifetimes to be between 2 and 8 Myr, implying a cold gas mass-loss rate of about 0.1 solar masses per year into the nuclear wind.The Green Bank Telescope is a facility of the National Science Foundation, operated under a cooperative agreement by Associated Universities, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hao; Ren, Shangping; Garzoglio, Gabriele
Cloud bursting is one of the key research topics in the cloud computing communities. A well designed cloud bursting module enables private clouds to automatically launch virtual machines (VMs) to public clouds when more resources are needed. One of the main challenges in developing a cloud bursting module is to decide when and where to launch a VM so that all resources are most effectively and efficiently utilized and the system performance is optimized. However, based on system operational data obtained from FermiCloud, a private cloud developed by the Fermi National Accelerator Laboratory for scientific workflows, the VM launching overheadmore » is not a constant. It varies with physical resource utilization, such as CPU and I/O device utilizations, at the time when a VM is launched. Hence, to make judicious decisions as to when and where a VM should be launched, a VM launching overhead reference model is needed. In this paper, we first develop a VM launching overhead reference model based on operational data we have obtained on FermiCloud. Second, we apply the developed reference model on FermiCloud and compare calculated VM launching overhead values based on the model with measured overhead values on FermiCloud. Our empirical results on FermiCloud indicate that the developed reference model is accurate. We believe, with the guidance of the developed reference model, efficient resource allocation algorithms can be developed for cloud bursting process to minimize the operational cost and resource waste.« less
The heliospheric sector boundary as a distented magnetic cloud
NASA Technical Reports Server (NTRS)
Crooker, N. U.; Intriligator, D. S.
1995-01-01
A magnetic cloud was detected both near Earth and by Pioneer 11 located 43 deg east of Earth at 4.8 AU. The magnetic field within the cloud rotated smoothly from toward to away polarity, marking sector boundary passage. Interpreted as a flux rope, the cloud had a vertical axis, implying that its cylindrical cross-section in the ecliptic plane was distended along the sector boundary by at least 43, forming an extensive occlusion in the heliospheric current sheet. At 1 AU the cloud had plasma signatures typical of a fast coronal mass ejection with low temperature and a leading shock. In contrast, at 4.8 AU, only the cloud signature remained. Its radial dimension was the same at both locations, consistent with little expansion beyond 1 AU. Energetic particle data at 4.8 AU show high fluxes preceding the cloud but not extending forward to the corotating shock that marked entry into the interaction region containing the cloud. The streaming direction was antisunward, consistent with possible acceleration in a low-beta region of field line draping around the cloud's western (upstream) end. The fluxes dropped upon entry into the cloud and became essentially isotropic one third of the way through it. On the basis of sector boundary characteristics published in the past, we suggest that distended clouds may be common heliospheric current sheet occlusions.
NASA Astrophysics Data System (ADS)
Hovey, Luke; Hughes, John P.; McCully, Curtis; Pandya, Viraj; Eriksen, Kristoffer
2018-01-01
We present results from an optical study of two young Balmer-dominated remnants of SNIa in the Large Magellanic Cloud, 0509-67.5 and 0519-69.0, in an attempt to search for signatures of efficient cosmic-ray (CR) acceleration. We combine proper motion measurements from HST with corresponding optical spectroscopic measurements of the Hα line at multiple rim positions from VLT/FORS2 and SALT/RSS and compare our results to published Balmer shock models. Analysis of the optical spectra result in broad Hα widths in the range of 1800-4000 km s-1 for twelve separate Balmer-dominated filaments that show no evidence for forbidden line emission, the corresponding shock speeds from proper motion measurements span a range of 1600-8500 km s-1. Our measured values of shock speeds and broad Hα widths in 0509-67.5 and 0519-69.0 are fit well with a Balmer shock model that does not include effects of efficient CR acceleration. We determine an upper limit of 7%/Χ (95% confidence) on the CR acceleration efficiency for our ensemble of data points, where Χ is the ionization fraction of the pre-shock gas. The upper limits on the individual remnants are 6%/Χ (0509-67.5) and 11%/Χ (0519-69.0). These upper limits are below the integrated CR acceleration efficiency in the Tycho supernova remnant, where the shocks predominantly show little Hα emission, indicating that Balmer-dominated shocks are not efficient CR accelerators.
NASA Astrophysics Data System (ADS)
Alby, E.; Elter, R.; Ripoche, C.; Quere, N.; de Strasbourg, INSA
2013-07-01
In a geopolitical very complex context as the Gaza Strip it has to be dealt with an enhancement of an archaeological site. This site is the monastery of St. Hilarion. To enable a cultural appropriation of a place with several identified phases of occupation must undertake extensive archaeological excavation. Excavate in this geographical area is to implement emergency excavations, so the aim of such a project can be questioned for each mission. Real estate pressure is also a motivating setting the documentation because the large population density does not allow systematic studies of underground before construction projects. This is also during the construction of a road that the site was discovered. Site dimensions are 150 m by 80 m. It is located on a sand dune, 300 m from the sea. To implement the survey, four different levels of detail have been defined for terrestrial photogrammetry. The first level elements are similar to objects, capitals, fragment of columns, tiles for example. Modeling of small objects requires the acquisition of very dense point clouds (density: 1 point / 1 mm on average). The object must then be a maximum area of the sensor of the camera, while retaining in the field of view a reference pattern for the scaling of the point cloud generated. The pictures are taken at a short distance from the object, using the images at full resolution. The main obstacle to the modeling of objects is the presence of noise partly due to the studied materials (sand, smooth rock), which do not favor the detection of points of interest quality. Pretreatments of the cloud will be achieved meticulously since the ouster of points on a surface of a small object results in the formation of a hole with a lack of information, useful to resulting mesh. Level 2 focuses on the stratigraphic units such as mosaics. The monastery of St. Hilarion identifies thirteen floors of which has been documented years ago by silver photographs, scanned later. Modeling of pavements is to obtain a three-dimensional model of the mosaic in particular to analyze the subsidence, which it may be subjected. The dense point cloud can go beyond by including the geometric shapes of the pavement. The calculation mesh using high-density point cloud colorization allows cloud sufficient to final rendering. Levels 3 and 4 will allow the survey and representation of loci and sectors. Their modeling can be done by colored mesh or textured by a generic pattern but also by geometric primitives. This method requires the segmentation simple geometrical elements and creates a surface geometry by analysis of the sample points. Statistical tools allow the extraction plans meet the requirements of the operator can monitor quantitatively the quality of the final rendering. Each level has constraints on the accuracy of survey and types of representation especially from the point clouds, which are detailed in the complete article.
Cloud Response to Arctic Sea Ice Loss and Implications for Feedbacks in the CESM1 Climate Model
NASA Astrophysics Data System (ADS)
Morrison, A.; Kay, J. E.; Chepfer, H.; Guzman, R.; Bonazzola, M.
2017-12-01
Clouds have the potential to accelerate or slow the rate of Arctic sea ice loss through their radiative influence on the surface. Cloud feedbacks can therefore play into Arctic warming as clouds respond to changes in sea ice cover. As the Arctic moves toward an ice-free state, understanding how cloud - sea ice relationships change in response to sea ice loss is critical for predicting the future climate trajectory. From satellite observations we know the effect of present-day sea ice cover on clouds, but how will clouds respond to sea ice loss as the Arctic transitions to a seasonally open water state? In this study we use a lidar simulator to first evaluate cloud - sea ice relationships in the Community Earth System Model (CESM1) against present-day observations (2006-2015). In the current climate, the cloud response to sea ice is well-represented in CESM1: we see no summer cloud response to changes in sea ice cover, but more fall clouds over open water than over sea ice. Since CESM1 is credible for the current Arctic climate, we next assess if our process-based understanding of Arctic cloud feedbacks related to sea ice loss is relevant for understanding future Arctic clouds. In the future Arctic, summer cloud structure continues to be insensitive to surface conditions. As the Arctic warms in the fall, however, the boundary layer deepens and cloud fraction increases over open ocean during each consecutive decade from 2020 - 2100. This study will also explore seasonal changes in cloud properties such as opacity and liquid water path. Results thus far suggest that a positive fall cloud - sea ice feedback exists in the present-day and future Arctic climate.
Towards a 3d Based Platform for Cultural Heritage Site Survey and Virtual Exploration
NASA Astrophysics Data System (ADS)
Seinturier, J.; Riedinger, C.; Mahiddine, A.; Peloso, D.; Boï, J.-M.; Merad, D.; Drap, P.
2013-07-01
This paper present a 3D platform that enables to make both cultural heritage site survey and its virtual exploration. It provides a single and easy way to use framework for merging multi scaled 3D measurements based on photogrammetry, documentation produced by experts and the knowledge of involved domains leaving the experts able to extract and choose the relevant information to produce the final survey. Taking into account the interpretation of the real world during the process of archaeological surveys is in fact the main goal of a survey. New advances in photogrammetry and the capability to produce dense 3D point clouds do not solve the problem of surveys. New opportunities for 3D representation are now available and we must to use them and find new ways to link geometry and knowledge. The new platform is able to efficiently manage and process large 3D data (points set, meshes) thanks to the implementation of space partition methods coming from the state of the art such as octrees and kd-trees and thus can interact with dense point clouds (thousands to millions of points) in real time. The semantisation of raw 3D data relies on geometric algorithms such as geodetic path computation, surface extraction from dense points cloud and geometrical primitive optimization. The platform provide an interface that enables expert to describe geometric representations of interesting objects like ashlar blocs, stratigraphic units or generic items (contour, lines, … ) directly onto the 3D representation of the site and without explicit links to underlying algorithms. The platform provide two ways for describing geometric representation. If oriented photographs are available, the expert can draw geometry on a photograph and the system computes its 3D representation by projection on the underlying mesh or the points cloud. If photographs are not available or if the expert wants to only use the 3D representation then he can simply draw objects shape on it. When 3D representations of objects of a surveyed site are extracted from the mesh, the link with domain related documentation is done by means of a set of forms designed by experts. Information from these forms are linked with geometry such that documentation can be attached to the viewed objects. Additional semantisation methods related to specific domains have been added to the platform. Beyond realistic rendering of surveyed site, the platform embeds non photorealistic rendering (NPR) algorithms. These algorithms enable to dynamically illustrate objects of interest that are related to knowledge with specific styles. The whole platform is implemented with a Java framework and relies on an actual and effective 3D engine that make available latest rendering methods. We illustrate this work on various photogrammetric survey, in medieval archaeology with the Shawbak castle in Jordan and in underwater archaeology on different marine sites.
NASA Technical Reports Server (NTRS)
Palm, Stephen P.; Strey, Sara T.; Spinhirne, James; Markus, Thorsten
2010-01-01
Recent satellite lidar measurements of cloud properties spanning a period of 5 years are used to examine a possible connection between Arctic sea ice amount and polar cloud fraction and vertical distribution. We find an anticorrelation between sea ice extent and cloud fraction with maximum cloudiness occurring over areas with little or no sea ice. We also find that over ice!free regions, there is greater low cloud frequency and average optical depth. Most of the optical depth increase is due to the presence of geometrically thicker clouds over water. In addition, our analysis indicates that over the last 5 years, October and March average polar cloud fraction has increased by about 7% and 10%, respectively, as year average sea ice extent has decreased by 5% 7%. The observed cloud changes are likely due to a number of effects including, but not limited to, the observed decrease in sea ice extent and thickness. Increasing cloud amount and changes in vertical distribution and optical properties have the potential to affect the radiative balance of the Arctic region by decreasing both the upwelling terrestrial longwave radiation and the downward shortwave solar radiation. Because longwave radiation dominates in the long polar winter, the overall effect of increasing low cloud cover is likely a warming of the Arctic and thus a positive climate feedback, possibly accelerating the melting of Arctic sea ice.
The Clouds and the Earth's Radiant Energy System Elevation Bearing Assembly Life Test
NASA Technical Reports Server (NTRS)
Brown, Phillip L.; Miller, James B.; Jones, William R., Jr.; Rasmussen, Kent; Wheeler, Donald R.; Rana, Mauro; Peri, Frank
1999-01-01
The Clouds and the Earth's Radiant Energy System (CERES) elevation scan bearings lubricated with Pennzane SHF X2000 and 2% lead naphthenate (PbNp) were life tested for a seven-year equivalent Low Earth Orbit (LEO) operation. The bearing life assembly was tested continuously at an accelerated and normal rate using the scanning patterns developed for the CERES Earth Observing System AM-1 mission. A post-life-test analysis was performed on the collected data, bearing wear, and lubricant behavior.
Advancing research and applications with lightning detection and mapping systems
NASA Astrophysics Data System (ADS)
MacGorman, Donald R.; Goodman, Steven J.
2011-11-01
Southern Thunder 2011 Workshop; Norman, Oklahoma, 11-14 July 2011 The Southern Thunder 2011 (ST11) Workshop was the fourth in a series intended to accelerate research and operational applications made possible by the expanding availability of ground-based and satellite systems that detect and map all types of lightning (in-cloud and cloud-to-ground). This community workshop, first held in 2004, brings together lightning data providers, algorithm developers, and operational users in government, academia, and industry.
NASA Technical Reports Server (NTRS)
Holmgren, G.; Bostroem, R.; Kelley, M. C.; Kintner, P. M.; Lundin, R.; Fahleson, U. V.; Bering, E. A.; Sheldon, W. R.
1979-01-01
The experiment design, including a description of the diagnostic and chemical release payload, and the general results are given for an auroral process simulation experiment. A drastic increase of the field aligned charged particle flux was observed over the approximate energy range 10 eV to more than 300 keV, starting about 150 ms after the release and lasting about one second. The is evidence of a second particle burst, starting one second after the release and lasting for tens of seconds, and evidence for a periodic train of particle bursts occurring with a 7.7 second period from 40 to 130 seconds after the release. A transient electric field pulse of 200 mv/m appeared just before the particle flux increase started. Electrostatic wave emissions around 2 kHz, as well as a delayed perturbation of the E-region below the plasma cloud were also observed. Some of the particle observations are interpreted in terms of field aligned electrostatic acceleration a few hundred kilometers above the injected plasma cloud. It is suggested that the acceleration electric field was created by an instability driven by field aligned currents originating in the plasma cloud.
ChalkBoard: Mapping Functions to Polygons
NASA Astrophysics Data System (ADS)
Matlage, Kevin; Gill, Andy
ChalkBoard is a domain specific language for describing images. The ChalkBoard language is uncompromisingly functional and encourages the use of modern functional idioms. ChalkBoard uses off-the-shelf graphics cards to speed up rendering of functional descriptions. In this paper, we describe the design of the core ChalkBoard language, and the architecture of our static image generation accelerator.
Accelerating the Original Profile Kernel.
Hamp, Tobias; Goldberg, Tatyana; Rost, Burkhard
2013-01-01
One of the most accurate multi-class protein classification systems continues to be the profile-based SVM kernel introduced by the Leslie group. Unfortunately, its CPU requirements render it too slow for practical applications of large-scale classification tasks. Here, we introduce several software improvements that enable significant acceleration. Using various non-redundant data sets, we demonstrate that our new implementation reaches a maximal speed-up as high as 14-fold for calculating the same kernel matrix. Some predictions are over 200 times faster and render the kernel as possibly the top contender in a low ratio of speed/performance. Additionally, we explain how to parallelize various computations and provide an integrative program that reduces creating a production-quality classifier to a single program call. The new implementation is available as a Debian package under a free academic license and does not depend on commercial software. For non-Debian based distributions, the source package ships with a traditional Makefile-based installer. Download and installation instructions can be found at https://rostlab.org/owiki/index.php/Fast_Profile_Kernel. Bugs and other issues may be reported at https://rostlab.org/bugzilla3/enter_bug.cgi?product=fastprofkernel.
A modeling analysis program for the JPL Table Mountain Io sodium cloud data
NASA Technical Reports Server (NTRS)
Smyth, W. H.; Goldberg, B. A.
1986-01-01
Progress and achievements in the second year are discussed in three main areas: (1) data quality review of the 1981 Region B/C images; (2) data processing activities; and (3) modeling activities. The data quality review revealed that almost all 1981 Region B/C images are of sufficient quality to be valuable in the analyses of the JPL data set. In the second area, the major milestone reached was the successful development and application of complex image-processing software required to render the original image data suitable for modeling analysis studies. In the third area, the lifetime description of sodium atoms in the planet magnetosphere was improved in the model to include the offset dipole nature of the magnetic field as well as an east-west electric field. These improvements are important in properly representing the basic morphology as well as the east-west asymmetries of the sodium cloud.
Sheldon, Julie; Perales, Celia
2012-01-01
Summary: Evolution of RNA viruses occurs through disequilibria of collections of closely related mutant spectra or mutant clouds termed viral quasispecies. Here we review the origin of the quasispecies concept and some biological implications of quasispecies dynamics. Two main aspects are addressed: (i) mutant clouds as reservoirs of phenotypic variants for virus adaptability and (ii) the internal interactions that are established within mutant spectra that render a virus ensemble the unit of selection. The understanding of viruses as quasispecies has led to new antiviral designs, such as lethal mutagenesis, whose aim is to drive viruses toward low fitness values with limited chances of fitness recovery. The impact of quasispecies for three salient human pathogens, human immunodeficiency virus and the hepatitis B and C viruses, is reviewed, with emphasis on antiviral treatment strategies. Finally, extensions of quasispecies to nonviral systems are briefly mentioned to emphasize the broad applicability of quasispecies theory. PMID:22688811
Typograph: Multiscale Spatial Exploration of Text Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Burtner, Edwin R.; Cramer, Nicholas O.
2013-12-01
Visualizing large document collections using a spatial layout of terms can enable quick overviews of information. However, these metaphors (e.g., word clouds, tag clouds, etc.) often lack interactivity to explore the information and the location and rendering of the terms are often not based on mathematical models that maintain relative distances from other information based on similarity metrics. Further, transitioning between levels of detail (i.e., from terms to full documents) can be challanging. In this paper, we present Typograph, a multi-scale spatial exploration visualization for large document collections. Based on the term-based visualization methods, Typograh enables multipel levels of detailmore » (terms, phrases, snippets, and full documents) within the single spatialization. Further, the information is placed based on their relative similarity to other information to create the “near = similar” geography metaphor. This paper discusses the design principles and functionality of Typograph and presents a use case analyzing Wikipedia to demonstrate usage.« less
Techniques for the measurements of the line of sight velocity of high altitude Barium clouds
NASA Technical Reports Server (NTRS)
Mende, S. B.
1981-01-01
It is demonstrated that for maximizing the scientific output of future ion cloud release experiments a new type of instrument is required which will measure the line of sight velocity of the ion cloud by the Doppler Technique. A simple instrument was constructed using a 5 cm diameter solid Fabry-Perot etalon coupled to a low light level integrating television camera. It was demonstrated that the system has both the sensitivity and spectral resolution for the detection of ion clouds and the measurement of their line of sight Doppler velocity. The tests consisted of (1) a field experiment using a rocket barium cloud release to check the sensitivity, (2) laboratory experiments to show the spectral resolving capabilities of the system. The instrument was found to be operational if the source was brighter than about 1 kilorayleigh and it had a wavelength resolution much better than .2A which corresponds to about 12 km/sec or an acceleration potential of 100 volts.
Observation of Gigawatt-Class THz Pulses from a Compact Laser-Driven Particle Accelerator
NASA Astrophysics Data System (ADS)
Gopal, A.; Herzer, S.; Schmidt, A.; Singh, P.; Reinhard, A.; Ziegler, W.; Brömmel, D.; Karmakar, A.; Gibbon, P.; Dillner, U.; May, T.; Meyer, H.-G.; Paulus, G. G.
2013-08-01
We report the observation of subpicosecond terahertz (T-ray) pulses with energies ≥460μJ from a laser-driven ion accelerator, thus rendering the peak power of the source higher even than that of state-of-the-art synchrotrons. Experiments were performed with intense laser pulses (up to 5×1019W/cm2) to irradiate thin metal foil targets. Ion spectra measured simultaneously showed a square law dependence of the T-ray yield on particle number. Two-dimensional particle-in-cell simulations show the presence of transient currents at the target rear surface which could be responsible for the strong T-ray emission.
SYNCHROTRON RADIO FREQUENCY PHASE CONTROL SYSTEM
Plotkin, M.; Raka, E.C.; Snyder, H.S.
1963-05-01
A system for canceling varying phase changes introduced by connecting cables and control equipment in an alternating gradient synchrotron is presented. In a specific synchrotron embodiment twelve spaced accelerating stations for the proton bunches are utilized. In order to ensure that the protons receive their boost or kick at the exact instant necessary it is necessary to compensate for phase changes occurring in the r-f circuitry over the wide range of frequencies dictated by the accelerated velocities of the proton bunches. A constant beat frequency is utilized to transfer the r-f control signals through the cables and control equipment to render the phase shift constant and readily compensable. (AEC)
FERMI LARGE AREA TELESCOPE OBSERVATIONS OF THE SUPERNOVA REMNANT G8.7-0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ajello, M.; Allafort, A.; Bechtol, K.
We present a detailed analysis of the GeV gamma-ray emission toward the supernova remnant (SNR) G8.7-0.1 with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope. An investigation of the relationship between G8.7-0.1 and the TeV unidentified source HESS J1804-216 provides us with an important clue on diffusion process of cosmic rays if particle acceleration operates in the SNR. The GeV gamma-ray emission is extended with most of the emission in positional coincidence with the SNR G8.7-0.1 and a lesser part located outside the western boundary of G8.7-0.1. The region of the gamma-ray emission overlaps spatially connectedmore » molecular clouds, implying a physical connection for the gamma-ray structure. The total gamma-ray spectrum measured with LAT from 200 MeV-100 GeV can be described by a broken power-law function with a break of 2.4 {+-} 0.6 (stat) {+-} 1.2 (sys) GeV, and photon indices of 2.10 {+-} 0.06 (stat) {+-} 0.10 (sys) below the break and 2.70 {+-} 0.12 (stat) {+-} 0.14 (sys) above the break. Given the spatial association among the gamma rays, the radio emission of G8.7-0.1, and the molecular clouds, the decay of {pi}{sup 0}s produced by particles accelerated in the SNR and hitting the molecular clouds naturally explains the GeV gamma-ray spectrum. We also find that the GeV morphology is not well represented by the TeV emission from HESS J1804-216 and that the spectrum in the GeV band is not consistent with the extrapolation of the TeV gamma-ray spectrum. The spectral index of the TeV emission is consistent with the particle spectral index predicted by a theory that assumes energy-dependent diffusion of particles accelerated in an SNR. We discuss the possibility that the TeV spectrum originates from the interaction of particles accelerated in G8.7-0.1 with molecular clouds, and we constrain the diffusion coefficient of the particles.« less
Particle-in-cell/accelerator code for space-charge dominated beam simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-05-08
Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model.more » The code is guilt atop the Python interpreter language.« less
Laboratory investigation of dust impacts on antennas in space
NASA Astrophysics Data System (ADS)
Sternovsky, Zoltan; Malaspina, D.; Gruen, E.; Drake, K.
2013-10-01
Recent observations of sharp voltage spikes by the WAVES electric field experiments onboard the twin STEREO spacecraft have been attributed to plasma clouds generated by the impact ionization of high velocity dust particles. The reported dust fluxes are much higher than those measured by dedicated dust detectors at 1 AU, which leads to the interpretation that the STEREO observations are due to nanometer-sized dust particles originating from the inner solar system and accelerated to high velocities by the solar wind magnetic field. However, this interpretation is based on a simplified model of coupling between the expanding plasma cloud from the dust impact and the WAVES electric field instrument. A series of laboratory measurements are performed to validate this model and to calibrate/investigate the effect of various impact parameters on the signals measured by the electric field instrument. The dust accelerator facility operating at the University of Colorado is used for the measurement with micron and submicron sized particles accelerated to 50 km/s. The first set of measurements is performed to calibrate the impact charge generated from materials specific the STEREO spacecraft and will help to interpret electric field data.
Neronov, Andrii
2017-11-10
Cosmic rays could be produced via shock acceleration powered by supernovae. The supernova hypothesis implies that each supernova injects, on average, some 10^{50} erg in cosmic rays, while the shock acceleration model predicts a power law cosmic ray spectrum with the slope close to 2. Verification of these predictions requires measurement of the spectrum and power of cosmic ray injection from supernova population(s). Here, we obtain such measurements based on γ-ray observation of the Constellation III region of the Large Magellanic Cloud. We show that γ-ray emission from this young star formation region originates from cosmic rays injected by approximately two thousand supernovae, rather than by a massive star wind powered by a superbubble predating supernova activity. Cosmic ray injection power is found to be (1.1_{-0.2}^{+0.5})×10^{50} erg/supernova (for the estimated interstellar medium density 0.3 cm^{-3}). The spectrum is a power law with slope 2.09_{-0.07}^{+0.06}. This agrees with the model of particle acceleration at supernova shocks and provides a direct proof of the supernova origin of cosmic rays.
NASA Technical Reports Server (NTRS)
Palm, Stephen P.; Strey, Sara T.; Spinhirne, James; Markus, Thorsten
2010-01-01
Recent satellite lidar measurements of cloud properties spanning a period of five years are used to examine a possible connection between Arctic sea ice amount and polar cloud fraction and vertical distribution. We find an anti-correlation between sea ice extent and cloud fraction with maximum cloudiness occurring over areas with little or no sea ice. We also find that over ice free regions, there is greater low cloud frequency and average optical depth. Most of the optical depth increase is due to the presence of geometrically thicker clouds over water. In addition, our analysis indicates that over the last 5 years, October and March average polar cloud fraction has increased by about 7 and 10 percent, respectively, as year average sea ice extent has decreased by 5 to 7 percent. The observed cloud changes are likely due to a number of effects including, but not limited to, the observed decrease in sea ice extent and thickness. Increasing cloud amount and changes in vertical distribution and optical properties have the potential to affect the radiative balance of the Arctic region by decreasing both the upwelling terrestrial longwave radiation and the downward shortwave solar radiation. Since longwave radiation dominates in the long polar winter, the overall effect of increasing low cloud cover is likely a warming of the Arctic and thus a positive climate feedback, possibly accelerating the melting of Arctic sea ice.
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.
3D image display of fetal ultrasonic images by thin shell
NASA Astrophysics Data System (ADS)
Wang, Shyh-Roei; Sun, Yung-Nien; Chang, Fong-Ming; Jiang, Ching-Fen
1999-05-01
Due to the properties of convenience and non-invasion, ultrasound has become an essential tool for diagnosis of fetal abnormality during women pregnancy in obstetrics. However, the 'noisy and blurry' nature of ultrasound data makes the rendering of the data a challenge in comparison with MRI and CT images. In spite of the speckle noise, the unwanted objects usually occlude the target to be observed. In this paper, we proposed a new system that can effectively depress the speckle noise, extract the target object, and clearly render the 3D fetal image in almost real-time from 3D ultrasound image data. The system is based on a deformable model that detects contours of the object according to the local image feature of ultrasound. Besides, in order to accelerate rendering speed, a thin shell is defined to separate the observed organ from unrelated structures depending on those detected contours. In this way, we can support quick 3D display of ultrasound, and the efficient visualization of 3D fetal ultrasound thus becomes possible.
Volumetric visualization algorithm development for an FPGA-based custom computing machine
NASA Astrophysics Data System (ADS)
Sallinen, Sami J.; Alakuijala, Jyrki; Helminen, Hannu; Laitinen, Joakim
1998-05-01
Rendering volumetric medical images is a burdensome computational task for contemporary computers due to the large size of the data sets. Custom designed reconfigurable hardware could considerably speed up volume visualization if an algorithm suitable for the platform is used. We present an algorithm and speedup techniques for visualizing volumetric medical CT and MR images with a custom-computing machine based on a Field Programmable Gate Array (FPGA). We also present simulated performance results of the proposed algorithm calculated with a software implementation running on a desktop PC. Our algorithm is capable of generating perspective projection renderings of single and multiple isosurfaces with transparency, simulated X-ray images, and Maximum Intensity Projections (MIP). Although more speedup techniques exist for parallel projection than for perspective projection, we have constrained ourselves to perspective viewing, because of its importance in the field of radiotherapy. The algorithm we have developed is based on ray casting, and the rendering is sped up by three different methods: shading speedup by gradient precalculation, a new generalized version of Ray-Acceleration by Distance Coding (RADC), and background ray elimination by speculative ray selection.
Atomic References for Measuring Small Accelerations
NASA Technical Reports Server (NTRS)
Maleki, Lute; Yu, Nan
2009-01-01
Accelerometer systems that would combine the best features of both conventional (e.g., mechanical) accelerometers and atom interferometer accelerometers (AIAs) have been proposed. These systems are intended mainly for use in scientific research aboard spacecraft but may also be useful on Earth in special military, geological, and civil-engineering applications. Conventional accelerometers can be sensitive, can have high dynamic range, and can have high frequency response, but they lack accuracy and long-term stability. AIAs have low frequency response, but they offer high sensitivity, and high accuracy for measuring small accelerations. In a system according to the proposal, a conventional accelerometer would be used to perform short-term measurements of higher-frequency components of acceleration, while an AIA would be used to provide consistent calibration of, and correction of errors in, the measurements of the conventional accelerometer in the lower-frequency range over the long term. A brief description of an AIA is prerequisite to a meaningful description of a system according to the proposal. An AIA includes a retroreflector next to one end of a cell that contains a cold cloud of atoms in an ultrahigh vacuum. The atoms in the cloud are in free fall. The retroreflector is mounted on the object, the acceleration of which is to be measured. Raman laser beams are directed through the cell from the end opposite the retroreflector, then pass back through the cell after striking the retroreflector. The Raman laser beams together with the cold atoms measure the relative acceleration, through the readout of the AIA, between the cold atoms and the retroreflector.
Knowledge engineering for PACES, the particle accelerator control expert system
NASA Astrophysics Data System (ADS)
Lind, P. C.; Poehlman, W. F. S.; Stark, J. W.; Cousins, T.
1992-04-01
The KN-3000 used at Defense Research Establishment Ottawa is a Van de Graaff particle accelerator employed primarily to produce monoenergetic neutrons for calibrating radiation detectors. To provide training and assistance for new operators, it was decided to develop an expert system for accelerator operation. Knowledge engineering aspects of the expert system are reviewed. Two important issues are involved: the need to encapsulate expert knowledge into the system in a form that facilitates automatic accelerator operation and to partition the system so that time-consuming inferencing is minimized in favor of faster, more algorithmic control. It is seen that accelerator control will require fast, narrowminded decision making for rapid fine tuning, but slower and broader reasoning for machine startup, shutdown, fault diagnosis, and correction. It is also important to render the knowledge base in a form conducive to operator training. A promising form of the expert system involves a hybrid system in which high level reasoning is performed on the host machine that interacts with the user, while an embedded controller employs neural networks for fast but limited adjustment of accelerator performance. This partitioning of duty facilitates a hierarchical chain of command yielding an effective mixture of speed and reasoning ability.
2014-06-01
C. MODTRAN ....................................................................................................34 1. Preset Atmospheric Models ...37 3. Aerosol Models ...................................................................................38 4. Cloud and Rain Models ...52 E. MODEL VALIDATION ...............................................................................53 VI. RESULTS
The eight micron band of silicon monoxide in the expanding cloud around VY Canis Majoris
NASA Technical Reports Server (NTRS)
Geballe, T. R.; Lacy, J. H.; Beck, S. C.
1978-01-01
Observations of vibration-rotation transitions of silicon monoxide in VY CMa show that the lines originate in accelerating, expanding, and cool (600 K) layers of a circumstellar cloud at a distance of roughly 0.15 minutes from the central star. The central stellar velocity, as estimated from observed SiO P Cygni line profiles, is somewhat redshifted from the midpoint of the maser emission features. Most of the silicon is probably in the form of dust grains. The isotopic ratios of silicon are nearly terrestrial.
The 8 micron band of silicon monoxide in the expanding cloud around VY Canis Majoris
NASA Technical Reports Server (NTRS)
Geballe, T. R.; Lacy, J. H.; Beck, S. C.
1979-01-01
Observations of vibration-rotation transitions of silicon monoxide in VY CMa show that the lines originate in accelerating, expanding, and cool (about 600 K) layers of a circumstellar cloud at a distance of approximately 0.15 arcsec from the central star. The central stellar velocity, as estimated from observed SiO P Cygni line profiles, is somewhat redshifted from the midpoint of the maser emission features. Most of the silicon is probably in the form of dust grains. The isotopic ratios of silicon are nearly terrestrial.
NASA Astrophysics Data System (ADS)
Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.
Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.
Impacts of solar-absorbing aerosol layers on the transition of stratocumulus to trade cumulus clouds
Zhou, Xiaoli; Ackerman, Andrew S.; Fridlind, Ann M.; ...
2017-10-26
Here, the effects of an initially overlying layer of solar-absorbing aerosol on the transition of stratocumulus to trade cumulus clouds are examined using large-eddy simulations. For lightly drizzling cloud the transition is generally hastened, resulting mainly from increased cloud droplet number concentration ( N c) induced by entrained aerosol. The increased N c slows sedimentation of cloud droplets and shortens their relaxation time for diffusional growth, both of which accelerate entrainment of overlying air and thereby stratocumulus breakup. However, the decrease in albedo from cloud breakup is more than offset by redistributing cloud water over a greater number of droplets,more » such that the diurnal-average shortwave forcing at the top of the atmosphere is negative. The negative radiative forcing is enhanced by sizable longwave contributions, which result from the greater cloud breakup and a reduced boundary layer height associated with aerosol heating. A perturbation of moisture instead of aerosol aloft leads to a greater liquid water path and a more gradual transition. Adding absorbing aerosol to that atmosphere results in substantial reductions in liquid water path (LWP) and cloud cover that lead to positive shortwave and negative longwave forcings on average canceling each other. Only for heavily drizzling clouds is the breakup delayed, as inhibition of precipitation overcomes cloud water loss from enhanced entrainment. Considering these simulations as an imperfect proxy for biomass burning plumes influencing Namibian stratocumulus, we expect regional indirect plus semi-direct forcings to be substantially negative to negligible at the top of the atmosphere, with its magnitude sensitive to background and perturbation properties.« less
Impacts of solar-absorbing aerosol layers on the transition of stratocumulus to trade cumulus clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Xiaoli; Ackerman, Andrew S.; Fridlind, Ann M.
Here, the effects of an initially overlying layer of solar-absorbing aerosol on the transition of stratocumulus to trade cumulus clouds are examined using large-eddy simulations. For lightly drizzling cloud the transition is generally hastened, resulting mainly from increased cloud droplet number concentration ( N c) induced by entrained aerosol. The increased N c slows sedimentation of cloud droplets and shortens their relaxation time for diffusional growth, both of which accelerate entrainment of overlying air and thereby stratocumulus breakup. However, the decrease in albedo from cloud breakup is more than offset by redistributing cloud water over a greater number of droplets,more » such that the diurnal-average shortwave forcing at the top of the atmosphere is negative. The negative radiative forcing is enhanced by sizable longwave contributions, which result from the greater cloud breakup and a reduced boundary layer height associated with aerosol heating. A perturbation of moisture instead of aerosol aloft leads to a greater liquid water path and a more gradual transition. Adding absorbing aerosol to that atmosphere results in substantial reductions in liquid water path (LWP) and cloud cover that lead to positive shortwave and negative longwave forcings on average canceling each other. Only for heavily drizzling clouds is the breakup delayed, as inhibition of precipitation overcomes cloud water loss from enhanced entrainment. Considering these simulations as an imperfect proxy for biomass burning plumes influencing Namibian stratocumulus, we expect regional indirect plus semi-direct forcings to be substantially negative to negligible at the top of the atmosphere, with its magnitude sensitive to background and perturbation properties.« less
Impacts of solar-absorbing aerosol layers on the transition of stratocumulus to trade cumulus clouds
NASA Astrophysics Data System (ADS)
Zhou, Xiaoli; Ackerman, Andrew S.; Fridlind, Ann M.; Wood, Robert; Kollias, Pavlos
2017-10-01
The effects of an initially overlying layer of solar-absorbing aerosol on the transition of stratocumulus to trade cumulus clouds are examined using large-eddy simulations. For lightly drizzling cloud the transition is generally hastened, resulting mainly from increased cloud droplet number concentration (Nc) induced by entrained aerosol. The increased Nc slows sedimentation of cloud droplets and shortens their relaxation time for diffusional growth, both of which accelerate entrainment of overlying air and thereby stratocumulus breakup. However, the decrease in albedo from cloud breakup is more than offset by redistributing cloud water over a greater number of droplets, such that the diurnal-average shortwave forcing at the top of the atmosphere is negative. The negative radiative forcing is enhanced by sizable longwave contributions, which result from the greater cloud breakup and a reduced boundary layer height associated with aerosol heating. A perturbation of moisture instead of aerosol aloft leads to a greater liquid water path and a more gradual transition. Adding absorbing aerosol to that atmosphere results in substantial reductions in liquid water path (LWP) and cloud cover that lead to positive shortwave and negative longwave forcings on average canceling each other. Only for heavily drizzling clouds is the breakup delayed, as inhibition of precipitation overcomes cloud water loss from enhanced entrainment. Considering these simulations as an imperfect proxy for biomass burning plumes influencing Namibian stratocumulus, we expect regional indirect plus semi-direct forcings to be substantially negative to negligible at the top of the atmosphere, with its magnitude sensitive to background and perturbation properties.
Impacts of Solar-Absorbing Aerosol Layers on the Transition of Stratocumulus to Trade Cumulus Clouds
NASA Technical Reports Server (NTRS)
Zhou, Xiaoli; Ackerman, Andrew S.; Fridlind, Ann M.; Wood, Robert; Kollias, Pavlos
2017-01-01
The effects of an initially overlying layer of solar-absorbing aerosol on the transition of stratocumulus to trade cumulus clouds are examined using large-eddy simulations. For lightly drizzling cloud the transition is generally hastened, resulting mainly from increased cloud droplet number concentration (Nc) induced by entrained aerosol. The increased Nc slows sedimentation of cloud droplets and shortens their relaxation time for diffusional growth, both of which accelerate entrainment of overlying air and thereby stratocumulus breakup. However, the decrease in albedo from cloud breakup is more than offset by redistributing cloud water over a greater number of droplets, such that the diurnal-average shortwave forcing at the top of the atmosphere is negative. The negative radiative forcing is enhanced by sizable longwave contributions, which result from the greater cloud breakup and a reduced boundary layer height associated with aerosol heating. A perturbation of moisture instead of aerosol aloft leads to a greater liquid water path and a more gradual transition. Adding absorbing aerosol to that atmosphere results in substantial reductions in liquid water path (LWP) and cloud cover that lead to positive short-wave and negative longwave forcings on average canceling each other. Only for heavily drizzling clouds is the breakup delayed, as inhibition of precipitation overcomes cloud water loss from enhanced entrainment. Considering these simulations as an imperfect proxy for biomass burning plumes influencing Namibian stratocumulus, we expect regional indirect plus semi-direct forcings to be substantially negative to negligible at the top of the atmosphere, with its magnitude sensitive to background and perturbation properties.
Enhanced PM2.5 pollution in China due to aerosol-cloud interactions.
Zhao, Bin; Liou, Kuo-Nan; Gu, Yu; Li, Qinbin; Jiang, Jonathan H; Su, Hui; He, Cenlin; Tseng, Hsien-Liang R; Wang, Shuxiao; Liu, Run; Qi, Ling; Lee, Wei-Liang; Hao, Jiming
2017-06-30
Aerosol-cloud interactions (aerosol indirect effects) play an important role in regional meteorological variations, which could further induce feedback on regional air quality. While the impact of aerosol-cloud interactions on meteorology and climate has been extensively studied, their feedback on air quality remains unclear. Using a fully coupled meteorology-chemistry model, we find that increased aerosol loading due to anthropogenic activities in China substantially increases column cloud droplet number concentration and liquid water path (LWP), which further leads to a reduction in the downward shortwave radiation at surface, surface air temperature and planetary boundary layer (PBL) height. The shallower PBL and accelerated cloud chemistry due to larger LWP in turn enhance the concentrations of particulate matter with diameter less than 2.5 μm (PM 2.5 ) by up to 33.2 μg m -3 (25.1%) and 11.0 μg m -3 (12.5%) in January and July, respectively. Such a positive feedback amplifies the changes in PM 2.5 concentrations, indicating an additional air quality benefit under effective pollution control policies but a penalty for a region with a deterioration in PM 2.5 pollution. Additionally, we show that the cloud processing of aerosols, including wet scavenging and cloud chemistry, could also have substantial effects on PM 2.5 concentrations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pivi, M.T.F.; Collet, G.; King, F.
Beam instability caused by the electron cloud has been observed in positron and proton storage rings and it is expected to be a limiting factor in the performance of the positron Damping Ring (DR) of future Linear Colliders (LC) such as ILC and CLIC. To test a series of promising possible electron cloud mitigation techniques as surface coatings and grooves, in the Positron Low Energy Ring (LER) of the PEP-II accelerator, we have installed several test vacuum chambers including (i) a special chamber to monitor the variation of the secondary electron yield of technical surface materials and coatings under themore » effect of ion, electron and photon conditioning in situ in the beam line; (ii) chambers with grooves in a straight magnetic-free section; and (iii) coated chambers in a dedicated newly installed 4-magnet chicane to study mitigations in a magnetic field region. In this paper, we describe the ongoing R&D effort to mitigate the electron cloud effect for the LC damping ring, focusing on the first experimental area and on results of the reduction of the secondary electron yield due to in situ conditioning.« less
Cloud-In-Cell modeling of shocked particle-laden flows at a ``SPARSE'' cost
NASA Astrophysics Data System (ADS)
Taverniers, Soren; Jacobs, Gustaaf; Sen, Oishik; Udaykumar, H. S.
2017-11-01
A common tool for enabling process-scale simulations of shocked particle-laden flows is Eulerian-Lagrangian Particle-Source-In-Cell (PSIC) modeling where each particle is traced in its Lagrangian frame and treated as a mathematical point. Its dynamics are governed by Stokes drag corrected for high Reynolds and Mach numbers. The computational burden is often reduced further through a ``Cloud-In-Cell'' (CIC) approach which amalgamates groups of physical particles into computational ``macro-particles''. CIC does not account for subgrid particle fluctuations, leading to erroneous predictions of cloud dynamics. A Subgrid Particle-Averaged Reynolds-Stress Equivalent (SPARSE) model is proposed that incorporates subgrid interphase velocity and temperature perturbations. A bivariate Gaussian source distribution, whose covariance captures the cloud's deformation to first order, accounts for the particles' momentum and energy influence on the carrier gas. SPARSE is validated by conducting tests on the interaction of a particle cloud with the accelerated flow behind a shock. The cloud's average dynamics and its deformation over time predicted with SPARSE converge to their counterparts computed with reference PSIC models as the number of Gaussians is increased from 1 to 16. This work was supported by AFOSR Grant No. FA9550-16-1-0008.
GPU-accelerated computation of electron transfer.
Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco
2012-11-05
Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Banda-Barragán, W. E.; Federrath, C.; Crocker, R. M.; Bicknell, G. V.
2018-01-01
We present a set of numerical experiments designed to systematically investigate how turbulence and magnetic fields influence the morphology, energetics, and dynamics of filaments produced in wind-cloud interactions. We cover 3D, magnetohydrodynamic systems of supersonic winds impacting clouds with turbulent density, velocity, and magnetic fields. We find that lognormal density distributions aid shock propagation through clouds, increasing their velocity dispersion and producing filaments with expanded cross-sections and highly magnetized knots and subfilaments. In self-consistently turbulent scenarios, the ratio of filament to initial cloud magnetic energy densities is ∼1. The effect of Gaussian velocity fields is bound to the turbulence Mach number: Supersonic velocities trigger a rapid cloud expansion; subsonic velocities only have a minor impact. The role of turbulent magnetic fields depends on their tension and is similar to the effect of radiative losses: the stronger the magnetic field or the softer the gas equation of state, the greater the magnetic shielding at wind-filament interfaces and the suppression of Kelvin-Helmholtz instabilities. Overall, we show that including turbulence and magnetic fields is crucial to understanding cold gas entrainment in multiphase winds. While cloud porosity and supersonic turbulence enhance the acceleration of clouds, magnetic shielding protects them from ablation and causes Rayleigh-Taylor-driven subfilamentation. Wind-swept clouds in turbulent models reach distances ∼15-20 times their core radius and acquire bulk speeds ∼0.3-0.4 of the wind speed in one cloud-crushing time, which are three times larger than in non-turbulent models. In all simulations, the ratio of turbulent magnetic to kinetic energy densities asymptotes at ∼0.1-0.4, and convergence of all relevant dynamical properties requires at least 64 cells per cloud radius.
Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parishani, Hossein; Pritchard, Michael S.; Bretherton, Christopher S.
Systematic biases in the representation of boundary layer (BL) clouds are a leading source of uncertainty in climate projections. A variation on superparameterization (SP) called “ultraparameterization” (UP) is developed, in which the grid spacing of the cloud-resolving models (CRMs) is fine enough (250 × 20 m) to explicitly capture the BL turbulence, associated clouds, and entrainment in a global climate model capable of multiyear simulations. UP is implemented within the Community Atmosphere Model using 2° resolution (~14,000 embedded CRMs) with one-moment microphysics. By using a small domain and mean-state acceleration, UP is computationally feasible today and promising for exascale computers.more » Short-duration global UP hindcasts are compared with SP and satellite observations of top-of-atmosphere radiation and cloud vertical structure. The most encouraging improvement is a deeper BL and more realistic vertical structure of subtropical stratocumulus (Sc) clouds, due to stronger vertical eddy motions that promote entrainment. Results from 90 day integrations show climatological errors that are competitive with SP, with a significant improvement in the diurnal cycle of offshore Sc liquid water. Ongoing concerns with the current UP implementation include a dim bias for near-coastal Sc that also occurs less prominently in SP and a bright bias over tropical continental deep convection zones. Nevertheless, UP makes global eddy-permitting simulation a feasible and interesting alternative to conventionally parameterized GCMs or SP-GCMs with turbulence parameterizations for studying BL cloud-climate and cloud-aerosol feedback.« less
Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence
Parishani, Hossein; Pritchard, Michael S.; Bretherton, Christopher S.; ...
2017-06-19
Systematic biases in the representation of boundary layer (BL) clouds are a leading source of uncertainty in climate projections. A variation on superparameterization (SP) called “ultraparameterization” (UP) is developed, in which the grid spacing of the cloud-resolving models (CRMs) is fine enough (250 × 20 m) to explicitly capture the BL turbulence, associated clouds, and entrainment in a global climate model capable of multiyear simulations. UP is implemented within the Community Atmosphere Model using 2° resolution (~14,000 embedded CRMs) with one-moment microphysics. By using a small domain and mean-state acceleration, UP is computationally feasible today and promising for exascale computers.more » Short-duration global UP hindcasts are compared with SP and satellite observations of top-of-atmosphere radiation and cloud vertical structure. The most encouraging improvement is a deeper BL and more realistic vertical structure of subtropical stratocumulus (Sc) clouds, due to stronger vertical eddy motions that promote entrainment. Results from 90 day integrations show climatological errors that are competitive with SP, with a significant improvement in the diurnal cycle of offshore Sc liquid water. Ongoing concerns with the current UP implementation include a dim bias for near-coastal Sc that also occurs less prominently in SP and a bright bias over tropical continental deep convection zones. Nevertheless, UP makes global eddy-permitting simulation a feasible and interesting alternative to conventionally parameterized GCMs or SP-GCMs with turbulence parameterizations for studying BL cloud-climate and cloud-aerosol feedback.« less
Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence
NASA Astrophysics Data System (ADS)
Parishani, Hossein; Pritchard, Michael S.; Bretherton, Christopher S.; Wyant, Matthew C.; Khairoutdinov, Marat
2017-07-01
Systematic biases in the representation of boundary layer (BL) clouds are a leading source of uncertainty in climate projections. A variation on superparameterization (SP) called "ultraparameterization" (UP) is developed, in which the grid spacing of the cloud-resolving models (CRMs) is fine enough (250 × 20 m) to explicitly capture the BL turbulence, associated clouds, and entrainment in a global climate model capable of multiyear simulations. UP is implemented within the Community Atmosphere Model using 2° resolution (˜14,000 embedded CRMs) with one-moment microphysics. By using a small domain and mean-state acceleration, UP is computationally feasible today and promising for exascale computers. Short-duration global UP hindcasts are compared with SP and satellite observations of top-of-atmosphere radiation and cloud vertical structure. The most encouraging improvement is a deeper BL and more realistic vertical structure of subtropical stratocumulus (Sc) clouds, due to stronger vertical eddy motions that promote entrainment. Results from 90 day integrations show climatological errors that are competitive with SP, with a significant improvement in the diurnal cycle of offshore Sc liquid water. Ongoing concerns with the current UP implementation include a dim bias for near-coastal Sc that also occurs less prominently in SP and a bright bias over tropical continental deep convection zones. Nevertheless, UP makes global eddy-permitting simulation a feasible and interesting alternative to conventionally parameterized GCMs or SP-GCMs with turbulence parameterizations for studying BL cloud-climate and cloud-aerosol feedback.
Automatic Registration of TLS-TLS and TLS-MLS Point Clouds Using a Genetic Algorithm
Yan, Li; Xie, Hong; Chen, Changjun
2017-01-01
Registration of point clouds is a fundamental issue in Light Detection and Ranging (LiDAR) remote sensing because point clouds scanned from multiple scan stations or by different platforms need to be transformed to a uniform coordinate reference frame. This paper proposes an efficient registration method based on genetic algorithm (GA) for automatic alignment of two terrestrial LiDAR scanning (TLS) point clouds (TLS-TLS point clouds) and alignment between TLS and mobile LiDAR scanning (MLS) point clouds (TLS-MLS point clouds). The scanning station position acquired by the TLS built-in GPS and the quasi-horizontal orientation of the LiDAR sensor in data acquisition are used as constraints to narrow the search space in GA. A new fitness function to evaluate the solutions for GA, named as Normalized Sum of Matching Scores, is proposed for accurate registration. Our method is divided into five steps: selection of matching points, initialization of population, transformation of matching points, calculation of fitness values, and genetic operation. The method is verified using a TLS-TLS data set and a TLS-MLS data set. The experimental results indicate that the RMSE of registration of TLS-TLS point clouds is 3~5 mm, and that of TLS-MLS point clouds is 2~4 cm. The registration integrating the existing well-known ICP with GA is further proposed to accelerate the optimization and its optimizing time decreases by about 50%. PMID:28850100
Automatic Registration of TLS-TLS and TLS-MLS Point Clouds Using a Genetic Algorithm.
Yan, Li; Tan, Junxiang; Liu, Hua; Xie, Hong; Chen, Changjun
2017-08-29
Registration of point clouds is a fundamental issue in Light Detection and Ranging (LiDAR) remote sensing because point clouds scanned from multiple scan stations or by different platforms need to be transformed to a uniform coordinate reference frame. This paper proposes an efficient registration method based on genetic algorithm (GA) for automatic alignment of two terrestrial LiDAR scanning (TLS) point clouds (TLS-TLS point clouds) and alignment between TLS and mobile LiDAR scanning (MLS) point clouds (TLS-MLS point clouds). The scanning station position acquired by the TLS built-in GPS and the quasi-horizontal orientation of the LiDAR sensor in data acquisition are used as constraints to narrow the search space in GA. A new fitness function to evaluate the solutions for GA, named as Normalized Sum of Matching Scores, is proposed for accurate registration. Our method is divided into five steps: selection of matching points, initialization of population, transformation of matching points, calculation of fitness values, and genetic operation. The method is verified using a TLS-TLS data set and a TLS-MLS data set. The experimental results indicate that the RMSE of registration of TLS-TLS point clouds is 3~5 mm, and that of TLS-MLS point clouds is 2~4 cm. The registration integrating the existing well-known ICP with GA is further proposed to accelerate the optimization and its optimizing time decreases by about 50%.
A cloud collision model for water maser excitation.
Tarter, J C; Welch, W J
1986-06-01
High-velocity collisions between small, dense, neutral clouds or between a dense cloud and a dense shell can provide the energy source required to excite H2O maser emission. The radiative precursor from the surface of the collisional shock front rapidly diffuses through the cloud, heating the dust grains but leaving the H2 molecules cool. Transient maser emission occurs as the conditions for the Goldreich and Kwan "hot-dust cold-gas" maser pump scheme are realized locally within the cloud. In time the local maser action quenches due to the heating of the H2 molecules by collisions against the grains. Although this model cannot explain the very long-lived steady maser features, it is quite successful in explaining a number of the observed properties of the high-velocity features in such sources as Orion, W51, and W49. In particular, it provides a natural explanation for the rapid time variations, the narrow line widths, juxtaposition of high- and low-velocity features, and the short lifetimes which are frequently observed for the so-called high-velocity maser "bullets" thought to be accelerated by strong stellar winds.
Science in the cloud (SIC): A use case in MRI connectomics.
Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T
2017-05-01
Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.
Khan, Majharul Haque; Jamali, Sina S; Lyalin, Andrey; Molino, Paul J; Jiang, Lei; Liu, Hua Kun; Taketsugu, Tetsuya; Huang, Zhenguo
2017-01-01
Outstanding protection of Cu by high-quality boron nitride nanofilm (BNNF) 1-2 atomic layers thick in salt water is observed, while defective BNNF accelerates the reaction of Cu toward water. The chemical stability, insulating nature, and impermeability of ions through the BN hexagons render BNNF a great choice for atomic-scale protection. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Development and Evaluation of Sterographic Display for Lung Cancer Screening
2008-12-01
burden. Application of GPUs – With the evolution of commodity graphics processing units (GPUs) for accelerating games on personal computers, over the...units, which are designed for rendering computer games , are readily available and can be programmed to perform the kinds of real-time calculations...575-581, 1994. 12. Anderson CM, Saloner D, Tsuruda JS, Shapeero LG, Lee RE. "Artifacts in maximun-intensity-projection display of MR angiograms
Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.
2015-01-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363
Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L
2015-02-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Bower, Keith; Choularton, Tom; Latham, John; Sahraei, Jalil; Salter, Stephen
2006-11-01
A simplified version of the model of marine stratocumulus clouds developed by Bower, Jones and Choularton [Bower, K.N., Jones, A., and Choularton, T.W., 1999. A modeling study of aerosol processing by stratocumulus clouds and its impact on GCM parameterisations of cloud and aerosol. Atmospheric Research, Vol. 50, Nos. 3-4, The Great Dun Fell Experiment, 1995-special issue, 317-344.] was used to examine the sensitivity of the albedo-enhancement global warming mitigation scheme proposed by Latham [Latham, J., 1990. Control of global warming? Nature 347, 339-340; Latham, J., 2002. Amelioration of global warming by controlled enhancement of the albedo and longevity of low-level maritime clouds. Atmos. Sci. Letters (doi:10.1006/Asle.2002.0048).] to the cloud and environmental aerosol characteristics, as well as those of the seawater aerosol of salt-mass ms and number concentration Δ N, which-under the scheme-are advertently introduced into the clouds. Values of albedo-change Δ A and droplet number concentration Nd were calculated for a wide range of values of ms, Δ N, updraught speed W, cloud thickness Δ Z and cloud-base temperature TB: for three measured aerosol spectra, corresponding to ambient air of negligible, moderate and high levels of pollution. Our choices of parameter value ranges were determined by the extent of their applicability to the mitigation scheme, whose current formulation is still somewhat preliminary, thus rendering unwarranted in this study the utilisation of refinements incorporated into other stratocumulus models. In agreement with earlier studies: (1) Δ A was found to be very sensitive to Δ N and (within certain constraints) insensitive to changes in ms, W, Δ Z and TB; (2) Δ A was greatest for clouds formed in pure air and least for highly polluted air. In many situations considered to be within the ambit of the mitigation scheme, the calculated Δ A values exceeded those estimated by earlier workers as being necessary to produce a cooling sufficient to compensate, globally, for the warming resulting from a doubling of the atmospheric carbon dioxide concentration. Our calculations provide quantitative support for the physical viability of the mitigation scheme and offer new insights into its technological requirements.
Interactions between spacecraft motions and the atmospheric cloud physics laboratory experiments
NASA Technical Reports Server (NTRS)
Anderson, B. J.
1981-01-01
In evaluating the effects of spacecraft motions on atmospheric cloud physics laboratory (ACPL) experimentation, the motions of concern are those which will result in the movement of the fluid or cloud particles within the experiment chambers. Of the various vehicle motions and residual forces which can and will occur, three types appear most likely to damage the experimental results: non-steady rotations through a large angle, long-duration accelerations in a constant direction, and vibrations. During the ACPL ice crystal growth experiments, the crystals are suspended near the end of a long fiber (20 cm long by 200 micron diameter) of glass or similar material. Small vibrations of the supported end of the fiber could cause extensive motions of the ice crystal, if care is not taken to avoid this problem.
Digital Textbooks. Research Brief
ERIC Educational Resources Information Center
Johnston, Howard
2011-01-01
Despite their growing popularity, digital alternatives to conventional textbooks are stirring up controversy. With the introduction of tablet computers, and the growing trend toward "cloud computing" and "open source" software, the trend is accelerating because costs are coming down and free or inexpensive materials are becoming more available.…
Inhomogeneous distribution of water droplets in cloud turbulence
NASA Astrophysics Data System (ADS)
Fouxon, Itzhak; Park, Yongnam; Harduf, Roei; Lee, Changhoon
2015-09-01
We consider sedimentation of small particles in the turbulent flow where fluid accelerations are much smaller than acceleration of gravity g . The particles are dragged by the flow by linear friction force. We demonstrate that the pair-correlation function of particles' concentration diverges with decreasing separation as a power law with negative exponent. This manifests fractal distribution of particles in space. We find that the exponent is proportional to ratio of integral of energy spectrum of turbulence times the wave number over g . The proportionality coefficient is a universal number independent of particle size. We derive the spectrum of Lyapunov exponents that describes the evolution of small patches of particles. It is demonstrated that particles separate dominantly in the horizontal plane. This provides a theory for the recently observed vertical columns formed by the particles. We confirm the predictions by direct numerical simulations of Navier-Stokes turbulence. The predictions include conditions that hold for water droplets in warm clouds thus providing a tool for the prediction of rain formation.
NASA Astrophysics Data System (ADS)
Rizki, Permata Nur Miftahur; Lee, Heezin; Lee, Minsu; Oh, Sangyoon
2017-01-01
With the rapid advance of remote sensing technology, the amount of three-dimensional point-cloud data has increased extraordinarily, requiring faster processing in the construction of digital elevation models. There have been several attempts to accelerate the computation using parallel methods; however, little attention has been given to investigating different approaches for selecting the most suited parallel programming model for a given computing environment. We present our findings and insights identified by implementing three popular high-performance parallel approaches (message passing interface, MapReduce, and GPGPU) on time demanding but accurate kriging interpolation. The performances of the approaches are compared by varying the size of the grid and input data. In our empirical experiment, we demonstrate the significant acceleration by all three approaches compared to a C-implemented sequential-processing method. In addition, we also discuss the pros and cons of each method in terms of usability, complexity infrastructure, and platform limitation to give readers a better understanding of utilizing those parallel approaches for gridding purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tatischeff, Vincent; Duprat, Jean; De Séréville, Nicolas, E-mail: Vincent.Tatischeff@csnsm.in2p3.fr
The presence of short-lived radionuclides (t {sub 1/2} < 10 Myr) in the early solar system provides important information about the astrophysical environment in which the solar system formed. The discovery of now extinct {sup 10}Be (t {sub 1/2} = 1.4 Myr) in calcium-aluminum-rich inclusions (CAIs) with Fractionation and Unidentified Nuclear isotope anomalies (FUN-CAIs) suggests that a baseline concentration of {sup 10}Be in the early solar system was inherited from the protosolar molecular cloud. In this paper, we investigate various astrophysical contexts for the nonthermal nucleosynthesis of {sup 10}Be by cosmic-ray-induced reactions. We first show that the {sup 10}Be recordedmore » in FUN-CAIs cannot have been produced in situ by irradiation of the FUN-CAIs themselves. We then show that trapping of Galactic cosmic rays (GCRs) in the collapsing presolar cloud core induced a negligible {sup 10}Be contamination of the protosolar nebula, the inferred {sup 10}Be/{sup 9}Be ratio being at least 40 times lower than that recorded in FUN-CAIs ({sup 10}Be/{sup 9}Be ∼ 3 × 10{sup –4}). Irradiation of the presolar molecular cloud by background GCRs produced a steady-state {sup 10}Be/{sup 9}Be ratio ≲ 1.3 × 10{sup –4} at the time of the solar system formation, which suggests that the presolar cloud was irradiated by an additional source of CRs. Considering a detailed model for CR acceleration in a supernova remnant (SNR), we find that the {sup 10}Be abundance recorded in FUN-CAIs can be explained within two alternative scenarios: (1) the irradiation of a giant molecular cloud by CRs produced by ≳ 50 supernovae exploding in a superbubble of hot gas generated by a large star cluster of at least 20,000 members, and (2) the irradiation of the presolar molecular cloud by freshly accelerated CRs escaped from an isolated SNR at the end of the Sedov-Taylor phase. In the second picture, the SNR resulted from the explosion of a massive star that ran away from its parent OB association, expanded during most of its adiabatic phase in an intercloud medium of density of about 1 H-atom cm{sup –3}, and eventually interacted with the presolar molecular cloud only during the radiative stage. This model naturally provides an explanation for the injection of other short-lived radionuclides of stellar origin into the cold presolar molecular cloud ({sup 26}Al, {sup 41}Ca, and {sup 36}Cl) and is in agreement with the solar system originating from the collapse of a molecular cloud shocked by a supernova blast wave.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubin, David L.
2015-01-23
Accelerators that collide high energy beams of matter and anti-matter are essential tools for the investigation of the fundamental constituents of matter, and the search for new forms of matter and energy. A “Linear Collider” is a machine that would bring high energy and very compact bunches of electrons and positrons (anti-electrons) into head-on collision. Such a machine would produce (among many other things) the newly discovered Higgs particle, enabling a detailed study of its properties. Among the most critical and challenging components of a linear collider are the damping rings that produce the very compact and intense beams ofmore » electrons and positrons that are to be accelerated into collision. Hot dilute particle beams are injected into the damping rings, where they are compressed and cooled. The size of the positron beam must be reduced more than a thousand fold in the damping ring, and this compression must be accomplished in a fraction of a second. The cold compact beams are then extracted from the damping ring and accelerated into collision at high energy. The proposed International Linear Collider (ILC), would require damping rings that routinely produce such cold, compact and intense beams. The goal of the Cornell study was a credible design for the damping rings for the ILC. Among the technical challenges of the damping rings; the development of instrumentation that can measure the properties of the very small beams in a very narrow window of time, and mitigation of the forces that can destabilize the beams and prevent adequate cooling, or worse lead to beam loss. One of the most pernicious destabilizing forces is due to the formation of clouds of electrons in the beam pipe. The electron cloud effect is a phenomenon in particle accelerators in which a high density of low energy electrons, build up inside the vacuum chamber. At the outset of the study, it was anticipated that electron cloud effects would limit the intensity of the positron ring, and that an instability associated with residual gas in the beam pipe would limit the intensity of the electron ring. It was also not clear whether the required very small beam size could be achieved. The results of this study are important contributions to the design of both the electron and positron damping rings in which all of those challenges are addressed and overcome. Our findings are documented in the ILC Technical Design Report, a document that represents the work of an international collaboration of scientists. Our contributions include design of the beam magnetic optics for the 3 km circumference damping rings, the vacuum system and surface treatments for electron cloud mitigation, the design of the guide field magnets, design of the superconducting damping wigglers, and new detectors for precision measurement of beam properties. Our study informed the specification of the basic design parameters for the damping rings, including alignment tolerances, magnetic field errors, and instrumentation. We developed electron cloud modelling tools and simulations to aid in the interpretation of the measurements that we carried out in the Cornell Electron-positron Storage Ring (CESR). The simulations provide a means for systematic extrapolation of our measurements at CESR to the proposed ILC damping rings, and ultimately to specify how the beam pipes should be fabricated in order to minimize the effects of the electron cloud. With the conclusion of this study, the design of the essential components of the damping rings is complete, including the development and characterization (with computer simulations) of the beam optics, specification of techniques for minimizing beam size, design of damping ring instrumentation, R&D into electron cloud suppression methods, tests of long term durability of electron cloud coatings, and design of damping ring vacuum system components.« less
NASA Astrophysics Data System (ADS)
Sudhakar, P.; Sheela, K. Anitha; Ramakrishna Rao, D.; Malladi, Satyanarayana
2016-05-01
In recent years weather modification activities are being pursued in many countries through cloud seeding techniques to facilitate the increased and timely precipitation from the clouds. In order to induce and accelerate the precipitation process clouds are artificially seeded with suitable materials like silver iodide, sodium chloride or other hygroscopic materials. The success of cloud seeding can be predicted with confidence if the precipitation process involving aerosol, the ice water balance, water vapor content and size of the seeding material in relation to aerosol in the cloud is monitored in real time and optimized. A project on the enhancement of rain fall through cloud seeding is being implemented jointly with Kerala State Electricity Board Ltd. Trivandrum, Kerala, India at the catchment areas of the reservoir of one of the Hydro electric projects. The dual polarization lidar is being used to monitor and measure the microphysical properties, the extinction coefficient, size distribution and related parameters of the clouds. The lidar makes use of the Mie, Rayleigh and Raman scattering techniques for the various measurement proposed. The measurements with the dual polarization lidar as above are being carried out in real time to obtain the various parameters during cloud seeding operations. In this paper we present the details of the multi-wavelength dual polarization lidar being used and the methodology to monitor the various cloud parameters involved in the precipitation process. The necessary retrieval algorithms for deriving the microphysical properties of clouds, aerosols characteristics and water vapor profiles are incorporated as a software package working under Lab-view for online and off line analysis. Details on the simulation studies and the theoretical model developed in this regard for the optimization of various parameters are discussed.
Typograph: Multiscale Spatial Exploration of Text Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Burtner, Edwin R.; Cramer, Nicholas O.
2013-10-06
Visualizing large document collections using a spatial layout of terms can enable quick overviews of information. These visual metaphors (e.g., word clouds, tag clouds, etc.) traditionally show a series of terms organized by space-filling algorithms. However, often lacking in these views is the ability to interactively explore the information to gain more detail, and the location and rendering of the terms are often not based on mathematical models that maintain relative distances from other information based on similarity metrics. In this paper, we present Typograph, a multi-scale spatial exploration visualization for large document collections. Based on the term-based visualization methods,more » Typograh enables multiple levels of detail (terms, phrases, snippets, and full documents) within the single spatialization. Further, the information is placed based on their relative similarity to other information to create the “near = similar” geographic metaphor. This paper discusses the design principles and functionality of Typograph and presents a use case analyzing Wikipedia to demonstrate usage.« less
a Framework for Voxel-Based Global Scale Modeling of Urban Environments
NASA Astrophysics Data System (ADS)
Gehrung, Joachim; Hebel, Marcus; Arens, Michael; Stilla, Uwe
2016-10-01
The generation of 3D city models is a very active field of research. Modeling environments as point clouds may be fast, but has disadvantages. These are easily solvable by using volumetric representations, especially when considering selective data acquisition, change detection and fast changing environments. Therefore, this paper proposes a framework for the volumetric modeling and visualization of large scale urban environments. Beside an architecture and the right mix of algorithms for the task, two compression strategies for volumetric models as well as a data quality based approach for the import of range measurements are proposed. The capabilities of the framework are shown on a mobile laser scanning dataset of the Technical University of Munich. Furthermore the loss of the compression techniques is evaluated and their memory consumption is compared to that of raw point clouds. The presented results show that generation, storage and real-time rendering of even large urban models are feasible, even with off-the-shelf hardware.
Electrostatic plasma lens for focusing negatively charged particle beams.
Goncharov, A A; Dobrovolskiy, A M; Dunets, S M; Litovko, I V; Gushenets, V I; Oks, E M
2012-02-01
We describe the current status of ongoing research and development of the electrostatic plasma lens for focusing and manipulating intense negatively charged particle beams, electrons, and negative ions. The physical principle of this kind of plasma lens is based on magnetic isolation electrons providing creation of a dynamical positive space charge cloud in shortly restricted volume propagating beam. Here, the new results of experimental investigations and computer simulations of wide-aperture, intense electron beam focusing by plasma lens with positive space charge cloud produced due to the cylindrical anode layer accelerator creating a positive ion stream towards an axis system is presented.
NASA Astrophysics Data System (ADS)
Pierce, S. A.
2017-12-01
Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case studies to highlight how Cloud CI streamlines the process for setting up an interactive decision support system. Moreover, advances in artificial intelligence offer new techniques for old problems from integrating data to adaptive sensing or from interactive dashboards to optimizing multi-attribute problems. The combination of scientific expertise, flexible cloud computing solutions, and intelligent systems opens new research horizons.
Jewel scarabs (Chrysina sp.) in Honduras: key species for cloud forest conservation monitoring?
Jocque, M; Vanhove, M P M; Creedy, T J; Burdekin, O; Nuñez-Miño, J M; Casteels, J
2013-01-01
Jewel scarabs, beetles in the genus Chrysina Kirby (Coleoptera: Rutelinae: Scarabaeidae), receive their name from the bright, often gold, green elytra that reflect light like a precious stone. Jewel scarabs are commonly observed at light traps in Mesoamerican cloud forests, and their association with mountain forests makes them potentially interesting candidates for cloud forest conservation monitoring. The absence of survey protocols and identification tools, and the little ecological information available are barriers. In the present study, collection of Chrysina species assembled during biodiversity surveys by Operation Wallacea in Cusuco National Park (CNP), Honduras, were studied. The aim of this overview is to provide an easy to use identification tool for in the field, hopefully stimulating data collection on these beetles. Based on the data associated with the collection localities, elevation distribution of the species in the park was analyzed. The limited data points available were complemented with potential distribution areas generated with distribution models based on climate and elevation data. This study is aimed at initializing the development of a survey protocol for Chrysina species that can be used in cloud forest conservation monitoring throughout Central America. A list of Chrysina species recorded from Honduras so far is provided. The six identified and one unidentified species recorded from CNP are easy to identify in the field based on color and straightforward morphological characteristics. Literature research revealed ten species currently recorded from Honduras. This low species richness in comparison with surrounding Central American countries indicates the poor knowledge of this genus in Honduras. Chrysina species richness in CNP increases with elevation, thereby making the genus one of a few groups of organisms where this correlation is observed, and rendering it a suitable invertebrate representative for cloud forest habitats in Central America.
NASA Astrophysics Data System (ADS)
Imai, M.; Kouyama, T.; Takahashi, Y.; Watanabe, S.; Yamazaki, A.; Yamada, M.; Nakamura, M.; Satoh, T.; Imamura, T.; Nakaoka, T.; Kawabata, M.; Yamanaka, M.; Kawabata, K. S.
2017-12-01
Venus has a global cloud layer, and the atmosphere rotates with the speed over 100 m/s. The scattering of solar radiance and absorber in clouds cause the strong dark and bright contrast in 365 nm unknown absorption bands. The Japanese Venus orbiter AKATSUKI and the onboard instrument UVI capture 100 km mesoscale cloud features over the entire visible dayside area. In contrast, planetary-scale features are observed when the orbiter is at the moderate distance from Venus and when the Sun-Venus-orbiter phase angle is smaller than 45 deg. Cloud top wind velocity was measured with the mesoscale cloud tracking technique, however, observations of the propagation velocity and its variation of the planetary-scale feature are not well conducted because of the limitation of the observable area. The purpose of the study is measuring the effect of wind acceleration by planetary-scale waves. Each cloud motion can be represented as the wind and phase velocity of the planetary-scale waves, respectively. We conducted simultaneous observations of the zonal motion of both mesoscale and planetary-scale feature using UVI/AKATSUKI and ground-based Pirka and Kanata telescopes in Japan. Our previous ground-based observation revealed the periodicity change of planetary-scale waves with a time scale of a couple of months. For the initial analysis of UVI images, we used the time-consecutive images taken in the orbit #32. During this orbit (from Nov. 13 to 20, 2016), 7 images were obtained with 2 hr time-interval in a day whose spatial resolution ranged from 10-35 km. To investigate the typical mesoscale cloud motion, the Gaussian-filters with sigma = 3 deg. were used to smooth geometrically mapped images with 0.25 deg. resolution. Then the amount of zonal shift for each 5 deg. latitudinal bands between the pairs of two time-consecutive images were estimated by searching the 2D cross-correlation maximum. The final wind velocity (or rotation period) for mesoscale features were determined with a small error about +/- 0.1-day period in equatorial region (Figure 2). The same method will be applied for planetary-scale features captured by UVI, and ground-based observations compensate the discontinuity in UVI data. At the presentation, the variability in winds and wave propagation velocity with the time scale of a couple of months will be shown.
Gamma-ray emission from the shell of supernova remnant W44 revealed by the Fermi LAT.
Abdo, A A; Ackermann, M; Ajello, M; Baldini, L; Ballet, J; Barbiellini, G; Baring, M G; Bastieri, D; Baughman, B M; Bechtol, K; Bellazzini, R; Berenji, B; Blandford, R D; Bloom, E D; Bonamente, E; Borgland, A W; Bregeon, J; Brez, A; Brigida, M; Bruel, P; Burnett, T H; Buson, S; Caliandro, G A; Cameron, R A; Caraveo, P A; Casandjian, J M; Cecchi, C; Celik, O; Chekhtman, A; Cheung, C C; Chiang, J; Ciprini, S; Claus, R; Cognard, I; Cohen-Tanugi, J; Cominsky, L R; Conrad, J; Cutini, S; Dermer, C D; de Angelis, A; de Palma, F; Digel, S W; do Couto e Silva, E; Drell, P S; Dubois, R; Dumora, D; Espinoza, C; Farnier, C; Favuzzi, C; Fegan, S J; Focke, W B; Fortin, P; Frailis, M; Fukazawa, Y; Funk, S; Fusco, P; Gargano, F; Gasparrini, D; Gehrels, N; Germani, S; Giavitto, G; Giebels, B; Giglietto, N; Giordano, F; Glanzman, T; Godfrey, G; Grenier, I A; Grondin, M-H; Grove, J E; Guillemot, L; Guiriec, S; Hanabata, Y; Harding, A K; Hayashida, M; Hays, E; Hughes, R E; Jackson, M S; Jóhannesson, G; Johnson, A S; Johnson, T J; Johnson, W N; Kamae, T; Katagiri, H; Kataoka, J; Katsuta, J; Kawai, N; Kerr, M; Knödlseder, J; Kocian, M L; Kramer, M; Kuss, M; Lande, J; Latronico, L; Lemoine-Goumard, M; Longo, F; Loparco, F; Lott, B; Lovellette, M N; Lubrano, P; Lyne, A G; Madejski, G M; Makeev, A; Mazziotta, M N; McEnery, J E; Meurer, C; Michelson, P F; Mitthumsiri, W; Mizuno, T; Monte, C; Monzani, M E; Morselli, A; Moskalenko, I V; Murgia, S; Nakamori, T; Nolan, P L; Norris, J P; Noutsos, A; Nuss, E; Ohsugi, T; Omodei, N; Orlando, E; Ormes, J F; Paneque, D; Parent, D; Pelassa, V; Pepe, M; Pesce-Rollins, M; Piron, F; Porter, T A; Rainò, S; Rando, R; Razzano, M; Reimer, A; Reimer, O; Reposeur, T; Rochester, L S; Rodriguez, A Y; Romani, R W; Roth, M; Ryde, F; Sadrozinski, H F-W; Sanchez, D; Sander, A; Saz Parkinson, P M; Scargle, J D; Sgrò, C; Siskind, E J; Smith, D A; Smith, P D; Spandre, G; Spinelli, P; Stappers, B W; Stecker, F W; Strickman, M S; Suson, D J; Tajima, H; Takahashi, H; Takahashi, T; Tanaka, T; Thayer, J B; Thayer, J G; Theureau, G; Thompson, D J; Tibaldo, L; Tibolla, O; Torres, D F; Tosti, G; Tramacere, A; Uchiyama, Y; Usher, T L; Vasileiou, V; Venter, C; Vilchez, N; Vitale, V; Waite, A P; Wang, P; Winer, B L; Wood, K S; Yamazaki, R; Ylinen, T; Ziegler, M
2010-02-26
Recent observations of supernova remnants (SNRs) hint that they accelerate cosmic rays to energies close to ~10(15) electron volts. However, the nature of the particles that produce the emission remains ambiguous. We report observations of SNR W44 with the Fermi Large Area Telescope at energies between 2 x 10(8) electron volts and 3 x10(11) electron volts. The detection of a source with a morphology corresponding to the SNR shell implies that the emission is produced by particles accelerated there. The gamma-ray spectrum is well modeled with emission from protons and nuclei. Its steepening above approximately 10(9) electron volts provides a probe with which to study how particle acceleration responds to environmental effects such as shock propagation in dense clouds and how accelerated particles are released into interstellar space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saide, Pablo; Spak, S. N.; Carmichael, Gregory
2012-03-30
We evaluate a regional-scale simulation with the WRF-Chem model for the VAMOS (Variability of the American Monsoon Systems) Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-REx), which sampled the Southeast Pacific's persistent stratocumulus deck. Evaluation of VOCALS-REx ship-based and aircraft observations focuses on analyzing how aerosol loading affects marine boundary layer (MBL) dynamics and cloud microphysics. We compare local time series and campaign averaged longitudinal gradients, and highlight differences in model simulations with (W) and without wet (NW) deposition processes. The higher aerosol loadings in the NW case produce considerable changes in MBL dynamics and cloud microphysics, in accordance with the established conceptualmore » model of aerosol indirect effects. These include increase in cloud albedo, increase in MBL and cloud heights, drizzle suppression, increase in liquid water content, and increase in cloud lifetime. Moreover, better statistical representation of aerosol mass and number concentration improves model fidelity in reproducing observed spatial and temporal variability in cloud properties, including top and base height, droplet concentration, water content, rain rate, optical depth (COD) and liquid water path (LWP). Together, these help to quantify confidence in WRF-Chem's modeled aerosol-cloud interactions, while identifying structural and parametric uncertainties including: irreversibility in rain wet removal; overestimation of marine DMS and sea salt emissions and accelerated aqueous sulfate conversion. Our findings suggest that WRF-Chem simulates marine cloud-aerosol interactions at a level sufficient for applications in forecasting weather and air quality and studying aerosol climate forcing, including the reliability required for policy analysis and geo-engineering applications.« less
The Namibia Early Flood Warning System, A CEOS Pilot Project
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert
2012-01-01
Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.
NASA Astrophysics Data System (ADS)
Bonanno, D.; Fraund, M. W.; Pham, D.; China, S.; Wang, B.; Laskin, A.; Gilles, M. K.; Moffet, R.
2017-12-01
The Holistic Interactions of Shallow Clouds, Aerosols, and Land-Ecosystems (HI-SCALE) Campaign was carried out to gain a better understanding of the lifecycle of shallow clouds. The HISCALE experiment was designed to contrast two seasons, wet and dry, and determine their effect on atmospheric cloud and aerosol processes. The spring component to HISCALE was selected to characterize mixing state for particles collected onto substrates. Sampling was performed to obtain airborne soil organic particles (ASOP), which are believed to be ejected following rain events. The unique composition of the ASOP have been shown to affect optical properties. The collection of particles took place at the Atmospheric Radiation Measurement Southern Great Plains (ARM SGP) field site. The Scanning Transmission X-Ray Microscope (STXM) was used to image the samples collected during the first HI-SCALE Campaign to determine the carbonaceous mixing state. Scanning Electron Microscopy Energy-dispersive X-ray (SEM/EDX) analysis is more sensitive to the inorganic makeup of particles, while STXM renders a more comprehensive analysis of the organics. Measurements such as nephelometry, Particle Soot Absorption Photometry (PSAP) from the ARM archive are correlated with microscopy measurements. The primary focus is the relation between composition and morphology of ASOP with optical properties.
The Route to Raindrop Formation in a Shallow Cumulus Cloud Simulated by a Lagrangian Cloud Model
NASA Astrophysics Data System (ADS)
Noh, Yign; Hoffmann, Fabian; Raasch, Siegfried
2017-11-01
The mechanism of raindrop formation in a shallow cumulus cloud is investigated using a Lagrangian cloud model (LCM). The analysis is focused on how and under which conditions a cloud droplet grows to a raindrop by tracking the history of individual Lagrangian droplets. It is found that the rapid collisional growth, leading to raindrop formation, is triggered when single droplets with a radius of 20 μm appear in the region near the cloud top, characterized by a large liquid water content, strong turbulence, large mean droplet size, a broad drop size distribution (DSD), and high supersaturations. Raindrop formation easily occurs when turbulence-induced collision enhancement(TICE) is considered, with or without any extra broadening of the DSD by another mechanism (such as entrainment and mixing). In contrast, when TICE is not considered, raindrop formation is severely delayed if no other broadening mechanism is active. The reason leading to the difference is clarified by the additional analysis of idealized box-simulations of the collisional growth process for different DSDs in varied turbulent environments. It is found that TICE does not accelerate the timing of the raindrop formation for individual droplets, but it enhances the collisional growth rate significantly afterward. KMA R & D Program (Korea), DFG (Germany).
MAPA: an interactive accelerator design code with GUI
NASA Astrophysics Data System (ADS)
Bruhwiler, David L.; Cary, John R.; Shasharina, Svetlana G.
1999-06-01
The MAPA code is an interactive accelerator modeling and design tool with an X/Motif GUI. MAPA has been developed in C++ and makes full use of object-oriented features. We present an overview of its features and describe how users can independently extend the capabilities of the entire application, including the GUI. For example, a user can define a new model for a focusing or accelerating element. If the appropriate form is followed, and the new element is "registered" with a single line in the specified file, then the GUI will fully support this user-defined element type after it has been compiled and then linked to the existing application. In particular, the GUI will bring up windows for modifying any relevant parameters of the new element type. At present, one can use the GUI for phase space tracking, finding fixed points and generating line plots for the Twiss parameters, the dispersion and the accelerator geometry. The user can define new types of simulations which the GUI will automatically support by providing a menu option to execute the simulation and subsequently rendering line plots of the resulting data.
Graphics Processing Unit (GPU) Acceleration of the Goddard Earth Observing System Atmospheric Model
NASA Technical Reports Server (NTRS)
Putnam, Williama
2011-01-01
The Goddard Earth Observing System 5 (GEOS-5) is the atmospheric model used by the Global Modeling and Assimilation Office (GMAO) for a variety of applications, from long-term climate prediction at relatively coarse resolution, to data assimilation and numerical weather prediction, to very high-resolution cloud-resolving simulations. GEOS-5 is being ported to a graphics processing unit (GPU) cluster at the NASA Center for Climate Simulation (NCCS). By utilizing GPU co-processor technology, we expect to increase the throughput of GEOS-5 by at least an order of magnitude, and accelerate the process of scientific exploration across all scales of global modeling, including: The large-scale, high-end application of non-hydrostatic, global, cloud-resolving modeling at 10- to I-kilometer (km) global resolutions Intermediate-resolution seasonal climate and weather prediction at 50- to 25-km on small clusters of GPUs Long-range, coarse-resolution climate modeling, enabled on a small box of GPUs for the individual researcher After being ported to the GPU cluster, the primary physics components and the dynamical core of GEOS-5 have demonstrated a potential speedup of 15-40 times over conventional processor cores. Performance improvements of this magnitude reduce the required scalability of 1-km, global, cloud-resolving models from an unfathomable 6 million cores to an attainable 200,000 GPU-enabled cores.
Parallel processing optimization strategy based on MapReduce model in cloud storage environment
NASA Astrophysics Data System (ADS)
Cui, Jianming; Liu, Jiayi; Li, Qiuyan
2017-05-01
Currently, a large number of documents in the cloud storage process employed the way of packaging after receiving all the packets. From the local transmitter this stored procedure to the server, packing and unpacking will consume a lot of time, and the transmission efficiency is low as well. A new parallel processing algorithm is proposed to optimize the transmission mode. According to the operation machine graphs model work, using MPI technology parallel execution Mapper and Reducer mechanism. It is good to use MPI technology to implement Mapper and Reducer parallel mechanism. After the simulation experiment of Hadoop cloud computing platform, this algorithm can not only accelerate the file transfer rate, but also shorten the waiting time of the Reducer mechanism. It will break through traditional sequential transmission constraints and reduce the storage coupling to improve the transmission efficiency.
Electron-Cloud Build-Up: Theory and Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M. A.
We present a broad-brush survey of the phenomenology, history and importance of the electron-cloud effect (ECE). We briefly discuss the simulation techniques used to quantify the electron-cloud (EC) dynamics. Finally, we present in more detail an effective theory to describe the EC density build-up in terms of a few effective parameters. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire 'ECLOUD' series. In addition, the proceedings of the various flavors of Particle Accelerator Conferences contain a large number of EC-related publications.more » The ICFA Beam Dynamics Newsletter series contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC.« less
Knowledge and Technology: Sharing With Society
NASA Astrophysics Data System (ADS)
Benvenuti, Cristoforo; Sutton, Christine; Wenninger, Horst
The following sections are included: * A Core Mission of CERN * Medical Accelerators: A Tool for Tumour Therapy * Medipix: The Image is the Message * Crystal Clear: From Higgs to PET * Solar Collectors: When Nothing is Better * The TARC Experiment at CERN: Modern Alchemy * A CLOUD Chamber with a Silvery Lining * References
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backfish, Michael
This paper documents the use of four retarding field analyzers (RFAs) to measure electron cloud signals created in Fermilab’s Main Injector during 120 GeV operations. The first data set was taken from September 11, 2009 to July 4, 2010. This data set is used to compare two different types of beam pipe that were installed in the accelerator. Two RFAs were installed in a normal steel beam pipe like the rest of the Main Injector while another two were installed in a one meter section of beam pipe that was coated on the inside with titanium nitride (TiN). A secondmore » data run started on August 23, 2010 and ended on January 10, 2011 when Main Injector beam intensities were reduced thus eliminating the electron cloud. This second run uses the same RFA setup but the TiN coated beam pipe was replaced by a one meter section coated with amorphous carbon (aC). This section of beam pipe was provided by CERN in an effort to better understand how an aC coating will perform over time in an accelerator. The research consists of three basic parts: (a) continuously monitoring the conditioning of the three different types of beam pipe over both time and absorbed electrons (b) measurement of the characteristics of the surrounding magnetic fields in the Main Injector in order to better relate actual data observed in the Main Injector with that of simulations (c) measurement of the energy spectrum of the electron cloud signals using retarding field analyzers in all three types of beam pipe.« less
NASA Astrophysics Data System (ADS)
Loftus, Adrian; Tsay, Si-Chee; Nguyen, Xuan Anh
2016-04-01
Low-level stratocumulus (Sc) clouds cover more of the Earth's surface than any other cloud type rendering them critical for Earth's energy balance, primarily via reflection of solar radiation, as well as their role in the global hydrological cycle. Stratocumuli are particularly sensitive to changes in aerosol loading on both microphysical and macrophysical scales, yet the complex feedbacks involved in aerosol-cloud-precipitation interactions remain poorly understood. Moreover, research on these clouds has largely been confined to marine environments, with far fewer studies over land where major sources of anthropogenic aerosols exist. The aerosol burden over Southeast Asia (SEA) in boreal spring, attributed to biomass burning (BB), exhibits highly consistent spatiotemporal distribution patterns, with major variability due to changes in aerosol loading mediated by processes ranging from large-scale climate factors to diurnal meteorological events. Downwind from source regions, the transported BB aerosols often overlap with low-level Sc cloud decks associated with the development of the region's pre-monsoon system, providing a unique, natural laboratory for further exploring their complex micro- and macro-scale relationships. Compared to other locations worldwide, studies of springtime biomass-burning aerosols and the predominately Sc cloud systems over SEA and their ensuing interactions are underrepresented in scientific literature. Measurements of aerosol and cloud properties, whether ground-based or from satellites, generally lack information on microphysical processes; thus cloud-resolving models are often employed to simulate the underlying physical processes in aerosol-cloud-precipitation interactions. The Goddard Cumulus Ensemble (GCE) cloud model has recently been enhanced with a triple-moment (3M) bulk microphysics scheme as well as the Regional Atmospheric Modeling System (RAMS) version 6 aerosol module. Because the aerosol burden not only affects cloud droplet size and number concentration, but also the spectral width of the cloud droplet size distribution, the 3M scheme is well suited to simulate aerosol-cloud-precipitation interactions within a three-dimensional regional cloud model. Moreover, the additional variability predicted on the hydrometeor distributions provides beneficial input for forward models to link the simulated microphysical processes with observations as well as to assess both ground-based and satellite retrieval methods. In this presentation, we provide an overview of the 7 South East Asian Studies / Biomass-burning Aerosols and Stratocumulus Environment: Lifecycles and Interactions Experiment (7-SEAS/BASELInE) operations during the spring of 2013. Preliminary analyses of pre-monsoon Sc system lifecycles observed during the first-ever deployment of a ground-based cloud radar to northern Vietnam will be also be presented. Initial results from GCE model simulations of these Sc using double-moment and the new 3M bulk microphysics schemes under various aerosol loadings will be used to showcase the 3M scheme as well as provide insight into how the impact of aerosols on cloud and precipitation processes in stratocumulus over land may manifest themselves in simulated remote-sensing signals. Applications and future work involving ongoing 7-SEAS campaigns aimed at improving our understanding of aerosol-cloud-precipitation interactions of will also be discussed.
Particle-in-cell simulations of the critical ionization velocity effect in finite size clouds
NASA Technical Reports Server (NTRS)
Moghaddam-Taaheri, E.; Lu, G.; Goertz, C. K.; Nishikawa, K. - I.
1994-01-01
The critical ionization velocity (CIV) mechanism in a finite size cloud is studied with a series of electrostatic particle-in-cell simulations. It is observed that an initial seed ionization, produced by non-CIV mechanisms, generates a cross-field ion beam which excites a modified beam-plasma instability (MBPI) with frequency in the range of the lower hybrid frequency. The excited waves accelerate electrons along the magnetic field up to the ion drift energy that exceeds the ionization energy of the neutral atoms. The heated electrons in turn enhance the ion beam by electron-neutral impact ionization, which establishes a positive feedback loop in maintaining the CIV process. It is also found that the efficiency of the CIV mechanism depends on the finite size of the gas cloud in the following ways: (1) Along the ambient magnetic field the finite size of the cloud, L (sub parallel), restricts the growth of the fastest growing mode, with a wavelength lambda (sub m parallel), of the MBPI. The parallel electron heating at wave saturation scales approximately as (L (sub parallel)/lambda (sub m parallel)) (exp 1/2); (2) Momentum coupling between the cloud and the ambient plasma via the Alfven waves occurs as a result of the finite size of the cloud in the direction perpendicular to both the ambient magnetic field and the neutral drift. This reduces exponentially with time the relative drift between the ambient plasma and the neutrals. The timescale is inversely proportional to the Alfven velocity. (3) The transvers e charge separation field across the cloud was found to result in the modulation of the beam velocity which reduces the parallel heating of electrons and increases the transverse acceleration of electrons. (4) Some energetic electrons are lost from the cloud along the magnetic field at a rate characterized by the acoustic velocity, instead of the electron thermal velocity. The loss of energetic electrons from the cloud seems to be larger in the direction of plasma drift relative to the neutrals, where the loss rate is characterized by the neutral drift velocity. It is also shown that a factor of 4 increase in the ambient plasma density, increases the CIV ionization yield by almost 2 orders of magnitude at the end of a typical run. It is concluded that a larger ambient plasma density can result in a larger CIV yield because of (1) larger seed ion production by non-CIV mechanisms, (2) smaller Alfven velocity and hence weak momentum coupling, and (3) smaller ratio of the ion beam density to the ambient ion density, and therefore a weaker modulation of the beam velocity. The simulation results are used to interpret various chemical release experiments in space.
NASA Astrophysics Data System (ADS)
Basso, Tessa Chiara; Iovieno, Michele; Bertoldo, Silvano; Perotto, Giovanni; Athanassiou, Athanassia; Canavero, Flavio; Perona, Giovanni; Tordella, Daniela
2017-11-01
An introduction to innovative, bio-compatible, ultralight, disposable radiosondes that are aimed to be passively transported on isopycnic surfaces in cloud and clear air environments. Their goal is to track small-scale fluctuations of velocity, temperature, humidity, acceleration and pressure for several hours within and outside the cloud boundary. With a target weight of 15 g, the volume is chosen such that the probes float on isopycnic surfaces at constant altitudes from 1000 to 3000 m. They are filled with helium gas to obtain a buoyancy force equal to the weight of the system. Transmitters within the probes will send data to receivers on Earth to be analysed and compared with numerical simulations. To minimise their environmental impact, it is foreseen that the disposable radiosondes be made with biodegradable smart materials which keep the desired hydrophobicity and flexibility. These environmentally friendly, hydrophobic balloons will provide an insight into the unsteady life cycle of warm clouds over land, ocean and alpine environments. These explorative observations will contribute to the current understanding of microphysical processes in clouds with the purpose of improving weather prediction and climate modelling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hao; Garzoglio, Gabriele; Ren, Shangping
FermiCloud is a private cloud developed in Fermi National Accelerator Laboratory to provide elastic and on-demand resources for different scientific research experiments. The design goal of the FermiCloud is to automatically allocate resources for different scientific applications so that the QoS required by these applications is met and the operational cost of the FermiCloud is minimized. Our earlier research shows that VM launching overhead has large variations. If such variations are not taken into consideration when making resource allocation decisions, it may lead to poor performance and resource waste. In this paper, we show how we may use an VMmore » launching overhead reference model to minimize VM launching overhead. In particular, we first present a training algorithm that automatically tunes a given refer- ence model to accurately reflect FermiCloud environment. Based on the tuned reference model for virtual machine launching overhead, we develop an overhead-aware-best-fit resource allocation algorithm that decides where and when to allocate resources so that the average virtual machine launching overhead is minimized. The experimental results indicate that the developed overhead-aware-best-fit resource allocation algorithm can significantly improved the VM launching time when large number of VMs are simultaneously launched.« less
Decreasing cloud cover drives the recent mass loss on the Greenland Ice Sheet.
Hofer, Stefan; Tedstone, Andrew J; Fettweis, Xavier; Bamber, Jonathan L
2017-06-01
The Greenland Ice Sheet (GrIS) has been losing mass at an accelerating rate since the mid-1990s. This has been due to both increased ice discharge into the ocean and melting at the surface, with the latter being the dominant contribution. This change in state has been attributed to rising temperatures and a decrease in surface albedo. We show, using satellite data and climate model output, that the abrupt reduction in surface mass balance since about 1995 can be attributed largely to a coincident trend of decreasing summer cloud cover enhancing the melt-albedo feedback. Satellite observations show that, from 1995 to 2009, summer cloud cover decreased by 0.9 ± 0.3% per year. Model output indicates that the GrIS summer melt increases by 27 ± 13 gigatons (Gt) per percent reduction in summer cloud cover, principally because of the impact of increased shortwave radiation over the low albedo ablation zone. The observed reduction in cloud cover is strongly correlated with a state shift in the North Atlantic Oscillation promoting anticyclonic conditions in summer and suggests that the enhanced surface mass loss from the GrIS is driven by synoptic-scale changes in Arctic-wide atmospheric circulation.
NASA Astrophysics Data System (ADS)
Chun, Won-Suk; Napoli, Joshua; Cossairt, Oliver S.; Dorval, Rick K.; Hall, Deirdre M.; Purtell, Thomas J., II; Schooler, James F.; Banker, Yigal; Favalora, Gregg E.
2005-03-01
We present a software and hardware foundation to enable the rapid adoption of 3-D displays. Different 3-D displays - such as multiplanar, multiview, and electroholographic displays - naturally require different rendering methods. The adoption of these displays in the marketplace will be accelerated by a common software framework. The authors designed the SpatialGL API, a new rendering framework that unifies these display methods under one interface. SpatialGL enables complementary visualization assets to coexist through a uniform infrastructure. Also, SpatialGL supports legacy interfaces such as the OpenGL API. The authors" first implementation of SpatialGL uses multiview and multislice rendering algorithms to exploit the performance of modern graphics processing units (GPUs) to enable real-time visualization of 3-D graphics from medical imaging, oil & gas exploration, and homeland security. At the time of writing, SpatialGL runs on COTS workstations (both Windows and Linux) and on Actuality"s high-performance embedded computational engine that couples an NVIDIA GeForce 6800 Ultra GPU, an AMD Athlon 64 processor, and a proprietary, high-speed, programmable volumetric frame buffer that interfaces to a 1024 x 768 x 3 digital projector. Progress is illustrated using an off-the-shelf multiview display, Actuality"s multiplanar Perspecta Spatial 3D System, and an experimental multiview display. The experimental display is a quasi-holographic view-sequential system that generates aerial imagery measuring 30 mm x 25 mm x 25 mm, providing 198 horizontal views.
Plasma development in the accelerator of a 2-kJ focus discharge.
Fischer, H; Haering, K H
1979-07-01
Optical image structures from early breakdown ( approximately 200 nsec) to focus formation (~1300 nsec) in 3 Torr hydrogen were studied by means of 2 image converter shutters having 50-nsec and 10-nsec exposure. Space charge limited cathode spots at the outer electrode (OE)-spoke-shaped positive columns across the gap-and an extended electron cloud along the center electrode (CE) determine the current flow during early breakdown. Ionization increases exponentially within the center gap plasma. This is separated from the CE by a pattern of anode drop filaments. Filament structures grow into the z-axis accelerated current sheath, which in addition carries the early spoke pattern. The sheath appears homogeneous after leaving the accelerator exit.
Declarative language design for interactive visualization.
Heer, Jeffrey; Bostock, Michael
2010-01-01
We investigate the design of declarative, domain-specific languages for constructing interactive visualizations. By separating specification from execution, declarative languages can simplify development, enable unobtrusive optimization, and support retargeting across platforms. We describe the design of the Protovis specification language and its implementation within an object-oriented, statically-typed programming language (Java). We demonstrate how to support rich visualizations without requiring a toolkit-specific data model and extend Protovis to enable declarative specification of animated transitions. To support cross-platform deployment, we introduce rendering and event-handling infrastructures decoupled from the runtime platform, letting designers retarget visualization specifications (e.g., from desktop to mobile phone) with reduced effort. We also explore optimizations such as runtime compilation of visualization specifications, parallelized execution, and hardware-accelerated rendering. We present benchmark studies measuring the performance gains provided by these optimizations and compare performance to existing Java-based visualization tools, demonstrating scalability improvements exceeding an order of magnitude.
Chromium: A Stress-Processing Framework for Interactive Rendering on Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphreys, G,; Houston, M.; Ng, Y.-R.
2002-01-11
We describe Chromium, a system for manipulating streams of graphics API commands on clusters of workstations. Chromium's stream filters can be arranged to create sort-first and sort-last parallel graphics architectures that, in many cases, support the same applications while using only commodity graphics accelerators. In addition, these stream filters can be extended programmatically, allowing the user to customize the stream transformations performed by nodes in a cluster. Because our stream processing mechanism is completely general, any cluster-parallel rendering algorithm can be either implemented on top of or embedded in Chromium. In this paper, we give examples of real-world applications thatmore » use Chromium to achieve good scalability on clusters of workstations, and describe other potential uses of this stream processing technology. By completely abstracting the underlying graphics architecture, network topology, and API command processing semantics, we allow a variety of applications to run in different environments.« less
Aerosol-cloud interactions in a multi-scale modeling framework
NASA Astrophysics Data System (ADS)
Lin, G.; Ghan, S. J.
2017-12-01
Atmospheric aerosols play an important role in changing the Earth's climate through scattering/absorbing solar and terrestrial radiation and interacting with clouds. However, quantification of the aerosol effects remains one of the most uncertain aspects of current and future climate projection. Much of the uncertainty results from the multi-scale nature of aerosol-cloud interactions, which is very challenging to represent in traditional global climate models (GCMs). In contrast, the multi-scale modeling framework (MMF) provides a viable solution, which explicitly resolves the cloud/precipitation in the cloud resolved model (CRM) embedded in the GCM grid column. In the MMF version of community atmospheric model version 5 (CAM5), aerosol processes are treated with a parameterization, called the Explicit Clouds Parameterized Pollutants (ECPP). It uses the cloud/precipitation statistics derived from the CRM to treat the cloud processing of aerosols on the GCM grid. However, this treatment treats clouds on the CRM grid but aerosols on the GCM grid, which is inconsistent with the reality that cloud-aerosol interactions occur on the cloud scale. To overcome the limitation, here, we propose a new aerosol treatment in the MMF: Explicit Clouds Explicit Aerosols (ECEP), in which we resolve both clouds and aerosols explicitly on the CRM grid. We first applied the MMF with ECPP to the Accelerated Climate Modeling for Energy (ACME) model to have an MMF version of ACME. Further, we also developed an alternative version of ACME-MMF with ECEP. Based on these two models, we have conducted two simulations: one with the ECPP and the other with ECEP. Preliminary results showed that the ECEP simulations tend to predict higher aerosol concentrations than ECPP simulations, because of the more efficient vertical transport from the surface to the higher atmosphere but the less efficient wet removal. We also found that the cloud droplet number concentrations are also different between the two simulations due to the difference in the cloud droplet lifetime. Next, we will explore how the ECEP treatment affects the anthropogenic aerosol forcing, particularly the aerosol indirect forcing, by comparing present-day and pre-industrial simulations.
EyeMIAS: a cloud-based ophthalmic image reading and auxiliary diagnosis system
NASA Astrophysics Data System (ADS)
Wu, Di; Zhao, Heming; Yu, Kai; Chen, Xinjian
2018-03-01
Relying solely on ophthalmic equipment is unable to meet the present health needs. It is urgent to find an efficient way to provide a quick screening and early diagnosis on diabetic retinopathy and other ophthalmic diseases. The purpose of this study is to develop a cloud-base system for medical image especially ophthalmic image to store, view and process and accelerate the screening and diagnosis. In this purpose the system with web application, upload client, storage dependency and algorithm support is implemented. After five alpha tests, the system bore the thousands of large traffic access and generated hundreds of reports with diagnosis.
Fast Transverse Beam Instability Caused by Electron Cloud Trapped in Combined Function Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey
Electron cloud instabilities affect the performance of many circular high-intensity particle accelerators. They usually have a fast growth rate and might lead to an increase of the transverse emittance and beam loss. A peculiar example of such an instability is observed in the Fermilab Recycler proton storage ring. Although this instability might pose a challenge for future intensity upgrades, its nature had not been completely understood. The phenomena has been studied experimentally by comparing the dynamics of stable and unstable beam, numerically by simulating the build-up of the electron cloud and its interaction with the beam, and analytically by constructing a model of an electron cloud driven instability with the electrons trapped in combined function dipoles. Stabilization of the beam by a clearing bunch reveals that the instability is caused by the electron cloud, trapped in beam optics magnets. Measurements of microwave propagation confirm the presence of the cloud in the combined function dipoles. Numerical simulations show that up to 10more » $$^{-2}$$ of the particles can be trapped by their magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. In a combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated fast instability growth rate of about 30 revolutions and low mode frequency of 0.4 MHz are consistent with experimental observations and agree with the simulations. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saide P. E.; Springston S.; Spak, S. N.
2012-03-29
We evaluate a regional-scale simulation with the WRF-Chem model for the VAMOS (Variability of the American Monsoon Systems) Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-REx), which sampled the Southeast Pacific's persistent stratocumulus deck. Evaluation of VOCALS-REx ship-based and three aircraft observations focuses on analyzing how aerosol loading affects marine boundary layer (MBL) dynamics and cloud microphysics. We compare local time series and campaign-averaged longitudinal gradients, and highlight differences in model simulations with (W) and without (NW) wet deposition processes. The higher aerosol loadings in the NW case produce considerable changes in MBL dynamics and cloud microphysics, in accordance with the established conceptualmore » model of aerosol indirect effects. These include increase in cloud albedo, increase in MBL and cloud heights, drizzle suppression, increase in liquid water content, and increase in cloud lifetime. Moreover, better statistical representation of aerosol mass and number concentration improves model fidelity in reproducing observed spatial and temporal variability in cloud properties, including top and base height, droplet concentration, water content, rain rate, optical depth (COD) and liquid water path (LWP). Together, these help to quantify confidence in WRF-Chem's modeled aerosol-cloud interactions, especially in the activation parameterization, while identifying structural and parametric uncertainties including: irreversibility in rain wet removal; overestimation of marine DMS and sea salt emissions, and accelerated aqueous sulfate conversion. Our findings suggest that WRF-Chem simulates marine cloud-aerosol interactions at a level sufficient for applications in forecasting weather and air quality and studying aerosol climate forcing, and may do so with the reliability required for policy analysis.« less
O'Brien, Caroline C; Kolandaivelu, Kumaran; Brown, Jonathan; Lopes, Augusto C; Kunio, Mie; Kolachalama, Vijaya B; Edelman, Elazer R
2016-01-01
Stacking cross-sectional intravascular images permits three-dimensional rendering of endovascular implants, yet introduces between-frame uncertainties that limit characterization of device placement and the hemodynamic microenvironment. In a porcine coronary stent model, we demonstrate enhanced OCT reconstruction with preservation of between-frame features through fusion with angiography and a priori knowledge of stent design. Strut positions were extracted from sequential OCT frames. Reconstruction with standard interpolation generated discontinuous stent structures. By computationally constraining interpolation to known stent skeletons fitted to 3D 'clouds' of OCT-Angio-derived struts, implant anatomy was resolved, accurately rendering features from implant diameter and curvature (n = 1 vessels, r2 = 0.91, 0.90, respectively) to individual strut-wall configurations (average displacement error ~15 μm). This framework facilitated hemodynamic simulation (n = 1 vessel), showing the critical importance of accurate anatomic rendering in characterizing both quantitative and basic qualitative flow patterns. Discontinuities with standard approaches systematically introduced noise and bias, poorly capturing regional flow effects. In contrast, the enhanced method preserved multi-scale (local strut to regional stent) flow interactions, demonstrating the impact of regional contexts in defining the hemodynamic consequence of local deployment errors. Fusion of planar angiography and knowledge of device design permits enhanced OCT image analysis of in situ tissue-device interactions. Given emerging interests in simulation-derived hemodynamic assessment as surrogate measures of biological risk, such fused modalities offer a new window into patient-specific implant environments.
NASA Astrophysics Data System (ADS)
Anstey, Josephine; Pape, Dave
2013-03-01
In this paper we discuss Mrs. Squandertime, a real-time, persistent simulation of a virtual character, her living room, and the view from her window, designed to be a wall-size, projected art installation. Through her large picture window, the eponymous Mrs. Squandertime watches the sea: boats, clouds, gulls, the tide going in and out, people on the sea wall. The hundreds of images that compose the view are drawn from historical printed sources. The program that assembles and animates these images is driven by weather, time, and tide data constantly updated from a real physical location. The character herself is rendered photographically in a series of slowly dissolving stills which correspond to the character's current behavior.
NASA Technical Reports Server (NTRS)
Markson, R.; Anderson, B.; Govaert, J.; Fairall, C. W.
1989-01-01
A novel coronal current-determining instrument is being used at NASA-KSC which overcomes previous difficulties with wind sensitivity and a voltage-threshold 'deadband'. The mounting of the corona needle at an elevated location reduces coronal and electrode layer space-charge influences on electric fields, rendering the measurement of space charge density possible. In conjunction with a space-charge compensation model, these features allow a more realistic estimation of cloud base electric fields and the potential for lightning strike than has previously been possible with ground-based sensors.
Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640
Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.
Condensed-phase biogenic-anthropogenic interactions with implications for cold cloud formation
Charnawskas, Joseph C.; Alpert, Peter A.; Lambe, Andrew; ...
2017-01-24
Anthropogenic and biogenic gas emissions contribute to the formation of secondary organic aerosol (SOA). When present, soot particles from fossil-fuel combustion can acquire a coating of SOA. We investigate SOA-soot biogenic-anthropogenic interactions and their impact on ice nucleation in relation to the particles’ organic phase state. SOA particles were generated from the OH oxidation of naphthalene, α-pinene, longifolene, or isoprene, with or without presence of sulfate or soot particles. Corresponding particle glass transition (T g) and full deliquescence relative humidity (FDRH) were estimated by a numerical diffusion model. Longifolene SOA particles are solid-like and all biogenic SOA sulfate mixtures exhibitmore » a core-shell configuration (i.e. a sulfate-rich core coated with SOA). Biogenic SOA with or without sulfate formed ice at conditions expected for homogeneous ice nucleation in agreement with respective T g and FDRH. α-pinene SOA coated soot particles nucleated ice above the homogeneous freezing temperature with soot acting as ice nuclei (IN). At lower temperatures the α-pinene SOA coating can be semisolid inducing ice nucleation. Naphthalene SOA coated soot particles acted as IN above and below the homogeneous freezing limit, which can be explained by the presence of a highly viscous SOA phase. Our results suggest that biogenic SOA does not play a significant role in mixed-phase cloud formation and the presence of sulfate further renders this even less likely. Furthermore, anthropogenic SOA may have an enhancing effect on cloud glaciation under mixed-phase and cirrus cloud conditions compared to biogenic SOA that dominate during preindustrial times or in pristine areas.« less
Condensed-phase biogenic–anthropogenic interactions with implications for cold cloud formation
Charnawskas, Joseph C.; Alpert, Peter A.; Lambe, Andrew T.; ...
2017-01-24
Anthropogenic and biogenic gas emissions contribute to the formation of secondary organic aerosol (SOA). When present, soot particles from fossil fuel combustion can acquire a coating of SOA. We investigate SOA–soot biogenic–anthropogenic interactions and their impact on ice nucleation in relation to the particles’ organic phase state. SOA particles were generated from the OH oxidation of naphthalene, α-pinene, longifolene, or isoprene, with or without the presence of sulfate or soot particles. Corresponding particle glass transition (T g) and full deliquescence relative humidity (FDRH) were estimated using a numerical diffusion model. Longifolene SOA particles are solid-like and all biogenic SOA sulfatemore » mixtures exhibit a core–shell configuration (i.e.a sulfate-rich core coated with SOA). Biogenic SOA with or without sulfate formed ice at conditions expected for homogeneous ice nucleation, in agreement with respectiveT gand FDRH. α-pinene SOA coated soot particles nucleated ice above the homogeneous freezing temperature with soot acting as ice nuclei (IN). At lower temperatures the α-pinene SOA coating can be semisolid, inducing ice nucleation. Naphthalene SOA coated soot particles acted as ice nuclei above and below the homogeneous freezing limit, which can be explained by the presence of a highly viscous SOA phase. Our results suggest that biogenic SOA does not play a significant role in mixed-phase cloud formation and the presence of sulfate renders this even less likely. However, anthropogenic SOA may have an enhancing effect on cloud glaciation under mixed-phase and cirrus cloud conditions compared to biogenic SOA that dominate during pre-industrial times or in pristine areas.« less
Condensed-phase biogenic-anthropogenic interactions with implications for cold cloud formation.
Charnawskas, Joseph C; Alpert, Peter A; Lambe, Andrew T; Berkemeier, Thomas; O'Brien, Rachel E; Massoli, Paola; Onasch, Timothy B; Shiraiwa, Manabu; Moffet, Ryan C; Gilles, Mary K; Davidovits, Paul; Worsnop, Douglas R; Knopf, Daniel A
2017-08-24
Anthropogenic and biogenic gas emissions contribute to the formation of secondary organic aerosol (SOA). When present, soot particles from fossil fuel combustion can acquire a coating of SOA. We investigate SOA-soot biogenic-anthropogenic interactions and their impact on ice nucleation in relation to the particles' organic phase state. SOA particles were generated from the OH oxidation of naphthalene, α-pinene, longifolene, or isoprene, with or without the presence of sulfate or soot particles. Corresponding particle glass transition (T g ) and full deliquescence relative humidity (FDRH) were estimated using a numerical diffusion model. Longifolene SOA particles are solid-like and all biogenic SOA sulfate mixtures exhibit a core-shell configuration (i.e. a sulfate-rich core coated with SOA). Biogenic SOA with or without sulfate formed ice at conditions expected for homogeneous ice nucleation, in agreement with respective T g and FDRH. α-pinene SOA coated soot particles nucleated ice above the homogeneous freezing temperature with soot acting as ice nuclei (IN). At lower temperatures the α-pinene SOA coating can be semisolid, inducing ice nucleation. Naphthalene SOA coated soot particles acted as ice nuclei above and below the homogeneous freezing limit, which can be explained by the presence of a highly viscous SOA phase. Our results suggest that biogenic SOA does not play a significant role in mixed-phase cloud formation and the presence of sulfate renders this even less likely. However, anthropogenic SOA may have an enhancing effect on cloud glaciation under mixed-phase and cirrus cloud conditions compared to biogenic SOA that dominate during pre-industrial times or in pristine areas.
NASA Astrophysics Data System (ADS)
Zalogin, Stanislav M.; Zalogin, M. S.
1997-02-01
The problem for construction of control algorithm in OEST the information track of the optical record carrier the realization of which is based on the use of accelerations is considered. Such control algorithms render the designed system the properties of adaptability, feeble sensitivity to the system parameter change and the action of disturbing forces what gives known advantages to information carriers with such system under operation in hard climate conditions as well as at maladjustment, workpieces wear and change of friction in the system. In the paper are investigated dynamic characteristics of a closed OEST, it is shown, that the designed stable system with given quality indices is a high-precision one. The validated recommendations as to design of control algorithms parameters are confirmed by results of mathematical simulation of controlled processes. The proposed methods for OEST synthesis on the basis of the control acceleration principle can be recommended for the use at industrial production of optical information record carriers.
GPU accelerated particle visualization with Splotch
NASA Astrophysics Data System (ADS)
Rivi, M.; Gheller, C.; Dykes, T.; Krokos, M.; Dolag, K.
2014-07-01
Splotch is a rendering algorithm for exploration and visual discovery in particle-based datasets coming from astronomical observations or numerical simulations. The strengths of the approach are production of high quality imagery and support for very large-scale datasets through an effective mix of the OpenMP and MPI parallel programming paradigms. This article reports our experiences in re-designing Splotch for exploiting emerging HPC architectures nowadays increasingly populated with GPUs. A performance model is introduced to guide our re-factoring of Splotch. A number of parallelization issues are discussed, in particular relating to race conditions and workload balancing, towards achieving optimal performances. Our implementation was accomplished by using the CUDA programming paradigm. Our strategy is founded on novel schemes achieving optimized data organization and classification of particles. We deploy a reference cosmological simulation to present performance results on acceleration gains and scalability. We finally outline our vision for future work developments including possibilities for further optimizations and exploitation of hybrid systems and emerging accelerators.
Synchrotron radiation and diffusive shock acceleration - A short review and GRB perspective
NASA Astrophysics Data System (ADS)
Karlica, Mile
2015-12-01
In this talk we present the sponge" model and its possible implications on the GRB afterglow light curves. "Sponge" model describes source of GRB afterglow radiation as fragmented GRB ejecta where bubbles move through the rarefied medium. In the first part of the talk a short introduction to synchrotron radiation and Fermi acceleration was presented. In the assumption that X-ray luminosity of GRB afterglow phase comes from the kinetic energy losses of clouds in ejecta medium radiated as synchrotron radiation we solved currently very simple equation of motion to find which combination of cloud and medium regime describes the afterglow light curve the best. We proposed for the first step to watch simple combinations of expansion regimes for both bubbles and surrounding medium. The closest case to the numerical fit of GRB 150403A with time power law index k = 1.38 is the combination of constant bubbles and Sedov like expanding medium with time power law index k = 1.25. Of course the question of possible mixture of variuos regime combinations is still open within this model.
The atmosphere of Uranus - Results of radio occultation measurements with Voyager 2
NASA Technical Reports Server (NTRS)
Lindal, G. F.; Lyons, J. R.; Sweetnam, D. N.; Eshleman, V. R.; Hinson, D. P.
1987-01-01
The Uranian atmosphere is investigated on the basis of S-band and X-band occultation observations (including measurements of Doppler frequency perturbations) obtained during the Voyager 2 encounter with Uranus in January 1986. The data are presented in extensive tables and graphs and characterized in detail. The atmosphere is assumed to have an H2/He abundance ratio of about 85/15, but also to contain small amounts of CH4 at above-cloud relative humidity 30 percent, cloud-base relative humidity 78 percent, and below-cloud mixing ratio 2.3 percent by number density. Other parameters estimated include magnetic-field rotation period 17.24 h, 1-bar equatorial radius 25,559 + or - 4 km, polar radius 24,973 + or - 20 km, equatorial acceleration of gravity 8.69 + or - 0.01 m/sec sq, and atmospheric temperature 76 + or - 2 K (assuming 85 + or - 3 percent H2).
Radiative transfer models for retrieval of cloud parameters from EPIC/DSCOVR measurements
NASA Astrophysics Data System (ADS)
Molina García, Víctor; Sasi, Sruthy; Efremenko, Dmitry S.; Doicu, Adrian; Loyola, Diego
2018-07-01
In this paper we analyze the accuracy and efficiency of several radiative transfer models for inferring cloud parameters from radiances measured by the Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR). The radiative transfer models are the exact discrete ordinate and matrix operator methods with matrix exponential, and the approximate asymptotic and equivalent Lambertian cloud models. To deal with the computationally expensive radiative transfer calculations, several acceleration techniques such as, for example, the telescoping technique, the method of false discrete ordinate, the correlated k-distribution method and the principal component analysis (PCA) are used. We found that, for the EPIC oxygen A-band absorption channel at 764 nm, the exact models using the correlated k-distribution in conjunction with PCA yield an accuracy better than 1.5% and a computation time of 18 s for radiance calculations at 5 viewing zenith angles.
A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm
Chen, Jui-Le; Yang, Chu-Sing
2013-01-01
The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864
A resource-sharing model based on a repeated game in fog computing.
Sun, Yan; Zhang, Nan
2017-03-01
With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.
The Origin of Cosmic Rays: What can GLAST Say?
NASA Technical Reports Server (NTRS)
Ormes, Jonathan F.; Digel, Seith; Moskalenko, Igor V.; Moiseev, Alexander; Williamson, Roger
2000-01-01
Gamma rays in the band from 30 MeV to 300 GeV, used in combination with direct measurements and with data from radio and X-ray bands, provide a powerful tool for studying the origin of Galactic cosmic rays. Gamma-ray Large Area Space Telescope (GLAST) with its fine 10-20 arcmin angular resolution will be able to map the sites of acceleration of cosmic rays and their interactions with interstellar matter, It will provide information that is necessary to study the acceleration of energetic particles in supernova shocks, their transport in the interstellar medium and penetration into molecular clouds.
Clouds Sailing Overhead on Mars, Unenhanced
2017-08-09
Wispy clouds float across the Martian sky in this accelerated sequence of images from NASA's Curiosity Mars rover. The rover's Navigation Camera (Navcam) took these eight images over a span of four minutes early in the morning of the mission's 1,758th Martian day, or sol (July 17, 2017), aiming nearly straight overhead. This sequence uses raw images, which include a bright ring around the center of the frame that is an artifact of sunlight striking the camera lens even though the Sun is not in the shot. A processed version removing that artifact and emphasizing changes between images is also available. The clouds resemble Earth's cirrus clouds, which are ice crystals at high altitudes. These Martian clouds are likely composed of crystals of water ice that condense onto dust grains in the cold Martian atmosphere. Cirrus wisps appear as ice crystals fall and evaporate in patterns known as "fall streaks" or "mare's tails." Such patterns have been seen before at high latitudes on Mars, for instance by the Phoenix Mars Lander in 2008, and seasonally nearer the equator, for instance by the Opportunity rover. However, Curiosity has not previously observed such clouds so clearly visible from the rover's study area about five degrees south of the equator. The Hubble Space Telescope and spacecraft orbiting Mars have observed a band of clouds to appear near the Martian equator around the time of the Martian year when the planet is farthest from the Sun. With a more elliptical orbit than Earth's, Mars experiences more annual variation than Earth in its distance from the Sun. The most distant point in an orbit around the Sun is called the aphelion. The near-equatorial Martian cloud pattern observed at that time of year is called the "aphelion cloud belt." These new images from Curiosity were taken about two months before aphelion, but the morning clouds observed may be an early stage of the aphelion cloud belt. An animation is available at https://photojournal.jpl.nasa.gov/catalog/PIA21842
Fast grasping of unknown objects using cylinder searching on a single point cloud
NASA Astrophysics Data System (ADS)
Lei, Qujiang; Wisse, Martijn
2017-03-01
Grasping of unknown objects with neither appearance data nor object models given in advance is very important for robots that work in an unfamiliar environment. The goal of this paper is to quickly synthesize an executable grasp for one unknown object by using cylinder searching on a single point cloud. Specifically, a 3D camera is first used to obtain a partial point cloud of the target unknown object. An original method is then employed to do post treatment on the partial point cloud to minimize the uncertainty which may lead to grasp failure. In order to accelerate the grasp searching, surface normal of the target object is then used to constrain the synthetization of the cylinder grasp candidates. Operability analysis is then used to select out all executable grasp candidates followed by force balance optimization to choose the most reliable grasp as the final grasp execution. In order to verify the effectiveness of our algorithm, Simulations on a Universal Robot arm UR5 and an under-actuated Lacquey Fetch gripper are used to examine the performance of this algorithm, and successful results are obtained.
On the formation and confinement of dense clouds in QSOs and active galactic nuclei
NASA Technical Reports Server (NTRS)
Marscher, A. P.; Weaver, R. P.
1979-01-01
A model for the formation and confinement of dense (at least about 1 billion per cu cm) clouds in QSOs and active galactic nuclei is presented wherein thermal instabilities behind radiative shocks cause the collapse of regions where the preshock density is enhanced over that of the surrounding medium. Such shocks (of total energy around 10 to the 51st ergs) are likely to occur if the frequent optical outbursts observed in many of these objects are accompanied by mass ejections of comparable energy. It is found that clouds quite similar to those thought to exist in QSOs etc. can be created in this manner at radii of the order of 10 to the 17th cm. The clouds can be subsequently accelerated to observed bulk velocities by either radiation pressure or a collision with a much stronger (total energy around 10 to the 53 ergs) shock. Alternatively, their high observed velocities could be caused by gravitational infall or rotation. The mass production required at inner radii by the outflow models can be supplied through a mechanism previously discussed by Shields (1977).
Science in the cloud (SIC): A use case in MRI connectomics
Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal
2017-01-01
Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935
Climbing the Slope of Enlightenment during NASA's Arctic Boreal Vulnerability Experiment
NASA Astrophysics Data System (ADS)
Griffith, P. C.; Hoy, E.; Duffy, D.; McInerney, M.
2015-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is a new field campaign sponsored by NASA's Terrestrial Ecology Program and designed to improve understanding of the vulnerability and resilience of Arctic and boreal social-ecological systems to environmental change (http://above.nasa.gov). ABoVE is integrating field-based studies, modeling, and data from airborne and satellite remote sensing. The NASA Center for Climate Simulation (NCCS) has partnered with the NASA Carbon Cycle and Ecosystems Office (CCEO) to create a high performance science cloud for this field campaign. The ABoVE Science Cloud combines high performance computing with emerging technologies and data management with tools for analyzing and processing geographic information to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage for "big data" with integrated data management, and integration of core variables from in-situ networks. The ABoVE Science Cloud is a collaboration that is accelerating the pace of new Arctic science for researchers participating in the field campaign. Specific examples of the utilization of the ABoVE Science Cloud by several funded projects will be presented.
Zannas, Anthony S; Arloth, Janine; Carrillo-Roa, Tania; Iurato, Stella; Röh, Simone; Ressler, Kerry J; Nemeroff, Charles B; Smith, Alicia K; Bradley, Bekh; Heim, Christine; Menke, Andreas; Lange, Jennifer F; Brückl, Tanja; Ising, Marcus; Wray, Naomi R; Erhardt, Angelika; Binder, Elisabeth B; Mehta, Divya
2018-05-23
Upon publication of the original article [1] it was highlighted by the authors that a transposition error affected Additional file 1, causing the misplacement of several columns and rendering the table difficult to read. This transposition does not influence any of the results nor analyses presented in the paper and has since been formally noted in this correction article; the corrected file is available here as an Additional File. The publisher apologizes for this error.
Biobeam—Multiplexed wave-optical simulations of light-sheet microscopy
Weigert, Martin; Bundschuh, Sebastian T.
2018-01-01
Sample-induced image-degradation remains an intricate wave-optical problem in light-sheet microscopy. Here we present biobeam, an open-source software package that enables simulation of operational light-sheet microscopes by combining data from 105–106 multiplexed and GPU-accelerated point-spread-function calculations. The wave-optical nature of these simulations leads to the faithful reproduction of spatially varying aberrations, diffraction artifacts, geometric image distortions, adaptive optics, and emergent wave-optical phenomena, and renders image-formation in light-sheet microscopy computationally tractable. PMID:29652879
Direct Volume Rendering with Shading via Three-Dimensional Textures
NASA Technical Reports Server (NTRS)
VanGelder, Allen; Kim, Kwansik
1996-01-01
A new and easy-to-implement method for direct volume rendering that uses 3D texture maps for acceleration, and incorporates directional lighting, is described. The implementation, called Voltx, produces high-quality images at nearly interactive speeds on workstations with hardware support for three-dimensional texture maps. Previously reported methods did not incorporate a light model, and did not address issues of multiple texture maps for large volumes. Our research shows that these extensions impact performance by about a factor of ten. Voltx supports orthographic, perspective, and stereo views. This paper describes the theory and implementation of this technique, and compares it to the shear-warp factorization approach. A rectilinear data set is converted into a three-dimensional texture map containing color and opacity information. Quantized normal vectors and a lookup table provide efficiency. A new tesselation of the sphere is described, which serves as the basis for normal-vector quantization. A new gradient-based shading criterion is described, in which the gradient magnitude is interpreted in the context of the field-data value and the material classification parameters, and not in isolation. In the rendering phase, the texture map is applied to a stack of parallel planes, which effectively cut the texture into many slabs. The slabs are composited to form an image.
Fortmeier, Dirk; Mastmeyer, Andre; Schröder, Julian; Handels, Heinz
2016-01-01
This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.
NASA Astrophysics Data System (ADS)
Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella
2015-09-01
Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.
Spectral variation during one quasi-periodic oscillation cycle in the black hole candidate H1743-322
NASA Astrophysics Data System (ADS)
Sarathi Pal, Partha; Debnath, Dipak; Chakrabarti, Sandip Kumar
2016-07-01
From the nature of energy dependence of the power density spectra, it is believed that the oscillation of the Compton cloud may be related to low frequency quasi-periodic oscillations (LFQPOs). In the context of two component advective flow (TCAF) solution, the centrifugal pressure supported boundary layer of a transonic flow acts as the Compton cloud. This region undergoes resonance oscillation when cooling time scale roughly agrees with infall time scale as matter crosses this region. By carefully separating photons emitted at different phases of a complete oscillation, we establish beyond reasonable doubt that such an oscillation is the cause of LFQPOs. We show that the degree of Comptonization and therefore the spectral properties of the flow oscillate systematically with the phase of LFQPOs. We analysis the properties of a 0.2Hz LFQPO exhibited by a black hole candidate H 1743-322 using the 3-80 keV data from NuSTAR satellite. This object was chosen because of availability of high quality data for a relatively low frequency oscillation, rendering easy phase-wise of separation of the light curve data.
Scalable and cost-effective NGS genotyping in the cloud.
Souilmi, Yassine; Lancaster, Alex K; Jung, Jae-Yoon; Rizzo, Ettore; Hawkins, Jared B; Powles, Ryan; Amzazi, Saaïd; Ghazal, Hassan; Tonellato, Peter J; Wall, Dennis P
2015-10-15
While next-generation sequencing (NGS) costs have plummeted in recent years, cost and complexity of computation remain substantial barriers to the use of NGS in routine clinical care. The clinical potential of NGS will not be realized until robust and routine whole genome sequencing data can be accurately rendered to medically actionable reports within a time window of hours and at scales of economy in the 10's of dollars. We take a step towards addressing this challenge, by using COSMOS, a cloud-enabled workflow management system, to develop GenomeKey, an NGS whole genome analysis workflow. COSMOS implements complex workflows making optimal use of high-performance compute clusters. Here we show that the Amazon Web Service (AWS) implementation of GenomeKey via COSMOS provides a fast, scalable, and cost-effective analysis of both public benchmarking and large-scale heterogeneous clinical NGS datasets. Our systematic benchmarking reveals important new insights and considerations to produce clinical turn-around of whole genome analysis optimization and workflow management including strategic batching of individual genomes and efficient cluster resource configuration.
McIDAS-V: Advanced Visualization for 3D Remote Sensing Data
NASA Astrophysics Data System (ADS)
Rink, T.; Achtor, T. H.
2010-12-01
McIDAS-V is a Java-based, open-source, freely available software package for analysis and visualization of geophysical data. Its advanced capabilities provide very interactive 4-D displays, including 3D volumetric rendering and fast sub-manifold slicing, linked to an abstract mathematical data model with built-in metadata for units, coordinate system transforms and sampling topology. A Jython interface provides user defined analysis and computation in terms of the internal data model. These powerful capabilities to integrate data, analysis and visualization are being applied to hyper-spectral sounding retrievals, eg. AIRS and IASI, of moisture and cloud density to interrogate and analyze their 3D structure, as well as, validate with instruments such as CALIPSO, CloudSat and MODIS. The object oriented framework design allows for specialized extensions for novel displays and new sources of data. Community defined CF-conventions for gridded data are understood by the software, and can be immediately imported into the application. This presentation will show examples how McIDAS-V is used in 3-dimensional data analysis, display and evaluation.
SU-E-T-186: Cloud-Based Quality Assurance Application for Linear Accelerator Commissioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, J
2015-06-15
Purpose: To identify anomalies and safety issues during data collection and modeling for treatment planning systems Methods: A cloud-based quality assurance system (AQUIRE - Automated QUalIty REassurance) has been developed to allow the uploading and analysis of beam data aquired during the treatment planning system commissioning process. In addition to comparing and aggregating measured data, tools have also been developed to extract dose from the treatment planning system for end-to-end testing. A gamma index is perfomed on the data to give a dose difference and distance-to-agreement for validation that a beam model is generating plans consistent with the beam datamore » collection. Results: Over 20 linear accelerators have been commissioning using this platform, and a variety of errors and potential saftey issues have been caught through the validation process. For example, the gamma index of 2% dose, 2mm DTA is quite sufficient to see curves not corrected for effective point of measurement. Also, data imported into the database is analyzed against an aggregate of similar linear accelerators to show data points that are outliers. The resulting curves in the database exhibit a very small standard deviation and imply that a preconfigured beam model based on aggregated linear accelerators will be sufficient in most cases. Conclusion: With the use of this new platform for beam data commissioning, errors in beam data collection and treatment planning system modeling are greatly reduced. With the reduction in errors during acquisition, the resulting beam models are quite similar, suggesting that a common beam model may be possible in the future. Development is ongoing to create routine quality assurance tools to compare back to the beam data acquired during commissioning. I am a medical physicist for Alzyen Medical Physics, and perform commissioning services.« less
Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing
NASA Astrophysics Data System (ADS)
Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim
2011-03-01
Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.
Measurement of the line-of-sight velocity of high-altitude barium clouds A technique
NASA Technical Reports Server (NTRS)
Mende, S. B.; Harris, S. E.
1982-01-01
It is demonstrated that for maximizing the scientific output of future ionospheric and magnetospheric ion cloud release experiments a new type of instrument is required which will measure the line-of-sight velocity of the ion cloud by the Doppler technique. A simple instrument was constructed using a 5-cm diam solid Fabry-Perot etalon coupled to a low-light-level integrating TV camera. It was demonstrated that the system has both the sensitivity and spectral resolution for detection of ion clouds and measurement of their line-of-sight Doppler velocity. The tests consisted of (1) a field experiment using a rocket barium cloud release to check sensitivity, and (2) laboratory experiments to show the spectral resolving capabilities of the system. The instrument was found to be operational if the source was brighter than approximately 1 kR, and it had a wavelength resolution much better than 0.2 A, which corresponds to approximately 12 km/sec or in the case of barium ion an acceleration potential of 100 V. The instrument is rugged and, therefore, simple to use in field experiments or on flight instruments. The sensitivity limit of the instrument can be increased by increasing the size of the etalon.
Decreasing cloud cover drives the recent mass loss on the Greenland Ice Sheet
Hofer, Stefan; Tedstone, Andrew J.; Fettweis, Xavier; Bamber, Jonathan L.
2017-01-01
The Greenland Ice Sheet (GrIS) has been losing mass at an accelerating rate since the mid-1990s. This has been due to both increased ice discharge into the ocean and melting at the surface, with the latter being the dominant contribution. This change in state has been attributed to rising temperatures and a decrease in surface albedo. We show, using satellite data and climate model output, that the abrupt reduction in surface mass balance since about 1995 can be attributed largely to a coincident trend of decreasing summer cloud cover enhancing the melt-albedo feedback. Satellite observations show that, from 1995 to 2009, summer cloud cover decreased by 0.9 ± 0.3% per year. Model output indicates that the GrIS summer melt increases by 27 ± 13 gigatons (Gt) per percent reduction in summer cloud cover, principally because of the impact of increased shortwave radiation over the low albedo ablation zone. The observed reduction in cloud cover is strongly correlated with a state shift in the North Atlantic Oscillation promoting anticyclonic conditions in summer and suggests that the enhanced surface mass loss from the GrIS is driven by synoptic-scale changes in Arctic-wide atmospheric circulation. PMID:28782014
Investigation of cloud properties and atmospheric stability with MODIS
NASA Technical Reports Server (NTRS)
Menzel, Paul
1995-01-01
In the past six months several milestones were accomplished. The MODIS Airborne Simulator (MAS) was flown in a 50 channel configuration for the first time in January 1995 and the data were calibrated and validated; in the same field campaign the approach for validating MODIS radiances using the MAS and High resolution Interferometer Sounder (HIS) instruments was successfully tested on GOES-8. Cloud masks for two scenes (one winter and the other summer) of AVHRR local area coverage from the Gulf of Mexico to Canada were processed and forwarded to the SDST for MODIS Science Team investigation; a variety of surface and cloud scenes were evident. Beta software preparations continued with incorporation of the EOS SDP Toolkit. SCAR-C data was processed and presented at the biomass burning conference. Preparations for SCAR-B accelerated with generation of a home page for access to real time satellite data related to biomass burning; this will be available to the scientists in Brazil via internet on the World Wide Web. The CO2 cloud algorithm was compared to other algorithms that differ in their construction of clear radiance fields. The HIRS global cloud climatology was completed for six years. The MODIS science team meeting was attended by five of the UW scientists.
Cloud computing approaches to accelerate drug discovery value chain.
Garg, Vibhav; Arora, Suchir; Gupta, Chitra
2011-12-01
Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.
One and two fluid numerical investigations of solar wind gas releases
NASA Astrophysics Data System (ADS)
Harold, James Benedict
1993-01-01
The dynamics of gas releases into high Mach number flowing plasmas are investigated. Emphasis is placed on systems of intermediate magnetization for which the scale size of the release lies between the ion and electron Larmor radii. The study is motivated by the December 1984 AMPTE (Active Magnetospheric Particle Tracer Explorer) solar wind barium release in which, contrary to the predictions of MHD theory, the barium cloud shifted transverse to the solar wind (in the uwind x B0 direction) before eventually turning downstream. Particular emphasis is given to identifying mechanisms responsible for this lateral motion. A modified MHD cold fluid approach that takes advantage of the supersonic nature of the problem forms the basis of this work. Two specific models are developed which incorporate large effective ion Larmor radius effects. The first is for a single ion species, the second for two ion species. Two physical effects are identified which are not present in the conventional MHD system: the Hall effect, based on a Hall magnetic drift wave, and a hybrid electrostatic ion cyclotron mode. Linear analysis shows that the effect of the Hall term is to propagate the upwind magnetic field compression azimuthally to the downwind side of the cloud, leading to a quasi-steady state field compression on the -uwind x BO side of the cloud. The cyclotron mode can lead to a similar compression through deflection of the solar wind ions into the uwind x BO direction. In each case the resulting compression leads to a transverse acceleration of the cloud. The relative importance of these two mechanisms is shown to depend on deltac / rc, the ratio of the collisionless skin depth to the cloud size. Nonlinear, two-dimensional simulations are performed for each model. These simulations produce the expected field compressions and the resultant lateral acceleration, in general qualitative agreement with the AMPTE experiment. The dependence of these mechanisms on the ratio deltac / rc is demonstrated. While no simulations are performed that precisely duplicate the parameters of the AMPTE release, the results suggest that the Hall effect, and possibly deflection of the solar wind by the cyclotron mode, constitute plausible mechanisms for the AMPTE shift.
Oh, Jeongsu; Choi, Chi-Hwan; Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo
2016-01-01
High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA and is freely available at http://clustomcloud.kopri.re.kr.
NASA Technical Reports Server (NTRS)
Doug, Xiquan; Mace, Gerald G.; Minnis, Patrick; Young, David F.
2001-01-01
To study Arctic stratus cloud properties and their effect on the surface radiation balance during the spring transition season, analyses are performed using data taken during three cloudy and two clear days in May 1998 as part of the First ISCCP Regional Experiment (FIRE) Arctic Cloud Experiment (ACE). Radiative transfer models are used in conjunction with surface- and satellite-based measurements to retrieve the layer-averaged microphysical and shortwave radiative properties. The surface-retrieved cloud properties in Cases 1 and 2 agree well with the in situ and satellite retrievals. Discrepancies in Case 3 are due to spatial mismatches between the aircraft and the surface measurements in a highly variable cloud field. Also, the vertical structure in the cloud layer is not fully characterized by the aircraft measurements. Satellite data are critical for understanding some of the observed discrepancies. The satellite-derived particle sizes agree well with the coincident surface retrievals and with the aircraft data when they were collocated. Optical depths derived from visible-channel data over snow backgrounds were overestimated in all three cases, suggesting that methods currently used in satellite cloud climatologies derive optical depths that are too large. Use of a near-infrared channel with a solar infrared channel to simultaneously derive optical depth and particle size appears to alleviate this overestimation problem. Further study of the optical depth retrieval is needed. The surface-based radiometer data reveal that the Arctic stratus clouds produce a net warming of 20 W m(exp -2) in the surface layer during the transition season suggesting that these clouds may accelerate the spring time melting of the ice pack. This surface warming contrasts with the net cooling at the top of the atmosphere (TOA) during the same period. All analysis of the complete FIRE ACE data sets will be valuable for understanding the role of clouds during the entire melting and refreezing process that occurs annually in the Arctic.
Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo
2016-01-01
High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology–a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA and is freely available at http://clustomcloud.kopri.re.kr. PMID:26954507
Radiation belt electron observations following the January 1997 magnetic cloud event
NASA Astrophysics Data System (ADS)
Selesnick, R. S.; Blake, J. B.
Relativistic electrons in the outer radiation belt associated with the January 1997 magnetic cloud event were observed by the HIST instrument on POLAR at kinetic energies from 0.7 to 7 MeV and L shells from 3 to 9. The electron enhancement occurred on a time scale of hours or less throughout the outer radiation belt, except for a more gradual rise in the higher energy electrons at the lower L values indicative of local acceleration and inward radial diffusion. At the higher L values, variations on a time scale of several days following the initial injection on January 10 are consistent with data from geosynchronous orbit and may be an adiabatic response.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Pimpec, F.; /PSI, Villigen; Kirby, R.E.
In many accelerator storage rings running positively charged beams, ionization of residual gas and secondary electron emission (SEE) in the beam pipe will give rise to an electron cloud which can cause beam blow-up or loss of the circulating beam. A preventative measure that suppresses electron cloud formation is to ensure that the vacuum wall has a low secondary emission yield (SEY). The SEY of thin films of TiN, sputter deposited Non-Evaporable Getters and a novel TiCN alloy were measured under a variety of conditions, including the effect of re-contamination from residual gas.
HNSciCloud - Overview and technical Challenges
NASA Astrophysics Data System (ADS)
Gasthuber, Martin; Meinhard, Helge; Jones, Robert
2017-10-01
HEP is only one of many sciences with sharply increasing compute requirements that cannot be met by profiting from Moore’s law alone. Commercial clouds potentially allow for realising larger economies of scale. While some small-scale experience requiring dedicated effort has been collected, public cloud resources have not been integrated yet with the standard workflows of science organisations in their private data centres; in addition, European science has not ramped up to significant scale yet. The HELIX NEBULA Science Cloud project - HNSciCloud, partly funded by the European Commission, addresses these points. Ten organisations under CERN’s leadership, covering particle physics, bioinformatics, photon science and other sciences, have joined to procure public cloud resources as well as dedicated development efforts towards this integration. The HNSciCloud project faces the challenge to accelerate developments performed by the selected commercial providers. In order to guarantee cost efficient usage of IaaS resources across a wide range of scientific communities, the technical requirements had to be carefully constructed. With respect to current IaaS offerings, dataintensive science is the biggest challenge; other points that need to be addressed concern identity federations, network connectivity and how to match business practices of large IaaS providers with those of public research organisations. In the first section, this paper will give an overview of the project and explain the findings so far. The last section will explain the key points of the technical requirements and present first results of the experience of the procurers with the services in comparison to their’on-premise’ infrastructure.
Study of the Fine-Scale Structure of Cumulus Clouds.
NASA Astrophysics Data System (ADS)
Rodi, Alfred R.
Small cumulus clouds are studied using data from an instrumented aircraft. Two aspects of the role of turbulence and mixing in these couds are examined: (1) the effect of mixing on the droplet size distribution, and (2) the effect of turbulence on the spread of ice crystal plumes artificially generated with cloud seeding agents. The data were collected in the course of the Bureau of Reclamation's High Plains Cooperative Experiment (HIPLEX) in Montana in the summers of 1978-80 by the University of Wyoming King Air aircraft. The shape of the cloud droplet spectrum as measured by the Particle Measuring Systems (PMS) Forward Scattering Spectrometer Probe (FSSP) is found to be very sensitive to entrainment of dry environmental air into the cloud. The narrowest cloud droplet spectra, the highest droplet concentrations, and the largest sized droplets are found in the cloud parcels which are least affected by entrainment. The most dilute regions of cloud exhibit the broadest spectra which are frequently bimodal. A procedure for measuring cloud inhomogeneity from FSSP is developed. The data shows that the clouds are extremely inhomogeneous in structure. Current models of inhomogeneous mixing are shown to be inadequate in explaining droplet spectrum effects. However, the inhomogeneous models characterize the data far better than classical models of droplet spectrum evolution. High resolution measurements of ice crystals from the PMS two dimensional imaging probe are used to characterize the spread of the ice crystal plume in seeded clouds. Plume spread is found to be a very complicated process which is in some cases dominated by organized motions in the cloud. As a result, classical diffusion theory is often inadequate to predict plume growth. The turbulent diffusion that occurs is shown to be best modeled using the relative diffusion concept of Richardson. Procedures for adapting aircraft data to the relative diffusion model are developed, including techniques for converting the aircraft Eulerian data into estimates of Lagrangian correlations. Predictions of the model are compared with observations of plume growth. A detailed analysis of errors in the air motion sensing system on the aircraft is presented. A procedure is developed to estimate the errors due to aircraft gyroscope sensitivity to horizontal accelerations.
Multilayered nonuniform sampling for three-dimensional scene representation
NASA Astrophysics Data System (ADS)
Lin, Huei-Yung; Xiao, Yu-Hua; Chen, Bo-Ren
2015-09-01
The representation of a three-dimensional (3-D) scene is essential in multiview imaging technologies. We present a unified geometry and texture representation based on global resampling of the scene. A layered data map representation with a distance-dependent nonuniform sampling strategy is proposed. It is capable of increasing the details of the 3-D structure locally and is compact in size. The 3-D point cloud obtained from the multilayered data map is used for view rendering. For any given viewpoint, image synthesis with different levels of detail is carried out using the quadtree-based nonuniformly sampled 3-D data points. Experimental results are presented using the 3-D models of reconstructed real objects.
Tryptophan and tryptophan-like substances in cloud water: Occurrence and photochemical fate
NASA Astrophysics Data System (ADS)
Bianco, Angelica; Passananti, Monica; Deguillaume, Laurent; Mailhot, Gilles; Brigante, Marcello
2016-07-01
This work investigates the occurrence and photochemical behaviour of tryptophan (TRP) in the cloud aqueous phase. The concentrations of tryptophan, TRYptophan LIke Substances (TRYLIS) and HUmic LIke Substances (HULIS) in real cloud water, collected between October 2013 and November 2014 at the top of the puy de Dôme station, were determined using the Excitation-Emission-Matrix (EEM) technique. The amount of free and complexed tryptophan (TRP) up to 10-7 M in cloud aqueous phase was quantified by HPLC-UV-fluorescence analysis, and its photoreactivity under sun-simulated conditions was investigated in synthetic water samples mimicking cloud aqueous phase compositions (oceanic and continental origins). TRP undergoes direct photolysis, and its degradation is enhanced in the presence of naturally occurring species able to photo-generate hydroxyl radicals (HOrad). The polychromatic quantum yield of TRP (ϕ290-340nmTRP) is estimated to be 8.37 × 10-4 between 290 and 340 nm, corresponding to the degradation rate (RTRPd) of 1.29 × 10-11 M s-1 under our irradiation conditions. The degradation is accelerated up to 3.65 × 10-10 and 8.26 × 10-10 M s-1 in synthetic oceanic and continental cloud water samples doped with 100 μM hydrogen peroxide, respectively. Hydroxyl radical-mediated transformation leads to the generation of different functionalized and oxidized products, as well as small carboxylic acids, such as formate and acetate. Moreover, fluorescent signals of irradiated solutions indicate the formation of HULIS.
The conversion of CESR to operate as the Test Accelerator, CesrTA. Part 1: overview
NASA Astrophysics Data System (ADS)
Billing, M. G.
2015-07-01
Cornell's electron/positron storage ring (CESR) was modified over a series of accelerator shutdowns beginning in May 2008, which substantially improves its capability for research and development for particle accelerators. CESR's energy span from 1.8 to 5.6 GeV with both electrons and positrons makes it ideal for the study of a wide spectrum of accelerator physics issues and instrumentation related to present light sources and future lepton damping rings. Additionally a number of these are also relevant for the beam physics of proton accelerators. This paper outlines the motivation, design and conversion of CESR to a test accelerator, CESRTA, enhanced to study such subjects as low emittance tuning methods, electron cloud (EC) effects, intra-beam scattering, fast ion instabilities as well as general improvements to beam instrumentation. While the initial studies of CESRTA focussed on questions related to the International Linear Collider (ILC) damping ring design, CESRTA is a very flexible storage ring, capable of studying a wide range of accelerator physics and instrumentation questions. This paper contains the outline and the basis for a set of papers documenting the reconfiguration of the storage ring and the associated instrumentation required for the studies described above. Further details may be found in these papers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yiran; Liu, Siming; Yuan, Qiang, E-mail: liusm@pmo.ac.cn
Recent precise measurements of cosmic-ray (CR) spectra show that the energy distribution of protons is softer than those of heavier nuclei, and there are spectral hardenings for all nuclear compositions above ∼200 GV. Models proposed for these anomalies generally assume steady-state solutions of the particle acceleration process. We show that if the diffusion coefficient has a weak dependence on the particle rigidity near shock fronts of supernova remnants (SNRs), time-dependent solutions of the linear diffusive shock acceleration at two stages of SNR evolution can naturally account for these anomalies. The high-energy component of CRs is dominated by acceleration in themore » free expansion and adiabatic phases with enriched heavy elements and a high shock speed. The low-energy component may be attributed to acceleration by slow shocks propagating in dense molecular clouds with low metallicity in the radiative phase. Instead of a single power-law distribution, the spectra of time-dependent solutions soften gradually with the increase of energy, which may be responsible for the “knee” of CRs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Ming; Albrecht, Bruce A.; Ghate, Virendra P.
This study first illustrates the utility of using the Doppler spectrum width from millimetrewavelength radar to calculate the energy dissipation rate and then to use the energy dissipation rate to study turbulence structure in a continental stratocumulus cloud. It is shown that the turbulence kinetic energy dissipation rate calculated from the radar-measured Doppler spectrum width agrees well with that calculated from the Doppler velocity power spectrum. During the 16-h stratocumulus cloud event, the small-scale turbulence contributes 40%of the total velocity variance at cloud base, 50% at normalized cloud depth=0.8 and 70% at cloud top, which suggests that small-scale turbulence playsmore » a critical role near the cloud top where the entrainment and cloud-top radiative cooling act. The 16-h mean vertical integral length scale decreases from about 160 m at cloud base to 60 m at cloud top, and this signifies that the larger scale turbulence dominates around cloud base whereas the small-scale turbulence dominates around cloud top. The energy dissipation rate, total variance and squared spectrum width exhibit diurnal variations, but unlike marine stratocumulus they are high during the day and lowest around sunset at all levels; energy dissipation rates increase at night with the intensification of the cloud-top cooling. In the normalized coordinate system, the averaged coherent structure of updrafts is characterized by low energy dissipation rates in the updraft core and higher energy dissipation rates surround the updraft core at the top and along the edges. In contrast, the energy dissipation rate is higher inside the downdraft core indicating that the downdraft core is more turbulent. The turbulence around the updraft is weaker at night and stronger during the day; the opposite is true around the downdraft. This behaviour indicates that the turbulence in the downdraft has a diurnal cycle similar to that observed in marine stratocumuluswhereas the turbulence diurnal cycle in the updraft is reversed. For both updraft and downdraft, the maximum energy dissipation rate occurs at a cloud depth=0.8 where the maximum reflectivity and air acceleration or deceleration are observed. Resolved turbulence dominates near cloud base whereas unresolved turbulence dominates near cloud top. Similar to the unresolved turbulence, the resolved turbulence described by the radial velocity variance is higher in the downdraft than in the updraft. The impact of the surface heating on the resolved turbulence in the updraft decreases with height and diminishes around the cloud top. In both updrafts and downdrafts, the resolved turbulence increases with height and reaches a maximum at cloud depth=0.4 and then decreases to the cloud top; the resolved turbulence near cloud top, just as the unresolved turbulence, is mostly due to the cloud-top radiative cooling.« less
Aziz, Zoriah; Abdul Rasool Hassan, Bassam
2017-02-01
Evidence from animal studies and trials suggests that honey may accelerate wound healing. The objective of this review was to assess the effects of honey compared with silver dressings on the healing of burn wounds. Relevant databases for randomized controlled trials (RCTs) of honey compared with silver sulfadiazine (SSD) were searched. The quality of the selected trials was assessed using the Cochrane Risk of Bias Assessment Tool. The primary endpoints considered were wound healing time and the number of infected wounds rendered sterile. Nine RCTs met the inclusion criteria. Based on moderate quality evidence there was a statistically significant difference between the two groups, favoring honey in healing time (MD -5.76days, 95% CI -8.14 to -3.39) and the proportions of infected wounds rendered sterile (RR 2.59; 95% CI 1.58-2.88). The available evidence suggests that honey dressings promote better wound healing than silver sulfadiazine for burns. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Radiative Transfer Model for Operational Retrieval of Cloud Parameters from DSCOVR-EPIC Measurements
NASA Astrophysics Data System (ADS)
Yang, Y.; Molina Garcia, V.; Doicu, A.; Loyola, D. G.
2016-12-01
The Earth Polychromatic Imaging Camera (EPIC) onboard the Deep Space Climate Observatory (DSCOVR) measures the radiance in the backscattering region. To make sure that all details in the backward glory are covered, a large number of streams is required by a standard radiative transfer model based on the discrete ordinates method. Even the use of the delta-M scaling and the TMS correction do not substantially reduce the number of streams. The aim of this work is to analyze the capability of a fast radiative transfer model to retrieve operationally cloud parameters from EPIC measurements. The radiative transfer model combines the discrete ordinates method with matrix exponential for the computation of radiances and the matrix operator method for the calculation of the reflection and transmission matrices. Standard acceleration techniques as, for instance, the use of the normalized right and left eigenvectors, telescoping technique, Pade approximation and successive-order-of-scattering approximation are implemented. In addition, the model may compute the reflection matrix of the cloud by means of the asymptotic theory, and may use the equivalent Lambertian cloud model. The various approximations are analyzed from the point of view of efficiency and accuracy.
Arctic sea ice melt leads to atmospheric new particle formation.
Dall Osto, M; Beddows, D C S; Tunved, P; Krejci, R; Ström, J; Hansson, H-C; Yoon, Y J; Park, Ki-Tae; Becagli, S; Udisti, R; Onasch, T; O Dowd, C D; Simó, R; Harrison, Roy M
2017-06-12
Atmospheric new particle formation (NPF) and growth significantly influences climate by supplying new seeds for cloud condensation and brightness. Currently, there is a lack of understanding of whether and how marine biota emissions affect aerosol-cloud-climate interactions in the Arctic. Here, the aerosol population was categorised via cluster analysis of aerosol size distributions taken at Mt Zeppelin (Svalbard) during a 11 year record. The daily temporal occurrence of NPF events likely caused by nucleation in the polar marine boundary layer was quantified annually as 18%, with a peak of 51% during summer months. Air mass trajectory analysis and atmospheric nitrogen and sulphur tracers link these frequent nucleation events to biogenic precursors released by open water and melting sea ice regions. The occurrence of such events across a full decade was anti-correlated with sea ice extent. New particles originating from open water and open pack ice increased the cloud condensation nuclei concentration background by at least ca. 20%, supporting a marine biosphere-climate link through sea ice melt and low altitude clouds that may have contributed to accelerate Arctic warming. Our results prompt a better representation of biogenic aerosol sources in Arctic climate models.
The terminal Velocity of the Deep Impact dust Ejecta
NASA Astrophysics Data System (ADS)
Rengel, M.; Küppers, M.; Keller, H. U.; Gutierrez, P.; Hviid, S. F.
2009-05-01
The collision of the projectile released from NASA Deep Impact spacecraft on the nucleus of comet 9P/Tempel 1 generated a hot plume. Afterwards ejecta were created, and material moved slowly in a form of a dust cloud, which dissipated during several days after the impact. Here we report a study about the distribution of terminal velocities of the particles ejected by the impact. This is performed by the development and application of an ill-conditioned inverse problem approach. We model the light-curves as seen by the Narrow Angle Camera (NAC) of OSIRIS onboard the ESA spacecraft Rosetta, and we compare them with the OSIRIS observations. Terminal velocities are derived using a maximum likelihood estimator. The dust velocity distribution is well constrained, and peaks at around 220 m s^{-1}, which is in good agreement with published estimates of the expansion velocities of the dust cloud. Measured and modeled velocity of the dust cloud suggests that the impact ejecta were quickly accelerated by the gas in the cometary coma. This analysis provides a more thorough understanding of the properties (velocity and mass of dust) of the Deep Impact dust cloud.
Acceleration of tropical cyclogenesis by self-aggregation feedbacks
NASA Astrophysics Data System (ADS)
Muller, Caroline J.; Romps, David M.
2018-03-01
Idealized simulations of tropical moist convection have revealed that clouds can spontaneously clump together in a process called self-aggregation. This results in a state where a moist cloudy region with intense deep convection is surrounded by extremely dry subsiding air devoid of deep convection. Because of the idealized settings of the simulations where it was discovered, the relevance of self-aggregation to the real world is still debated. Here, we show that self-aggregation feedbacks play a leading-order role in the spontaneous genesis of tropical cyclones in cloud-resolving simulations. Those feedbacks accelerate the cyclogenesis process by a factor of 2, and the feedbacks contributing to the cyclone formation show qualitative and quantitative agreement with the self-aggregation process. Once the cyclone is formed, wind-induced surface heat exchange (WISHE) effects dominate, although we find that self-aggregation feedbacks have a small but nonnegligible contribution to the maintenance of the mature cyclone. Our results suggest that self-aggregation, and the framework developed for its study, can help shed more light into the physical processes leading to cyclogenesis and cyclone intensification. In particular, our results point out the importance of the longwave radiative cooling outside the cyclone.
Design of smart neonatal health monitoring system using SMCC
Mukherjee, Anwesha; Bhakta, Ishita
2016-01-01
Automated health monitoring and alert system development is a demanding research area today. Most of the currently available monitoring and controlling medical devices are wired which limits freeness of working environment. Wireless sensor network (WSN) is a better alternative in such an environment. Neonatal intensive care unit is used to take care of sick and premature neonates. Hypothermia is an independent risk factor for neonatal mortality and morbidity. To prevent it an automated monitoring system is required. In this Letter, an automated neonatal health monitoring system is designed using sensor mobile cloud computing (SMCC). SMCC is based on WSN and MCC. In the authors’ system temperature sensor, acceleration sensor and heart rate measurement sensor are used to monitor body temperature, acceleration due to body movement and heart rate of neonates. The sensor data are stored inside the cloud. The health person continuously monitors and accesses these data through the mobile device using an Android Application for neonatal monitoring. When an abnormal situation arises, an alert is generated in the mobile device of the health person. By alerting health professional using such an automated system, early care is provided to the affected babies and the probability of recovery is increased. PMID:28261491
Design of smart neonatal health monitoring system using SMCC.
De, Debashis; Mukherjee, Anwesha; Sau, Arkaprabha; Bhakta, Ishita
2017-02-01
Automated health monitoring and alert system development is a demanding research area today. Most of the currently available monitoring and controlling medical devices are wired which limits freeness of working environment. Wireless sensor network (WSN) is a better alternative in such an environment. Neonatal intensive care unit is used to take care of sick and premature neonates. Hypothermia is an independent risk factor for neonatal mortality and morbidity. To prevent it an automated monitoring system is required. In this Letter, an automated neonatal health monitoring system is designed using sensor mobile cloud computing (SMCC). SMCC is based on WSN and MCC. In the authors' system temperature sensor, acceleration sensor and heart rate measurement sensor are used to monitor body temperature, acceleration due to body movement and heart rate of neonates. The sensor data are stored inside the cloud. The health person continuously monitors and accesses these data through the mobile device using an Android Application for neonatal monitoring. When an abnormal situation arises, an alert is generated in the mobile device of the health person. By alerting health professional using such an automated system, early care is provided to the affected babies and the probability of recovery is increased.
Hinkson, Izumi V.; Davidsen, Tanja M.; Klemm, Juli D.; Chandramouliswaran, Ishwar; Kerlavage, Anthony R.; Kibbe, Warren A.
2017-01-01
Advancements in next-generation sequencing and other -omics technologies are accelerating the detailed molecular characterization of individual patient tumors, and driving the evolution of precision medicine. Cancer is no longer considered a single disease, but rather, a diverse array of diseases wherein each patient has a unique collection of germline variants and somatic mutations. Molecular profiling of patient-derived samples has led to a data explosion that could help us understand the contributions of environment and germline to risk, therapeutic response, and outcome. To maximize the value of these data, an interdisciplinary approach is paramount. The National Cancer Institute (NCI) has initiated multiple projects to characterize tumor samples using multi-omic approaches. These projects harness the expertise of clinicians, biologists, computer scientists, and software engineers to investigate cancer biology and therapeutic response in multidisciplinary teams. Petabytes of cancer genomic, transcriptomic, epigenomic, proteomic, and imaging data have been generated by these projects. To address the data analysis challenges associated with these large datasets, the NCI has sponsored the development of the Genomic Data Commons (GDC) and three Cloud Resources. The GDC ensures data and metadata quality, ingests and harmonizes genomic data, and securely redistributes the data. During its pilot phase, the Cloud Resources tested multiple cloud-based approaches for enhancing data access, collaboration, computational scalability, resource democratization, and reproducibility. These NCI-led efforts are continuously being refined to better support open data practices and precision oncology, and to serve as building blocks of the NCI Cancer Research Data Commons. PMID:28983483
NASA Astrophysics Data System (ADS)
Zhu, Xiaoyuan; Zhang, Hui; Yang, Bo; Zhang, Guichen
2018-01-01
In order to improve oscillation damping control performance as well as gear shift quality of electric vehicle equipped with integrated motor-transmission system, a cloud-based shaft torque estimation scheme is proposed in this paper by using measurable motor and wheel speed signals transmitted by wireless network. It can help reduce computational burden of onboard controllers and also relief network bandwidth requirement of individual vehicle. Considering possible delays during signal wireless transmission, delay-dependent full-order observer design is proposed to estimate the shaft torque in cloud server. With these random delays modeled by using homogenous Markov chain, robust H∞ performance is adopted to minimize the effect of wireless network-induced delays, signal measurement noise as well as system modeling uncertainties on shaft torque estimation error. Observer parameters are derived by solving linear matrix inequalities, and simulation results using acceleration test and tip-in, tip-out test demonstrate the effectiveness of proposed shaft torque observer design.
A 12CO J = 4-->3 High-Velocity Cloud in the Large Magellanic Cloud
NASA Astrophysics Data System (ADS)
Kim, Sungeun; Walsh, Wilfred; Xiao, Kecheng; Lane, Adair P.
2005-10-01
We present Antarctic Submillimeter Telescope and Remote Observatory observations of 12CO J=4-->3 and 12[C I] emission in the 30 Doradus complex in the Large Magellanic Cloud. We detected strong 12CO J=4-->3 emission toward R140, a multiple system of Wolf-Rayet stars located on the rim of the expanding H II shell surrounding the R136 cluster. We also detected a high-velocity gas component as a separate feature in the 12CO J=4-->3 spectrum. This component probably originates from molecular material accelerated as a result of the combined motion induced by the stellar winds and explosions of supernovae, including several fast-expanding H II shells in the complex. The lower limit on the total kinetic energy of the atomic and molecular gas component is ~2×1051 ergs, suggesting that this comprises only 20% of the total kinetic energy contained in the H II complex structure.
Impacts of cloud water droplets on the OH production rate from peroxide photolysis.
Martins-Costa, M T C; Anglada, J M; Francisco, J S; Ruiz-López, Manuel F
2017-12-06
Understanding the difference between observed and modeled concentrations of HO x radicals in the troposphere is a current major issue in atmospheric chemistry. It is widely believed that existing atmospheric models miss a source of such radicals and several potential new sources have been proposed. In recent years, interest has increased on the role played by cloud droplets and organic aerosols. Computer modeling of ozone photolysis, for instance, has shown that atmospheric aqueous interfaces accelerate the associated OH production rate by as much as 3-4 orders of magnitude. Since methylhydroperoxide is a main source and sink of HO x radicals, especially at low NO x concentrations, it is fundamental to assess what is the influence of clouds on its chemistry and photochemistry. In this study, computer simulations for the photolysis of methylhydroperoxide at the air-water interface have been carried out showing that the OH production rate is severely enhanced, reaching a comparable level to ozone photolysis.
Geo-spatial Service and Application based on National E-government Network Platform and Cloud
NASA Astrophysics Data System (ADS)
Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.
2014-04-01
With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.
Recovering a MOND-like acceleration law in mimetic gravity
NASA Astrophysics Data System (ADS)
Vagnozzi, Sunny
2017-09-01
We reconsider the recently proposed mimetic gravity, focusing in particular on whether the theory is able to reproduce the inferred flat rotation curves of galaxies. We extend the theory by adding a non-minimal coupling between matter and mimetic field. Such coupling leads to the appearance of an extra force which renders the motion of test particles non-geodesic. By studying the weak field limit of the resulting equations of motion, we demonstrate that in the Newtonian limit the acceleration law induced by the non-minimal coupling reduces to a modified Newtonian dynamics (MOND)-like one. In this way, it is possible to reproduce the successes of MOND, namely the explanation for the flat galactic rotation curves and the Tully-Fisher relation, within the framework of mimetic gravity, without the need for particle dark matter. The scale-dependence of the recovered acceleration scale opens up the possibility of addressing the missing mass problem not only on galactic but also on cluster scales: we defer a full study of this issue, together with a complete analysis of fits to spiral galaxy rotation curves, to an upcoming companion paper.
NASA Astrophysics Data System (ADS)
Núñez, M.; Robie, T.; Vlachos, D. G.
2017-10-01
Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).
Clouds Sailing Overhead on Mars, Enhanced
2017-08-09
Wispy clouds float across the Martian sky in this accelerated sequence of enhanced images from NASA's Curiosity Mars rover. The rover's Navigation Camera (Navcam) took these eight images over a span of four minutes early in the morning of the mission's 1,758th Martian day, or sol (July 17, 2017), aiming nearly straight overhead. They have been processed by first making a "flat field' adjustment for known differences in sensitivity among pixels and correcting for camera artifacts due to light reflecting within the camera, and then generating an "average" of all the frames and subtracting that average from each frame. This subtraction results in emphasizing any changes due to movement or lighting. The clouds are also visible, though fainter, in a raw image sequence from these same observations. On the same Martian morning, Curiosity also observed clouds near the southern horizon. The clouds resemble Earth's cirrus clouds, which are ice crystals at high altitudes. These Martian clouds are likely composed of crystals of water ice that condense onto dust grains in the cold Martian atmosphere. Cirrus wisps appear as ice crystals fall and evaporate in patterns known as "fall streaks" or "mare's tails." Such patterns have been seen before at high latitudes on Mars, for instance by the Phoenix Mars Lander in 2008, and seasonally nearer the equator, for instance by the Opportunity rover. However, Curiosity has not previously observed such clouds so clearly visible from the rover's study area about five degrees south of the equator. The Hubble Space Telescope and spacecraft orbiting Mars have observed a band of clouds to appear near the Martian equator around the time of the Martian year when the planet is farthest from the Sun. With a more elliptical orbit than Earth's, Mars experiences more annual variation than Earth in its distance from the Sun. The most distant point in an orbit around the Sun is called the aphelion. The near-equatorial Martian cloud pattern observed at that time of year is called the "aphelion cloud belt." These new images from Curiosity were taken about two months before aphelion, but the morning clouds observed may be an early stage of the aphelion cloud belt. An animation is available at https://photojournal.jpl.nasa.gov/catalog/PIA21841
Clouds Sailing Above Martian Horizon, Enhanced
2017-08-09
Clouds drift across the sky above a Martian horizon in this accelerated sequence of enhanced images from NASA's Curiosity Mars rover. The rover's Navigation Camera (Navcam) took these eight images over a span of four minutes early in the morning of the mission's 1,758th Martian day, or sol (July 17, 2017), aiming toward the south horizon. They have been processed by first making a "flat field' adjustment for known differences in sensitivity among pixels and correcting for camera artifacts due to light reflecting within the camera, and then generating an "average" of all the frames and subtracting that average from each frame. This subtraction emphasizes changes whether due to movement -- such as the clouds' motion -- or due to lighting -- such as changing shadows on the ground as the morning sunlight angle changed. On the same Martian morning, Curiosity also observed clouds nearly straight overhead. The clouds resemble Earth's cirrus clouds, which are ice crystals at high altitudes. These Martian clouds are likely composed of crystals of water ice that condense onto dust grains in the cold Martian atmosphere. Cirrus wisps appear as ice crystals fall and evaporate in patterns known as "fall streaks" or "mare's tails." Such patterns have been seen before at high latitudes on Mars, for instance by the Phoenix Mars Lander in 2008, and seasonally nearer the equator, for instance by the Opportunity rover. However, Curiosity has not previously observed such clouds so clearly visible from the rover's study area about five degrees south of the equator. The Hubble Space Telescope and spacecraft orbiting Mars have observed a band of clouds to appear near the Martian equator around the time of the Martian year when the planet is farthest from the Sun. With a more elliptical orbit than Earth's, Mars experiences more annual variation than Earth in its distance from the Sun. The most distant point in an orbit around the Sun is called the aphelion. The near-equatorial Martian cloud pattern observed at that time of year is called the "aphelion cloud belt." These new images from Curiosity were taken about two months before aphelion, but the morning clouds observed may be an early stage of the aphelion cloud belt. An animation is available at https://photojournal.jpl.nasa.gov/catalog/PIA21840
NASA Astrophysics Data System (ADS)
Casey, K. S.; Hausman, S. A.
2016-02-01
In the last year, the NOAA National Oceanographic Data Center (NODC) and its siblings, the National Climatic Data Center and National Geophysical Data Center, were merged into one organization, the NOAA National Centers for Environmental Information (NCEI). Combining its expertise under one management has helped NCEI accelerate its efforts to embrace and integrate private, public, and hybrid cloud environments into its range of data stewardship services. These services span a range of tiers, from basic, long-term preservation and access, through enhanced access and scientific quality control, to authoritative product development and international-level services. Throughout these tiers of stewardship, partnerships and pilot projects have been launched to identify technological and policy-oriented challenges, to establish solutions to these problems, and to highlight success stories for emulation during operational integration of the cloud into NCEI's data stewardship activities. Some of these pilot activities including data storage, access, and reprocessing in Amazon Web Services, the OneStop data discovery and access framework project, and a set of Cooperative Research and Development Agreements under the Big Data Project with Amazon, Google, IBM, Microsoft, and the Open Cloud Consortium. Progress in these efforts will be highlighted along with a future vision of how NCEI could leverage hybrid cloud deployments and federated systems across NOAA to enable effective data stewardship for its oceanographic, atmospheric, climatic, and geophysical Big Data.
Towards real-time photon Monte Carlo dose calculation in the cloud
NASA Astrophysics Data System (ADS)
Ziegenhein, Peter; Kozin, Igor N.; Kamerling, Cornelis Ph; Oelfke, Uwe
2017-06-01
Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.
Towards real-time photon Monte Carlo dose calculation in the cloud.
Ziegenhein, Peter; Kozin, Igor N; Kamerling, Cornelis Ph; Oelfke, Uwe
2017-06-07
Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.
Hybrid glowworm swarm optimization for task scheduling in the cloud environment
NASA Astrophysics Data System (ADS)
Zhou, Jing; Dong, Shoubin
2018-06-01
In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.
Status of a Parkes Survey of the Large Magellanic Cloud for Millisecond Pulsars and Transients
NASA Astrophysics Data System (ADS)
Crawford, Fronefield; Lorimer, Duncan; Ridley, Josh; Bonidie, Victoria; Faisal Alam, Md
2018-01-01
To date, no millisecond radio pulsars have been discovered outside of our Galaxy. We are undertaking the first survey of the Large Magellanic Cloud that is sensitive to millisecond pulsars. For this search we are using the 1.4 GHz multibeam receiver on the Parkes 64-m telescope. We also hope to discover new source populations and probe the high-end of the pulsar luminosity function. We are searching our data over a wide range of dispersion measures for both single-pulse events and for accelerated pulsars. With about 40% of the survey completed, we have discovered three new long-period pulsars (all of which have been published) but have not yet confirmed any new millisecond pulsars.
Kinematics of the Optically Visible YSOs toward the Orion B Molecular Cloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kounkel, Marina; Hartmann, Lee; Mateo, Mario
2017-08-01
We present results from high-resolution optical spectra toward 66 young stars in the Orion B molecular cloud to study their kinematics and other properties. Observations of the H α and Li i 6707 Å lines are used to check membership and accretion properties. While the stellar radial velocities of NGC 2068 and L1622 are in good agreement with that of the molecular gas, many of the stars in NGC 2024 show a considerable offset. This could be a signature of either the expansion of the cluster, the high degree of the ejection of the stars from the cluster through dynamicalmore » interaction, or the acceleration of the gas due to stellar feedback.« less
Compression and accelerated rendering of volume data using DWT
NASA Astrophysics Data System (ADS)
Kamath, Preyas; Akleman, Ergun; Chan, Andrew K.
1998-09-01
2D images cannot convey information on object depth and location relative to the surfaces. The medical community is increasingly using 3D visualization techniques to view data from CT scans, MRI etc. 3D images provide more information on depth and location in the spatial domain to help surgeons making better diagnoses of the problem. 3D images can be constructed from 2D images using 3D scalar algorithms. With recent advances in communication techniques, it is possible for doctors to diagnose and plan treatment of a patient who lives at a remote location. It is made possible by transmitting relevant data of the patient via telephone lines. If this information is to be reconstructed in 3D, then 2D images must be transmitted. However 2D dataset storage occupies a lot of memory. In addition, visualization algorithms are slow. We describe in this paper a scheme which reduces the data transfer time by only transmitting information that the doctor wants. Compression is achieved by reducing the amount of data transfer. This is possible by using the 3D wavelet transform applied to 3D datasets. Since the wavelet transform is localized in frequency and spatial domain, we transmit detail only in the region where the doctor needs it. Since only ROM (Region of Interest) is reconstructed in detail, we need to render only ROI in detail, thus we can reduce the rendering time.
Machine Learning for Flood Prediction in Google Earth Engine
NASA Astrophysics Data System (ADS)
Kuhn, C.; Tellman, B.; Max, S. A.; Schwarz, B.
2015-12-01
With the increasing availability of high-resolution satellite imagery, dynamic flood mapping in near real time is becoming a reachable goal for decision-makers. This talk describes a newly developed framework for predicting biophysical flood vulnerability using public data, cloud computing and machine learning. Our objective is to define an approach to flood inundation modeling using statistical learning methods deployed in a cloud-based computing platform. Traditionally, static flood extent maps grounded in physically based hydrologic models can require hours of human expertise to construct at significant financial cost. In addition, desktop modeling software and limited local server storage can impose restraints on the size and resolution of input datasets. Data-driven, cloud-based processing holds promise for predictive watershed modeling at a wide range of spatio-temporal scales. However, these benefits come with constraints. In particular, parallel computing limits a modeler's ability to simulate the flow of water across a landscape, rendering traditional routing algorithms unusable in this platform. Our project pushes these limits by testing the performance of two machine learning algorithms, Support Vector Machine (SVM) and Random Forests, at predicting flood extent. Constructed in Google Earth Engine, the model mines a suite of publicly available satellite imagery layers to use as algorithm inputs. Results are cross-validated using MODIS-based flood maps created using the Dartmouth Flood Observatory detection algorithm. Model uncertainty highlights the difficulty of deploying unbalanced training data sets based on rare extreme events.
NASA Astrophysics Data System (ADS)
Bonanno, D.; China, S.; Fraund, M. W.; Pham, D.; Kulkarni, G.; Laskin, A.; Gilles, M. K.; Moffet, R.
2016-12-01
The Holistic Interactions of Shallow Clouds, Aerosols, and Land-Ecosystems (HI-SCALE) Campaign was carried out to gain a better understanding of the lifecycle of shallow clouds. The HISCALE experiment was designed to contrast two seasons, wet and dry, and determine their effect on atmospheric cloud and aerosol processes. The spring component to HISCALE was selected to characterize mixing state for particles collected onto substrates. Sampling was performed before and after rain events to obtain airborne soil organic particles (ASOP), which are ejected after rain events. The unique composition of the ASOP may affect optical properties and/or hygroscopic properties. The collection of particles took place at the Atmospheric Radiation Measurement Southern Great Plains (ARM SGP) field site. The Scanning Transmission X-Ray Microscope (STXM) was used to image the samples collected during the first HI-SCALE Campaign to determine the carbonaceous mixing state. Scanning Electron Microscopy Energy-dispersive X-ray (SEM/EDX) analysis is more sensitive to the inorganic makeup of particles, while STXM renders a more comprehensive analysis of the organics. Measurements such as nephelometry, Particle Soot Absorption Photometry (PSAP), and Aerosol Mass Spectrometry (AMS) from the ARM archive will be correlated with microscopy measurements. The primary focus is the relation between composition and morphology of ASOP with hygroscopicity and optical properties. Further investigation of these organic particles will be performed to provide a mixing state parameterization and aid in the advancement of current climate models.
External front instabilities induced by a shocked particle ring.
Rodriguez, V; Saurel, R; Jourdan, G; Houas, L
2014-10-01
The dispersion of a cylindrical particle ring by a blast or shock wave induces the formation of coherent structures which take the form of particle jets. A blast wave, issuing from the discharge of a planar shock wave at the exit of a conventional shock tube, is generated in the center of a granular medium ring initially confined inside a Hele-Shaw cell. With the present experimental setup, under impulsive acceleration, a solid particle-jet formation is observed in a quasi-two-dimensional configuration. The aim of the present investigation is to observe in detail the formation of very thin perturbations created around the external surface of the dispersed particle layer. By means of fast flow visualization with an appropriate recording window, we focus solely on the first instants during which the external particle ring becomes unstable. We find that the critical area of the destabilization of the external ring surface is constant regardless of the acceleration of the initial layer. Moreover, we observe in detail the external front perturbation wavelength, rendered dimensionless by the initial ring perimeter, and follow its evolution with the initial particle layer acceleration. We report this quantity to be constant regardless of the evolution of the initial particle layer acceleration. Finally, we can reasonably assert that external front perturbations depend solely on the material of the particles.
Star Polymers Reduce Islet Amyloid Polypeptide Toxicity via Accelerated Amyloid Aggregation.
Pilkington, Emily H; Lai, May; Ge, Xinwei; Stanley, William J; Wang, Bo; Wang, Miaoyi; Kakinen, Aleksandr; Sani, Marc-Antonie; Whittaker, Michael R; Gurzov, Esteban N; Ding, Feng; Quinn, John F; Davis, Thomas P; Ke, Pu Chun
2017-12-11
Protein aggregation into amyloid fibrils is a ubiquitous phenomenon across the spectrum of neurodegenerative disorders and type 2 diabetes. A common strategy against amyloidogenesis is to minimize the populations of toxic oligomers and protofibrils by inhibiting protein aggregation with small molecules or nanoparticles. However, melanin synthesis in nature is realized by accelerated protein fibrillation to circumvent accumulation of toxic intermediates. Accordingly, we designed and demonstrated the use of star-shaped poly(2-hydroxyethyl acrylate) (PHEA) nanostructures for promoting aggregation while ameliorating the toxicity of human islet amyloid polypeptide (IAPP), the peptide involved in glycemic control and the pathology of type 2 diabetes. The binding of PHEA elevated the β-sheet content in IAPP aggregates while rendering a new morphology of "stelliform" amyloids originating from the polymers. Atomistic molecular dynamics simulations revealed that the PHEA arms served as rodlike scaffolds for IAPP binding and subsequently accelerated IAPP aggregation by increased local peptide concentration. The tertiary structure of the star nanoparticles was found to be essential for driving the specific interactions required to impel the accelerated IAPP aggregation. This study sheds new light on the structure-toxicity relationship of IAPP and points to the potential of exploiting star polymers as a new class of therapeutic agents against amyloidogenesis.
Cosmic-ray ionisation of dense molecular clouds
NASA Astrophysics Data System (ADS)
Vaupre, Solenn
2015-07-01
Cosmic rays (CR) are of tremendous importance in the dynamical and chemical evolution of interstellar molecular clouds, where stars and planets form. CRs are likely accelerated in the shells of supernova remnants (SNR), thus molecular clouds nearby can be irradiated by intense fluxes of CRs. CR protons have two major effects on dense molecular clouds: 1) when they encounter the dense medium, high-energy protons (>280 MeV) create pions that decay into gamma-rays. This process makes SNR-molecular cloud associations intense GeV and/or TeV sources whose spectra mimic the CR spectrum. 2) at lower energies, CRs penetrate the cloud and ionise the gas, leading to the formation of molecular species characteristic of the presence of CRs, called tracers of the ionisation. Studying these tracers gives information on low-energy CRs that are unaccessible to any other observations. I studied the CR ionisation of molecular clouds next to three SNRs: W28, W51C and W44. These SNRs are known to be interacting with the nearby clouds, from the presence of shocked gas, OH masers and pion-decay induced gamma-ray emission. My work includes millimeter observations and chemical modeling of tracers of the ionisation in these dense molecular clouds. In these three regions, we determined an enhanced CR ionisation rate, supporting the hypothesis of an origin of the CRs in the SNR nearby. The evolution of the CR ionisation rate with the distance to the SNR brings valuable constraints on the propagation properties of low-energy CRs. The method used relies on observations of the molecular ions HCO+ and DCO+, which shows crucial limitations at high ionisation. Therefore, I investigated, both through modeling and observations, the chemical abundances of several other species to try and identity alternative tracers of the ionisation. In particular, in the W44 region, observations of N2H+ bring additional constraints on the physical conditions, volatile abundances in the cloud, and the ionisation state. This research brought valuable insight in to the CR induced chemistry in the interstellar medium. It also brought new perspectives of interdisciplinary research towards the understanding of CRs, from millimeter to gamma-ray observations.
Eulerian and Lagrangian approaches to multidimensional condensation and collection
NASA Astrophysics Data System (ADS)
Li, Xiang-Yu; Brandenburg, A.; Haugen, N. E. L.; Svensson, G.
2017-06-01
Turbulence is argued to play a crucial role in cloud droplet growth. The combined problem of turbulence and cloud droplet growth is numerically challenging. Here an Eulerian scheme based on the Smoluchowski equation is compared with two Lagrangian superparticle (or superdroplet) schemes in the presence of condensation and collection. The growth processes are studied either separately or in combination using either two-dimensional turbulence, a steady flow or just gravitational acceleration without gas flow. Good agreement between the different schemes for the time evolution of the size spectra is observed in the presence of gravity or turbulence. The Lagrangian superparticle schemes are found to be superior over the Eulerian one in terms of computational performance. However, it is shown that the use of interpolation schemes such as the cloud-in-cell algorithm is detrimental in connection with superparticle or superdroplet approaches. Furthermore, the use of symmetric over asymmetric collection schemes is shown to reduce the amount of scatter in the results. For the Eulerian scheme, gravitational collection is rather sensitive to the mass bin resolution, but not so in the case with turbulence.
NASA Astrophysics Data System (ADS)
Hsu, Juno; Prather, Michael J.; Cameron-Smith, Philip; Veidenbaum, Alex; Nicolau, Alex
2017-07-01
Solar-J is a comprehensive radiative transfer model for the solar spectrum that addresses the needs of both solar heating and photochemistry in Earth system models. Solar-J is a spectral extension of Cloud-J, a standard in many chemical models that calculates photolysis rates in the 0.18-0.8 µm region. The Cloud-J core consists of an eight-stream scattering, plane-parallel radiative transfer solver with corrections for sphericity. Cloud-J uses cloud quadrature to accurately average over correlated cloud layers. It uses the scattering phase function of aerosols and clouds expanded to eighth order and thus avoids isotropic-equivalent approximations prevalent in most solar heating codes. The spectral extension from 0.8 to 12 µm enables calculation of both scattered and absorbed sunlight and thus aerosol direct radiative effects and heating rates throughout the Earth's atmosphere.The Solar-J extension adopts the correlated-k gas absorption bins, primarily water vapor, from the shortwave Rapid Radiative Transfer Model for general circulation model (GCM) applications (RRTMG-SW). Solar-J successfully matches RRTMG-SW's tropospheric heating profile in a clear-sky, aerosol-free, tropical atmosphere. We compare both codes in cloudy atmospheres with a liquid-water stratus cloud and an ice-crystal cirrus cloud. For the stratus cloud, both models use the same physical properties, and we find a systematic low bias of about 3 % in planetary albedo across all solar zenith angles caused by RRTMG-SW's two-stream scattering. Discrepancies with the cirrus cloud using any of RRTMG-SW's three different parameterizations are as large as about 20-40 % depending on the solar zenith angles and occur throughout the atmosphere.Effectively, Solar-J has combined the best components of RRTMG-SW and Cloud-J to build a high-fidelity module for the scattering and absorption of sunlight in the Earth's atmosphere, for which the three major components - wavelength integration, scattering, and averaging over cloud fields - all have comparably small errors. More accurate solutions with Solar-J come with increased computational costs, about 5 times that of RRTMG-SW for a single atmosphere. There are options for reduced costs or computational acceleration that would bring costs down while maintaining improved fidelity and balanced errors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, Juno; Prather, Michael J.; Cameron-Smith, Philip
Solar-J is a comprehensive radiative transfer model for the solar spectrum that addresses the needs of both solar heating and photochemistry in Earth system models. Solar-J is a spectral extension of Cloud-J, a standard in many chemical models that calculates photolysis rates in the 0.18–0.8 µm region. The Cloud-J core consists of an eight-stream scattering, plane-parallel radiative transfer solver with corrections for sphericity. Cloud-J uses cloud quadrature to accurately average over correlated cloud layers. It uses the scattering phase function of aerosols and clouds expanded to eighth order and thus avoids isotropic-equivalent approximations prevalent in most solar heating codes. Themore » spectral extension from 0.8 to 12 µm enables calculation of both scattered and absorbed sunlight and thus aerosol direct radiative effects and heating rates throughout the Earth's atmosphere. Furthermore, the Solar-J extension adopts the correlated-k gas absorption bins, primarily water vapor, from the shortwave Rapid Radiative Transfer Model for general circulation model (GCM) applications (RRTMG-SW). Solar-J successfully matches RRTMG-SW's tropospheric heating profile in a clear-sky, aerosol-free, tropical atmosphere. Here, we compare both codes in cloudy atmospheres with a liquid-water stratus cloud and an ice-crystal cirrus cloud. For the stratus cloud, both models use the same physical properties, and we find a systematic low bias of about 3 % in planetary albedo across all solar zenith angles caused by RRTMG-SW's two-stream scattering. Discrepancies with the cirrus cloud using any of RRTMG-SW's three different parameterizations are as large as about 20–40 % depending on the solar zenith angles and occur throughout the atmosphere. Effectively, Solar-J has combined the best components of RRTMG-SW and Cloud-J to build a high-fidelity module for the scattering and absorption of sunlight in the Earth's atmosphere, for which the three major components – wavelength integration, scattering, and averaging over cloud fields – all have comparably small errors. More accurate solutions with Solar-J come with increased computational costs, about 5 times that of RRTMG-SW for a single atmosphere. There are options for reduced costs or computational acceleration that would bring costs down while maintaining improved fidelity and balanced errors.« less
Hsu, Juno; Prather, Michael J.; Cameron-Smith, Philip; ...
2017-01-01
Solar-J is a comprehensive radiative transfer model for the solar spectrum that addresses the needs of both solar heating and photochemistry in Earth system models. Solar-J is a spectral extension of Cloud-J, a standard in many chemical models that calculates photolysis rates in the 0.18–0.8 µm region. The Cloud-J core consists of an eight-stream scattering, plane-parallel radiative transfer solver with corrections for sphericity. Cloud-J uses cloud quadrature to accurately average over correlated cloud layers. It uses the scattering phase function of aerosols and clouds expanded to eighth order and thus avoids isotropic-equivalent approximations prevalent in most solar heating codes. Themore » spectral extension from 0.8 to 12 µm enables calculation of both scattered and absorbed sunlight and thus aerosol direct radiative effects and heating rates throughout the Earth's atmosphere. Furthermore, the Solar-J extension adopts the correlated-k gas absorption bins, primarily water vapor, from the shortwave Rapid Radiative Transfer Model for general circulation model (GCM) applications (RRTMG-SW). Solar-J successfully matches RRTMG-SW's tropospheric heating profile in a clear-sky, aerosol-free, tropical atmosphere. Here, we compare both codes in cloudy atmospheres with a liquid-water stratus cloud and an ice-crystal cirrus cloud. For the stratus cloud, both models use the same physical properties, and we find a systematic low bias of about 3 % in planetary albedo across all solar zenith angles caused by RRTMG-SW's two-stream scattering. Discrepancies with the cirrus cloud using any of RRTMG-SW's three different parameterizations are as large as about 20–40 % depending on the solar zenith angles and occur throughout the atmosphere. Effectively, Solar-J has combined the best components of RRTMG-SW and Cloud-J to build a high-fidelity module for the scattering and absorption of sunlight in the Earth's atmosphere, for which the three major components – wavelength integration, scattering, and averaging over cloud fields – all have comparably small errors. More accurate solutions with Solar-J come with increased computational costs, about 5 times that of RRTMG-SW for a single atmosphere. There are options for reduced costs or computational acceleration that would bring costs down while maintaining improved fidelity and balanced errors.« less
NASA Astrophysics Data System (ADS)
Ge, C.; Wang, J.; Reid, J. S.
2013-12-01
The online-coupled Weather Research and Forecasting model with Chemistry (WRF-Chem) is used to simulate the direct and semi-direct radiative impacts of smoke particles over the southeast Asian Marine Continents (MC, 10°S - 10°N, 90°E-150°E) during October 2006 when a significant El Nino event caused the highest biomass burning activity since 1997. With the use of OC (Organic Carbon) /BC (Black Carbon) ratio of 10 in the smoke emission inventory, the baseline simulation shows that the low-level clouds amplifying effect on smoke absorption led to a warming effect at the top-of-atmosphere (TOA) with a domain/monthly average forcing value of ~20 Wm-2 over the islands of Borneo and Sumatra. The smoke-induced monthly average daytime heating (0.3K) that is largely confined above the low-level clouds results in the local convergence over the smoke source region. This heating-induced convergence coupled with daytime planetary boundary layer turbulent mixing, transports more smoke particles above the planetary boundary layer height (PBLH), hence rendering a positive feedback. This positive feedback contrasts with the decrease of cloud fraction resulted from the combined effects of smoke heating within the cloud layer and the more stability in the boundary layer; the latter can be considered as a negative feedback in which decrease of cloud fraction weakens the heating by smoke particles above the clouds. During nighttime, the elevated smoke layer (above clouds in daytime) is decoupled from boundary layer, and the reduction of PBLH due to the residual surface cooling from the daytime lead to the accumulation of smoke particles near the surface. Because of smoke radiative extinction, on monthly basis, the amount of the solar input at the surface is reduced as large as 60 Wm-2, which lead to the decrease of sensible heat, latent heat, 2-m air temperature, and PBLH by a maximum of 20 Wm-2, 20 Wm-2, 1K, 120 m, respectively. The cloud changes over continents are mostly occurred over the islands of Sumatra and Borneo during the daytime, where the low-level cloud fraction decreases more than 10%. However, the change of local wind (include sea breeze) induced by the smoke radiative feedback leads to more convergence over Karimata Strait and south coastal area of Kalimantan during both daytime and night time; consequently, cloud fraction is increased there up to 20%. The sensitivities with different OC/BC ratio show the importance of the smoke single scattering albedo for the smoke semi-direct effects. A case study on 31 October 2006 further demonstrated a much larger (more than twice of the monthly average) feedback induced by smoke aerosols. The decreased sea breeze during big events can lead to prominent increase (40%) of low-level cloud over coastal water. Lastly, the direct and semi-direct radiative impact of smoke particles over the Southeast Asian Marine Continents is summarized as a conceptual model.
Laboratory investigation of dust impacts induced signals on antennas in space
NASA Astrophysics Data System (ADS)
Rocha, J. R.; Collette, A.; Malaspina, D.; Gruen, E.; Sternovsky, Z.
2014-12-01
Recent observations of sharp voltage spikes by the WAVES electric field experiments onboard the twin STEREO spacecraft have been attributed to plasma clouds generated by the impact ionization of high velocity dust particles. The reported dust fluxes are much higher than those measured by dedicated dust detectors at 1 AU, which leads to the interpretation that the STEREO observations are due to nanometer-sized dust particles originating from the inner solar system and accelerated to high velocities by the solar wind magnetic field. However, this interpretation is based on a simplified model of coupling between the expanding plasma cloud from the dust impact and the WAVES electric field instrument. A series of laboratory measurements are performed to validate this model and to calibrate/investigate the effect of various impact parameters on the signals measured by the electric field instrument. The dust accelerator facility operating at the University of Colorado is used for the measurement with micron and submicron sized particles accelerated to 50 km/s. The first set of measurements was aimed at the understanding of the charge yield of impact-generated plasmas from common materials used on spacecraft, i.e. BeCu, germanium coated black Kapton, MLI, and solar cells. The measurements show that at 10 km/s these materials yield similar charge signals. At higher speeds (~50 km/s) the variation is with material increases. The impact charge is also found to depend on angle of incidence; the data suggest a maximum at 45 degrees. The second set of measurements investigates the variation of the induced dust signal with bias potential applied on the simulated spacecraft.
NASA Astrophysics Data System (ADS)
Ammazzalorso, F.; Bednarz, T.; Jelen, U.
2014-03-01
We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.
Design of orbital debris shields for oblique hypervelocity impact
NASA Technical Reports Server (NTRS)
Fahrenthold, Eric P.
1994-01-01
A new impact debris propagation code was written to link CTH simulations of space debris shield perforation to the Lagrangian finite element code DYNA3D, for space structure wall impact simulations. This software (DC3D) simulates debris cloud evolution using a nonlinear elastic-plastic deformable particle dynamics model, and renders computationally tractable the supercomputer simulation of oblique impacts on Whipple shield protected structures. Comparison of three dimensional, oblique impact simulations with experimental data shows good agreement over a range of velocities of interest in the design of orbital debris shielding. Source code developed during this research is provided on the enclosed floppy disk. An abstract based on the work described was submitted to the 1994 Hypervelocity Impact Symposium.
Utilization of DIRSIG in support of real-time infrared scene generation
NASA Astrophysics Data System (ADS)
Sanders, Jeffrey S.; Brown, Scott D.
2000-07-01
Real-time infrared scene generation for hardware-in-the-loop has been a traditionally difficult challenge. Infrared scenes are usually generated using commercial hardware that was not designed to properly handle the thermal and environmental physics involved. Real-time infrared scenes typically lack details that are included in scenes rendered in no-real- time by ray-tracing programs such as the Digital Imaging and Remote Sensing Scene Generation (DIRSIG) program. However, executing DIRSIG in real-time while retaining all the physics is beyond current computational capabilities for many applications. DIRSIG is a first principles-based synthetic image generation model that produces multi- or hyper-spectral images in the 0.3 to 20 micron region of the electromagnetic spectrum. The DIRSIG model is an integrated collection of independent first principles based on sub-models, each of which works in conjunction to produce radiance field images with high radiometric fidelity. DIRSIG uses the MODTRAN radiation propagation model for exo-atmospheric irradiance, emitted and scattered radiances (upwelled and downwelled) and path transmission predictions. This radiometry submodel utilizes bidirectional reflectance data, accounts for specular and diffuse background contributions, and features path length dependent extinction and emission for transmissive bodies (plumes, clouds, etc.) which may be present in any target, background or solar path. This detailed environmental modeling greatly enhances the number of rendered features and hence, the fidelity of a rendered scene. While DIRSIG itself cannot currently be executed in real-time, its outputs can be used to provide scene inputs for real-time scene generators. These inputs can incorporate significant features such as target to background thermal interactions, static background object thermal shadowing, and partially transmissive countermeasures. All of these features represent significant improvements over the current state of the art in real-time IR scene generation.
Measuring Gravitation Using Polarization Spectroscopy
NASA Technical Reports Server (NTRS)
Matsko, Andrey; Yu, Nan; Maleki, Lute
2004-01-01
A proposed method of measuring gravitational acceleration would involve the application of polarization spectroscopy to an ultracold, vertically moving cloud of atoms (an atomic fountain). A related proposed method involving measurements of absorption of light pulses like those used in conventional atomic interferometry would yield an estimate of the number of atoms participating in the interferometric interaction. The basis of the first-mentioned proposed method is that the rotation of polarization of light is affected by the acceleration of atoms along the path of propagation of the light. The rotation of polarization is associated with a phase shift: When an atom moving in a laboratory reference interacts with an electromagnetic wave, the energy levels of the atom are Doppler-shifted, relative to where they would be if the atom were stationary. The Doppler shift gives rise to changes in the detuning of the light from the corresponding atomic transitions. This detuning, in turn, causes the electromagnetic wave to undergo a phase shift that can be measured by conventional means. One would infer the gravitational acceleration and/or the gradient of the gravitational acceleration from the phase measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, Takuya, E-mail: takahashi@kwasan.kyoto-u.ac.jp
Flare-associated coronal shock waves sometimes interact with solar prominences, leading to large-amplitude prominence oscillations (LAPOs). Such prominence activation gives us a unique opportunity to track the time evolution of shock–cloud interaction in cosmic plasmas. Although the dynamics of interstellar shock–cloud interaction has been extensively studied, coronal shock–solar prominence interaction is rarely studied in the context of shock–cloud interaction. Associated with the X5.4 class solar flare that occurred on 2012 March 7, a globally propagated coronal shock wave interacted with a polar prominence, leading to LAPO. In this paper, we studied bulk acceleration and excitation of the internal flow of themore » shocked prominence using three-dimensional magnetohydrodynamic (MHD) simulations. We studied eight MHD simulation runs, each with different mass density structure of the prominence, and one hydrodynamic simulation run, and we compared the result. In order to compare the observed motion of activated prominence with the corresponding simulation, we also studied prominence activation by injection of a triangular-shaped coronal shock. We found that the prominence is first accelerated mainly by magnetic tension force as well as direct transmission of the shock, and later decelerated mainly by magnetic tension force. The internal flow, on the other hand, is excited during the shock front sweeps through the prominence and damps almost exponentially. We construct a phenomenological model of bulk momentum transfer from the shock to the prominence, which agreed quantitatively with all the simulation results. Based on the phenomenological prominence activation model, we diagnosed physical parameters of the coronal shock wave. The estimated energy of the coronal shock is several percent of the total energy released during the X5.4 flare.« less
Electrical Breakdown of Anodized Structures in a Low Earth Orbital Environmental
NASA Technical Reports Server (NTRS)
Galofaro, J. T.; Doreswamy, C. V.; Vayner, B. V.; Snyder, D. B.; Ferguson, D. C.
1999-01-01
A comprehensive set of investigations involving arcing on a negatively biased anodized aluminum plate immersed in a low density argon plasma at low pressures (P(sub O), 7.5 x 10(exp -5) Torr) have been performed. These arcing experiments were designed to simulate electrical breakdown of anodized coatings in a Low Earth Orbital (LEO) environment. When electrical breakdown of an anodized layer occurs, an arc strikes, and there is a sudden flux of electrons accelerated into the ambient plasma. This event is directly followed by ejection of a quasi-neutral plasma cloud consisting of ejected material blown out of the anodized layer. Statistical analysis of plasma cloud expansion velocities have yielded a mean propagation velocity, v = (19.4 +/- 3.5) km/s. As the plasma cloud expands into the ambient plasma, energy in the form of electrical noise is generated. The radiated electromagnetic noise is detected by means of an insulated antenna immersed in the ambient plasma. The purpose of the investigations is (1) to observe and record the electromagnetic radiation spectrum resulting from the arcing process. (2) Make estimates of the travel time of the quasi-neutral plasma cloud based on fluctuations to several Langmuir probes mounted in the ambient plasma. (3) To study induced arcing between two anodized aluminum structures in close proximity.
NASA Astrophysics Data System (ADS)
Kearns, E. J.
2017-12-01
NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Reducing Time to Science: Unidata and JupyterHub Technology Using the Jetstream Cloud
NASA Astrophysics Data System (ADS)
Chastang, J.; Signell, R. P.; Fischer, J. L.
2017-12-01
Cloud computing can accelerate scientific workflows, discovery, and collaborations by reducing research and data friction. We describe the deployment of Unidata and JupyterHub technologies on the NSF-funded XSEDE Jetstream cloud. With the aid of virtual machines and Docker technology, we deploy a Unidata JupyterHub server co-located with a Local Data Manager (LDM), THREDDS data server (TDS), and RAMADDA geoscience content management system. We provide Jupyter Notebooks and the pre-built Python environments needed to run them. The notebooks can be used for instruction and as templates for scientific experimentation and discovery. We also supply a large quantity of NCEP forecast model results to allow data-proximate analysis and visualization. In addition, users can transfer data using Globus command line tools, and perform their own data-proximate analysis and visualization with Notebook technology. These data can be shared with others via a dedicated TDS server for scientific distribution and collaboration. There are many benefits of this approach. Not only is the cloud computing environment fast, reliable and scalable, but scientists can analyze, visualize, and share data using only their web browser. No local specialized desktop software or a fast internet connection is required. This environment will enable scientists to spend less time managing their software and more time doing science.
Solar Eruptions, CMEs and Space Weather
NASA Technical Reports Server (NTRS)
Gopalswamy, Nat
2011-01-01
Coronal mass ejections (CMEs) are large-scale magnetized plasma structures ejected from the Sun and propagate far into the interplanetary medium. CMEs represent energy output from the Sun in the form of magnetized plasma and electromagnetic radiation. The electromagnetic radiation suddenly increases the ionization content of the ionosphere, thus impacting communication and navigation systems. The plasma clouds can drive shocks that accelerate charged particles to very high energies in the interplanetary space, which pose radiation hazard to astronauts and space systems. The plasma clouds also arrive at Earth in about two days and impact Earth's magnetosphere, producing geomagnetic storms. The magnetic storms result in a number of effects including induced currents that can disrupt power grids, railroads, and underground pipelines. This lecture presents an overview of the origin, propagation, and geospace consequences of solar storms.
Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...
2016-10-06
The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less
Rigid-body rotation of an electron cloud in divergent magnetic fields
Fruchtman, A.; Gueroult, R.; Fisch, N. J.
2013-07-10
For a given voltage across a divergent poloidal magnetic field, two electric potential distributions, each supported by a rigid-rotor electron cloud rotating with a different frequency, are found analytically. The two rotation frequencies correspond to the slow and fast rotation frequencies known in uniform plasma. Due to the centrifugal force, the equipotential surfaces, that correspond to the two electric potential distributions, diverge more than the magnetic surfaces do, the equipotential surfaces in the fast mode diverge largely in particular. The departure of the equipotential surfaces from the magnetic field surfaces may have a significant focusing effect on the ions acceleratedmore » by the electric field. Furthermore, the focusing effect could be important for laboratory plasma accelerators as well as for collimation of astrophysical jets.« less
Biomedical Informatics on the Cloud: A Treasure Hunt for Advancing Cardiovascular Medicine.
Ping, Peipei; Hermjakob, Henning; Polson, Jennifer S; Benos, Panagiotis V; Wang, Wei
2018-04-27
In the digital age of cardiovascular medicine, the rate of biomedical discovery can be greatly accelerated by the guidance and resources required to unearth potential collections of knowledge. A unified computational platform leverages metadata to not only provide direction but also empower researchers to mine a wealth of biomedical information and forge novel mechanistic insights. This review takes the opportunity to present an overview of the cloud-based computational environment, including the functional roles of metadata, the architecture schema of indexing and search, and the practical scenarios of machine learning-supported molecular signature extraction. By introducing several established resources and state-of-the-art workflows, we share with our readers a broadly defined informatics framework to phenotype cardiovascular health and disease. © 2018 American Heart Association, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, Heng; Gustafson, William I.; Wang, Hailong
Subgrid-scale interactions between turbulence and radiation are potentially important for accurately reproducing marine low clouds in climate models. To better understand the impact of these interactions, the Weather Research and Forecasting (WRF) model is configured for large eddy simulation (LES) to study the stratocumulus-to-trade cumulus (Sc-to-Cu) transition. Using the GEWEX Atmospheric System Studies (GASS) composite Lagrangian transition case and the Atlantic Trade Wind Experiment (ATEX) case, it is shown that the lack of subgrid-scale turbulence-radiation interaction, as is the case in current generation climate models, accelerates the Sc-to-Cu transition. Our analysis suggests that in cloud-topped boundary layers subgrid-scale turbulence-radiation interactionsmore » contribute to stronger production of temperature variance, which in turn leads to stronger buoyancy production of turbulent kinetic energy and helps to maintain the Sc cover.« less
Rigid-body rotation of an electron cloud in divergent magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fruchtman, A.; Gueroult, R.; Fisch, N. J.
2013-07-15
For a given voltage across a divergent poloidal magnetic field, two electric potential distributions, each supported by a rigid-rotor electron cloud rotating with a different frequency, are found analytically. The two rotation frequencies correspond to the slow and fast rotation frequencies known in uniform plasma. Due to the centrifugal force, the equipotential surfaces, that correspond to the two electric potential distributions, diverge more than the magnetic surfaces do, the equipotential surfaces in the fast mode diverge largely in particular. The departure of the equipotential surfaces from the magnetic field surfaces may have a significant focusing effect on the ions acceleratedmore » by the electric field. The focusing effect could be important for laboratory plasma accelerators as well as for collimation of astrophysical jets.« less
Epilepsy analytic system with cloud computing.
Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei
2013-01-01
Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.
Integrating solar energy and climate research into science education
NASA Astrophysics Data System (ADS)
Betts, Alan K.; Hamilton, James; Ligon, Sam; Mahar, Ann Marie
2016-01-01
This paper analyzes multi-year records of solar flux and climate data from two solar power sites in Vermont. We show the inter-annual differences of temperature, wind, panel solar flux, electrical power production, and cloud cover. Power production has a linear relation to a dimensionless measure of the transmission of sunlight through the cloud field. The difference between panel and air temperatures reaches 24°C with high solar flux and low wind speed. High panel temperatures that occur in summer with low wind speeds and clear skies can reduce power production by as much as 13%. The intercomparison of two sites 63 km apart shows that while temperature is highly correlated on daily (
Turbulence and cloud droplets in cumulus clouds
NASA Astrophysics Data System (ADS)
Saito, Izumi; Gotoh, Toshiyuki
2018-02-01
In this paper, we report on the successful and seamless simulation of turbulence and the evolution of cloud droplets to raindrops over 10 minutes from microscopic viewpoints by using direct numerical simulation. Included processes are condensation-evaporation, collision-coalescence of droplets with hydrodynamic interaction, Reynolds number dependent drag, and turbulent flow within a parcel that is ascending within a self-consistently determined updraft inside a cumulus cloud. We found that the altitude and the updraft velocity of the parcel, the mean supersaturation, and the liquid water content are insensitive to the turbulence intensity, and that when the turbulence intensity increases, the droplet number density swiftly decreases while the spectral width of droplets rapidly increases. This study marks the first time the evolution of the mass density distribution function has been successfully calculated from microscopic computations. The turbulence accelerated to form a second peak in the mass density distribution function, leading to the raindrop formation, and the radius of the largest drop was over 300 μm at the end of the simulation. We also found that cloud droplets modify the turbulence in a way that is unlike the Kolmogorov-Obukhov-Corrsin theory. For example, the temperature and water vapor spectra at low wavenumbers become shallower than {k}-5/3 in the inertial-convective range, and decrease slower than exponentially in the diffusive range. This spectra modification is explained by nonlinear interactions between turbulent mixing and the evaporation-condensation process associated with large numbers of droplets.
Liebeskind, David S
2016-01-01
Crowdsourcing, an unorthodox approach in medicine, creates an unusual paradigm to study precision cerebrovascular health, eliminating the relative isolation and non-standardized nature of current imaging data infrastructure, while shifting emphasis to the astounding capacity of big data in the cloud. This perspective envisions the use of imaging data of the brain and vessels to orient and seed A Million Brains Initiative™ that may leapfrog incremental advances in stroke and rapidly provide useful data to the sizable population around the globe prone to the devastating effects of stroke and vascular substrates of dementia. Despite such variability in the type of data available and other limitations, the data hierarchy logically starts with imaging and can be enriched with almost endless types and amounts of other clinical and biological data. Crowdsourcing allows an individual to contribute to aggregated data on a population, while preserving their right to specific information about their own brain health. The cloud now offers endless storage, computing prowess, and neuroimaging applications for postprocessing that is searchable and scalable. Collective expertise is a windfall of the crowd in the cloud and particularly valuable in an area such as cerebrovascular health. The rise of precision medicine, rapidly evolving technological capabilities of cloud computing and the global imperative to limit the public health impact of cerebrovascular disease converge in the imaging of A Million Brains Initiative™. Crowdsourcing secure data on brain health may provide ultimate generalizability, enable focused analyses, facilitate clinical practice, and accelerate research efforts.
Development of the cloud sharing system for residential earthquake responses using smartphones
NASA Astrophysics Data System (ADS)
Shohei, N.; Fujiwara, H.; Azuma, H.; Hao, K. X.
2015-12-01
Earthquake responses at residential depends on its building structure, site amplification, epicenter distance, and etc. Until recently, it was impossible to obtain the individual residential response by conventional seismometer in terms of costs. However, current technology makes it possible with the Micro Electro Mechanical Systems (MEMS) sensors inside mobile terminals like smartphones. We developed the cloud sharing system for residential earthquake response in local community utilizing mobile terminals, such as an iPhone, iPad, iPod touch as a collaboration between NIED and Hakusan Corp. The triggered earthquake acceleration waveforms are recorded at sampling frequencies of 100Hz and stored on their memories once an threshold value was exceeded or ordered information received from the Earthquake Early Warning system. The recorded data is automatically transmitted and archived on the cloud server once the wireless communication is available. Users can easily get the uploaded data by use of a web browser through Internet. The cloud sharing system is designed for residential and only shared in local community internal. Residents can freely add sensors and register information about installation points in each region. And if an earthquake occurs, they can easily view the local distribution of seismic intensities and even analyze waves.To verify this cloud-based seismic wave sharing system, we have performed on site experiments under the cooperation of several local communities, The system and experimental results will be introduced and demonstrated in the presentation.
In-Storage Embedded Accelerator for Sparse Pattern Processing
2016-09-13
computation . As a result, a very small processor could be used and still make full use of storage device bandwidth. When the host software sends...Rean Griffith, Anthony D. Joseph, Randy Katz, Andy Konwinski, Gunho Lee et al. "A view of cloud computing ."Communications of the ACM 53, no. 4 (2010...Laboratory, * MIT Computer Science & Artificial Intelligence Laboratory Abstract— We present a novel system architecture for sparse pattern
Enhancing RHIC luminosity capabilities with in-situ beam piple coating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herschcovitch,A.; Blaskiewicz, M.; Fischer, W.
Electron clouds have been observed in many accelerators, including the Relativistic Heavy Ion Collider (RHIC) at the Brookhaven National Laboratory (BNL). They can limit the machine performance through pressure degradation, beam instabilities or incoherent emittance growth. The formation of electron clouds can be suppressed with beam pipe surfaces that have low secondary electron yield. At the same time, high wall resistivity in accelerators can result in levels of ohmic heating unacceptably high for superconducting magnets. This is a concern for the RHIC machine, as its vacuum chamber in the superconducting dipoles is made from relatively high resistivity 316LN stainless steel.more » The high resistivity can be addressed with a copper (Cu) coating; a reduction in the secondary electron yield can be achieved with a titanium nitride (TiN) or amorphous carbon (a-C) coating. Applying such coatings in an already constructed machine is rather challenging. We started developing a robotic plasma deposition technique for in-situ coating of long, small diameter tubes. The technique entails fabricating a device comprised of staged magnetrons and/or cathodic arcs mounted on a mobile mole for deposition of about 5 {micro}m (a few skin depths) of Cu followed by about 0.1 {micro}m of TiN (or a-C).« less
In situ measurements of the Runaway Breakdown (RB) on Aragats mountain
NASA Astrophysics Data System (ADS)
Chilingarian, A.; Hovsepyan, G.; Mailyan, B.
2017-12-01
Acceleration and multiplication of the cosmic ray electrons by strong electric fields in the thundercloud are well-established phenomena comprising the core of the atmospheric high-energy physics. The majority of experimental data on particle acceleration in the thunderclouds comes from space-born experiments detecting Terrestrial Gamma flashes (TGFs) and from networks of particle detectors located on the earth's surface observing Thunderstorm Ground Enhancements (TGEs). Models for explaining both TGF and TGE are based on the concept of a Runaway Breakdown (RB) introduced by A. Gurevich. Prove of these models requires registration of the electromagnetic avalanches developing in the thundercloud and reaching the earth's surface. Unfortunately due to high location of cloud and fast attenuation of electrons in the atmosphere the registration of such an avalanches are very rare. On Aragats mountain in Armenia, where the cloud location is very low we observe several TGE events with sizable electron contribution. We present direct measurements of such an avalanches lasting less than a microsecond; hundreds of such avalanches comprise a TGE lasting few minutes. We recovered as well the differential energy spectra of electron and gamma ray content of avalanches. The abrupt termination of the particle flux by nearby lightning indicates that RB process precedes (initiates) the lightning flash.
NASA Astrophysics Data System (ADS)
Defelice, Thomas Peter
The decline of forests has long been attributed to various natural (e.g. drought), man-made (e.g. logging), and perhaps, combinations of these (eg. fires caused by loggers) causes. A new type of forest decline (attributed to the deposition of air pollutants and other natural causes) has become apparent at high elevation sites in western Europe and North America; especially for above cloudbase forests like those in the Mt. Mitchell State Park. Investigations of air pollutant deposition are plentiful and laboratory studies have shown extreme deposition of these pollutants to be potentially harmful to forests. However, no field study has concentrated on these events. The primary objective of this study is to characterize (i.e., meterologically, microphysically, chemically) extreme episodes of air pollutant deposition. This study defines extreme aqueous events as having a pH < 3.1. pH's of this order are known to reduce laboratory tree growth depending on their age and species. On the average, one out of three aqueous events, sampled in the park during the 1986-1988 growing seasons (mid-May through mid-September), was extreme. Their occurrence over time may lead to the death of infant and 'old' trees, and to the reduced vigor of trees in their prime, as a result of triggering the decline mechanisms of these trees. These events usually last ~ 4.0 h, form during extended periods of high atmospheric pressure, have a liquid water content of ~ 0.10 gm^{-3}, and near typical cloud droplet sizes (~ 8.0 μm). Extreme aqueous events deposit most of their acid at their end. The deposition from the infrequent occurrences of very high ozone ( >=q100 ppb) and sulfur dioxide (>=q 5 ppb) concentrations in conjunction with these cloud events may be even more detrimental to the canopy, then that by extreme aqueous events alone. The physical characteristics of these combined events appear to include those of mature, precipitating clouds. Their occurrence may provide a clue as to how very low pH clouds might be deacidified. That is, base gases (eg. ammonia) locally introduced into such clouds at the proper time may render them harmless upon impact with the forest canopy, and beneficial to regional water supply users.
NASA Astrophysics Data System (ADS)
Schweitzer, S.; Kirchengast, G.; Proschek, V.
2011-10-01
LEO-LEO infrared-laser occultation (LIO) is a new occultation technique between Low Earth Orbit (LEO) satellites, which applies signals in the short wave infrared spectral range (SWIR) within 2 μm to 2.5 μm. It is part of the LEO-LEO microwave and infrared-laser occultation (LMIO) method that enables to retrieve thermodynamic profiles (pressure, temperature, humidity) and altitude levels from microwave signals and profiles of greenhouse gases and further variables such as line-of-sight wind speed from simultaneously measured LIO signals. Due to the novelty of the LMIO method, detailed knowledge of atmospheric influences on LIO signals and of their suitability for accurate trace species retrieval did not yet exist. Here we discuss these influences, assessing effects from refraction, trace species absorption, aerosol extinction and Rayleigh scattering in detail, and addressing clouds, turbulence, wind, scattered solar radiation and terrestrial thermal radiation as well. We show that the influence of refractive defocusing, foreign species absorption, aerosols and turbulence is observable, but can be rendered small to negligible by use of the differential transmission principle with a close frequency spacing of LIO absorption and reference signals within 0.5%. The influences of Rayleigh scattering and terrestrial thermal radiation are found negligible. Cloud-scattered solar radiation can be observable under bright-day conditions, but this influence can be made negligible by a close time spacing (within 5 ms) of interleaved laser-pulse and background signals. Cloud extinction loss generally blocks SWIR signals, except very thin or sub-visible cirrus clouds, which can be addressed by retrieving a cloud layering profile and exploiting it in the trace species retrieval. Wind can have a small influence on the trace species absorption, which can be made negligible by using a simultaneously retrieved or a moderately accurate background wind speed profile. We conclude that the set of SWIR channels proposed for implementing the LMIO method (Kirchengast and Schweitzer, 2011) provides adequate sensitivity to accurately retrieve eight trace species of key importance to climate and atmospheric chemistry (H2O, CO2, 13CO2, C18OO, CH4, N2O, O3, CO) in the upper troposphere/lower stratosphere region outside clouds under all atmospheric conditions. Two further species (HDO, H218O) can be retrieved in the upper troposphere.
Fast animation of lightning using an adaptive mesh.
Kim, Theodore; Lin, Ming C
2007-01-01
We present a fast method for simulating, animating, and rendering lightning using adaptive grids. The "dielectric breakdown model" is an elegant algorithm for electrical pattern formation that we extend to enable animation of lightning. The simulation can be slow, particularly in 3D, because it involves solving a large Poisson problem. Losasso et al. recently proposed an octree data structure for simulating water and smoke, and we show that this discretization can be applied to the problem of lightning simulation as well. However, implementing the incomplete Cholesky conjugate gradient (ICCG) solver for this problem can be daunting, so we provide an extensive discussion of implementation issues. ICCG solvers can usually be accelerated using "Eisenstat's trick," but the trick cannot be directly applied to the adaptive case. Fortunately, we show that an "almost incomplete Cholesky" factorization can be computed so that Eisenstat's trick can still be used. We then present a fast rendering method based on convolution that is competitive with Monte Carlo ray tracing but orders of magnitude faster, and we also show how to further improve the visual results using jittering.
iview: an interactive WebGL visualizer for protein-ligand complex.
Li, Hongjian; Leung, Kwong-Sak; Nakane, Takanori; Wong, Man-Hon
2014-02-25
Visualization of protein-ligand complex plays an important role in elaborating protein-ligand interactions and aiding novel drug design. Most existing web visualizers either rely on slow software rendering, or lack virtual reality support. The vital feature of macromolecular surface construction is also unavailable. We have developed iview, an easy-to-use interactive WebGL visualizer of protein-ligand complex. It exploits hardware acceleration rather than software rendering. It features three special effects in virtual reality settings, namely anaglyph, parallax barrier and oculus rift, resulting in visually appealing identification of intermolecular interactions. It supports four surface representations including Van der Waals surface, solvent excluded surface, solvent accessible surface and molecular surface. Moreover, based on the feature-rich version of iview, we have also developed a neat and tailor-made version specifically for our istar web platform for protein-ligand docking purpose. This demonstrates the excellent portability of iview. Using innovative 3D techniques, we provide a user friendly visualizer that is not intended to compete with professional visualizers, but to enable easy accessibility and platform independence.
NASA Astrophysics Data System (ADS)
Sasaki, S.
In the solar nebula, a growing planet attracts ambient gas to form a solar-type atmosphere. The structure of this H2-He atmosphere is calculated assuming the Earth was formed in the nebula. The blanketing effect of the atmosphere renders the planetary surface molten when the planetary mass exceeds 0.2 ME (ME being the present Earth's mass). Reduction of the surface melt by atmospheric H2 should add a large amount of H2O to the atmosphere: under the quartz-iron-fayalite oxygen buffer, partial pressure ratio P(H2O)/P(H2) becomes higher than 0.1. Enhancing opacity and gas mean molecular weight, the excess H2O raises the temperature and renders the atmosphere in convective equilibrium, while the dissociation of H2 suppresses the adiabatic temperature gradient. The surface temperature of the proto-Earth can be as high as 4700K when its mass is 1 ME. Such a high temperature may accelerate the evaporation of surface materials. A deep totally-molten magma ocean should exist in the accretion Earth.
A faster technique for rendering meshes in multiple display systems
NASA Astrophysics Data System (ADS)
Hand, Randall E.; Moorhead, Robert J., II
2003-05-01
Level of detail algorithms have widely been implemented in architectural VR walkthroughs and video games, but have not had widespread use in VR terrain visualization systems. This thesis explains a set of optimizations to allow most current level of detail algorithms run in the types of multiple display systems used in VR. It improves both the visual quality of the system through use of graphics hardware acceleration, and improves the framerate and running time through moifications to the computaitons that drive the algorithms. Using ROAM as a testbed, results show improvements between 10% and 100% on varying machines.
Simulations of Early Structure Formation: Primordial Gas Clouds
NASA Astrophysics Data System (ADS)
Yoshida, Naoki; Abel, Tom; Hernquist, Lars; Sugiyama, Naoshi
2003-08-01
We use cosmological simulations to study the origin of primordial star-forming clouds in a ΛCDM universe, by following the formation of dark matter halos and the cooling of gas within them. To model the physics of chemically pristine gas, we employ a nonequilibrium treatment of the chemistry of nine species (e-, H, H+, He, He+, He++, H2, H+2, H-) and include cooling by molecular hydrogen. By considering cosmological volumes, we are able to study the statistical properties of primordial halos, and the high resolution of our simulations enables us to examine these objects in detail. In particular, we explore the hierarchical growth of bound structures forming at redshifts z~25-30 with total masses in the range ~105-106Msolar. We find that when the amount of molecular hydrogen in these objects reaches a critical level, cooling by rotational line emission is efficient, and dense clumps of cold gas form. We identify these ``gas clouds'' as sites for primordial star formation. In our simulations, the threshold for gas cloud formation by molecular cooling corresponds to a critical halo mass of ~5×105h-1Msolar, in agreement with earlier estimates, but with a weak dependence on redshift in the range z>16. The complex interplay between the gravitational formation of dark halos and the thermodynamic and chemical evolution of the gas clouds compromises analytic estimates of the critical H2 fraction. Dynamical heating from mass accretion and mergers opposes relatively inefficient cooling by molecular hydrogen, delaying the production of star-forming clouds in rapidly growing halos. We also investigate the effect of photodissociating ultraviolet radiation on the formation of primordial gas clouds. We consider two extreme cases, first by including a uniform radiation field in the optically thin limit and second by accounting for the maximum effect of gas self-shielding in virialized regions. For radiation with Lyman-Werner band flux J>10-23 ergs s-1 cm-2 Hz-1 sr-1, hydrogen molecules are rapidly dissociated, rendering gas cooling inefficient. In both the cases we consider, the overall effect can be described by computing an equilibrium H2 abundance for the radiation flux and defining an effective shielding factor. Based on our numerical results, we develop a semianalytic model of the formation of the first stars and demonstrate how it can be coupled with large N-body simulations to predict the star formation rate in the early universe.
OpenNEX, a private-public partnership in support of the national climate assessment
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Wang, W.; Michaelis, A.; Votava, P.; Ganguly, S.
2016-12-01
The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX is funded as an enabling tool for sustaining the national climate assessment. Over the past five years, researchers have used the NEX platform and produced a number of data sets highly relevant to the National Climate Assessment. These include high-resolution climate projections using different downscaling techniques and trends in historical climate from satellite data. To enable a broader community in exploiting the above datasets, the NEX team partnered with public cloud providers to create the OpenNEX platform. OpenNEX provides ready access to NEX data holdings on a number of public cloud platforms along with pertinent analysis tools and workflows in the form of Machine Images and Docker Containers, lectures and tutorials by experts. We will showcase some of the applications of OpenNEX data and tools by the community on Amazon Web Services, Google Cloud and the NEX Sandbox.
Robotic Online Path Planning on Point Cloud.
Liu, Ming
2016-05-01
This paper deals with the path-planning problem for mobile wheeled- or tracked-robot which drive in 2.5-D environments, where the traversable surface is usually considered as a 2-D-manifold embedded in a 3-D ambient space. Specially, we aim at solving the 2.5-D navigation problem using raw point cloud as input. The proposed method is independent of traditional surface parametrization or reconstruction methods, such as a meshing process, which generally has high-computational complexity. Instead, we utilize the output of 3-D tensor voting framework on the raw point clouds. The computation of tensor voting is accelerated by optimized implementation on graphics computation unit. Based on the tensor voting results, a novel local Riemannian metric is defined using the saliency components, which helps the modeling of the latent traversable surface. Using the proposed metric, we prove that the geodesic in the 3-D tensor space leads to rational path-planning results by experiments. Compared to traditional methods, the results reveal the advantages of the proposed method in terms of smoothing the robot maneuver while considering the minimum travel distance.
Blowing in the Milky Way Wind: Neutral Hydrogen Clouds Tracing the Galactic Nuclear Outflow
NASA Astrophysics Data System (ADS)
Di Teodoro, Enrico M.; McClure-Griffiths, N. M.; Lockman, Felix J.; Denbo, Sara R.; Endsley, Ryan; Ford, H. Alyson; Harrington, Kevin
2018-03-01
We present the results of a new sensitive survey of neutral hydrogen above and below the Galactic Center with the Green Bank Telescope. The observations extend up to Galactic latitude | b| < 10^\\circ with an effective angular resolution of 9.‧5 and an average rms brightness temperature noise of 40 mK in a 1 {km} {{{s}}}-1 channel. The survey reveals the existence of a population of anomalous high-velocity clouds extending up to heights of about 1.5 kpc from the Galactic plane and showing no signature of Galactic rotation. These clouds have local standard of rest velocities | {V}LSR}| ≲ 360 {km} {{{s}}}-1, and assuming a Galactic Center origin, they have sizes of a few tens of parsec and neutral hydrogen masses spanning 10{--}{10}5 {M}ȯ . Accounting for selection effects, the cloud population is symmetric in longitude, latitude, and V LSR. We model the cloud kinematics in terms of an outflow expanding from the Galactic Center and find the population consistent with being material moving with radial velocity {V}{{w}}≃ 330 {km} {{{s}}}-1 distributed throughout a bicone with opening angle α > 140^\\circ . This simple model implies an outflow luminosity {L}{{w}}> 3× {10}40 erg s‑1 over the past 10 Myr, consistent with star formation feedback in the inner region of the Milky Way, with a cold gas mass-loss rate ≲ 0.1 {{M}ȯ {yr}}-1. These clouds may represent the cold gas component accelerated in the nuclear wind driven by our Galaxy, although some of the derived properties challenge current theoretical models of the entrainment process.
Lee, Meonghun; Yoe, Hyun
2015-01-01
The environment promotes evolution. Evolutionary processes represent environmental adaptations over long time scales; evolution of crop genomes is not inducible within the relatively short time span of a human generation. Extreme environmental conditions can accelerate evolution, but such conditions are often stress inducing and disruptive. Artificial growth systems can be used to induce and select genomic variation by changing external environmental conditions, thus, accelerating evolution. By using cloud computing and big-data analysis, we analyzed environmental stress factors for Pleurotus ostreatus by assessing, evaluating, and predicting information of the growth environment. Through the indexing of environmental stress, the growth environment can be precisely controlled and developed into a technology for improving crop quality and production. PMID:25874206
Cloud GPU-based simulations for SQUAREMR.
Kantasis, George; Xanthis, Christos G; Haris, Kostas; Heiberg, Einar; Aletras, Anthony H
2017-01-01
Quantitative Magnetic Resonance Imaging (MRI) is a research tool, used more and more in clinical practice, as it provides objective information with respect to the tissues being imaged. Pixel-wise T 1 quantification (T 1 mapping) of the myocardium is one such application with diagnostic significance. A number of mapping sequences have been developed for myocardial T 1 mapping with a wide range in terms of measurement accuracy and precision. Furthermore, measurement results obtained with these pulse sequences are affected by errors introduced by the particular acquisition parameters used. SQUAREMR is a new method which has the potential of improving the accuracy of these mapping sequences through the use of massively parallel simulations on Graphical Processing Units (GPUs) by taking into account different acquisition parameter sets. This method has been shown to be effective in myocardial T 1 mapping; however, execution times may exceed 30min which is prohibitively long for clinical applications. The purpose of this study was to accelerate the construction of SQUAREMR's multi-parametric database to more clinically acceptable levels. The aim of this study was to develop a cloud-based cluster in order to distribute the computational load to several GPU-enabled nodes and accelerate SQUAREMR. This would accommodate high demands for computational resources without the need for major upfront equipment investment. Moreover, the parameter space explored by the simulations was optimized in order to reduce the computational load without compromising the T 1 estimates compared to a non-optimized parameter space approach. A cloud-based cluster with 16 nodes resulted in a speedup of up to 13.5 times compared to a single-node execution. Finally, the optimized parameter set approach allowed for an execution time of 28s using the 16-node cluster, without compromising the T 1 estimates by more than 10ms. The developed cloud-based cluster and optimization of the parameter set reduced the execution time of the simulations involved in constructing the SQUAREMR multi-parametric database thus bringing SQUAREMR's applicability within time frames that would be likely acceptable in the clinic. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charnawskas, Joseph C.; Alpert, Peter A.; Lambe, Andrew T.
Anthropogenic and biogenic gas emissions contribute to the formation of secondary organic aerosol (SOA). When present, soot particles from fossil fuel combustion can acquire a coating of SOA. We investigate SOA–soot biogenic–anthropogenic interactions and their impact on ice nucleation in relation to the particles’ organic phase state. SOA particles were generated from the OH oxidation of naphthalene, α-pinene, longifolene, or isoprene, with or without the presence of sulfate or soot particles. Corresponding particle glass transition (T g) and full deliquescence relative humidity (FDRH) were estimated using a numerical diffusion model. Longifolene SOA particles are solid-like and all biogenic SOA sulfatemore » mixtures exhibit a core–shell configuration (i.e.a sulfate-rich core coated with SOA). Biogenic SOA with or without sulfate formed ice at conditions expected for homogeneous ice nucleation, in agreement with respectiveT gand FDRH. α-pinene SOA coated soot particles nucleated ice above the homogeneous freezing temperature with soot acting as ice nuclei (IN). At lower temperatures the α-pinene SOA coating can be semisolid, inducing ice nucleation. Naphthalene SOA coated soot particles acted as ice nuclei above and below the homogeneous freezing limit, which can be explained by the presence of a highly viscous SOA phase. Our results suggest that biogenic SOA does not play a significant role in mixed-phase cloud formation and the presence of sulfate renders this even less likely. However, anthropogenic SOA may have an enhancing effect on cloud glaciation under mixed-phase and cirrus cloud conditions compared to biogenic SOA that dominate during pre-industrial times or in pristine areas.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charnawskas, Joseph C.; Alpert, Peter A.; Lambe, Andrew
Anthropogenic and biogenic gas emissions contribute to the formation of secondary organic aerosol (SOA). When present, soot particles from fossil-fuel combustion can acquire a coating of SOA. We investigate SOA-soot biogenic-anthropogenic interactions and their impact on ice nucleation in relation to the particles’ organic phase state. SOA particles were generated from the OH oxidation of naphthalene, α-pinene, longifolene, or isoprene, with or without presence of sulfate or soot particles. Corresponding particle glass transition (T g) and full deliquescence relative humidity (FDRH) were estimated by a numerical diffusion model. Longifolene SOA particles are solid-like and all biogenic SOA sulfate mixtures exhibitmore » a core-shell configuration (i.e. a sulfate-rich core coated with SOA). Biogenic SOA with or without sulfate formed ice at conditions expected for homogeneous ice nucleation in agreement with respective T g and FDRH. α-pinene SOA coated soot particles nucleated ice above the homogeneous freezing temperature with soot acting as ice nuclei (IN). At lower temperatures the α-pinene SOA coating can be semisolid inducing ice nucleation. Naphthalene SOA coated soot particles acted as IN above and below the homogeneous freezing limit, which can be explained by the presence of a highly viscous SOA phase. Our results suggest that biogenic SOA does not play a significant role in mixed-phase cloud formation and the presence of sulfate further renders this even less likely. Furthermore, anthropogenic SOA may have an enhancing effect on cloud glaciation under mixed-phase and cirrus cloud conditions compared to biogenic SOA that dominate during preindustrial times or in pristine areas.« less
Issues Regarding the Assimilation of Cloud and Precipitation Data
NASA Technical Reports Server (NTRS)
Errico, Ronald M.; Bauer, Peter; Mahfouf, Jean-Francois
2008-01-01
This is the authors' response to a set of criticisms regarding a previously published work. It briefly addresses the main criticisms. In particular, it explains why some papers identified as having some fundamental flaws were referenced in the original work without detailed exposition of those flaws. It also explains why parts of the conclusion criticized as being contradictory are, in fact, not. It further highlights the need for more publishing of scientific criticisms. In the December 2007, special issue of the Journal of Atmospheric Sciences devoted to the Workshop on Assimilation of Satellite Cloud and Precipitation Observations, the authors published an article summarizing the many critical issues that render observations of cloud and precipitation difficult to analyze. Essentially, these include the inaccuracies of both current instruments and the relationships between what is actually observed (infrared or microwave energy detected at the altitude of the satellite) to what is desired (e.g., estimates of cloud drop sizes or rain rates) and the chaotic nature of atmospheric behavior and the complex mathematics describing it. The paper also included recommendations for future research and brief descriptions of many previous works concerning the subject. One reader is now attempting to publish a criticism of that paper. Her three complaints are that there was insufficient explanation of the identification of some cited works as being fundamentally flawed, that as a review the paper should have referenced some works additional to those it did, and that two recommendations were contradictory. Each of these complaints is addressed briefly in this response. First we explain why a brief list of works cited in our paper were identified as "flawed" with only a brief explanation. The design and conduct of the experiments reported in those papers violate well-established fundamentals such that, once the errors are recognized, their interpretations are no longer supported. Unfortunately, over the years, no researchers have bothered to publish criticisms of those papers, such that there are now too many to address in any single paper not devoted to that purpose. Yet, those papers are so often cited that we could not simply ignore them. Furthermore, if we had cited them without warning our readers regarding their flaws, we would have perpetrated a great disservice. In our response, however, we do offer further explanation of why some details, neglected in these papers, are critical to proper scientific evaluation. Neither did we offer insufficient references. Although we intentionally did not claim to be a "review' paper, we did cite 100 papers. That number is approximately 5 times the usual amount cited in journal articles. Although we only referenced few papers published after 2005, that was because our manuscript was submitted in January 2006, with its final, editorially-reviewed form in June 2006. We therefore could not reference papers published after this date. The problem here is that our paper was "in press" for 18 months. Finally, we explain that a careful reading of our paper reveals that our recommendations are not contradictory. Essentially, although we recommend 2 very distinct research approaches, these are complimentary and either alone is insufficient to accelerate progress. In conclusion, we recommend that the scientific community expends greater effort in publishing careful scientific criticisms so that others do not face the same dilemma we did. Likely this requires some reward system for doing so.
Measurement and reconstruction of the leaflet geometry for a pericardial artificial heart valve.
Jiang, Hongjun; Campbell, Gord; Xi, Fengfeng
2005-03-01
This paper describes the measurement and reconstruction of the leaflet geometry for a pericardial heart valve. Tasks involved include mapping the leaflet geometries by laser digitizing and reconstructing the 3D freeform leaflet surface based on a laser scanned profile. The challenge is to design a prosthetic valve that maximizes the benefits offered to the recipient as compared to the normally operating naturally-occurring valve. This research was prompted by the fact that artificial heart valve bioprostheses do not provide long life durability comparable to the natural heart valve, together with the anticipated benefits associated with defining the valve geometries, especially the leaflet geometries for the bioprosthetic and human valves, in order to create a replicate valve fabricated from synthetic materials. Our method applies the concept of reverse engineering in order to reconstruct the freeform surface geometry. A Brown & Shape coordinate measuring machine (CMM) equipped with a HyMARC laser-digitizing system was used to measure the leaflet profiles of a Baxter Carpentier-Edwards pericardial heart valve. The computer software, Polyworks was used to pre-process the raw data obtained from the scanning, which included merging images, eliminating duplicate points, and adding interpolated points. Three methods, creating a mesh model from cloud points, creating a freeform surface from cloud points, and generating a freeform surface by B-splines are presented in this paper to reconstruct the freeform leaflet surface. The mesh model created using Polyworks can be used for rapid prototyping and visualization. To fit a freeform surface to cloud points is straightforward but the rendering of a smooth surface is usually unpredictable. A surface fitted by a group of B-splines fitted to cloud points was found to be much smoother. This method offers the possibility of manually adjusting the surface curvature, locally. However, the process is complex and requires additional manipulation. Finally, this paper presents a reverse engineered design for the pericardial heart valve which contains three identical leaflets with reconstructed geometry.
Surfatron accelerator in the local interstellar cloud
NASA Astrophysics Data System (ADS)
Loznikov, V. M.; Erokhin, N. S.; Zol'nikova, N. N.; Mikhailovskaya, L. A.
2017-01-01
Taking into account results of numerous experiments, the variability of the energy spectra of cosmic rays (protons and helium nuclei) in the energy range of 10 GeV to 107 GeV is explained on the basis of a hypothesis of the existence of two variable sources close to the Sun. The first (soft) surfatron source (with a size of 100 AU) is located at the periphery of the heliosphere. The second (hard) surfatron source (with a size of 1 pc) is situated in the Local Interstellar Cloud (LIC) at a distance of <1 pc. The constant background is described by a power-law spectrum with a slope of 2.75. The variable heliospheric surfatron source is described by a power-law spectrum with a variable amplitude, slope, and cutoff energy, the maximum cutoff energy being in the range of E CH/ Z < 1000 GeV. The variable surfatron source in the LIC is described by a power-law spectrum with a variable amplitude, slope, and cut-off energy, the maximum cut-off energy being E CL/ Z ≤ 3 × 106 GeV. The proposed model is used to approximate data from several experiments performed at close times. The energy of each cosmic-ray component is calculated. The possibility of surfatron acceleration of Fe nuclei ( Z = 26) in the LIC up to an energy of E CL 1017 eV and electron and positrons to the "knee" in the energy spectrum is predicted. By numerically solving a system of nonlinear equations describing the interaction between an electromagnetic wave and a charged particle with an energy of up to E/ Z 3 × 106 GeV, the possibility of trapping, confinement, and acceleration of charged cosmic-ray particles by a quasi-longitudinal plasma wave is demonstrated.
NASA Astrophysics Data System (ADS)
Liu, X.; Shi, Y.; Wu, M.; Zhang, K.
2017-12-01
Mixed-phase clouds frequently observed in the Arctic and mid-latitude storm tracks have the substantial impacts on the surface energy budget, precipitation and climate. In this study, we first implement the two empirical parameterizations (Niemand et al. 2012 and DeMott et al. 2015) of heterogeneous ice nucleation for mixed-phase clouds in the NCAR Community Atmosphere Model Version 5 (CAM5) and DOE Accelerated Climate Model for Energy Version 1 (ACME1). Model simulated ice nucleating particle (INP) concentrations based on Niemand et al. and DeMott et al. are compared with those from the default ice nucleation parameterization based on the classical nucleation theory (CNT) in CAM5 and ACME, and with in situ observations. Significantly higher INP concentrations (by up to a factor of 5) are simulated from Niemand et al. than DeMott et al. and CNT especially over the dust source regions in both CAM5 and ACME. Interestingly the ACME model simulates higher INP concentrations than CAM5, especially in the Polar regions. This is also the case when we nudge the two models' winds and temperature towards the same reanalysis, indicating more efficient transport of aerosols (dust) to the Polar regions in ACME. Next, we examine the responses of model simulated cloud liquid water and ice water contents to different INP concentrations from three ice nucleation parameterizations (Niemand et al., DeMott et al., and CNT) in CAM5 and ACME. Changes in liquid water path (LWP) reach as much as 20% in the Arctic regions in ACME between the three parameterizations while the LWP changes are smaller and limited in the Northern Hemispheric mid-latitudes in CAM5. Finally, the impacts on cloud radiative forcing and dust indirect effects on mixed-phase clouds are quantified with the three ice nucleation parameterizations in CAM5 and ACME.
Sedna and the cloud of comets surrounding the solar system in Milgromian dynamics
NASA Astrophysics Data System (ADS)
Paučo, R.; Klačka, J.
2016-05-01
We reconsider the hypothesis of a vast cometary reservoir surrounding the solar system - the Oort cloud of comets - within the framework of Milgromian dynamics (MD or MOND). For this purpose we built a numerical model of the cloud, assuming the theory of modified gravity, QUMOND. In modified gravity versions of MD, the internal dynamics of a system is influenced by the external gravitational field in which the system is embedded, even when this external field is constant and uniform, a phenomenon dubbed the external field effect (EFE). Adopting the popular pair ν(x) = [1-exp(-x1 / 2)] -1 for the MD interpolating function and a0 = 1.2 × 10-10 m s-2 for the MD acceleration scale, we found that the observationally inferred Milgromian cloud of comets is much more radially compact than its Newtonian counterpart. The comets of the Milgromian cloud stay away from the zone where the Galactic tide can torque their orbits significantly. However, this does not need to be an obstacle for the injection of the comets into the inner solar system as the EFE can induce significant change in perihelion distance during one revolution of a comet around the Sun. Adopting constraints on different interpolating function families and a revised value of a0 (provided recently by the Cassini spacecraft), the aforementioned qualitative results no longer hold, and, in conclusion, the Milgromian cloud is very similar to the Newtonian in its overall size, binding energies of comets and hence the operation of the Jupiter-Saturn barrier. However, EFE torquing of perihelia still play a significant role in the inner parts of the cloud. Consequently Sedna-like orbits and orbits of large semi-major axis Centaurs are easily comprehensible in MD. In MD, they both belong to the same population, just in different modes of their evolution.
Plasma Radiation and Acceleration Effectiveness of CME-driven Shocks
NASA Astrophysics Data System (ADS)
Gopalswamy, N.; Schmidt, J. M.
2008-05-01
CME-driven shocks are effective radio radiation generators and accelerators for Solar Energetic Particles (SEPs). We present simulated 3 D time-dependent radio maps of second order plasma radiation generated by CME- driven shocks. The CME with its shock is simulated with the 3 D BATS-R-US CME model developed at the University of Michigan. The radiation is simulated using a kinetic plasma model that includes shock drift acceleration of electrons and stochastic growth theory of Langmuir waves. We find that in a realistic 3 D environment of magnetic field and solar wind outflow of the Sun the CME-driven shock shows a detailed spatial structure of the density, which is responsible for the fine structure of type II radio bursts. We also show realistic 3 D reconstructions of the magnetic cloud field of the CME, which is accelerated outward by magnetic buoyancy forces in the diverging magnetic field of the Sun. The CME-driven shock is reconstructed by tomography using the maximum jump in the gradient of the entropy. In the vicinity of the shock we determine the Alfven speed of the plasma. This speed profile controls how steep the shock can grow and how stable the shock remains while propagating away from the Sun. Only a steep shock can provide for an effective particle acceleration.
Plasma radiation and acceleration effectiveness of CME-driven shocks
NASA Astrophysics Data System (ADS)
Schmidt, Joachim
CME-driven shocks are effective radio radiation generators and accelerators for Solar Energetic Particles (SEPs). We present simulated 3 D time-dependent radio maps of second order plasma radiation generated by CME-driven shocks. The CME with its shock is simulated with the 3 D BATS-R-US CME model developed at the University of Michigan. The radiation is simulated using a kinetic plasma model that includes shock drift acceleration of electrons and stochastic growth theory of Langmuir waves. We find that in a realistic 3 D environment of magnetic field and solar wind outflow of the Sun the CME-driven shock shows a detailed spatial structure of the density, which is responsible for the fine structure of type II radio bursts. We also show realistic 3 D reconstructions of the magnetic cloud field of the CME, which is accelerated outward by magnetic buoyancy forces in the diverging magnetic field of the Sun. The CME-driven shock is reconstructed by tomography using the maximum jump in the gradient of the entropy. In the vicinity of the shock we determine the Alfven speed of the plasma. This speed profile controls how steep the shock can grow and how stable the shock remains while propagating away from the Sun. Only a steep shock can provide for an effective particle acceleration.
Beam position monitoring system at CESR
NASA Astrophysics Data System (ADS)
Billing, M. G.; Bergan, W. F.; Forster, M. J.; Meller, R. E.; Rendina, M. C.; Rider, N. T.; Sagan, D. C.; Shanks, J.; Sikora, J. P.; Stedinger, M. G.; Strohman, C. R.; Palmer, M. A.; Holtzapple, R. L.
2017-09-01
The Cornell Electron-positron Storage Ring (CESR) has been converted from a High Energy Physics electron-positron collider to operate as a dedicated synchrotron light source for the Cornell High Energy Synchrotron Source (CHESS) and to conduct accelerator physics research as a test accelerator, capable of studying topics relevant to future damping rings, colliders and light sources. Some of the specific topics that were targeted for the initial phase of operation of the storage ring in this mode, labeled CESRTA (CESR as a Test Accelerator), included 1) tuning techniques to produce low emittance beams, 2) the study of electron cloud development in a storage ring and 3) intra-beam scattering effects. The complete conversion of CESR to CESRTA occurred over a several year period and is described elsewhere. As a part of this conversion the CESR beam position monitoring (CBPM) system was completely upgraded to provide the needed instrumental capabilities for these studies. This paper describes the new CBPM system hardware, its function and representative measurements performed by the upgraded system.
FERMI LAT discovery of extended gamma-ray emissions in the vicinity of the HB 3 supernova remnant
Katagiri, H.; Yoshida, K.; Ballet, J.; ...
2016-02-11
We report the discovery of extended gamma-ray emission measured by the Large Area Tele- scope (LAT) onboard the Fermi Gamma-ray Space Telescope in the region of the supernova rem- nant (SNR) HB 3 (G132.7+1.3) and the W3 HII complex adjacent to the southeast of the remnant. W3 is spatially associated with bright 12CO (J=1-0) emission. The gamma-ray emission is spatially correlated with this gas and the SNR. We discuss the possibility that gamma rays originate in inter- actions between particles accelerated in the SNR and interstellar gas or radiation fields. The decay of neutral pions produced in nucleon-nucleon interactions betweenmore » accelerated hadrons and interstellar gas provides a reasonable explanation for the gamma-ray emission. The emission fromW3 is consistent with irradiation of the CO clouds by the cosmic rays accelerated in HB 3.« less
FERMI LAT DISCOVERY OF EXTENDED GAMMA-RAY EMISSIONS IN THE VICINITY OF THE HB 3 SUPERNOVA REMNANT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katagiri, H.; Yoshida, K.; Ballet, J.
2016-02-20
We report the discovery of extended gamma-ray emission measured by the Large Area Telescope (LAT) onboard the Fermi Gamma-ray Space Telescope in the region of the supernova remnant (SNR) HB 3 (G132.7+1.3) and the W3 II complex adjacent to the southeast of the remnant. W3 is spatially associated with bright {sup 12}CO (J = 1–0) emission. The gamma-ray emission is spatially correlated with this gas and the SNR. We discuss the possibility that gamma rays originate in interactions between particles accelerated in the SNR and interstellar gas or radiation fields. The decay of neutral pions produced in nucleon–nucleon interactions between accelerated hadrons and interstellar gas provides amore » reasonable explanation for the gamma-ray emission. The emission from W3 is consistent with irradiation of the CO clouds by the cosmic rays accelerated in HB 3.« less
NIR Imaging Spectroscopy of the Inner Few Arcseconds of NGC 4151 with OSIRIS at Keck
NASA Technical Reports Server (NTRS)
Iserlohe, Christof; Krabbe, Alfred; Larkin, James E.; Barczys, Matthew; McElwain, Michael W.; Quirrenbach, Andreas; Weiss, Jason; Wright, Shelley A.
2013-01-01
We present H- and K-band data from the inner arcsecond of the Seyfert 1.5 galaxy NGC 4151 obtained with the adaptive optics assisted near-infrared imaging field spectrograph OSIRIS at the Keck Observatory. The angular resolution is about a few parsecs on-site and thus competes easily with optical images taken previously with the Hubble Space Telescope. We present the morphology and dynamics of most species detected but focus on the morphology and dynamics of the narrow line region (as traced by emission of [FeII]?1.644 µm), the interplay between plasma ejected from the nucleus (as traced by 21 cm continuum radio data) and hot H2 gas and characterize the detected nuclear HeI?2.058 µm absorption feature as a narrow absorption line (NAL) phenomenon. Emission from the narrow line region (NLR) as traced by [FeII] reveals a biconical morphology and we compare the measured dynamics in the [FeII] emission line with models proposing acceleration of gas in the NLR and simple ejection of gas into the NLR. In the inner 2.5 arcseconds the acceleration model reveals a better fit to our data than the ejection model.We also see evidence that the jet very locally enhances emission in [FeII] at certain positions in our field-of-view such that we were able to distinct the kinematics of these clouds from clouds generally accelerated in the NLR. Further, the radio jet is aligned with the bicone surface rather than the bicone axis such that we assume that the jet is not the dominant mechanism responsible for driving the kinematics of clouds in the NLR. The hot H2 gas is thermal with a temperature of about 1700 K. We observe a remarkable correlation between individual H2 clouds at systemic velocity with the 21 cm continuum radio jet. We propose that the radio jet is at least partially embedded in the galactic disk of NGC 4151 such that deviations from a linear radio structure are invoked by interactions of jet plasma with H2 clouds that are moving into the path of the jet because of rotation of the galactic disk of NGC 4151. Additionally, we observe a correlation of the jet as traced by the radio data, with gas as traced in Br? and H2, at velocities between systemic and +/- 200 km/s at several locations along the path of the jet. The HeI?2.058 µm line in NGC 4151 appears in emission with a blueshifted absorption component from an outflow. The emission (absorption) component has a velocity offset of 10 km/s (-280 km/s) with a Gaussian (Lorentzian) full-width (half-width) at half maximum of 160 km/s (440 km/s). The absorption component remains spatially unresolved and its kinematic measures differ from that of UV resonance absorption lines. From the amount of absorption we derive a lower limit of the HeI 2S column density of 1 × 10(exp 14) cm-2 with a covering factor along the line-of-sight of C(sub los) approximately equal to 0.1.
Coronal Mass Ejections Near the Sun and in the Interplanetary Medium
NASA Technical Reports Server (NTRS)
Gopalswamy, Nat
2012-01-01
Coronal mass ejections (CMEs) are the most energetic phenomenon in the heliosphere. During solar eruptions, the released energy flows out from the Sun in the form of magnetized plasma and electromagnetic radiation. The electromagnetic radiation suddenly increases the ionization content of the ionosphere, thus impacting communication and navigation systems. The plasma clouds can drive shocks that accelerate charged particles to very high energies in the interplanetary space, which pose radiation hazard to astronauts and space systems. The plasma clouds also arrive at Earth in about two days and impact Earth's magnetosphere, producing geomagnetic storms. The magnetic storms result in a number of effects including induced currents that can disrupt power grids, railroads, and underground pipelines. This lecture presents an overview of the origin, propagation, and geospace consequences of CMEs and their interplanetary counterparts.
The upper atmosphere of Venus: A tentative explanation of its rotation
NASA Technical Reports Server (NTRS)
Boyer, C.
1986-01-01
The upper atmosphere of Venus seems to revolve every 4 days, while the planet rotates in 243 days. Mariner 10 UV data on the changing positions of dark spots in the upper Venusian clouds have supported estimations of speeds ranging from 120-240 m/s. High rates of acceleration and deceleration occur on the night side, the former between -110 to -90 deg and the latter continuing to -50 deg. Arch and Y formations have been seen repeatedly between -110 to -70 deg. The highest are seen at about -90 deg and the lowest at about -30 deg. The temperature of the cloud layer at 60 km altitude is about 20 C, the pressure is nearly one earth atmosphere, and complex molecules, including O, C, H, N and S and combinations of these are present in abundance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andriyash, A. V.; Astashkin, M. V.; Baranov, V. K.
2016-06-15
The results of optoheterodyne Doppler measurements of the ballistic expansion of the products of surface destruction under shock-wave loading are presented. The possibility of determining the physical characteristics of a rapidly flying dust cloud, including the microparticle velocities, the microparticle sizes, and the areal density of the dust cloud, is shown. A compact stand for performing experiments on shock-wave loading of metallic samples is described. Shock-wave loading is performed by a 100-µm-thick tantalum flyer plate accelerated to a velocity of 2.8 km/s. As the samples, lead plates having various thicknesses and the same surface roughness are used. At a shock-wavemore » pressure of 31.5 GPa, the destruction products are solid microparticles about 50 µm in size. At a pressure of 42 and 88 GPa, a liquid-drop dust cloud with a particle size of 10–15 µm is formed. To interpret the spectral data on the optoheterodyne Doppler measurements of the expansion of the surface destruction products (spalled fragments, dust microparticles), a transport equation for the function of mutual coherence of a multiply scattered field is used. The Doppler spectra of a backscattered signal are calculated with the model developed for the dust cloud that appears when a shock wave reaches the sample surface at the parameters that are typical of an experimental situation. Qualitative changes are found in the spectra, depending on the optical thickness of the dust cloud. The obtained theoretical results are in agreement with the experimental data.« less
NASA Astrophysics Data System (ADS)
Lima, V.; Hossain, U. H.; Walbert, T.; Seidl, T.; Ensinger, W.
2018-03-01
The study of polymers irradiated by highly energetic ions and the resulting radiation-induced degradation is of major importance for space and particle accelerator applications. The mechanism of ion-induced molecular fragmentation of polyethylene, polyethyleneimine and polyamide was investigated by means of mass spectrometry and infrared spectroscopy. The results show that the introduction of nitrogen and oxygen into the polymer influences the stability rendering aliphatic polymers with heteroatoms less stable. A comparison to thermal decomposition data from literature reveals that ion-induced degradation is different in its bond fracture mechanism. While thermal degradation starts at the weakest bond, which is usually the carbon-heteroatom bond, energetic ion irradiation leads in the first step to scission of all types of bonds creating smaller molecular fragments. This is due to the localized extreme energy input under non-equilibrium conditions when the ions transfer kinetic energy onto electrons. These findings are of relevance for the choice of polymers for long-term application in both space and accelerator facilities.
Moly99 Production Facility: Report on Beamline Components, Requirements, Costs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bishofberger, Kip A.
2015-12-23
In FY14 we completed the design of the beam line for the linear accelerator production design concept. This design included a set of three bending magnets, quadrupole focusing magnets, and octopoles to flatten the beam on target. This design was generic and applicable to multiple different accelerators if necessary. In FY15 we built on that work to create specifications for the individual beam optic elements, including power supply requirements. This report captures the specification of beam line components with initial cost estimates for the NorthStar production facility.This report is organized as follows: The motivation of the beamline design is introducedmore » briefly, along with renderings of the design. After that, a specific list is provided, which accounts for each beamline component, including part numbers and costs, to construct the beamline. After that, this report details the important sections of the beamline and individual components. A final summary and list of follow-on activities completes this report.« less
Stepping Into Science Data: Data Visualization in Virtual Reality
NASA Astrophysics Data System (ADS)
Skolnik, S.
2017-12-01
Have you ever seen people get really excited about science data? Navteca, along with the Earth Science Technology Office (ESTO), within the Earth Science Division of NASA's Science Mission Directorate have been exploring virtual reality (VR) technology for the next generation of Earth science technology information systems. One of their first joint experiments was visualizing climate data from the Goddard Earth Observing System Model (GEOS) in VR, and the resulting visualizations greatly excited the scientific community. This presentation will share the value of VR for science, such as the capability of permitting the observer to interact with data rendered in real-time, make selections, and view volumetric data in an innovative way. Using interactive VR hardware (headset and controllers), the viewer steps into the data visualizations, physically moving through three-dimensional structures that are traditionally displayed as layers or slices, such as cloud and storm systems from NASA's Global Precipitation Measurement (GPM). Results from displaying this precipitation and cloud data show that there is interesting potential for scientific visualization, 3D/4D visualizations, and inter-disciplinary studies using VR. Additionally, VR visualizations can be leveraged as 360 content for scientific communication and outreach and VR can be used as a tool to engage policy and decision makers, as well as the public.
A secure online image trading system for untrusted cloud environments.
Munadi, Khairul; Arnia, Fitri; Syaryadhi, Mohd; Fujiyoshi, Masaaki; Kiya, Hitoshi
2015-01-01
In conventional image trading systems, images are usually stored unprotected on a server, rendering them vulnerable to untrusted server providers and malicious intruders. This paper proposes a conceptual image trading framework that enables secure storage and retrieval over Internet services. The process involves three parties: an image publisher, a server provider, and an image buyer. The aim is to facilitate secure storage and retrieval of original images for commercial transactions, while preventing untrusted server providers and unauthorized users from gaining access to true contents. The framework exploits the Discrete Cosine Transform (DCT) coefficients and the moment invariants of images. Original images are visually protected in the DCT domain, and stored on a repository server. Small representation of the original images, called thumbnails, are generated and made publicly accessible for browsing. When a buyer is interested in a thumbnail, he/she sends a query to retrieve the visually protected image. The thumbnails and protected images are matched using the DC component of the DCT coefficients and the moment invariant feature. After the matching process, the server returns the corresponding protected image to the buyer. However, the image remains visually protected unless a key is granted. Our target application is the online market, where publishers sell their stock images over the Internet using public cloud servers.
NASA Astrophysics Data System (ADS)
Juvela, Mika J.
The relationship between physical conditions of an interstellar cloud and the observed radiation is defined by the radiative transfer problem. Radiative transfer calculations are needed if, e.g., one wants to disentangle abundance variations from excitation effects or wants to model variations of dust properties inside an interstellar cloud. New observational facilities (e.g., ALMA and Herschel) will bring improved accuracy both in terms of intensity and spatial resolution. This will enable detailed studies of the densest sub-structures of interstellar clouds and star forming regions. Such observations must be interpreted with accurate radiative transfer methods and realistic source models. In many cases this will mean modelling in three dimensions. High optical depths and observed wide range of linear scales are, however, challenging for radiative transfer modelling. A large range of linear scales can be accessed only with hierarchical models. Figure 1 shows an example of the use of a hierarchical grid for radiative transfer calculations when the original model cloud (L=10 pc,
Acoustic Model of the Remnant Bubble Cloud from Underwater Explosion
2012-11-01
fluid, bu g is the acceleration due to gravity, and C is the drag coefficient. Here we use the Grace Drag model (Clift et al., 1978; ANSYS CFX ...Dynaflow, Inc., Baltimore, MD for providing the bubble maker data. REFERENCES ANSYS CFX -Solver, Release 13.0: Theory 2010. ANSYS Inc. Brennen...unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 21-23 November 2012, Fremantle, Australia Proceedings of Acoustics 2012
Induction Linacs and Free Electron Laser Amplifiers
1986-03-20
accelerated and the effects of space - charge force is minimized. EMnTANCE-PRESERVING BEAMLINE The beamline (Fig. 5) is designed to preserve the good beam...electrons and pushes them right out of the way leaving a bare ion cloud. With relativistic beams in vacuum, their space charge defocusing is offset by the...suspect, on why charged particle beams cannot be used in space . Now it is a fairly straight- forward extrapolation, already mentioned in Lou Marguet’s
A model for the repeating FRB 121102 in the AGN scenario
NASA Astrophysics Data System (ADS)
Vieyro, F. L.; Romero, G. E.; Bosch-Ramon, V.; Marcote, B.; del Valle, M. V.
2017-06-01
Context. Fast radio bursts (FRBs) are transient sources of unknown origin. Recent radio and optical observations have provided strong evidence for an extragalactic origin of the phenomenon and the precise localization of the repeating FRB 121102. Observations using the Karl G. Jansky Very Large Array (VLA) and very-long-baseline interferometry (VLBI) have revealed the existence of a continuum non-thermal radio source consistent with the location of the bursts in a dwarf galaxy. All these new data rule out several models that were previously proposed, and impose stringent constraints to new models. Aims: We aim to model FRB 121102 in light of the new observational results in the active galactic nucleus (AGN) scenario. Methods: We propose a model for repeating FRBs in which a non-steady relativistic e±-beam, accelerated by an impulsive magnetohydrodynamic driven mechanism, interacts with a cloud at the centre of a star-forming dwarf galaxy. The interaction generates regions of high electrostatic field called cavitons in the plasma cloud. Turbulence is also produced in the beam. These processes, plus particle isotropization, the interaction scale, and light retardation effects, provide the necessary ingredients for short-lived, bright coherent radiation bursts. Results: The mechanism studied in this work explains the general properties of FRB 121102, and may also be applied to other repetitive FRBs. Conclusions: Coherent emission from electrons and positrons accelerated in cavitons provides a plausible explanation of FRBs.
UAS Photogrammetry for Rapid Response Characterization of Subaerial Coastal Change
NASA Astrophysics Data System (ADS)
Do, C.; Anarde, K.; Figlus, J.; Prouse, W.; Bedient, P. B.
2016-12-01
Unmanned aerial systems (UASs) provide an exciting new platform for rapid response measurement of subaerial coastal change. Here we validate the use of a coupled hobbyist UAS and optical photogrammetry framework for high-resolution mapping of portions of a low-lying barrier island along the Texas Gulf Coast. A DJI Phantom 3 Professional was used to capture 2D nadir images of the foreshore and back-beach environments containing both vegetated and non-vegetated features. The images were georeferenced using ground-truth markers surveyed via real-time kinematic (RTK) GPS and were then imported into Agisoft Photoscan, a photo-processing software, to generate 3D point clouds and digital elevation maps (DEMs). The georeferenced elevation models were then compared to RTK measurements to evaluate accuracy and precision. Thus far, DEMs derived from UAS photogrammetry show centimeter resolution for renderings of non-vegetated landforms. High-resolution renderings of vegetated and back-barrier regions have proven more difficult due to interstitial wetlands (surface reflectance) and uneven terrain for GPS backpack surveys. In addition to producing high-quality models, UAS photogrammetry has demonstrated to be more time-efficient than traditional mapping methods, making it advantageous for rapid response deployments. This study is part of a larger effort to relate field measurements of storm hydrodynamics to subaerial evidence of geomorphic change to better understand barrier island response to extreme storms.
NASA Astrophysics Data System (ADS)
2001-01-01
ESO Telescopes Provide Most Detailed View Ever Into a Dark Cloud Summary How do stars like our Sun come into being? Which fundamental processes are responsible for transforming a dark and diffuse interstellar cloud of gas and dust into a much denser, shining object? Astronomers have just taken an important step towards answering this fundamental question. Based on the most detailed study ever made of the internal structure of a small interstellar cloud, three scientists from ESO and the USA [1] have found that it is apparently on the verge of becoming unstable - and thus in the stage immediately preceding a dramatic collapse into a dense and hot, low-mass star. Interestingly, the current structure of this cloud, a "Bok globule" known as Barnard 68 (B68) [2], is governed by the same basic physics as is that of a star. The cloud is obviously in a temporary state of near-equilibrium, where the inward force of gravity caused by its mass more or less balances that of the outward pressure due to its temperature. But this situation may not last long. The astronomers believe that this particular cloud, together with some others in the same galactic neighbourhood, constitute the few resistent remains of a much larger cloud that has disappeared due to the influence of strong stellar winds and ultraviolet radiation from young and heavy stars as well as supernova explosions. The new and unique insight into the pre-collapse phase of the complicated process of stellar birth is based on observations made with ESO telescopes at the La Silla and Paranal observatories in Chile. PR Photo 02a/01 : The Bok Globule B68 , as seen in visual light. PR Photo 02b/01 : Looking through the Bok Globule B68 . PR Photo 02c/01 : A comparison of the visual and infrared views of the Bok Globule B68 . From Dark Clouds to Stars Astronomers have known for some time that stars like our Sun are formed from interstellar clouds of gas and dust. When they contract, the interior temperature rises. If the cloud is sufficiently heavy, it will become so hot at the centre that energy-producing nuclear processes ignite. After a while, the central regions of the cloud reach equilibrium and a new star is born. Planets are formed from condensations in the surrounding material as this collects in a circumstellar disk. A good understanding of the origin of stars and planetary systems, like our own solar system, is therefore intimately connected to a detailed knowledge about the conditions in the cold interiors of dark clouds in interstellar space. However, such clouds are highly opaque and their physical structure has remained a mystery for as long as we have known about their existence. The following phases of stellar evolution are much better known and some scientists therefore refer to these very earliest stages as the "missing link" in our current picture of star formation. Finely balanced equilibrium The present results are changing this situation. By means of a new and straightforward observational technique, it has now been possible to explore the detailed structure of a nearby cloud. It is found to be quite simple, with the mean density steadily increasing towards the centre. In fact, the way this happens (referred to as the cloud's "density profile") is exactly as expected in an isolated gas sphere at a certain temperature in which the inward force of gravity is finely balanced against the internal thermal pressure. With this clear physical description it is now possible to determine with unprecedented precision (approx. 3%) the fundamental parameters of the cloud, such as its distance and gas-to-dust ratio. ESO astronomer João Alves from the team is content: "These measurements constitute a major breakthrough in the understanding of dark clouds. For the first time, the internal structure of a dark cloud has been specified with a detail approaching that which characterizes our knowledge of stellar interiors". Seeing light through the dark The observational technique that has led to the new result is straightforward but rather difficult to apply to dark clouds. It is based on measurements of the light from stars that are located behind the cloud. When this light passes through the cloud, it is absorbed and scattered by the dust inside. The effect depends on the colour (wavelength) and the background stars will appear redder than they really are . It is also proportional to the amount of obscuring material and is therefore largest for stars that are situated behind the cloud's centre. By measuring the degree of this "reddening" experienced by stars seen through different areas of the cloud, it is thus possible to chart the distribution of dust in the cloud . The finer the net of background stars is, the more detailed this map will be and the better the information about the internal structure of the cloud. And that is exactly the problem. Even small clouds are so opaque that very few background stars can be seen through them. Only large telescopes and extremely sensitive instruments are able to observe a sufficient number of stars in order to produce significant results. In particular, until now it has never been possible to map the densest, central areas of a dark cloud. The structure of Barnard 68 ESO PR Photo 02a/01 ESO PR Photo 02a/01 [Preview - JPEG:400 x 482 pix - 92k] [Normal - JPEG: 800 x 964 pix - 560k] [Hires - JPEG: 2296 x 2768 pix - 7.9M] ESO PR Photo 02b/01 ESO PR Photo 02b/01 [Preview - JPEG: 400 x 480 pix - 89k] [Normal - JPEG: 800 x 960 pix - 432k] [Hires - JPEG: 2301 x 2762 pix - 7.3M] ESO PR Photo 02c/01 ESO PR Photo 02c/01 [Preview - JPEG: 624 x 400 pix - 88k] [Normal - JPEG: 1247 x 800 pix - 496k] [Hires - JPEG: 2828 x 1814 pix - 5.6M] Caption : PR Photo 02a/01 shows a colour composite of visible and near-infrared images of the dark cloud Barnard 68 . It was obtained with the 8.2-m VLT ANTU telescope and the multimode FORS1 instrument in March 1999. At these wavelengths, the small cloud is completely opaque because of the obscuring effect of dust particles in its interior. PR Photo 02b/01 is a false-colour composite based on a visible (here rendered as blue), a near-infrared (green) and an infrared (red) image. Since the light from stars behind the cloud is only visible at the longest (infrared) wavelengths, they appear red. In PR Photo 02c/01 , the central area of these two photos may be directly compared. Technical information about these photos is available below. At a distance of only 410 light-years, Barnard 68 is one of the nearest dark clouds. Its size is about 12,500 AU (= 2 million million km; 1 Astronomical Unit [AU] = 150 million km), or just about the same as the so-called "Oort Cloud" of long-period comets that surrounds the solar system. The temperature of Barnard 68 is 16 Kelvin (-257 °C) and the pressure at its boundary is 0.0025 nPa, or about 10 times higher than in the interstellar medium (but still 40,000 million million times less than the atmospheric pressure at the Earth's surface!). The total mass of the cloud is about twice that of the Sun. A new investigation of Barnard 68 was carried out by means of instruments at the 3.58-m New Technology Telescope (NTT) at La Silla and the Very Large Telescope (VLT) at Paranal. Long exposures revealed a total of about 3700 background stars (of which over 1000 can only be seen at infrared wavelengths), cf. PR Photos 02a-c/01 . Careful measurements of the colours of these stars and hence, the degree of obscuration, allowed the most finely sampled (in more than 1000 individual areas) and most accurate mapping of the dust distribution inside a dark cloud ever performed. In order to further increase the accuracy, the mean dust density was measured in concentric circles around the centre - this resulted in a very accurate determination of the change in dust density with the distance from the centre. It was found that this dependance is almost exactly as that predicted for a sphere in which the opposite forces of gravity and internal pressure closely balance each other. Nevertheless, it is also evident that Barnard 68 is only marginally stable and is on the verge of collapse. The origin of Barnard 68 This first-ever, detailed characterization of a dark interstellar cloud that is currently in the stage immediately preceding collapse and subsequent star formation constitutes a very important step towards a better understanding of earliest phases of the stellar life cycle. The astronomers suggest that Barnard 68 (and its neighbouring brethren, the dark clouds Barnard 69, 70 and 72) may be the precursors of an isolated and sparsely populated association of low-mass solar-like stars. However, where did these clouds come from? João Alves thinks he and his colleagues know the answer: "It is most likely that they are the remnant cores of particularly resistent parts of a larger cloud. By now, most of it has been 'eaten away' because of strong attrition caused by ultraviolet radiation and stellar winds from hot massive stars or 'storms' from exploding supernovae". He adds: "Our new observations show that objects with just the right mass like Barnard 68 can reach a temporary equilibrium and survive for some time before they begin to collapse." The team is now eager to continue this type of investigation on other dark clouds. More information The research described in this Press Release is reported in a research article ("Seeing Light Through the Dark: Measuring the Internal Structure of a Cold Dark Cloud"), that appears in the international research jounal Nature on Thursday, January 11, 2001. Notes [1]: The team consists of João F. Alves (ESO-Garching, Germany), Charles J. Lada (Harvard-Smithsonian Center for Astrophysics, Cambridge, Mass. USA) and Elizabeth A. Lada (University of Florida, Gainsville, Fl., USA). [2]: The Dutch astronomer Bart Bok (1906-1983) studied the dark clouds in the Milky Way and described the small, compact ones as "globules". The early stages of the present investigation of Barnard 68 were presented in ESO PR Photos 29a-c/99 , with more background information about this cloud. Technical information about the photos PR Photo 02a/01 of the sky area of Barnard 68 is based on three frames through B- (440 nm = 0.44 µm - here rendered as blue), V- (0.55 µm - green) and I-band 0.90 µm - red) optical filters, as obtained with FORS1 instrument at the VLT ANTU telescope on March 27, 1999. The field measures 6.8 x 6.8 arcmin 2 (2048 x 2048 pix 2 a 0.20 arcsec. PR Photo 02b/01 is a false-colour composite based on B- (wavelength 0.44 µm - 1.5 min; here rendered as blue), I- (wavelength 0.85 µm - 1.5 min; green), and Ks-filters (2.16 µm - 30 min; red), respectively. The B and I images were obtained on March 1999, with the FORS1 instrument at the 8.2-m VLT ANTU. The Ks image was obtained in March 1999 with the SOFI instrument at the ESO 3.58-m New Technology Telescope (NTT) at La Silla. The sky field measures about 4.9 x 4.9 arcmin 2 (1024 x 1024 pixels 2 a 0.29 arcsec). North is up and East is left. PR Photo 02c/01 allows a direct comparison between the two views.
Vector Observation-Aided/Attitude-Rate Estimation Using Global Positioning System Signals
NASA Technical Reports Server (NTRS)
Oshman, Yaakov; Markley, F. Landis
1997-01-01
A sequential filtering algorithm is presented for attitude and attitude-rate estimation from Global Positioning System (GPS) differential carrier phase measurements. A third-order, minimal-parameter method for solving the attitude matrix kinematic equation is used to parameterize the filter's state, which renders the resulting estimator computationally efficient. Borrowing from tracking theory concepts, the angular acceleration is modeled as an exponentially autocorrelated stochastic process, thus avoiding the use of the uncertain spacecraft dynamic model. The new formulation facilitates the use of aiding vector observations in a unified filtering algorithm, which can enhance the method's robustness and accuracy. Numerical examples are used to demonstrate the performance of the method.
Observation of cooperative Mie scattering from an ultracold atomic cloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, H.; Stehle, C.; Slama, S.
Scattering of light at a distribution of scatterers is an intrinsically cooperative process, which means that the scattering rate and the angular distribution of the scattered light are essentially governed by bulk properties of the distribution, such as its size, shape, and density, although local disorder and density fluctuations may have an important impact on the cooperativity. Via measurements of the radiation pressure force exerted by a far-detuned laser beam on a very small and dense cloud of ultracold atoms, we are able to identify the respective roles of superradiant acceleration of the scattering rate and of Mie scattering inmore » the cooperative process. They lead, respectively, to a suppression or an enhancement of the radiation pressure force. We observe a maximum in the radiation pressure force as a function of the phase shift induced in the incident laser beam by the cloud's refractive index. The maximum marks the borderline of the validity of the Rayleigh-Debye-Gans approximation from a regime, where Mie scattering is more complex. Our observations thus help to clarify the intricate relationship between Rayleigh scattering of light at a coarse-grained ensemble of individual scatterers and Mie scattering at the bulk density distribution.« less
GPU-based cloud service for Smith-Waterman algorithm using frequency distance filtration scheme.
Lee, Sheng-Ta; Lin, Chun-Yuan; Hung, Che Lun
2013-01-01
As the conventional means of analyzing the similarity between a query sequence and database sequences, the Smith-Waterman algorithm is feasible for a database search owing to its high sensitivity. However, this algorithm is still quite time consuming. CUDA programming can improve computations efficiently by using the computational power of massive computing hardware as graphics processing units (GPUs). This work presents a novel Smith-Waterman algorithm with a frequency-based filtration method on GPUs rather than merely accelerating the comparisons yet expending computational resources to handle such unnecessary comparisons. A user friendly interface is also designed for potential cloud server applications with GPUs. Additionally, two data sets, H1N1 protein sequences (query sequence set) and human protein database (database set), are selected, followed by a comparison of CUDA-SW and CUDA-SW with the filtration method, referred to herein as CUDA-SWf. Experimental results indicate that reducing unnecessary sequence alignments can improve the computational time by up to 41%. Importantly, by using CUDA-SWf as a cloud service, this application can be accessed from any computing environment of a device with an Internet connection without time constraints.
Role of mixed precipitating cloud systems on the typhoon rainfall
NASA Astrophysics Data System (ADS)
Pan, C. J.; Krishna Reddy, K.; Lai, H. C.; Yang, S. S.
2010-01-01
L-band wind profiler data are utilized to diagnose the vertical structure of the typhoon precipitating cloud systems in Taiwan. For several typhoons, a pronounced bright band (BB) around 5 km is commonly observed from the observation. Since strong convection within typhoon circulation may disturb and/or disrupt the melting layer, the BB shall not appear persistently. Hence, an understanding of the vertical structure of the BB region is important because it holds extensive hydrometeors information on the type of precipitation and its variability. Wind profiler observational results suggest that the mixture of convective and stratiform (embedded type) clouds are mostly associated with typhoons. In the case of one typhoon, BB is appeared around 5.5 km with embedded precipitation and also BB height of 1 km higher than ordinary showery precipitation. This is evident from the long-term observations of wind profiler and Tropical Rainfall Measuring Mission. The Doppler velocity profiles show hydrometers (ice/snow) at 6 km but liquid below 5 km for typhoons and 4 km for showery precipitation. In the BB region the melting particles accelerations of 5.8 ms-1 km-1 and 3.2 ms-1 km-1 are observed for typhoon and showery precipitation, respectively.
An accelerated hologram calculation using the wavefront recording plane method and wavelet transform
NASA Astrophysics Data System (ADS)
Arai, Daisuke; Shimobaba, Tomoyoshi; Nishitsuji, Takashi; Kakue, Takashi; Masuda, Nobuyuki; Ito, Tomoyoshi
2017-06-01
Fast hologram calculation methods are critical in real-time holography applications such as three-dimensional (3D) displays. We recently proposed a wavelet transform-based hologram calculation called WASABI. Even though WASABI can decrease the calculation time of a hologram from a point cloud, it increases the calculation time with increasing propagation distance. We also proposed a wavefront recoding plane (WRP) method. This is a two-step fast hologram calculation in which the first step calculates the superposition of light waves emitted from a point cloud in a virtual plane, and the second step performs a diffraction calculation from the virtual plane to the hologram plane. A drawback of the WRP method is in the first step when the point cloud has a large number of object points and/or a long distribution in the depth direction. In this paper, we propose a method combining WASABI and the WRP method in which the drawbacks of each can be complementarily solved. Using a consumer CPU, the proposed method succeeded in performing a hologram calculation with 2048 × 2048 pixels from a 3D object with one million points in approximately 0.4 s.
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
NASA Technical Reports Server (NTRS)
Morrison, R. B. (Principal Investigator)
1974-01-01
The author has identified the following significant results. Indexing and analysis of the SL 2, SL 3, and SL 4 photos of the project area has shown that S-190A coverage with less than 30% clouds totals about 123,000 sq km. The 70-mm unenlarged color, color-infrared, B/W red, and B/W green bands from S-190A are of good to excellent quality; the B/W IR bands from SL 2 are excessively grainy and have very low resolution; those from SL 3 are better but nevertheless have low resolution. The 5-inch unenlarged color transparencies from S-190B are generally of excellent photographic quality. However, where cloud cover is extensive, commonly the S-190A and S-190B color and color-IR photos are correctly exposed for the clouds but considerably underexposed for the ground. The 4X enlargements of all bands of S-190A photos taken by SL 2 are much fuzzier than they should be; evidently the enlarger was not focused properly. The 4X enlargements from SL 3 are much superior.
NASA Astrophysics Data System (ADS)
McInerney, M.; Schnase, J. L.; Duffy, D.; Tamkin, G.; Nadeau, D.; Strong, S.; Thompson, J. H.; Sinno, S.; Lazar, D.
2014-12-01
The climate sciences represent a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with big data that ultimately product societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. Within this framework, cloud computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics-as-a-service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the big data challenges in this domain. This poster will highlight specific examples of CAaaS using climate reanalysis data, high-performance cloud computing, map reduce, and the Climate Data Services API.
GC31G-1182: Opennex, a Private-Public Partnership in Support of the National Climate Assessment
NASA Technical Reports Server (NTRS)
Nemani, Ramakrishna R.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Ganguly, Sangram
2016-01-01
The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX is funded as an enabling tool for sustaining the national climate assessment. Over the past five years, researchers have used the NEX platform and produced a number of data sets highly relevant to the National Climate Assessment. These include high-resolution climate projections using different downscaling techniques and trends in historical climate from satellite data. To enable a broader community in exploiting the above datasets, the NEX team partnered with public cloud providers to create the OpenNEX platform. OpenNEX provides ready access to NEX data holdings on a number of public cloud platforms along with pertinent analysis tools and workflows in the form of Machine Images and Docker Containers, lectures and tutorials by experts. We will showcase some of the applications of OpenNEX data and tools by the community on Amazon Web Services, Google Cloud and the NEX Sandbox.
NASA Technical Reports Server (NTRS)
2002-01-01
This spectacular Moderate Resolution Imaging Spectroradiometer (MODIS) 'blue marble' image is based on the most detailed collection of true-color imagery of the entire Earth to date. Using a collection of satellite-based observations, scientists and visualizers stitched together months of observations of the land surface, oceans, sea ice, and clouds into a seamless, true-color mosaic of every square kilometer (.386 square mile) of our planet. Most of the information contained in this image came from MODIS, illustrating MODIS' outstanding capacity to act as an integrated tool for observing a variety of terrestrial, oceanic, and atmospheric features of the Earth. The land and coastal ocean portions of this image is based on surface observations collected from June through September 2001 and combined, or composited, every eight days to compensate for clouds that might block the satellite's view on any single day. Global ocean color (or chlorophyll) data was used to simulate the ocean surface. MODIS doesn't measure 3-D features of the Earth, so the surface observations were draped over topographic data provided by the U.S. Geological Survey EROS Data Center. MODIS observations of polar sea ice were combined with observations of Antarctica made by the National Oceanic and Atmospheric Administration's AVHRR sensor-the Advanced Very High Resolution Radiometer. The cloud image is a composite of two days of MODIS imagery collected in visible light wavelengths and a third day of thermal infra-red imagery over the poles. A large collection of imagery based on the blue marble in a variety of sizes and formats, including animations and the full (1 km) resolution imagery, is available at the Blue Marble page. Image by Reto Stockli, Render by Robert Simmon. Based on data from the MODIS Science Team
Scalable Machine Learning for Massive Astronomical Datasets
NASA Astrophysics Data System (ADS)
Ball, Nicholas M.; Gray, A.
2014-04-01
We present the ability to perform data mining and machine learning operations on a catalog of half a billion astronomical objects. This is the result of the combination of robust, highly accurate machine learning algorithms with linear scalability that renders the applications of these algorithms to massive astronomical data tractable. We demonstrate the core algorithms kernel density estimation, K-means clustering, linear regression, nearest neighbors, random forest and gradient-boosted decision tree, singular value decomposition, support vector machine, and two-point correlation function. Each of these is relevant for astronomical applications such as finding novel astrophysical objects, characterizing artifacts in data, object classification (including for rare objects), object distances, finding the important features describing objects, density estimation of distributions, probabilistic quantities, and exploring the unknown structure of new data. The software, Skytree Server, runs on any UNIX-based machine, a virtual machine, or cloud-based and distributed systems including Hadoop. We have integrated it on the cloud computing system of the Canadian Astronomical Data Centre, the Canadian Advanced Network for Astronomical Research (CANFAR), creating the world's first cloud computing data mining system for astronomy. We demonstrate results showing the scaling of each of our major algorithms on large astronomical datasets, including the full 470,992,970 objects of the 2 Micron All-Sky Survey (2MASS) Point Source Catalog. We demonstrate the ability to find outliers in the full 2MASS dataset utilizing multiple methods, e.g., nearest neighbors. This is likely of particular interest to the radio astronomy community given, for example, that survey projects contain groups dedicated to this topic. 2MASS is used as a proof-of-concept dataset due to its convenience and availability. These results are of interest to any astronomical project with large and/or complex datasets that wishes to extract the full scientific value from its data.
Scalable Machine Learning for Massive Astronomical Datasets
NASA Astrophysics Data System (ADS)
Ball, Nicholas M.; Astronomy Data Centre, Canadian
2014-01-01
We present the ability to perform data mining and machine learning operations on a catalog of half a billion astronomical objects. This is the result of the combination of robust, highly accurate machine learning algorithms with linear scalability that renders the applications of these algorithms to massive astronomical data tractable. We demonstrate the core algorithms kernel density estimation, K-means clustering, linear regression, nearest neighbors, random forest and gradient-boosted decision tree, singular value decomposition, support vector machine, and two-point correlation function. Each of these is relevant for astronomical applications such as finding novel astrophysical objects, characterizing artifacts in data, object classification (including for rare objects), object distances, finding the important features describing objects, density estimation of distributions, probabilistic quantities, and exploring the unknown structure of new data. The software, Skytree Server, runs on any UNIX-based machine, a virtual machine, or cloud-based and distributed systems including Hadoop. We have integrated it on the cloud computing system of the Canadian Astronomical Data Centre, the Canadian Advanced Network for Astronomical Research (CANFAR), creating the world's first cloud computing data mining system for astronomy. We demonstrate results showing the scaling of each of our major algorithms on large astronomical datasets, including the full 470,992,970 objects of the 2 Micron All-Sky Survey (2MASS) Point Source Catalog. We demonstrate the ability to find outliers in the full 2MASS dataset utilizing multiple methods, e.g., nearest neighbors, and the local outlier factor. 2MASS is used as a proof-of-concept dataset due to its convenience and availability. These results are of interest to any astronomical project with large and/or complex datasets that wishes to extract the full scientific value from its data.
The Real-Time Monitoring Service Platform for Land Supervision Based on Cloud Integration
NASA Astrophysics Data System (ADS)
Sun, J.; Mao, M.; Xiang, H.; Wang, G.; Liang, Y.
2018-04-01
Remote sensing monitoring has become the important means for land and resources departments to strengthen supervision. Aiming at the problems of low monitoring frequency and poor data currency in current remote sensing monitoring, this paper researched and developed the cloud-integrated real-time monitoring service platform for land supervision which enhanced the monitoring frequency by acquiring the domestic satellite image data overall and accelerated the remote sensing image data processing efficiency by exploiting the intelligent dynamic processing technology of multi-source images. Through the pilot application in Jinan Bureau of State Land Supervision, it has been proved that the real-time monitoring technical method for land supervision is feasible. In addition, the functions of real-time monitoring and early warning are carried out on illegal land use, permanent basic farmland protection and boundary breakthrough in urban development. The application has achieved remarkable results.
Astrophysics. The exceptionally powerful TeV γ-ray emitters in the Large Magellanic Cloud.
2015-01-23
The Large Magellanic Cloud, a satellite galaxy of the Milky Way, has been observed with the High Energy Stereoscopic System (H.E.S.S.) above an energy of 100 billion electron volts for a deep exposure of 210 hours. Three sources of different types were detected: the pulsar wind nebula of the most energetic pulsar known, N 157B; the radio-loud supernova remnant N 132D; and the largest nonthermal x-ray shell, the superbubble 30 Dor C. The unique object SN 1987A is, unexpectedly, not detected, which constrains the theoretical framework of particle acceleration in very young supernova remnants. These detections reveal the most energetic tip of a γ-ray source population in an external galaxy and provide via 30 Dor C the unambiguous detection of γ-ray emission from a superbubble. Copyright © 2015, American Association for the Advancement of Science.
Antarctic new particle formation from continental biogenic precursors
NASA Astrophysics Data System (ADS)
Kyrö, E.-M.; Kerminen, V.-M.; Virkkula, A.; Dal Maso, M.; Parshintsev, J.; Ruíz-Jimenez, J.; Forsström, L.; Manninen, H. E.; Riekkola, M.-L.; Heinonen, P.; Kulmala, M.
2012-12-01
Over Antarctica, aerosol particles originate almost entirely from marine areas, with minor contribution from long-range transported dust or anthropogenic material. The Antarctic continent itself, unlike all other continental areas, has been thought to be practically free of aerosol sources. Here we present evidence of local aerosol production associated with melt-water ponds in the continental Antarctica. We show that in air masses passing such ponds, new aerosol particles are efficiently formed and these particles grow up to sizes where they may act as cloud condensation nuclei (CCN). The precursor vapours responsible for aerosol formation and growth originate very likely from highly abundant cyanobacteria Nostoc commune (Vaucher) communities of local ponds. This is the first time when freshwater vegetation has been identified as an aerosol precursor source. The influence of the new source on clouds and climate may increase in future Antarctica, and possibly elsewhere undergoing accelerating summer melting of semi-permanent snow cover.
Antarctic new particle formation from continental biogenic precursors
NASA Astrophysics Data System (ADS)
Kyrö, E.-M.; Kerminen, V.-M.; Virkkula, A.; Dal Maso, M.; Parshintsev, J.; Ruíz-Jimenez, J.; Forsström, L.; Manninen, H. E.; Riekkola, M.-L.; Heinonen, P.; Kulmala, M.
2013-04-01
Over Antarctica, aerosol particles originate almost entirely from marine areas, with minor contribution from long-range transported dust or anthropogenic material. The Antarctic continent itself, unlike all other continental areas, has been thought to be practically free of aerosol sources. Here we present evidence of local aerosol production associated with melt-water ponds in continental Antarctica. We show that in air masses passing such ponds, new aerosol particles are efficiently formed and these particles grow up to sizes where they may act as cloud condensation nuclei (CCN). The precursor vapours responsible for aerosol formation and growth originate very likely from highly abundant cyanobacteria Nostoc commune (Vaucher) communities of local ponds. This is the first time freshwater vegetation has been identified as an aerosol precursor source. The influence of the new source on clouds and climate may increase in future Antarctica, and possibly elsewhere undergoing accelerating summer melting of semi-permanent snow cover.
Discovery of very-high-energy gamma-rays from the Galactic Centre ridge.
Aharonian, F; Akhperjanian, A G; Bazer-Bachi, A R; Beilicke, M; Benbow, W; Berge, D; Bernlöhr, K; Boisson, C; Bolz, O; Borrel, V; Braun, I; Breitling, F; Brown, A M; Chadwick, P M; Chounet, L-M; Cornils, R; Costamante, L; Degrange, B; Dickinson, H J; Djannati-Ataï, A; Drury, L O'C; Dubus, G; Emmanoulopoulos, D; Espigat, P; Feinstein, F; Fontaine, G; Fuchs, Y; Funk, S; Gallant, Y A; Giebels, B; Gillessen, S; Glicenstein, J F; Goret, P; Hadjichristidis, C; Hauser, D; Hauser, M; Heinzelmann, G; Henri, G; Hermann, G; Hinton, J A; Hofmann, W; Holleran, M; Horns, D; Jacholkowska, A; de Jager, O C; Khélifi, B; Klages, S; Komin, Nu; Konopelko, A; Latham, I J; Le Gallou, R; Lemière, A; Lemoine-Goumard, M; Leroy, N; Lohse, T; Marcowith, A; Martin, J M; Martineau-Huynh, O; Masterson, C; McComb, T J L; de Naurois, M; Nolan, S J; Noutsos, A; Orford, K J; Osborne, J L; Ouchrif, M; Panter, M; Pelletier, G; Pita, S; Pühlhofer, G; Punch, M; Raubenheimer, B C; Raue, M; Raux, J; Rayner, S M; Reimer, A; Reimer, O; Ripken, J; Rob, L; Rolland, L; Rowell, G; Sahakian, V; Saugé, L; Schlenker, S; Schlickeiser, R; Schuster, C; Schwanke, U; Siewert, M; Sol, H; Spangler, D; Steenkamp, R; Stegmann, C; Tavernet, J-P; Terrier, R; Théoret, C G; Tluczykont, M; van Eldik, C; Vasileiadis, G; Venter, C; Vincent, P; Völk, H J; Wagner, S J
2006-02-09
The source of Galactic cosmic rays (with energies up to 10(15) eV) remains unclear, although it is widely believed that they originate in the shock waves of expanding supernova remnants. At present the best way to investigate their acceleration and propagation is by observing the gamma-rays produced when cosmic rays interact with interstellar gas. Here we report observations of an extended region of very-high-energy (> 10(11) eV) gamma-ray emission correlated spatially with a complex of giant molecular clouds in the central 200 parsecs of the Milky Way. The hardness of the gamma-ray spectrum and the conditions in those molecular clouds indicate that the cosmic rays giving rise to the gamma-rays are likely to be protons and nuclei rather than electrons. The energy associated with the cosmic rays could have come from a single supernova explosion around 10(4) years ago.
Measuring and Evaluating TCP Splitting for Cloud Services
NASA Astrophysics Data System (ADS)
Pathak, Abhinav; Wang, Y. Angela; Huang, Cheng; Greenberg, Albert; Hu, Y. Charlie; Kern, Randy; Li, Jin; Ross, Keith W.
In this paper, we examine the benefits of split-TCP proxies, deployed in an operational world-wide network, for accelerating cloud services. We consider a fraction of a network consisting of a large number of satellite datacenters, which host split-TCP proxies, and a smaller number of mega datacenters, which ultimately perform computation or provide storage. Using web search as an exemplary case study, our detailed measurements reveal that a vanilla TCP splitting solution deployed at the satellite DCs reduces the 95 th percentile of latency by as much as 43% when compared to serving queries directly from the mega DCs. Through careful dissection of the measurement results, we characterize how individual components, including proxy stacks, network protocols, packet losses and network load, can impact the latency. Finally, we shed light on further optimizations that can fully realize the potential of the TCP splitting solution.
Inverse Bremsstrahlung in Shocked Astrophysical Plasmas
NASA Technical Reports Server (NTRS)
Baring, Matthew G.; Jones, Frank C.; Ellison, Donald C.
2000-01-01
There has recently been interest in the role of inverse bremsstrahlung, the emission of photons by fast suprathermal ions in collisions with ambient electrons possessing relatively low velocities, in tenuous plasmas in various astrophysical contexts. This follows a long hiatus in the application of suprathermal ion bremsstrahlung to astrophysical models since the early 1970s. The potential importance of inverse bremsstrahlung relative to normal bremsstrahlung, i.e. where ions are at rest, hinges upon the underlying velocity distributions of the interacting species. In this paper, we identify the conditions under which the inverse bremsstrahlung emissivity is significant relative to that for normal bremsstrahlung in shocked astrophysical plasmas. We determine that, since both observational and theoretical evidence favors electron temperatures almost comparable to, and certainly not very deficient relative to proton temperatures in shocked plasmas, these environments generally render inverse bremsstrahlung at best a minor contributor to the overall emission. Hence inverse bremsstrahlung can be safely neglected in most models invoking shock acceleration in discrete sources such as supernova remnants. However, on scales approximately > 100 pc distant from these sources, Coulomb collisional losses can deplete the cosmic ray electrons, rendering inverse bremsstrahlung, and perhaps bremsstrahlung from knock-on electrons, possibly detectable.
A high-level 3D visualization API for Java and ImageJ.
Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin
2010-05-21
Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.
TransCut: interactive rendering of translucent cutouts.
Li, Dongping; Sun, Xin; Ren, Zhong; Lin, Stephen; Tong, Yiying; Guo, Baining; Zhou, Kun
2013-03-01
We present TransCut, a technique for interactive rendering of translucent objects undergoing fracturing and cutting operations. As the object is fractured or cut open, the user can directly examine and intuitively understand the complex translucent interior, as well as edit material properties through painting on cross sections and recombining the broken pieces—all with immediate and realistic visual feedback. This new mode of interaction with translucent volumes is made possible with two technical contributions. The first is a novel solver for the diffusion equation (DE) over a tetrahedral mesh that produces high-quality results comparable to the state-of-art finite element method (FEM) of Arbree et al. but at substantially higher speeds. This accuracy and efficiency is obtained by computing the discrete divergences of the diffusion equation and constructing the DE matrix using analytic formulas derived for linear finite elements. The second contribution is a multiresolution algorithm to significantly accelerate our DE solver while adapting to the frequent changes in topological structure of dynamic objects. The entire multiresolution DE solver is highly parallel and easily implemented on the GPU. We believe TransCut provides a novel visual effect for heterogeneous translucent objects undergoing fracturing and cutting operations.
Assessment of the Dehydration-Greenhouse Feedback Over the Arctic During Winter
NASA Astrophysics Data System (ADS)
Girard, E.; Stefanof, A.; Peltier-Champigny, M.; Munoz-Alpizar, R.; Dueymes, G.; Jean-Pierre, B.
2007-12-01
The effect of pollution-derived sulphuric acid aerosols on the aerosol-cloud-radiation interactions is investigated over the Arctic for February 1990. Observations suggest that acidic aerosols can decrease the heterogeneous nucleation rate of ice crystals and lower the homogeneous freezing temperature of haze droplets. Based on these observations, we hypothesize that the cloud thermodynamic phase is modified in polluted air mass (Arctic haze). Cloud ice number concentration is reduced, thus promoting further ice crystal growth by the Bergeron-Findeisen process. Hence, ice crystals reach larger sizes and low-level ice crystal precipitation from mixed-phase clouds increases. Enhanced dehydration of the lower troposphere contributes to decrease the water vapour greenhouse effect and cool the surface. A positive feedback is created between surface cooling and air dehydration, accelerating the cold air production. This process is referred to as the dehydration-greenhouse feedback (DGF). Simulations performed using an arctic regional climate model for February 1990, February and March 1985 and 1995 are used to assess the potential effect of the DGF on the Arctic climate. Results show that the DGF has an important effect over the Central and Eurasian Arctic, which is the coldest part of the Arctic with a surface cooling ranging between 0 and -3K. Moreover, the lower tropospheric cooling over the Eurasian and Central Arctic strengthens the atmospheric circulation at upper level, thus increasing the aerosol transport from the mid-latitudes and enhancing the DGF. Over warmer areas, the increased aerosol concentration (caused by the DGF) leads to longer cloud lifetime, which contributes to warm these areas. It is also shown that the maximum ice nuclei reduction must be of the order of 100 to get a significant effect.
Development of a Cloud Resolving Model for Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.
2017-12-01
A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.
Directed Panspermia. 3. strategies and Motivation for Seeding Star-Forming Clouds
NASA Astrophysics Data System (ADS)
Mautner, Michael N.
1997-11-01
Microbial swarms aimed at star-forming regions of interstellar clouds can seed stellar associations of 10 - 100 young planetary systems. Swarms of millimeter size, milligram packets can be launched by 35 cm solar sails at 5E-4 c, to penetrate interstellar clouds. Selective capture in high-density planetary accretion zones of densities > 1E-17 kg m-3 is achieved by viscous drag. Strategies are evaluated to seed dense cloud cores, or individual protostellar condensations, accretion disks or young planets therein. Targeting the Ophiuchus cloud is described as a model system. The biological content, dispersed in 30 μm, 1E-10 kg capsules of 1E6 freeze-dried microorganisms each, may be captured by new planets or delivered to planets after incorporation first into carbonaceous asteroids and comets. These objects, as modeled by meteorite materials, contain biologically available organic and mineral nutrients that are shown to sustain microbial growth. The program may be driven by panbiotic ethics, predicated on: 1. The unique position of complex organic life amongst the structures of Nature; 2. Self-propagation as the basic propensity of the living pattern; 3. The biophysical unity humans with of the organic, DNA/protein family of life; and 4. Consequently, the primary human purpose to safeguard and propagate our organic life form. To promote this purpose, panspermia missions with diverse biological payloads will maximize survival at the targets and induce evolutionary pressures. In particular, eukaryotes and simple multicellular organisms in the payload will accelerate higher evolution. Based on the geometries and masses of star-forming regions, the 1E24 kg carbon resources of one solar system, applied during its 5E9 yr lifespan, can seed all newly forming planetary systems in the galaxy.
Coupled Sulfur and Chlorine Chemistry in Venus' Upper Cloud Layer
NASA Astrophysics Data System (ADS)
Mills, Franklin P.
2006-09-01
Venus' atmosphere likely contains a rich variety of sulfur and chlorine compounds because HCl, SO2, and OCS have all been observed. Photodissociation of CO2 and SO2 in the upper cloud layer produces oxygen which can react directly or indirectly with SO2 to form SO3 and eventually H2SO4. Photodissociation of HCl within and above the upper cloud layer produces chlorine which can react with CO and O2 to form ClCO and ClC(O)OO. These two species have been identified as potentially critical intermediaries in the production of CO2. Much less work has been done on the potential coupling between sulfur and chlorine chemistry that may occur within the upper cloud layer. Several aspects have been examined in recent modeling: (1) linkage of the CO2 and sulfur oxidation cycles (based on ideas from Yung and DeMore, 1982), (2) reaction of Cl with SO2 to form ClSO2 (based on ideas from DeMore et al., 1985), and (3) the chemistry of SmCln for m,n = 1,2 (based on preliminary work in Mills, 1998). Initial results suggest the chemistry of SmCln may provide a pathway for accelerated production of polysulfur, Sx, if the oxygen abundance in the upper cloud layer is as small as is implied by the observational limit on O2 (Trauger and Lunine, 1983). Initial results also suggest that ClSO2 can act as a buffer which helps increase the scale height of SO2 and decrease the rate of production of H2SO4. This presentation will describe the results from this modeling; discuss their potential implications for the CO2, sulfur oxidation, and polysulfur cycles; and outline key observations from Venus Express that can help resolve existing questions concerning the chemistry of Venus' upper cloud. Partial funding for this research was provided by the Australian Research Council.
A robust method to forecast volcanic ash clouds
Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin
2012-01-01
Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an efficient means to assess all of the hazards associated with these ash clouds.
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Spazier, J.; Reißland, S.
2014-12-01
Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.
NASA Astrophysics Data System (ADS)
Molina Garcia, Victor; Sasi, Sruthy; Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego
2017-04-01
In this work, the requirements for the retrieval of cloud properties in the back-scattering region are described, and their application to the measurements taken by the Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR) is shown. Various radiative transfer models and their linearizations are implemented, and their advantages and issues are analyzed. As radiative transfer calculations in the back-scattering region are computationally time-consuming, several acceleration techniques are also studied. The radiative transfer models analyzed include the exact Discrete Ordinate method with Matrix Exponential (DOME), the Matrix Operator method with Matrix Exponential (MOME), and the approximate asymptotic and equivalent Lambertian cloud models. To reduce the computational cost of the line-by-line (LBL) calculations, the k-distribution method, the Principal Component Analysis (PCA) and a combination of the k-distribution method plus PCA are used. The linearized radiative transfer models for retrieval of cloud properties include the Linearized Discrete Ordinate method with Matrix Exponential (LDOME), the Linearized Matrix Operator method with Matrix Exponential (LMOME) and the Forward-Adjoint Discrete Ordinate method with Matrix Exponential (FADOME). These models were applied to the EPIC oxygen-A band absorption channel at 764 nm. It is shown that the approximate asymptotic and equivalent Lambertian cloud models give inaccurate results, so an offline processor for the retrieval of cloud properties in the back-scattering region requires the use of exact models such as DOME and MOME, which behave similarly. The combination of the k-distribution method plus PCA presents similar accuracy to the LBL calculations, but it is up to 360 times faster, and the relative errors for the computed radiances are less than 1.5% compared to the results when the exact phase function is used. Finally, the linearized models studied show similar behavior, with relative errors less than 1% for the radiance derivatives, but FADOME is 2 times faster than LDOME and 2.5 times faster than LMOME.
Computer generated hologram from point cloud using graphics processor.
Chen, Rick H-Y; Wilkinson, Timothy D
2009-12-20
Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum. We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologram plane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique.
Erosion and Channel Incision Analysis with High-Resolution Lidar
NASA Astrophysics Data System (ADS)
Potapenko, J.; Bookhagen, B.
2013-12-01
High-resolution LiDAR (LIght Detection And Ranging) provides a new generation of sub-meter topographic data that is still to be fully exploited by the Earth science communities. We make use of multi-temporal airborne and terrestrial lidar scans in the south-central California and Santa Barbara area. Specifically, we have investigated the Mission Canyon and Channel Islands regions from 2009-2011 to study changes in erosion and channel incision on the landscape. In addition to gridding the lidar data into digital elevation models (DEMs), we also make use of raw lidar point clouds and triangulated irregular networks (TINs) for detailed analysis of heterogeneously spaced topographic data. Using recent advancements in lidar point cloud processing from information technology disciplines, we have employed novel lidar point cloud processing and feature detection algorithms to automate the detection of deeply incised channels and gullies, vegetation, and other derived metrics (e.g. estimates of eroded volume). Our analysis compares topographically-derived erosion volumes to field-derived cosmogenic radionuclide age and in-situ sediment-flux measurements. First results indicate that gully erosion accounts for up to 60% of the sediment volume removed from the Mission Canyon region. Furthermore, we observe that gully erosion and upstream arroyo propagation accelerated after fires, especially in regions where vegetation was heavily burned. The use of high-resolution lidar point cloud data for topographic analysis is still a novel method that needs more precedent and we hope to provide a cogent example of this approach with our research.
CHOLLA: A New Massively Parallel Hydrodynamics Code for Astrophysical Simulation
NASA Astrophysics Data System (ADS)
Schneider, Evan E.; Robertson, Brant E.
2015-04-01
We present Computational Hydrodynamics On ParaLLel Architectures (Cholla ), a new three-dimensional hydrodynamics code that harnesses the power of graphics processing units (GPUs) to accelerate astrophysical simulations. Cholla models the Euler equations on a static mesh using state-of-the-art techniques, including the unsplit Corner Transport Upwind algorithm, a variety of exact and approximate Riemann solvers, and multiple spatial reconstruction techniques including the piecewise parabolic method (PPM). Using GPUs, Cholla evolves the fluid properties of thousands of cells simultaneously and can update over 10 million cells per GPU-second while using an exact Riemann solver and PPM reconstruction. Owing to the massively parallel architecture of GPUs and the design of the Cholla code, astrophysical simulations with physically interesting grid resolutions (≳2563) can easily be computed on a single device. We use the Message Passing Interface library to extend calculations onto multiple devices and demonstrate nearly ideal scaling beyond 64 GPUs. A suite of test problems highlights the physical accuracy of our modeling and provides a useful comparison to other codes. We then use Cholla to simulate the interaction of a shock wave with a gas cloud in the interstellar medium, showing that the evolution of the cloud is highly dependent on its density structure. We reconcile the computed mixing time of a turbulent cloud with a realistic density distribution destroyed by a strong shock with the existing analytic theory for spherical cloud destruction by describing the system in terms of its median gas density.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Xiquan; Zib, Benjamin J.; Xi, Baike
A warming Arctic climate is undergoing significant e 21 nvironmental change, most evidenced by the reduction of Arctic sea-ice extent during the summer. In this study, we examine two extreme anomalies of September sea-ice extent in 2007 and 1996, and investigate the impacts of cloud fraction (CF), atmospheric precipitable water vapor (PWV), downwelling longwave flux (DLF), surface air temperature (SAT), pressure and winds on the sea-ice variation in 2007 and 1996 using both satellite-derived sea-ice products and MERRA reanalysis. The area of the Laptev, East Siberian and West Chukchi seas (70-90oN, 90-180oE) has experienced the largest variation in sea-ice extentmore » from year-to-year and defined here as the Area Of Focus (AOF). The record low September sea-ice extent in 2007 was associated with positive anomalies 30 of CF, PWV, DLF, and SAT over the AOF. Persistent anti-cyclone positioned over the Beaufort Sea coupled with low pressure over Eurasia induced easterly zonal and southerly meridional winds. In contrast, negative CF, PWV, DLF and SAT anomalies, as well as opposite wind patterns to those in 2007, characterized the 1996 high September sea-ice extent. Through this study, we hypothesize the following positive feedbacks of clouds, water vapor, radiation and atmospheric variables on the sea-ice retreat during the summer 2007. The record low sea-ice extent during the summer 2007 is initially triggered by the atmospheric circulation anomaly. The southerly winds across the Chukchi and East Siberian seas transport warm, moist air from the north Pacific, which is not only enhancing sea-ice melt across the AOF, but also increasing clouds. The positive cloud feedback results in higher SAT and more sea-ice melt. Therefore, 40 more water vapor could be evaporated from open seas and higher SAT to form more clouds, which will enhance positive cloud feedback. This enhanced positive cloud feedback will then further increase SAT and accelerate the sea-ice retreat during the summer 2007.« less
Hanker, J; Giammara, B
1993-01-01
Recent studies in our laboratories have shown how microwave (MW) irradiation can accelerate a number of tissue-processing techniques, especially staining, to aid in the preparation of single specimens on glass microscope slides or coverslips for examination by light microscopy (and electron microscopy, if required) for diagnostic purposes. Techniques have been developed, which give permanently stained preparations, that can be studied initially by light microscopy, their areas of interest mapped, and computer-automated image analysis performed to obtain quantitative information. This is readily performed after MW-accelerated staining with silver methenamine by the Giammara-Hanker PATS or PATS-TS reaction. This variation of the PAS reaction gives excellent markers for specific infectious agents such as lipopolysaccharides for gram-negative bacteria or mannans for fungi. It is also an excellent stain for glycogen and basement membranes and an excellent marker for type III collagen or reticulin in the endoneurium or perineurium of peripheral nerve or in the capillary walls. Our improved MW-accelerated Feulgen reaction with silver methenamine for nuclear DNA is useful to show the nuclei of bacteria and fungi as well as of cells they are infecting. Improved coating and penetration of tissue surfaces by thiocarbohydrazide bridging of ruthenium red, applied under MW-acceleration, render biologic specimens sufficiently conductive for SEM so that sputter coating with gold is unnecessary. The specimens treated with these highly visible electron-opaque stains can be screened with the light microscope after mounting in polyethylene glycol (PEG) and the structures or areas selected for EM study are mapped with a Micro-Locator slide. After removal of the water soluble PEG the specimens are remounted in the usual EM media for scanning electron microscopy (SEM) or transmission electron microscopy (TEM) study of the mapped areas. By comparing duplicate smears from areas of infection, such as two coverslips of buffy coat smears of blood from a patient with septicemia, the microorganisms responsible can occasionally be classified for antimicrobial therapy long before culture results are available; gram-negative bacteria are positive with the Giammara-Hanker PATS-TS stain, and gram-positive bacteria are positive with the SIGMA HT40 Gram stain. The gram-positive as well as gram-negative bacteria are both initially stained by the crystal violet component of the Gram stain. The crystal violet stain is readily removed from the gram-negative (but not the gram-positive) bacteria when the specimens are rinsed with alcohol/acetone. If this rinse step is omitted, the crystal violet remains attached to both gram-negative and gram-positive bacteria. It can then be rendered insoluble, electron-opaque, and conductive by treatment with silver methenamine solution under MW-irradiation. This metallized crystal violet is a more effective silver stain than the PATS-TS stain for a number of gram-negative spirochetes such as Treponema pallidum, the microbe that causes syphilis.
NASA Technical Reports Server (NTRS)
Oshman, Yaakov; Markley, Landis
1998-01-01
A sequential filtering algorithm is presented for attitude and attitude-rate estimation from Global Positioning System (GPS) differential carrier phase measurements. A third-order, minimal-parameter method for solving the attitude matrix kinematic equation is used to parameterize the filter's state, which renders the resulting estimator computationally efficient. Borrowing from tracking theory concepts, the angular acceleration is modeled as an exponentially autocorrelated stochastic process, thus avoiding the use of the uncertain spacecraft dynamic model. The new formulation facilitates the use of aiding vector observations in a unified filtering algorithm, which can enhance the method's robustness and accuracy. Numerical examples are used to demonstrate the performance of the method.
Bacterial expression of human kynurenine 3-monooxygenase: Solubility, activity, purification☆
Wilson, K.; Mole, D.J.; Binnie, M.; Homer, N.Z.M.; Zheng, X.; Yard, B.A.; Iredale, J.P.; Auer, M.; Webster, S.P.
2014-01-01
Kynurenine 3-monooxygenase (KMO) is an enzyme central to the kynurenine pathway of tryptophan metabolism. KMO has been implicated as a therapeutic target in several disease states, including Huntington’s disease. Recombinant human KMO protein production is challenging due to the presence of transmembrane domains, which localise KMO to the outer mitochondrial membrane and render KMO insoluble in many in vitro expression systems. Efficient bacterial expression of human KMO would accelerate drug development of KMO inhibitors but until now this has not been achieved. Here we report the first successful bacterial (Escherichia coli) expression of active FLAG™-tagged human KMO enzyme expressed in the soluble fraction and progress towards its purification. PMID:24316190
Carr, Dustin W [Albuquerque, NM
2008-04-08
An optical displacement sensor is disclosed which uses a vertical-cavity surface-emitting laser (VCSEL) coupled to an optical cavity formed by a moveable membrane and an output mirror of the VCSEL. This arrangement renders the lasing characteristics of the VCSEL sensitive to any movement of the membrane produced by sound, vibrations, pressure changes, acceleration, etc. Some embodiments of the optical displacement sensor can further include a light-reflective diffractive lens located on the membrane or adjacent to the VCSEL to control the amount of lasing light coupled back into the VCSEL. A photodetector detects a portion of the lasing light from the VCSEL to provide an electrical output signal for the optical displacement sensor which varies with the movement of the membrane.
NASA Astrophysics Data System (ADS)
Elfaki, H.; Yousef, S.; Mawad, Ramy; Algafari, Y. H. O.; Amer, M.; Abdel-Sattar, W.
2017-12-01
Severe solar events manifested as highly energetic X-Ray events accompanied by coronal mass ejections ( CMEs) and proton flares caused flash floods in Makkah Al-Mukaramah, Al-Madinah Al-Munawarah and Jeddah. In the case of the 20 January 2005 CME that initiated severe flash on the 22 of January. it is shown that the CME lowered the pressure in the polar region and extended the low pressure regime to Saudi Arabia passing by the Mediterranean. Such passage accelerated evaporation and caused Cumulonimbus clouds to form and discharge flash floods over Makkah Al-Mukaramah. On the other hand, solar forcing due coronal holes have a different technique in initiating flash floods. The November 25 2009 and the 13-15 January 2011 Jeddah flash floods are attributed to prompt events due to fast solar streams emanated from two coronal holes that arrived the Earth on 24 November 2009 and 13 January 2011. We present evidences that those streams penetrated the Earth's magnetosphere and hit the troposphere at the western part of the Red Sea, dissipated their energy at 925mb geopotential height and left two hot spots. It follows that the air in the hot spots expanded and developed spots of low pressure air that spread over the Red Sea to its eastern coast. Accelerated evaporation due to reduced pressure caused quick formation of Cumulonimbus clouds that caused flash floods over Makkah Al-Mukaramah and Jeddah.
NASA Astrophysics Data System (ADS)
Jumelet, Julien; Bekki, Slimane; Keckhut, Philippe
2017-04-01
We present a high-resolution isentropic microphysical transport model dedicated to stratospheric aerosols and clouds. The model is based on the MIMOSA model (Modélisation Isentrope du transport Méso-échelle de l'Ozone Stratosphérique par Advection) and adds several modules: a fully explicit size-resolving microphysical scheme to transport aerosol granulometry as passive tracers and an optical module, able to calculate the scattering and extinction properties of particles at given wavelengths. Originally designed for polar stratospheric clouds (composed of sulfuric acid, nitric acid and water vapor), the model is fully capable of rendering the structure and properties of volcanic plumes at the finer scales, assuming complete SO2 oxydation. This link between microphysics and optics also enables the model to take advantage of spaceborne lidar data (i.e. CALIOP) by calculating the 532nm aerosol backscatter coefficient, taking it as the control variable to provide microphysical constraints during the transport. This methodology has been applied to simulate volcanic plumes during relatively recent volcanic eruptions, from the 2010 Merapi to the 2015 Calbuco eruption. Optical calculations are also used for direct comparisons between the model and groundbased lidar stations for validation as well as characterization purposes. We will present the model and the simulation results, along with a focus on the sensitivity to initialisation parameters, considering the need for quasi-real time modelling and forecasts in the case of future eruptions.
KAGLVis - On-line 3D Visualisation of Earth-observing-satellite Data
NASA Astrophysics Data System (ADS)
Szuba, Marek; Ameri, Parinaz; Grabowski, Udo; Maatouki, Ahmad; Meyer, Jörg
2015-04-01
One of the goals of the Large-Scale Data Management and Analysis project is to provide a high-performance framework facilitating management of data acquired by Earth-observing satellites such as Envisat. On the client-facing facet of this framework, we strive to provide visualisation and basic analysis tool which could be used by scientists with minimal to no knowledge of the underlying infrastructure. Our tool, KAGLVis, is a JavaScript client-server Web application which leverages modern Web technologies to provide three-dimensional visualisation of satellite observables on a wide range of client systems. It takes advantage of the WebGL API to employ locally available GPU power for 3D rendering; this approach has been demonstrated to perform well even on relatively weak hardware such as integrated graphics chipsets found in modern laptop computers and with some user-interface tuning could even be usable on embedded devices such as smartphones or tablets. Data is fetched from the database back-end using a ReST API and cached locally, both in memory and using HTML5 Web Storage, to minimise network use. Computations, calculation of cloud altitude from cloud-index measurements for instance, can depending on configuration be performed on either the client or the server side. Keywords: satellite data, Envisat, visualisation, 3D graphics, Web application, WebGL, MEAN stack.
Surfatron accelerator in the local interstellar cloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loznikov, V. M., E-mail: vloznikov@yandex.ru; Erokhin, N. S.; Zol’nikova, N. N.
2017-01-15
Taking into account results of numerous experiments, the variability of the energy spectra of cosmic rays (protons and helium nuclei) in the energy range of 10 GeV to ~10{sup 7} GeV is explained on the basis of a hypothesis of the existence of two variable sources close to the Sun. The first (soft) surfatron source (with a size of ~100 AU) is located at the periphery of the heliosphere. The second (hard) surfatron source (with a size of ~1 pc) is situated in the Local Interstellar Cloud (LIC) at a distance of <1 pc. The constant background is described bymore » a power-law spectrum with a slope of ~2.75. The variable heliospheric surfatron source is described by a power-law spectrum with a variable amplitude, slope, and cutoff energy, the maximum cutoff energy being in the range of E{sub CH}/Z < 1000 GeV. The variable surfatron source in the LIC is described by a power-law spectrum with a variable amplitude, slope, and cut-off energy, the maximum cut-off energy being E{sub Ð}¡{sub L}/Z ≤ 3 × 10{sup 6} GeV. The proposed model is used to approximate data from several experiments performed at close times. The energy of each cosmic-ray component is calculated. The possibility of surfatron acceleration of Fe nuclei (Z = 26) in the LIC up to an energy of E{sub CL} ~ 10{sup 17} eV and electron and positrons to the “knee” in the energy spectrum is predicted. By numerically solving a system of nonlinear equations describing the interaction between an electromagnetic wave and a charged particle with an energy of up to E/Z ~ 3 × 10{sup 6} GeV, the possibility of trapping, confinement, and acceleration of charged cosmic-ray particles by a quasi-longitudinal plasma wave is demonstrated.« less
Observations of the Large Magellanic Cloud with Fermi
Abdo, A. A.; Ackermann, M.; Ajello, M.; ...
2010-03-18
Context. The Large Magellanic Cloud (LMC) is to date the only normal external galaxy that has been detected in high-energy gamma rays. High-energy gamma rays trace particle acceleration processes and gamma-ray observations allow the nature and sites of acceleration to be studied. Aims. We characterise the distribution and sources of cosmic rays in the LMC from analysis of gamma-ray observations. Methods. We analyse 11 months of continuous sky-survey observations obtained with the Large Area Telescope aboard the Fermi Gamma-Ray Space Telescope and compare it to tracers of the interstellar medium and models of the gamma-ray sources in the LMC. Results.more » The LMC is detected at 33σ significance. The integrated >100 MeV photon flux of the LMC amounts to (2.6 ± 0.2) × 10 -7 ph cm -2 s -1 which corresponds to an energy flux of (1.6 ± 0.1) × 10 -10 erg cm -2 s -1, with additional systematic uncertainties of 16%. The analysis reveals the massive star forming region 30 Doradus as a bright source of gamma-ray emission in the LMC in addition to fainter emission regions found in the northern part of the galaxy. The gamma-ray emission from the LMC shows very little correlation with gas density and is rather correlated to tracers of massive star forming regions. The close confinement of gamma-ray emission to star forming regions suggests a relatively short GeV cosmic-ray proton diffusion length. In conclusion, the close correlation between cosmic-ray density and massive star tracers supports the idea that cosmic rays are accelerated in massive star forming regions as a result of the large amounts of kinetic energy that are input by the stellar winds and supernova explosions of massive stars into the interstellar medium.« less
Huang, Zhaowen; Cao, Yang; Nie, Jinfeng; Zhou, Hao; Li, Yusheng
2018-01-01
Gradient structured materials possess good combinations of strength and ductility, rendering the materials attractive in industrial applications. In this research, a surface nanocrystallization (SNC) technique, rotationally accelerated shot peening (RASP), was employed to produce a gradient nanostructured pure Ti with a deformation layer that had a thickness of 2000 μm, which is thicker than those processed by conventional SNC techniques. It is possible to fabricate a gradient structured Ti workpiece without delamination. Moreover, based on the microstructural features, the microstructure of the processed sample can be classified into three regions, from the center to the surface of the RASP-processed sample: (1) a twinning-dominated core region; (2) a “twin intersection”-dominated twin transition region; and (3) the nanostructured region, featuring nanograins. A microhardness gradient was detected from the RASP-processed Ti. The surface hardness was more than twice that of the annealed Ti sample. The RASP-processed Ti sample exhibited a good combination of yield strength and uniform elongation, which may be attributed to the high density of deformation twins and a strong back stress effect. PMID:29498631
Novae as Tevatrons: prospects for CTA and IceCube
NASA Astrophysics Data System (ADS)
Metzger, B. D.; Caprioli, D.; Vurm, I.; Beloborodov, A. M.; Bartos, I.; Vlasov, A.
2016-04-01
The discovery of novae as sources of ˜0.1-1 GeV gamma-rays highlights the key role of shocks and relativistic particle acceleration in these transient systems. Although there is evidence for a spectral cut-off above energies ˜1-100 GeV at particular epochs in some novae, the maximum particle energy achieved in these accelerators has remained an open question. The high densities of the nova ejecta (˜10 orders of magnitude larger than in supernova remnants) render the gas far upstream of the shock neutral and shielded from ionizing radiation. The amplification of the magnetic field needed for diffusive shock acceleration requires ionized gas, thus confining the acceleration process to a narrow photoionized layer immediately ahead of the shock. Based on the growth rate of the hybrid non-resonant cosmic ray current-driven instability (considering also ion-neutral damping), we quantify the maximum particle energy, Emax, across the range of shock velocities and upstream densities of interest. We find values of Emax ˜ 10 GeV-10 TeV, which are broadly consistent with the inferred spectral cut-offs, but which could also in principle lead to emission extending to ≳ 100 GeV accessible to atmosphere Cherenkov telescopes, such as the Cherenkov Telescope Array (CTA). Detecting TeV neutrinos with IceCube is more challenging, although the prospects are improved for a nearby event (≲ kpc) or if the shock power during the earliest, densest phases of the outburst is higher than implied by the GeV light curves, due to downscattering of the gamma-rays within the ejecta.
Assessing the Impact of Earth Radiation Pressure Acceleration on Low-Earth Orbit Satellites
NASA Astrophysics Data System (ADS)
Vielberg, Kristin; Forootan, Ehsan; Lück, Christina; Kusche, Jürgen; Börger, Klaus
2017-04-01
The orbits of satellites are influenced by several external forces. The main non-gravitational forces besides thermospheric drag, acting on the surface of satellites, are accelerations due to the Earth and Solar Radiation Pres- sure (SRP and ERP, respectively). The sun radiates visible and infrared light reaching the satellite directly, which causes the SRP. Earth also emits and reflects the sunlight back into space, where it acts on satellites. This is known as ERP acceleration. The influence of ERP increases with decreasing distance to the Earth, and for low-earth orbit (LEO) satellites ERP must be taken into account in orbit and gravity computations. Estimating acceler- ations requires knowledge about energy emitted from the Earth, which can be derived from satellite remote sensing data, and also by considering the shape and surface material of a satellite. In this sensitivity study, we assess ERP accelerations based on different input albedo and emission fields and their modelling for the satellite missions Challenging Mini-Satellite Payload (CHAMP) and Gravity Recovery and Climate Experiment (GRACE). As input fields, monthly 1°x1° products of Clouds and the Earth's Radiant En- ergy System (CERES), L3 are considered. Albedo and emission models are generated as latitude-dependent, as well as in terms of spherical harmonics. The impact of different albedo and emission models as well as the macro model and the altitude of satellites on ERP accelerations will be discussed.
MAJOR MERGERS WITH SMALL GALAXIES: THE DISCOVERY OF A MAGELLANIC-TYPE GALAXY AT z = 0.12
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Andreas; Frank, Matthias J.; Pasquali, Anna
We report on the serendipitous discovery of a star-forming galaxy at redshift z = 0.116 with morphological features that indicate an ongoing merger. This object exhibits two clearly separated components with significantly different colors, plus a possible tidal stream. Follow-up spectroscopy of the bluer component revealed a low star-forming activity of 0.09 M{sub ⊙} yr{sup −1} and a high metallicity of 12 + log(O/H) = 8.6. Based on comparison with mass–star formation-rate and mass–metallicity relations, and on fitting of spectral energy distributions, we obtain a stellar mass of 3 × 10{sup 9} M{sub ⊙}, which renders this object comparable to the Largemore » Magellanic Cloud. Thus our finding provides a further piece of evidence of a major merger already acting on small, dwarf-galaxy-like scales.« less
NASA Astrophysics Data System (ADS)
Li, Chengyuan; Deng, Licai; de Grijs, Richard; Jiang, Dengkai; Xin, Yu
2018-03-01
The bifurcated patterns in the color–magnitude diagrams of blue straggler stars (BSSs) have attracted significant attention. This type of special (but rare) pattern of two distinct blue straggler sequences is commonly interpreted as evidence that cluster core-collapse-driven stellar collisions are an efficient formation mechanism. Here, we report the detection of a bifurcated blue straggler distribution in a young Large Magellanic Cloud cluster, NGC 2173. Because of the cluster’s low central stellar number density and its young age, dynamical analysis shows that stellar collisions alone cannot explain the observed BSSs. Therefore, binary evolution is instead the most viable explanation of the origin of these BSSs. However, the reason why binary evolution would render the color–magnitude distribution of BSSs bifurcated remains unclear. C. Li, L. Deng, and R. de Grijs jointly designed this project.
Integration of prior knowledge into dense image matching for video surveillance
NASA Astrophysics Data System (ADS)
Menze, M.; Heipke, C.
2014-08-01
Three-dimensional information from dense image matching is a valuable input for a broad range of vision applications. While reliable approaches exist for dedicated stereo setups they do not easily generalize to more challenging camera configurations. In the context of video surveillance the typically large spatial extent of the region of interest and repetitive structures in the scene render the application of dense image matching a challenging task. In this paper we present an approach that derives strong prior knowledge from a planar approximation of the scene. This information is integrated into a graph-cut based image matching framework that treats the assignment of optimal disparity values as a labelling task. Introducing the planar prior heavily reduces ambiguities together with the search space and increases computational efficiency. The results provide a proof of concept of the proposed approach. It allows the reconstruction of dense point clouds in more general surveillance camera setups with wider stereo baselines.
Juno Arrival at Jupiter Artist Concept
2015-07-07
This artist's rendering shows NASA's Juno spacecraft making one of its close passes over Jupiter. Launched in 2011, the Juno spacecraft will arrive at Jupiter in 2016 to study the giant planet from an elliptical, polar orbit. Juno will repeatedly dive between the planet and its intense belts of charged particle radiation, traveling from pole to pole in about an hour, and coming within 5,000 kilometers (about 3,000 miles) of the cloud tops at closest approach. Juno's primary goal is to improve our understanding of Jupiter's formation and evolution. The spacecraft will spend a little over a year investigating the planet's origins, interior structure, deep atmosphere and magnetosphere. Juno's study of Jupiter will help us to understand the history of our own solar system and provide new insight into how planetary systems form and develop in our galaxy and beyond. http://photojournal.jpl.nasa.gov/catalog/PIA19639
The origin of recombining plasma and the detection of the Fe-K line in the supernova remnant W 28
NASA Astrophysics Data System (ADS)
Okon, Hiromichi; Uchida, Hiroyuki; Tanaka, Takaaki; Matsumura, Hideaki; Tsuru, Takeshi Go
2018-03-01
Overionized recombining plasmas (RPs) have been discovered from a dozen mixed-morphology (MM) supernova remnants (SNRs). However, their formation process is still under debate. As pointed out by many previous studies, spatial variations of plasma temperature and ionization state provide clues to understanding the physical origin of RPs. We report on spatially resolved X-ray spectroscopy of W 28, which is one of the largest MM SNRs found in our Galaxy. Two observations with Suzaku XIS cover the center of W 28 to the northeastern rim where the shock is interacting with molecular clouds. The X-ray spectra in the inner regions are reproduced well by a combination of two RP models with different temperatures and ionization states, whereas that in the northeastern rim is explained with a single RP model. Our discovery of the RP in the northeastern rim suggests an effect of thermal conduction between the cloud and hot plasma, which may be the production process of the RP. The X-ray spectrum of the northeastern rim also shows an excess emission of the Fe I K α line. The most probable process to explain the line would be inner shell ionization of Fe in the molecular cloud by cosmic ray particles accelerated in W 28.
The Physics and Chemistry of Marine Aerosols
NASA Astrophysics Data System (ADS)
Russell, Lynn M.
Understanding the physics and chemistry of the marine atmosphere requires both predicting the evolution of its gas and aerosol phases and making observations that reflect the processes in that evolution. This work presents a model of the most fundamental physical and chemical processes important in the marine atmosphere, and discusses the current uncertainties in our theoretical understanding of those processes. Backing up these predictions with observations requires improved instrumentation for field measurements of aerosol. One important advance in this instrumentation is described for accelerating the speed of size distribution measurements. Observations of aerosols in the marine boundary layer during the Atlantic Stratocumulus Transition Experiment (ASTEX) provide an illustration of the impact of cloud processing in marine stratus. More advanced measurements aboard aircraft were enabled by redesigning the design of the system for separating particles by differential mobility and counting them by condensational growth. With this instrumentation, observations made during the Monterey Area Ship Tracks (MAST) Experiment have illustrated the role of aerosol emissions of ships in forming tracks in clouds. High-resolution gas chromatography and mass spectrometry was used with samples extracted by supercritical fluid extraction in order to identify the role of combustion organics in forming ship tracks. The results illustrate the need both for more sophisticated models incorporating organic species in cloud activation and for more extensive boundary layer observations.
Fast grasping of unknown objects using principal component analysis
NASA Astrophysics Data System (ADS)
Lei, Qujiang; Chen, Guangming; Wisse, Martijn
2017-09-01
Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.
GenomeVIP: a cloud platform for genomic variant discovery and interpretation
Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li
2017-01-01
Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612
NASA Astrophysics Data System (ADS)
Chen, Sisi; Yau, Man-Kong; Bartello, Peter; Xue, Lulin
2018-05-01
In most previous direct numerical simulation (DNS) studies on droplet growth in turbulence, condensational growth and collisional growth were treated separately. Studies in recent decades have postulated that small-scale turbulence may accelerate droplet collisions when droplets are still small when condensational growth is effective. This implies that both processes should be considered simultaneously to unveil the full history of droplet growth and rain formation. This paper introduces the first direct numerical simulation approach to explicitly study the continuous droplet growth by condensation and collisions inside an adiabatic ascending cloud parcel. Results from the condensation-only, collision-only, and condensation-collision experiments are compared to examine the contribution to the broadening of droplet size distribution (DSD) by the individual process and by the combined processes. Simulations of different turbulent intensities are conducted to investigate the impact of turbulence on each process and on the condensation-induced collisions. The results show that the condensational process promotes the collisions in a turbulent environment and reduces the collisions when in still air, indicating a positive impact of condensation on turbulent collisions. This work suggests the necessity of including both processes simultaneously when studying droplet-turbulence interaction to quantify the turbulence effect on the evolution of cloud droplet spectrum and rain formation.
An Analytical Method To Compute Comet Cloud Formation Efficiency And Its Application
NASA Astrophysics Data System (ADS)
Brasser, Ramon; Duncan, M. J.
2007-07-01
A quick analytical method is presented for calculating comet cloud formation efficiency in the case of a single planet or multiple-planet system for planets that are not too eccentric (e_p < 0.2). A method to calculate the fraction of comets that stay under the control of each planet is also presented. The location of the planet(s) in mass-semi-major axis space to form a comet cloud is constrained based on the conditions developed by Tremaine (1993) together with estimates of the likelihood of passing comets between planets; and, in the case of a single, eccentric planet, the additional constraint that it is, by itself, able to accelerate material to lower values of Tisserand parameter within the age of the stellar system without sweeping up the majority of the material beforehand. For a single planet, it turns out the efficiency is mainly a function of planetary mass and semi-major axis of the planet and density of the stellar environment. The theory has been applied to some extrasolar systems and compared to numerical simulations for both these systems and the Solar system, as well as a diffusion scheme based on the energy kick distribution of Everhart (1968). Results agree well with analytical predictions.
NASA Astrophysics Data System (ADS)
Lemaitre, P.; Brunel, M.; Rondeau, A.; Porcheron, E.; Gréhan, G.
2015-12-01
According to changes in aircraft certifications rules, instrumentation has to be developed to alert the flight crews of potential icing conditions. The technique developed needs to measure in real time the amount of ice and liquid water encountered by the plane. Interferometric imaging offers an interesting solution: It is currently used to measure the size of regular droplets, and it can further measure the size of irregular particles from the analysis of their speckle-like out-of-focus images. However, conventional image processing needs to be speeded up to be compatible with the real-time detection of icing conditions. This article presents the development of an optimised algorithm to accelerate image processing. The algorithm proposed is based on the detection of each interferogram with the use of the gradient pair vector method. This method is shown to be 13 times faster than the conventional Hough transform. The algorithm is validated on synthetic images of mixed phase clouds, and finally tested and validated in laboratory conditions. This algorithm should have important applications in the size measurement of droplets and ice particles for aircraft safety, cloud microphysics investigation, and more generally in the real-time analysis of triphasic flows using interferometric particle imaging.
The origin of recombining plasma and the detection of the Fe-K line in the supernova remnant W 28
NASA Astrophysics Data System (ADS)
Okon, Hiromichi; Uchida, Hiroyuki; Tanaka, Takaaki; Matsumura, Hideaki; Tsuru, Takeshi Go
2018-06-01
Overionized recombining plasmas (RPs) have been discovered from a dozen mixed-morphology (MM) supernova remnants (SNRs). However, their formation process is still under debate. As pointed out by many previous studies, spatial variations of plasma temperature and ionization state provide clues to understanding the physical origin of RPs. We report on spatially resolved X-ray spectroscopy of W 28, which is one of the largest MM SNRs found in our Galaxy. Two observations with Suzaku XIS cover the center of W 28 to the northeastern rim where the shock is interacting with molecular clouds. The X-ray spectra in the inner regions are reproduced well by a combination of two RP models with different temperatures and ionization states, whereas that in the northeastern rim is explained with a single RP model. Our discovery of the RP in the northeastern rim suggests an effect of thermal conduction between the cloud and hot plasma, which may be the production process of the RP. The X-ray spectrum of the northeastern rim also shows an excess emission of the Fe I K α line. The most probable process to explain the line would be inner shell ionization of Fe in the molecular cloud by cosmic ray particles accelerated in W 28.
Spallation processes and nuclear interaction products of cosmic rays.
Silberberg, R; Tsao, C H
1990-08-01
Most cosmic-ray nuclei heavier than helium have suffered nuclear collisions in the interstellar gas, with transformation of nuclear composition. The isotopic and elemental composition at the sources has to be inferred from the observed composition near the Earth. The source composition permits tests of current ideas on sites of origin, nucleosynthesis in stars, evolution of stars, the mixing and composition of the interstellar medium and injection processes prior to acceleration. The effects of nuclear spallation, production of radioactive nuclides and the time dependence of their decay provide valuable information on the acceleration and propagation of cosmic rays, their nuclear transformations, and their confinement time in the Galaxy. The formation of spallation products that only decay by electron capture and are relatively long-lived permits an investigation of the nature and density fluctuations (like clouds) of the interstellar medium. Since nuclear collisions yield positrons, antiprotons, gamma rays and neutrinos, we shall discuss these topics briefly.
A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing
NASA Astrophysics Data System (ADS)
Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.
2012-04-01
Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking can benefit largely from the hardware parallelism provided by the cloud deployment. The resulting output, post-stack section, coherence, and NMO-velocity panels are used to generate a smooth migration-velocity model. Residual static corrections are calculated as a by-product of the stack and can be applied iteratively. As a final step, a time migrated subsurface image is obtained by a parallelized Kirchhoff time migration scheme. Processing can be done step-by-step or using a graphical workflow editor that can launch a series of pipelined tasks. The status of the submitted jobs is monitored by a dedicated service. All results are stored in project directories, where they can be downloaded of viewed directly in the browser. Currently, the portal has access to three research clusters having a total number of 70 nodes with 4 cores each. They are shared with four other cloud-computing applications bundled within the GRIDA3 project. To demonstrate the functionality of our "seismic cloud lab", we will present results obtained for three different types of data, all taken from hydrogeophysical studies: (1) a seismic reflection data set, made of compressional waves from explosive sources, recorded in Muravera, Sardinia; (2) a shear-wave data set from, Sardinia; (3) a multi-offset Ground-Penetrating-Radar data set from Larreule, France. The presented work was funded by the government of the Autonomous Region of Sardinia and by the Italian Ministry of Research and Education.
Accelerated optical polymer aging studies for LED luminaire applications
NASA Astrophysics Data System (ADS)
Estupiñán, Edgar; Wendling, Peter; Kostrun, Marijan; Garner, Richard
2013-09-01
There is a need in the lighting industry to design and implement accelerated aging methods that accurately simulate the aging process of LED luminaire components. In response to this need, we have built a flexible and reliable system to study the aging characteristics of optical polymer materials, and we have employed it to study a commercially available LED luminaire diffuser made of PMMA. The experimental system consists of a "Blue LED Emitter" and a working surface. Both the temperatures of the samples and the optical powers of the LEDs are appropriately characterized in the system. Several accelerated aging experiments are carried out at different temperatures and optical powers over a 90 hour period and the measured transmission values are used as inputs to a degradation model derived using plausibility arguments. This model seems capable of predicting the behavior of the material as a function of time, temperature and optical power. The model satisfactorily predicts the measured transmission values of diffusers aged in luminaires at two different times and thus can be used to make application recommendations for this material. Specifically, at 35000 hours (the manufacturer's stated life of the luminaire) and at the typical operational temperature of the diffuser, the model predicts a transmission loss of only a few percent over the original transmission of the material at 450 nm, which renders this material suitable for this application.
GPU accelerated generation of digitally reconstructed radiographs for 2-D/3-D image registration.
Dorgham, Osama M; Laycock, Stephen D; Fisher, Mark H
2012-09-01
Recent advances in programming languages for graphics processing units (GPUs) provide developers with a convenient way of implementing applications which can be executed on the CPU and GPU interchangeably. GPUs are becoming relatively cheap, powerful, and widely available hardware components, which can be used to perform intensive calculations. The last decade of hardware performance developments shows that GPU-based computation is progressing significantly faster than CPU-based computation, particularly if one considers the execution of highly parallelisable algorithms. Future predictions illustrate that this trend is likely to continue. In this paper, we introduce a way of accelerating 2-D/3-D image registration by developing a hybrid system which executes on the CPU and utilizes the GPU for parallelizing the generation of digitally reconstructed radiographs (DRRs). Based on the advancements of the GPU over the CPU, it is timely to exploit the benefits of many-core GPU technology by developing algorithms for DRR generation. Although some previous work has investigated the rendering of DRRs using the GPU, this paper investigates approximations which reduce the computational overhead while still maintaining a quality consistent with that needed for 2-D/3-D registration with sufficient accuracy to be clinically acceptable in certain applications of radiation oncology. Furthermore, by comparing implementations of 2-D/3-D registration on the CPU and GPU, we investigate current performance and propose an optimal framework for PC implementations addressing the rigid registration problem. Using this framework, we are able to render DRR images from a 256×256×133 CT volume in ~24 ms using an NVidia GeForce 8800 GTX and in ~2 ms using NVidia GeForce GTX 580. In addition to applications requiring fast automatic patient setup, these levels of performance suggest image-guided radiation therapy at video frame rates is technically feasible using relatively low cost PC architecture.
Detection of high-energy gamma rays from winter thunderclouds.
Tsuchiya, H; Enoto, T; Yamada, S; Yuasa, T; Kawaharada, M; Kitaguchi, T; Kokubun, M; Kato, H; Okano, M; Nakamura, S; Makishima, K
2007-10-19
A report is made on a comprehensive observation of a burstlike gamma-ray emission from thunderclouds on the Sea of Japan, during strong thunderstorms on 6 January 2007. The detected emission, lasting for approximately 40 sec, preceded cloud-to-ground lightning discharges. The burst spectrum, extending to 10 MeV, can be interpreted as consisting of bremsstrahlung photons originating from relativistic electrons. This ground-based observation provides the first clear evidence that strong electric fields in thunderclouds can continuously accelerate electrons beyond 10 MeV prior to lightning discharges.
Optimization of Selected Remote Sensing Algorithms for Embedded NVIDIA Kepler GPU Architecture
NASA Technical Reports Server (NTRS)
Riha, Lubomir; Le Moigne, Jacqueline; El-Ghazawi, Tarek
2015-01-01
This paper evaluates the potential of embedded Graphic Processing Units in the Nvidias Tegra K1 for onboard processing. The performance is compared to a general purpose multi-core CPU and full fledge GPU accelerator. This study uses two algorithms: Wavelet Spectral Dimension Reduction of Hyperspectral Imagery and Automated Cloud-Cover Assessment (ACCA) Algorithm. Tegra K1 achieved 51 for ACCA algorithm and 20 for the dimension reduction algorithm, as compared to the performance of the high-end 8-core server Intel Xeon CPU with 13.5 times higher power consumption.
Copernicus observations of C I: pressures and carbon abundances in diffuse interstellar clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, E.B.; Jura, M.; Loewenstein, M.
1983-07-01
Using the Copernicus satellite, we observed ultraviolet absorption lines of interstellar neutral carbon atoms toward 27 stars. In addition to deriving column densities of C I (both in its ground state and the two excited fine-structure levels), we used our equivalent widths to revise the f-values of some of the C I transitions measured by other investigators. We also observed H/sub 2/ from the J = 4 level so that we could compare the rotational excitation of H/sub 2/ with the fine-structure excitation of C I. From the amount of fine-structure excitation of C I in each case, we havemore » derived information on the thermal gas pressures within the diffuse clouds. Most clouds have p/k between 10/sup 3/ cm/sup -3/ K and 10/sup 4/ cm/sup -3/ K, but we found that at least 6% of the C I-bearing material is at p/k>10/sup 4/ cm/sup -3/ K, and one-third of the gas has upper limits for pressure below 10/sup 3/ cm/sup -3/ K, assuming temperatures are not appreciably below 20 K. An analysis of radial velocities for the absorption lines showed no distinctive trends for the kinematics of high- or low-pressure gas components. From the apparent lack of acceleration of high-pressure clouds, we conclude that it is unlikely that streaming intercloud material is causing significant ram pressurization. We have compared our results with the predictions for pressure fluctuations caused by supernova explosions in the theory of McKee and Ostriker.« less
Probing Galactic Center Cosmic-Rays in the X-ray Regime
NASA Astrophysics Data System (ADS)
Zhang, Shuo; Baganoff, Frederick K.; Bulbul, Esra; Miller, Eric D.; Bautz, Mark W.
2017-08-01
The central few hundred parsecs of the Galaxy harbors 5-10% of the molecular gas mass of the entire Milky Way. This central molecular zone exhibits 6.4 keV Fe Kα line and continuum X-ray emission with time-variability. The time-variable X-ray emission from the gas clouds is best explained by light echoes of past X-ray outbursts from the central supermassive black hole Sgr A*. However,MeV-GeV cosmic-ray particles may also contribute to a constant X-ray emission component from the clouds, through collisional ionization and bremsstrahlung. Sgr B2 is the densest and most massive cloud in the central molecular zone. It is the only known gas cloud whose X-ray emission has kept fading over the past decade and will soon reach a constant X-ray level in 2017/2018, and thus serves as the best probe for MeV-GeV particles in the central 100 pc of the Galaxy. At the same time, the Fe Kα emission has also been discovered from molecular structures beyond the central molecular zone, extening to ~1 kpc from the Galactic center. The X-ray reflection scenario meets challenges this far from the Galactic center, while the MeV-GeV cosmic-ray electrons serve as a more natural explanation. Our studies on Sgr B2 and the large-scale moleuclar structures will for the first time constrain the MeV-GeV particles in the Galactic center, and point to their origin: whether they rise from particle acceleration or dark matter annihilation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsumoto, Tomoaki; Machida, Masahiro N.; Inutsuka, Shu-ichiro, E-mail: matsu@hosei.ac.jp
2017-04-10
We investigate the formation of circumstellar disks and outflows subsequent to the collapse of molecular cloud cores with the magnetic field and turbulence. Numerical simulations are performed by using an adaptive mesh refinement to follow the evolution up to ∼1000 years after the formation of a protostar. In the simulations, circumstellar disks are formed around the protostars; those in magnetized models are considerably smaller than those in nonmagnetized models, but their size increases with time. The models with stronger magnetic fields tend to produce smaller disks. During evolution in the magnetized models, the mass ratios of a disk to amore » protostar is approximately constant at ∼1%–10%. The circumstellar disks are aligned according to their angular momentum, and the outflows accelerate along the magnetic field on the 10–100 au scale; this produces a disk that is misaligned with the outflow. The outflows are classified into two types: a magnetocentrifugal wind and a spiral flow. In the latter, because of the geometry, the axis of rotation is misaligned with the magnetic field. The magnetic field has an internal structure in the cloud cores, which also causes misalignment between the outflows and the magnetic field on the scale of the cloud core. The distribution of the angular momentum vectors in a core also has a non-monotonic internal structure. This should create a time-dependent accretion of angular momenta onto the circumstellar disk. Therefore, the circumstellar disks are expected to change their orientation as well as their sizes in the long-term evolutions.« less
Natural 3D content on glasses-free light-field 3D cinema
NASA Astrophysics Data System (ADS)
Balogh, Tibor; Nagy, Zsolt; Kovács, Péter Tamás.; Adhikarla, Vamsi K.
2013-03-01
This paper presents a complete framework for capturing, processing and displaying the free viewpoint video on a large scale immersive light-field display. We present a combined hardware-software solution to visualize free viewpoint 3D video on a cinema-sized screen. The new glasses-free 3D projection technology can support larger audience than the existing autostereoscopic displays. We introduce and describe our new display system including optical and mechanical design considerations, the capturing system and render cluster for producing the 3D content, and the various software modules driving the system. The indigenous display is first of its kind, equipped with front-projection light-field HoloVizio technology, controlling up to 63 MP. It has all the advantages of previous light-field displays and in addition, allows a more flexible arrangement with a larger screen size, matching cinema or meeting room geometries, yet simpler to set-up. The software system makes it possible to show 3D applications in real-time, besides the natural content captured from dense camera arrangements as well as from sparse cameras covering a wider baseline. Our software system on the GPU accelerated render cluster, can also visualize pre-recorded Multi-view Video plus Depth (MVD4) videos on this light-field glasses-free cinema system, interpolating and extrapolating missing views.
A study of acoustic heating and forced convection in the solar corona
NASA Technical Reports Server (NTRS)
Foukal, P. V.
1980-01-01
The S055 EUV spectra was used to perform emission measure and line intensity ratio analyses of loop plasma conditions in a study on the thermodynamics of magnetic loops in the solar corona. The evidence that loops contain plasma hotter than the background corona, and thus, require enhanced local dissipation of magnetic or mechanical energy is discussed. The S055 EUV raster pictures were used to study physical conditions in cool ultraviolet absorbing clouds in the solar corona, and optical data were used to derive constraints on the dimension, time scales and optical depths in dark opaque clouds not seen in H alpha and CaK as filaments or prominences. Theoretical modelling of propagation of magnetically guided acoustic shocks in the solar chromosphere finds it still unlikely that high frequency acoustic shocks could reach the solar corona. Dynamic modelling of spicules shows that such guided slow mode shocks can explain the acceleration of cool spicular material seen high in the corona.
NASA Astrophysics Data System (ADS)
Nobukawa, Kumiko K.; Nobukawa, Masayoshi; Koyama, Katsuji; Yamauchi, Shigeo; Uchiyama, Hideki; Okon, Hiromichi; Tanaka, Takaaki; Uchida, Hiroyuki; Tsuru, Takeshi G.
2018-02-01
Supernova remnants (SNRs) have been prime candidates for Galactic cosmic-ray accelerators. When low-energy cosmic-ray protons (LECRp) collide with interstellar gas, they ionize neutral iron atoms and emit the neutral iron line (Fe I Kα) at 6.40 keV. We search for the iron K-shell line in seven SNRs from the Suzaku archive data of the Galactic plane in the 6^\\circ ≲ l≲ 40^\\circ ,| b| < 1^\\circ region. All of these SNRs interact with molecular clouds. We discover Fe I Kα line emissions from five SNRs (W28, Kes 67, Kes 69, Kes 78, and W44). The spectra and morphologies suggest that the Fe I Kα line is produced by interactions between LECRp and the adjacent cold gas. The proton energy density is estimated to be ≳10–100 eV cm‑3, which is more than 10 times higher than that in the ambient interstellar medium.
NASA Astrophysics Data System (ADS)
Lin, W.; Xie, S.; Jackson, R. C.; Endo, S.; Vogelmann, A. M.; Collis, S. M.; Golaz, J. C.
2017-12-01
Climate models are known to have difficulty in simulating tropical diurnal convections that exhibit distinct characteristics over land and open ocean. While the causes are rooted in deficiencies in convective parameterization in general, lack of representations of mesoscale dynamics in terms of land-sea breeze, convective organization, and propagation of convection-induced gravity waves also play critical roles. In this study, the problem is investigated at the process-level with the U.S. Department of Energy Accelerated Climate Modeling for Energy (ACME) model in short-term hindcast mode using the Cloud Associated Parameterization Testbed (CAPT) framework. Convective-scale radar retrievals and observation-driven convection-permitting simulations for the Tropical Warm Pool-International Cloud Experiment (TWP-ICE) cases are used to guide the analysis of the underlying processes. The emphasis will be on linking deficiencies in representation of detailed process elements to the model biases in diurnal convective properties and their contrast among inland, coastal and open ocean conditions.
Kato, Yusuke; Yagi, Hisashi; Kaji, Yuichi; Oshika, Tetsuro; Goto, Yuji
2013-08-30
Corneal dystrophies are genetic disorders resulting in progressive corneal clouding due to the deposition of amyloid fibrils derived from keratoepithelin, also called transforming growth factor β-induced protein (TGFBI). The formation of amyloid fibrils is often accelerated by surfactants such as sodium dodecyl sulfate (SDS). Most eye drops contain benzalkonium chloride (BAC), a cationic surfactant, as a preservative substance. In the present study, we aimed to reveal the role of BAC in the amyloid fibrillation of keratoepithelin-derived peptides in vitro. We used three types of 22-residue synthetic peptides covering Leu110-Glu131 of the keratoepithelin sequence: an R-type peptide with wild-type R124, a C-type peptide with C124 associated with lattice corneal dystrophy type I, and a H-type peptide with H124 associated with granular corneal dystrophy type II. The time courses of spontaneous amyloid fibrillation and seed-dependent fibril elongation were monitored in the presence of various concentrations of BAC or SDS using thioflavin T fluorescence. BAC and SDS accelerated the fibrillation of all synthetic peptides in the absence and presence of seeds. Optimal acceleration occurred near the CMC, which suggests that the unstable and dynamic interactions of keratoepithelin peptides with amphipathic surfactants led to the formation of fibrils. These results suggest that eye drops containing BAC may deteriorate corneal dystrophies and that those without BAC are preferred especially for patients with corneal dystrophies.
NASA Astrophysics Data System (ADS)
Farrah, S.; Al Yazidi, O.
2016-12-01
The UAE Research Program for Rain Enhancement Science (UAEREP) is an international research initiative designed to advance the science and technology of rain enhancement. It comes from an understanding of the needs of countries suffering from scarcity of fresh water, and its will to support innovation globally. The Program focuses on the following topics: Climate change, Climate modelling, Climatology, Atmospheric physics, Atmospheric dynamics, Weather modification, Cloud physics, Cloud dynamics, Cloud seeding, Weather radars, Dust modelling, Aerosol physics , Aerosol chemistry, Aerosol/cloud interactions, Water resources, Physics, Numerical modelling, Material science, Nanotechnology, Meteorology, Hydrology, Hydrogeology, Rocket technology, Laser technology, Water sustainability, Remote sensing, Environmental sciences... In 2015, three research teams from Japan, Germany and the UAE led by Prof. Masataka Murakami, Volker Wulfmeyer and Linda Zou have been respectively awarded. Together, they are addressing the issue of water security through innovative ideas: algorithms and sensors, land cover modification, and nanotechnologies to accelerate condensation. These three projects are undergoing now with extensive research and progresses. This session will be an opportunity to present their latest results as well as to detail the evolution of research in rain enhancement. In 2016 indeed, the Program saw a remarkable increase in participation, with 91 pre-proposals from 398 scientists, researchers and technologists affiliated to 180 institutes from 45 countries. The projects submitted are now focusing on modelling to predict weather, autonomous vehicles, rocket technology, lasers or new seeding materials… The science of rain enhancement offers considerable potential in terms of research, development and innovation. Though cloud seeding has been pursued since the late 1940s, it has been viewed as a relatively marginal field of interest for scientists. This benign neglect has been recently replaced by a new drive to solve the technical obstacles impeding its potential. There is now a real prospect that this science will come of age and play its rightful part in boosting sustainable water supplies for people at risk in arid and semi-arid regions of the world.
Toward a web-based real-time radiation treatment planning system in a cloud computing environment.
Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei
2013-09-21
To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.
Toward a web-based real-time radiation treatment planning system in a cloud computing environment
NASA Astrophysics Data System (ADS)
Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei
2013-09-01
To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.
NASA Astrophysics Data System (ADS)
Hiranuma, Naruki; Möhler, Ottmar; Kulkarni, Gourihar; Schnaiter, Martin; Vogt, Steffen; Vochezer, Paul; Järvinen, Emma; Wagner, Robert; Bell, David M.; Wilson, Jacqueline; Zelenyuk, Alla; Cziczo, Daniel J.
2016-08-01
Separation of particles that play a role in cloud activation and ice nucleation from interstitial aerosols has become necessary to further understand aerosol-cloud interactions. The pumped counterflow virtual impactor (PCVI), which uses a vacuum pump to accelerate the particles and increase their momentum, provides an accessible option for dynamic and inertial separation of cloud elements. However, the use of a traditional PCVI to extract large cloud hydrometeors is difficult mainly due to its small cut-size diameters (< 5 µm). Here, for the first time we describe a development of an ice-selecting PCVI (IS-PCVI) to separate ice in controlled mixed-phase cloud system based on the particle inertia with the cut-off diameter ≥ 10 µm. We also present its laboratory application demonstrating the use of the impactor under a wide range of temperature and humidity conditions. The computational fluid dynamics simulations were initially carried out to guide the design of the IS-PCVI. After fabrication, a series of validation laboratory experiments were performed coupled with the Aerosol Interaction and Dynamics in the Atmosphere (AIDA) expansion cloud simulation chamber. In the AIDA chamber, test aerosol particles were exposed to the ice supersaturation conditions (i.e., RHice > 100 %), where a mixture of droplets and ice crystals was formed during the expansion experiment. In parallel, the flow conditions of the IS-PCVI were actively controlled, such that it separated ice crystals from a mixture of ice crystals and cloud droplets, which were of diameter ≥ 10 µm. These large ice crystals were passed through the heated evaporation section to remove the water content. Afterwards, the residuals were characterized with a suite of online and offline instruments downstream of the IS-PCVI. These results were used to assess the optimized operating parameters of the device in terms of (1) the critical cut-size diameter, (2) the transmission efficiency and (3) the counterflow-to-input flow ratio. Particle losses were characterized by comparing the residual number concentration to the rejected interstitial particle number concentration. Overall results suggest that the IS-PCVI enables inertial separation of particles with a volume-equivalent particle size in the range of ~ 10-30 µm in diameter with small inadvertent intrusion (~ 5 %) of unwanted particles.
Selectively transporting small chiral particles with circularly polarized Airy beams.
Lu, Wanli; Chen, Huajin; Guo, Sandong; Liu, Shiyang; Lin, Zhifang
2018-05-01
Based on the full wave simulation, we demonstrate that a circularly polarized vector Airy beam can selectively transport small chiral particles along a curved trajectory via the chirality-tailored optical forces. The transverse optical forces can draw the chiral particles with different particle chirality towards or away from the intensity maxima of the beam, leading to the selective trapping in the transverse plane. The transversely trapped chiral particles are then accelerated along a curved trajectory of the Airy beam by the chirality-tailored longitudinal scattering force, rendering an alternative way to sort and/or transport chiral particles with specified helicity. Finally, the underlying physics of the chirality induced transverse trap and de-trap phenomena are examined by the analytical theory within the dipole approximation.
Bacterial expression of human kynurenine 3-monooxygenase: solubility, activity, purification.
Wilson, K; Mole, D J; Binnie, M; Homer, N Z M; Zheng, X; Yard, B A; Iredale, J P; Auer, M; Webster, S P
2014-03-01
Kynurenine 3-monooxygenase (KMO) is an enzyme central to the kynurenine pathway of tryptophan metabolism. KMO has been implicated as a therapeutic target in several disease states, including Huntington's disease. Recombinant human KMO protein production is challenging due to the presence of transmembrane domains, which localise KMO to the outer mitochondrial membrane and render KMO insoluble in many in vitro expression systems. Efficient bacterial expression of human KMO would accelerate drug development of KMO inhibitors but until now this has not been achieved. Here we report the first successful bacterial (Escherichia coli) expression of active FLAG™-tagged human KMO enzyme expressed in the soluble fraction and progress towards its purification. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Gravity and the cell: Intracellular structures and Stokes sedimentation
NASA Technical Reports Server (NTRS)
Todd, P.
1977-01-01
Plant and certain animal embryos appear to be responsive to the gravity vector during early stages of development. The convection of particle sedimentation as the basis for the sensing of gravity is investigated using the cells of wheat seedlings, amphibian embryos, and mammals. Exploration of the mammalian cell for sedimenting particles reveals that their existence is unlikely, especially in the presence of a network of microtubules and microfilaments considered to be responsible for intracellular organization. Destruction of these structures renders the cell susceptible to accelerations several times g. Large dense particles, such as chromosomes, nucleoli, and cytoplasmic organelles are acted upon by forces much larger than that due to gravity, and their positions in the cell appear to be insensitive to gravity.
NASA Astrophysics Data System (ADS)
José Gómez-Navarro, Juan; María López-Romero, José; Palacios-Peña, Laura; Montávez, Juan Pedro; Jiménez-Guerrero, Pedro
2017-04-01
A critical challenge for assessing regional climate change projections relies on improving the estimate of atmospheric aerosol impact on clouds and reducing the uncertainty associated with the use of parameterizations. In this sense, the horizontal grid spacing implemented in state-of-the-art regional climate simulations is typically 10-25 kilometers, meaning that very important processes such as convective precipitation are smaller than a grid box, and therefore need to be parameterized. This causes large uncertainties, as closure assumptions and a number of parameters have to be established by model tuning. Convection is a physical process that may be strongly conditioned by atmospheric aerosols, although the solution of aerosol-cloud interactions in warm convective clouds remains nowadays a very important scientific challenge, rendering parametrization of these complex processes an important bottleneck that is responsible from a great part of the uncertainty in current climate change projections. Therefore, the explicit simulation of convective processes might improve the quality and reliability of the simulations of the aerosol-cloud interactions in a wide range of atmospheric phenomena. Particularly over the Mediterranean, the role of aerosol particles is very important, being this a crossroad that fuels the mixing of particles from different sources (sea-salt, biomass burning, anthropogenic, Saharan dust, etc). Still, the role of aerosols in extreme events in this area such as medicanes has been barely addressed. This work aims at assessing the role of aerosol-atmosphere interaction in medicanes with the help of the regional chemistry/climate on-line coupled model WRF-CHEM run at a convection-permitting resolution. The analysis is exemplary based on the "Rolf" medicane (6-8 November 2011). Using this case study as reference, four sets of simulations are run with two spatial resolutions: one at a convection-permitting configuration of 4 km, and other at the lower resolution of 12 km, in whose case the convection has to be parameterized. Each configuration is used to produce two simulations, including and not including aerosol-radiation-cloud interactions. The comparison of the simulated output at different scales allows to evaluate the impact of sub-grid scale mixing of precursors on aerosol production. By focusing on these processes at different resolutions, the differences between convection-permitting models running at resolutions of 4 km to 12 km can be explored. Preliminary results indicate that the inclusion of aerosol effects may indeed impact the severity of this simulated medicane, especially sea salt aerosols, and leads to important spatial shifts and differences in intensity of surface precipitation.
Fermi LAT Observations of the Supernova Remnant W28 (G6.4-0.1)
Abdo, A. A.; Ackermann, M.; Ajello, M.; ...
2010-06-30
Here, we present detailed analysis of two gamma-ray sources, 1FGL J1801.3–2322c and 1FGL J1800.5–2359c, that have been found toward the supernova remnant (SNR) W28 with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope. 1FGL J1801.3–2322c is found to be an extended source within the boundary of SNR W28, and to extensively overlap with the TeV gamma-ray source HESS J1801–233, which is associated with a dense molecular cloud interacting with the SNR. The gamma-ray spectrum measured with the LAT from 0.2 to 100 GeV can be described by a broken power-law function with a break at ~1more » GeV and photon indices of 2.09 ± 0.08 (stat) ± 0.28 (sys) below the break and 2.74 ± 0.06 (stat) ± 0.09 (sys) above the break. Given the clear association between HESS J1801–233 and the shocked molecular cloud and a smoothly connected spectrum in the GeV-TeV band, we consider the origin of the gamma-ray emission in both GeV and TeV ranges to be the interaction between particles accelerated in the SNR and the molecular cloud. The decay of neutral pions produced in interactions between accelerated hadrons and dense molecular gas provides a reasonable explanation for the broadband gamma-ray spectrum. 1FGL J1800.5–2359c, located outside the southern boundary of SNR W28, cannot be resolved. An upper limit on the size of the gamma-ray emission was estimated to be ~16' using events above ~2 GeV under the assumption of a circular shape with uniform surface brightness. It appears to coincide with the TeV source HESS J1800–240B, which is considered to be associated with a dense molecular cloud that contains the ultra compact H II region W28A2 (G5.89–0.39). In conclusion, we found no significant gamma-ray emission in the LAT energy band at the positions of TeV sources HESS J1800–230A and HESS J1800–230C. The LAT data for HESS J1800–230A combined with the TeV data points indicate a spectral break between 10 GeV and 100 GeV.« less
Beer tapping: dynamics of bubbles after impact
NASA Astrophysics Data System (ADS)
Mantič-Lugo, V.; Cayron, A.; Brun, P.-T.; Gallaire, F.
2015-12-01
Beer tapping is a well known prank where a bottle of beer is impacted from the top by a solid object, usually another bottle, leading to a sudden foam overflow. A description of the shock-driven bubble dynamics leading to foaming is presented based on an experimental and numerical study evoking the following physical picture. First, the solid impact produces a sudden downwards acceleration of the bottle creating a strong depression in the liquid bulk. The existing bubbles undergo a strong expansion and a sudden contraction ending in their collapse and fragmentation into a large amount of small bubbles. Second, the bubble clouds present a large surface area to volume ratio, enhancing the CO2 diffusion from the supersaturated liquid, hence growing rapidly and depleting the CO2. The clouds of bubbles migrate upwards in the form of plumes pulling the surrounding liquid with them and eventually resulting in the foam overflow. The sudden pressure drop that triggers the bubble dynamics with a collapse and oscillations is modelled by the Rayleigh-Plesset equation. The bubble dynamics from impact to collapse occurs over a time (tb ≃ 800 μs) much larger than the acoustic time scale of the liquid bulk (tac = 2H/c ≃ 80 μs), for the experimental container of height H = 6 cm and a speed of sound around c ≃ 1500 m/s. This scale separation, together with the comparison of numerical and experimental results, suggests that the pressure drop is controlled by two parameters: the acceleration of the container and the distance from the bubble to the free surface.
Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H
2014-02-07
RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.
Literman, Robert; Burrett, Alexandria; Bista, Basanta; Valenzuela, Nicole
2018-01-01
The evolutionary lability of sex-determining mechanisms across the tree of life is well recognized, yet the extent of molecular changes that accompany these repeated transitions remain obscure. Most turtles retain the ancestral temperature-dependent sex determination (TSD) from which multiple transitions to genotypic sex determination (GSD) occurred independently, and two contrasting hypotheses posit the existence or absence of reversals back to TSD. Here we examined the molecular evolution of the coding regions of a set of gene regulators involved in gonadal development in turtles and several other vertebrates. We found slower molecular evolution in turtles and crocodilians compared to other vertebrates, but an acceleration in Trionychia turtles and at some phylogenetic branches demarcating major taxonomic diversification events. Of all gene classes examined, hormone signaling genes, and Srd5a1 in particular, evolve faster in many lineages and especially in turtles. Our data show that sex-linked genes do not follow a ubiquitous nor uniform pattern of molecular evolution. We then evaluated turtle nucleotide and protein evolution under two evolutionary hypotheses with or without GSD-to-TSD reversals, and found that when GSD-to-TSD reversals are considered, all transitional branches irrespective of direction, exhibit accelerated molecular evolution of nucleotide sequences, while GSD-to-TSD transitional branches also show acceleration in protein evolution. Significant changes in predicted secondary structure that may affect protein function were identified in three genes that exhibited hastened evolution in turtles compared to other vertebrates or in transitional versus non-transitional branches within turtles, rendering them candidates for a key role during SDM evolution in turtles.
Ludwig, Carmen F.; Ullrich, Florian; Leisle, Lilia; Stauber, Tobias; Jentsch, Thomas J.
2013-01-01
CLC anion transporters form dimers that function either as Cl− channels or as electrogenic Cl−/H+ exchangers. CLC channels display two different types of “gates,” “protopore” gates that open and close the two pores of a CLC dimer independently of each other and common gates that act on both pores simultaneously. ClC-7/Ostm1 is a lysosomal 2Cl−/1H+ exchanger that is slowly activated by depolarization. This gating process is drastically accelerated by many CLCN7 mutations underlying human osteopetrosis. Making use of some of these mutants, we now investigate whether slow voltage activation of plasma membrane-targeted ClC-7/Ostm1 involves protopore or common gates. Voltage activation of wild-type ClC-7 subunits was accelerated by co-expressing an excess of ClC-7 subunits carrying an accelerating mutation together with a point mutation rendering these subunits transport-deficient. Conversely, voltage activation of a fast ClC-7 mutant could be slowed by co-expressing an excess of a transport-deficient mutant. These effects did not depend on whether the accelerating mutation localized to the transmembrane part or to cytoplasmic cystathionine-β-synthase (CBS) domains of ClC-7. Combining accelerating mutations in the same subunit did not speed up gating further. No currents were observed when ClC-7 was truncated after the last intramembrane helix. Currents and slow gating were restored when the C terminus was co-expressed by itself or fused to the C terminus of the β-subunit Ostm1. We conclude that common gating underlies the slow voltage activation of ClC-7. It depends on the CBS domain-containing C terminus that does not require covalent binding to the membrane domain of ClC-7. PMID:23983121
Accelerated Brain Aging in Schizophrenia: A Longitudinal Pattern Recognition Study.
Schnack, Hugo G; van Haren, Neeltje E M; Nieuwenhuis, Mireille; Hulshoff Pol, Hilleke E; Cahn, Wiepke; Kahn, René S
2016-06-01
Despite the multitude of longitudinal neuroimaging studies that have been published, a basic question on the progressive brain loss in schizophrenia remains unaddressed: Does it reflect accelerated aging of the brain, or is it caused by a fundamentally different process? The authors used support vector regression, a supervised machine learning technique, to address this question. In a longitudinal sample of 341 schizophrenia patients and 386 healthy subjects with one or more structural MRI scans (1,197 in total), machine learning algorithms were used to build models to predict the age of the brain and the presence of schizophrenia ("schizophrenia score"), based on the gray matter density maps. Age at baseline ranged from 16 to 67 years, and follow-up scans were acquired between 1 and 13 years after the baseline scan. Differences between brain age and chronological age ("brain age gap") and between schizophrenia score and healthy reference score ("schizophrenia gap") were calculated. Accelerated brain aging was calculated from changes in brain age gap between two consecutive measurements. The age prediction model was validated in an independent sample. In schizophrenia patients, brain age was significantly greater than chronological age at baseline (+3.36 years) and progressively increased during follow-up (+1.24 years in addition to the baseline gap). The acceleration of brain aging was not constant: it decreased from 2.5 years/year just after illness onset to about the normal rate (1 year/year) approximately 5 years after illness onset. The schizophrenia gap also increased during follow-up, but more pronounced variability in brain abnormalities at follow-up rendered this increase nonsignificant. The progressive brain loss in schizophrenia appears to reflect two different processes: one relatively homogeneous, reflecting accelerated aging of the brain and related to various measures of outcome, and a more variable one, possibly reflecting individual variation and medication use. Differentiating between these two processes may not only elucidate the various factors influencing brain loss in schizophrenia, but also assist in individualizing treatment.
Photogrammetric analysis of concrete specimens and structures for condition assessment
NASA Astrophysics Data System (ADS)
D'Amico, Nicolas; Yu, Tzuyang
2016-04-01
Deterioration of civil infrastructure in America demands routine inspection and maintenance to avoid catastrophic failures from occurring. Among many other non-destructive evaluations (NDE), photogrammetry is an accessible and realistic approach used for non-destructive evaluation (NDE) of a civil infrastructure systems. The objective of this paper is to explore the capabilities of photogrammetry for locating, sizing, and analyzing the remaining capacity of a specimen or system using point cloud data. Geometric interpretations, composed from up to 70 photographs are analyzed as a mesh or point cloud models. In this case study, concrete, which exhibits a large amount of surface texture features, was thoroughly examined. These evaluative techniques discussed were applied to concrete cylinder models as well as portions of civil infrastructure including buildings, retaining walls, and bridge abutments. In this paper, the aim is to demonstrate the basic analytical functionality of photogrammetry, as well as its applicability to in-situ civil infrastructure systems. In concrete specimens defect length and location can be evaluated in a fully defined model (one with the maximum amount of correctly acquired photographs) with less than 2% error. Error was found to be inversely proportional to the number of acceptable photographs acquired, remaining significantly under 10% error for any model with enough data to render. Furthermore, volumetric stress evaluations were applied using a cross sectional evaluation technique to locate the critical area, and determine the severity of damages. Finally, findings and the accuracy of the results are discussed.
Use of parallel computing in mass processing of laser data
NASA Astrophysics Data System (ADS)
Będkowski, J.; Bratuś, R.; Prochaska, M.; Rzonca, A.
2015-12-01
The first part of the paper includes a description of the rules used to generate the algorithm needed for the purpose of parallel computing and also discusses the origins of the idea of research on the use of graphics processors in large scale processing of laser scanning data. The next part of the paper includes the results of an efficiency assessment performed for an array of different processing options, all of which were substantially accelerated with parallel computing. The processing options were divided into the generation of orthophotos using point clouds, coloring of point clouds, transformations, and the generation of a regular grid, as well as advanced processes such as the detection of planes and edges, point cloud classification, and the analysis of data for the purpose of quality control. Most algorithms had to be formulated from scratch in the context of the requirements of parallel computing. A few of the algorithms were based on existing technology developed by the Dephos Software Company and then adapted to parallel computing in the course of this research study. Processing time was determined for each process employed for a typical quantity of data processed, which helped confirm the high efficiency of the solutions proposed and the applicability of parallel computing to the processing of laser scanning data. The high efficiency of parallel computing yields new opportunities in the creation and organization of processing methods for laser scanning data.
An analytical method to compute comet cloud formation efficiency and its application
NASA Astrophysics Data System (ADS)
Brasser, Ramon; Duncan, Martin J.
2008-01-01
A quick analytical method is presented for calculating comet cloud formation efficiency in the case of a single planet or multiple-planet system for planets that are not too eccentric ( e p ≲ 0.3). A method to calculate the fraction of comets that stay under the control of each planet is also presented, as well as a way to determine the efficiency in different star cluster environments. The location of the planet(s) in mass-semi-major axis space to form a comet cloud is constrained based on the conditions developed by Tremaine (1993) together with estimates of the likelyhood of passing comets between planets; and, in the case of a single, eccentric planet, the additional constraint that it is, by itself, able to accelerate material to relative encounter velocity U ~ 0.4 within the age of the stellar system without sweeping up the majority of the material beforehand. For a single planet, it turns out the efficiency is mainly a function of planetary mass and semi-major axis of the planet and density of the stellar environment. The theory has been applied to some extrasolar systems and compared to numerical simulations for both these systems and the Solar System, as well as a diffusion scheme based on the energy kick distribution of Everhart (Astron J 73:1039 1052, 1968). The analytic results are in good agreement with the simulations.
NASA Astrophysics Data System (ADS)
Kaufmann, H. T. C.; Cunha, M. D.; Benilov, M. S.; Hartmann, W.; Wenzel, N.
2017-10-01
A model of cathode spots in high-current vacuum arcs is developed with account of all the potentially relevant mechanisms: the bombardment of the cathode surface by ions coming from a pre-existing plasma cloud; vaporization of the cathode material in the spot, its ionization, and the interaction of the produced plasma with the cathode; the Joule heat generation in the cathode body; melting of the cathode material and motion of the melt under the effect of the plasma pressure and the Lorentz force and related phenomena. After the spot has been ignited by the action of the cloud (which takes a few nanoseconds), the metal in the spot is melted and accelerated toward the periphery of the spot, with the main driving force being the pressure due to incident ions. Electron emission cooling and convective heat transfer are dominant mechanisms of cooling in the spot, limiting the maximum temperature of the cathode to approximately 4700-4800 K. A crater is formed on the cathode surface in this way. After the plasma cloud has been extinguished, a liquid-metal jet is formed and a droplet is ejected. No explosions have been observed. The modeling results conform to estimates of different mechanisms of cathode erosion derived from the experimental data on the net and ion erosion of copper cathodes.
Digital shaded-relief map of Venezuela
Garrity, Christopher P.; Hackley, Paul C.; Urbani, Franco
2004-01-01
The Digital Shaded-Relief Map of Venezuela is a composite of more than 20 tiles of 90 meter (3 arc second) pixel resolution elevation data, captured during the Shuttle Radar Topography Mission (SRTM) in February 2000. The SRTM, a joint project between the National Geospatial-Intelligence Agency (NGA) and the National Aeronautics and Space Administration (NASA), provides the most accurate and comprehensive international digital elevation dataset ever assembled. The 10-day flight mission aboard the U.S. Space Shuttle Endeavour obtained elevation data for about 80% of the world's landmass at 3-5 meter pixel resolution through the use of synthetic aperture radar (SAR) technology. SAR is desirable because it acquires data along continuous swaths, maintaining data consistency across large areas, independent of cloud cover. Swaths were captured at an altitude of 230 km, and are approximately 225 km wide with varying lengths. Rendering of the shaded-relief image required editing of the raw elevation data to remove numerous holes and anomalously high and low values inherent in the dataset. Customized ArcInfo Arc Macro Language (AML) scripts were written to interpolate areas of null values and generalize irregular elevation spikes and wells. Coastlines and major water bodies used as a clipping mask were extracted from 1:500,000-scale geologic maps of Venezuela (Bellizzia and others, 1976). The shaded-relief image was rendered with an illumination azimuth of 315? and an altitude of 65?. A vertical exaggeration of 2X was applied to the image to enhance land-surface features. Image post-processing techniques were accomplished using conventional desktop imaging software.
An image-processing software package: UU and Fig for optical metrology applications
NASA Astrophysics Data System (ADS)
Chen, Lujie
2013-06-01
Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.
Can Clouds replace Grids? Will Clouds replace Grids?
NASA Astrophysics Data System (ADS)
Shiers, J. D.
2010-04-01
The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9oK and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared "open" and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently "Cloud Computing" - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.
NASA Astrophysics Data System (ADS)
Pecha, Petr; Pechova, Emilie
2014-06-01
This article focuses on derivation of an effective algorithm for the fast estimation of cloudshine doses/dose rates induced by a large mixture of radionuclides discharged into the atmosphere. A certain special modification of the classical Gaussian plume approach is proposed for approximation of the near-field dispersion problem. Specifically, the accidental radioactivity release is subdivided into consecutive one-hour Gaussian segments, each driven by a short-term meteorological forecast for the respective hours. Determination of the physical quantity of photon fluence rate from an ambient cloud irradiation is coupled to a special decomposition of the Gaussian plume shape into the equivalent virtual elliptic disks. It facilitates solution of the formerly used time-consuming 3-D integration and provides advantages with regard to acceleration of the computational process on a local scale. An optimal choice of integration limit is adopted on the basis of the mean free path of γ-photons in the air. An efficient approach is introduced for treatment of a wide range of energetic spectrum of the emitted photons when the usual multi-nuclide approach is replaced by a new multi-group scheme. The algorithm is capable of generating the radiological responses in a large net of spatial nodes. It predetermines the proposed procedure such as a proper tool for online data assimilation analysis in the near-field areas. A specific technique for numerical integration is verified on the basis of comparison with a partial analytical solution. Convergence of the finite cloud approximation to the tabulated semi-infinite cloud values for dose conversion factors was validated.
Sahai, Aakash A; Tsung, Frank S; Tableman, Adam R; Mori, Warren B; Katsouleas, Thomas C
2013-10-01
The relativistically induced transparency acceleration (RITA) scheme of proton and ion acceleration using laser-plasma interactions is introduced, modeled, and compared to the existing schemes. Protons are accelerated with femtosecond relativistic pulses to produce quasimonoenergetic bunches with controllable peak energy. The RITA scheme works by a relativistic laser inducing transparency [Akhiezer and Polovin, Zh. Eksp. Teor. Fiz 30, 915 (1956); Kaw and Dawson, Phys. Fluids 13, 472 (1970); Max and Perkins, Phys. Rev. Lett. 27, 1342 (1971)] to densities higher than the cold-electron critical density, while the background heavy ions are stationary. The rising laser pulse creates a traveling acceleration structure at the relativistic critical density by ponderomotively [Lindl and Kaw, Phys. Fluids 14, 371 (1971); Silva et al., Phys. Rev. E 59, 2273 (1999)] driving a local electron density inflation, creating an electron snowplow and a co-propagating electrostatic potential. The snowplow advances with a velocity determined by the rate of the rise of the laser's intensity envelope and the heavy-ion-plasma density gradient scale length. The rising laser is incrementally rendered transparent to higher densities such that the relativistic-electron plasma frequency is resonant with the laser frequency. In the snowplow frame, trace density protons reflect off the electrostatic potential and get snowplowed, while the heavier background ions are relatively unperturbed. Quasimonoenergetic bunches of velocity equal to twice the snowplow velocity can be obtained and tuned by controlling the snowplow velocity using laser-plasma parameters. An analytical model for the proton energy as a function of laser intensity, rise time, and plasma density gradient is developed and compared to 1D and 2D PIC OSIRIS [Fonseca et al., Lect. Note Comput. Sci. 2331, 342 (2002)] simulations. We model the acceleration of protons to GeV energies with tens-of-femtoseconds laser pulses of a few petawatts. The scaling of proton energy with laser power compares favorably to other mechanisms for ultrashort pulses [Schreiber et al., Phys. Rev. Lett. 97, 045005 (2006); Esirkepov et al., Phys. Rev. Lett. 92, 175003 (2004); Silva et al., Phys. Rev. Lett. 92, 015002 (2004); Fiuza et al., Phys. Rev. Lett. 109, 215001 (2012)].
An Offload NIC for NASA, NLR, and Grid Computing
NASA Technical Reports Server (NTRS)
Awrach, James
2013-01-01
This work addresses distributed data management and access dynamically configurable high-speed access to data distributed and shared over wide-area high-speed network environments. An offload engine NIC (network interface card) is proposed that scales at nX10-Gbps increments through 100-Gbps full duplex. The Globus de facto standard was used in projects requiring secure, robust, high-speed bulk data transport. Novel extension mechanisms were derived that will combine these technologies for use by GridFTP, bandwidth management resources, and host CPU (central processing unit) acceleration. The result will be wire-rate encrypted Globus grid data transactions through offload for splintering, encryption, and compression. As the need for greater network bandwidth increases, there is an inherent need for faster CPUs. The best way to accelerate CPUs is through a network acceleration engine. Grid computing data transfers for the Globus tool set did not have wire-rate encryption or compression. Existing technology cannot keep pace with the greater bandwidths of backplane and network connections. Present offload engines with ports to Ethernet are 32 to 40 Gbps f-d at best. The best of ultra-high-speed offload engines use expensive ASICs (application specific integrated circuits) or NPUs (network processing units). The present state of the art also includes bonding and the use of multiple NICs that are also in the planning stages for future portability to ASICs and software to accommodate data rates at 100 Gbps. The remaining industry solutions are for carrier-grade equipment manufacturers, with costly line cards having multiples of 10-Gbps ports, or 100-Gbps ports such as CFP modules that interface to costly ASICs and related circuitry. All of the existing solutions vary in configuration based on requirements of the host, motherboard, or carriergrade equipment. The purpose of the innovation is to eliminate data bottlenecks within cluster, grid, and cloud computing systems, and to add several more capabilities while reducing space consumption and cost. Provisions were designed for interoperability with systems used in the NASA HEC (High-End Computing) program. The new acceleration engine consists of state-ofthe- art FPGA (field-programmable gate array) core IP, C, and Verilog code; novel communication protocol; and extensions to the Globus structure. The engine provides the functions of network acceleration, encryption, compression, packet-ordering, and security added to Globus grid or for cloud data transfer. This system is scalable in nX10-Gbps increments through 100-Gbps f-d. It can be interfaced to industry-standard system-side or network-side devices or core IP in increments of 10 GigE, scaling to provide IEEE 40/100 GigE compliance.
Kadam, Shantanu; Vanka, Kumar
2013-02-15
Methods based on the stochastic formulation of chemical kinetics have the potential to accurately reproduce the dynamical behavior of various biochemical systems of interest. However, the computational expense makes them impractical for the study of real systems. Attempts to render these methods practical have led to the development of accelerated methods, where the reaction numbers are modeled by Poisson random numbers. However, for certain systems, such methods give rise to physically unrealistic negative numbers for species populations. The methods which make use of binomial variables, in place of Poisson random numbers, have since become popular, and have been partially successful in addressing this problem. In this manuscript, the development of two new computational methods, based on the representative reaction approach (RRA), has been discussed. The new methods endeavor to solve the problem of negative numbers, by making use of tools like the stochastic simulation algorithm and the binomial method, in conjunction with the RRA. It is found that these newly developed methods perform better than other binomial methods used for stochastic simulations, in resolving the problem of negative populations. Copyright © 2012 Wiley Periodicals, Inc.