Electron Cloud Trapping in Recycler Combined Function Dipole Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey A.; Nagaitsev, S.
2016-10-04
Electron cloud can lead to a fast instability in intense proton and positron beams in circular accelerators. In the Fermilab Recycler the electron cloud is confined within its combined function magnets. We show that the field of combined function magnets traps the electron cloud, present the results of analytical estimates of trapping, and compare them to numerical simulations of electron cloud formation. The electron cloud is located at the beam center and up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electronsmore » significantly increases the density of the cloud on the next revolution. In a Recycler combined function dipole this multiturn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The multi-turn build-up can be stopped by injection of a clearing bunch of 1010 p at any position in the ring.« less
Fast instability caused by electron cloud in combined function magnets
Antipov, S. A.; Adamson, P.; Burov, A.; ...
2017-04-10
One of the factors which may limit the intensity in the Fermilab Recycler is a fast transverse instability. It develops within a hundred turns and, in certain conditions, may lead to a beam loss. The high rate of the instability suggest that its cause is electron cloud. Here, we studied the phenomena by observing the dynamics of stable and unstable beam, simulating numerically the build-up of the electron cloud, and developed an analytical model of an electron cloud driven instability with the electrons trapped in combined function di-poles. We also found that beam motion can be stabilized by a clearingmore » bunch, which confirms the electron cloud nature of the instability. The clearing suggest electron cloud trapping in Recycler combined function mag-nets. Numerical simulations show that up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. Furthermore, in a Recycler combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated resulting instability growth rate of about 30 revolutions and the mode fre-quency of 0.4 MHz are consistent with experimental observations and agree with the simulation in the PEI code. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
Fast instability caused by electron cloud in combined function magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, S. A.; Adamson, P.; Burov, A.
One of the factors which may limit the intensity in the Fermilab Recycler is a fast transverse instability. It develops within a hundred turns and, in certain conditions, may lead to a beam loss. The high rate of the instability suggest that its cause is electron cloud. Here, we studied the phenomena by observing the dynamics of stable and unstable beam, simulating numerically the build-up of the electron cloud, and developed an analytical model of an electron cloud driven instability with the electrons trapped in combined function di-poles. We also found that beam motion can be stabilized by a clearingmore » bunch, which confirms the electron cloud nature of the instability. The clearing suggest electron cloud trapping in Recycler combined function mag-nets. Numerical simulations show that up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. Furthermore, in a Recycler combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated resulting instability growth rate of about 30 revolutions and the mode fre-quency of 0.4 MHz are consistent with experimental observations and agree with the simulation in the PEI code. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
Fast Transverse Beam Instability Caused by Electron Cloud Trapped in Combined Function Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey
Electron cloud instabilities affect the performance of many circular high-intensity particle accelerators. They usually have a fast growth rate and might lead to an increase of the transverse emittance and beam loss. A peculiar example of such an instability is observed in the Fermilab Recycler proton storage ring. Although this instability might pose a challenge for future intensity upgrades, its nature had not been completely understood. The phenomena has been studied experimentally by comparing the dynamics of stable and unstable beam, numerically by simulating the build-up of the electron cloud and its interaction with the beam, and analytically by constructing a model of an electron cloud driven instability with the electrons trapped in combined function dipoles. Stabilization of the beam by a clearing bunch reveals that the instability is caused by the electron cloud, trapped in beam optics magnets. Measurements of microwave propagation confirm the presence of the cloud in the combined function dipoles. Numerical simulations show that up to 10more » $$^{-2}$$ of the particles can be trapped by their magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. In a combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated fast instability growth rate of about 30 revolutions and low mode frequency of 0.4 MHz are consistent with experimental observations and agree with the simulations. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
Electron-Cloud Build-Up: Theory and Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M. A.
We present a broad-brush survey of the phenomenology, history and importance of the electron-cloud effect (ECE). We briefly discuss the simulation techniques used to quantify the electron-cloud (EC) dynamics. Finally, we present in more detail an effective theory to describe the EC density build-up in terms of a few effective parameters. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire 'ECLOUD' series. In addition, the proceedings of the various flavors of Particle Accelerator Conferences contain a large number of EC-related publications.more » The ICFA Beam Dynamics Newsletter series contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC.« less
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
NASA Astrophysics Data System (ADS)
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.
Biotic games and cloud experimentation as novel media for biophysics education
NASA Astrophysics Data System (ADS)
Riedel-Kruse, Ingmar; Blikstein, Paulo
2014-03-01
First-hand, open-ended experimentation is key for effective formal and informal biophysics education. We developed, tested and assessed multiple new platforms that enable students and children to directly interact with and learn about microscopic biophysical processes: (1) Biotic games that enable local and online play using galvano- and photo-tactic stimulation of micro-swimmers, illustrating concepts such as biased random walks, Low Reynolds number hydrodynamics, and Brownian motion; (2) an undergraduate course where students learn optics, electronics, micro-fluidics, real time image analysis, and instrument control by building biotic games; and (3) a graduate class on the biophysics of multi-cellular systems that contains a cloud experimentation lab enabling students to execute open-ended chemotaxis experiments on slimemolds online, analyze their data, and build biophysical models. Our work aims to generate the equivalent excitement and educational impact for biophysics as robotics and video games have had for mechatronics and computer science, respectively. We also discuss how scaled-up cloud experimentation systems can support MOOCs with true lab components and life-science research in general.
Electron temperatures within magnetic clouds between 2 and 4 AU: Voyager 2 observations
NASA Astrophysics Data System (ADS)
Sittler, E. C.; Burlaga, L. F.
1998-08-01
We have performed an analysis of Voyager 2 plasma electron observations within magnetic clouds between 2 and 4 AU identified by Burlaga and Behannon [1982]. The analysis has been confined to three of the magnetic clouds identified by Burlaga and Behannon that had high-quality data. The general properties of the plasma electrons within a magnetic cloud are that (1) the moment electron temperature anticorrelates with the electron density within the cloud, (2) the ratio Te/Tp tends to be >1, and (3) on average, Te/Tp~7.0. All three results are consistent with previous electron observations within magnetic clouds. Detailed analyses of the core and halo populations within the magnetic clouds show no evidence of either an anticorrelation between the core temperature TC and the electron density Ne or an anticorrelation between the halo temperature TH and the electron density. Within the magnetic clouds the halo component can contribute more than 50% of the electron pressure. The anticorrelation of Te relative to Ne can be traced to the density of the halo component relative to the density of the core component. The core electrons dominate the electron density. When the density goes up, the halo electrons contribute less to the electron pressure, so we get a lower Te. When the electron density goes down, the halo electrons contribute more to the electron pressure, and Te goes up. We find a relation between the electron pressure and density of the form Pe=αNeγ with γ~0.5.
NASA Astrophysics Data System (ADS)
Romano, Annalisa; Boine-Frankenheim, Oliver; Buffat, Xavier; Iadarola, Giovanni; Rumolo, Giovanni
2018-06-01
At the beginning of the 2016 run, an anomalous beam instability was systematically observed at the CERN Large Hadron Collider (LHC). Its main characteristic was that it spontaneously appeared after beams had been stored for several hours in collision at 6.5 TeV to provide data for the experiments, despite large chromaticity values and high strength of the Landau-damping octupole magnet. The instability exhibited several features characteristic of those induced by the electron cloud (EC). Indeed, when LHC operates with 25 ns bunch spacing, an EC builds up in a large fraction of the beam chambers, as revealed by several independent indicators. Numerical simulations have been carried out in order to investigate the role of the EC in the observed instabilities. It has been found that the beam intensity decay is unfavorable for the beam stability when LHC operates in a strong EC regime.
Electron-cloud build-up in hadron machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
2004-08-09
The first observations of electron-proton coupling effect for coasting beams and for long-bunch beams were made at the earliest proton storage rings at the Budker Institute of Nuclear Physics (BINP) in the mid-60's [1]. The effect was mainly a form of the two-stream instability. This phenomenon reappeared at the CERN ISR in the early 70's, where it was accompanied by an intense vacuum pressure rise. When the ISR was operated in bunched-beam mode while testing aluminum vacuum chambers, a resonant effect was observed in which the electron traversal time across the chamber was comparable to the bunch spacing [2]. Thismore » effect (''beam-induced multipacting''), being resonant in nature, is a dramatic manifestation of an electron cloud sharing the vacuum chamber with a positively-charged beam. An electron-cloud-induced instability has been observed since the mid-80's at the PSR (LANL) [3]; in this case, there is a strong transverse instability accompanied by fast beam losses when the beam current exceeds a certain threshold. The effect was observed for the first time for a positron beam in the early 90's at the Photon Factory (PF) at KEK, where the most prominent manifestation was a coupled-bunch instability that was absent when the machine was operated with an electron beam under otherwise identical conditions [4]. Since then, with the advent of ever more intense positron and hadron beams, and the development and deployment of specialized electron detectors [5-9], the effect has been observed directly or indirectly, and sometimes studied systematically, at most lepton and hadron machines when operated with sufficiently intense beams. The effect is expected in various forms and to various degrees in accelerators under design or construction. The electron-cloud effect (ECE) has been the subject of various meetings [10-15]. Two excellent reviews, covering the phenomenology, measurements, simulations and historical development, have been recently given by Frank Zimmermann [16,17]. In this article we focus on the mechanisms of electron-cloud buildup and dissipation for hadronic beams, particularly those with very long, intense, bunches.« less
Protection of electronic health records (EHRs) in cloud.
Alabdulatif, Abdulatif; Khalil, Ibrahim; Mai, Vu
2013-01-01
EHR technology has come into widespread use and has attracted attention in healthcare institutions as well as in research. Cloud services are used to build efficient EHR systems and obtain the greatest benefits of EHR implementation. Many issues relating to building an ideal EHR system in the cloud, especially the tradeoff between flexibility and security, have recently surfaced. The privacy of patient records in cloud platforms is still a point of contention. In this research, we are going to improve the management of access control by restricting participants' access through the use of distinct encrypted parameters for each participant in the cloud-based database. Also, we implement and improve an existing secure index search algorithm to enhance the efficiency of information control and flow through a cloud-based EHR system. At the final stage, we contribute to the design of reliable, flexible and secure access control, enabling quick access to EHR information.
Lightweight Electronic Camera for Research on Clouds
NASA Technical Reports Server (NTRS)
Lawson, Paul
2006-01-01
"Micro-CPI" (wherein "CPI" signifies "cloud-particle imager") is the name of a small, lightweight electronic camera that has been proposed for use in research on clouds. It would acquire and digitize high-resolution (3- m-pixel) images of ice particles and water drops at a rate up to 1,000 particles (and/or drops) per second.
Weaver, Charlotte A; Teenier, Pamela
2014-01-01
Health care organizations have long been limited to a small number of major vendors in their selection of an electronic health record (EHR) system in the national and international marketplace. These major EHR vendors have in common base systems that are decades old, are built in antiquated programming languages, use outdated server architecture, and are based on inflexible data models [1,2]. The option to upgrade their technology to keep pace with the power of new web-based architecture, programming tools and cloud servers is not easily undertaken due to large client bases, development costs and risk [3]. This paper presents the decade-long efforts of a large national provider of home health and hospice care to select an EHR product, failing that to build their own and failing that initiative to go back into the market in 2012. The decade time delay had allowed new technologies and more nimble vendors to enter the market. Partnering with a new start-up company doing web and cloud based architecture for the home health and hospice market, made it possible to build, test and implement an operational and point of care system in 264 home health locations across 40 states and three time zones in the United States. This option of "starting over" with the new web and cloud technologies may be posing a next generation of new EHR vendors that retells the Blackberry replacement by iPhone story in healthcare.
The thinking of Cloud computing in the digital construction of the oil companies
NASA Astrophysics Data System (ADS)
CaoLei, Qizhilin; Dengsheng, Lei
In order to speed up digital construction of the oil companies and enhance productivity and decision-support capabilities while avoiding the disadvantages from the waste of the original process of building digital and duplication of development and input. This paper presents a cloud-based models for the build in the digital construction of the oil companies that National oil companies though the private network will join the cloud data of the oil companies and service center equipment integrated into a whole cloud system, then according to the needs of various departments to prepare their own virtual service center, which can provide a strong service industry and computing power for the Oil companies.
D Building Reconstruction by Multiview Images and the Integrated Application with Augmented Reality
NASA Astrophysics Data System (ADS)
Hwang, Jin-Tsong; Chu, Ting-Chen
2016-10-01
This study presents an approach wherein photographs with a high degree of overlap are clicked using a digital camera and used to generate three-dimensional (3D) point clouds via feature point extraction and matching. To reconstruct a building model, an unmanned aerial vehicle (UAV) is used to click photographs from vertical shooting angles above the building. Multiview images are taken from the ground to eliminate the shielding effect on UAV images caused by trees. Point clouds from the UAV and multiview images are generated via Pix4Dmapper. By merging two sets of point clouds via tie points, the complete building model is reconstructed. The 3D models are reconstructed using AutoCAD 2016 to generate vectors from the point clouds; SketchUp Make 2016 is used to rebuild a complete building model with textures. To apply 3D building models in urban planning and design, a modern approach is to rebuild the digital models; however, replacing the landscape design and building distribution in real time is difficult as the frequency of building replacement increases. One potential solution to these problems is augmented reality (AR). Using Unity3D and Vuforia to design and implement the smartphone application service, a markerless AR of the building model can be built. This study is aimed at providing technical and design skills related to urban planning, urban designing, and building information retrieval using AR.
Scaling predictive modeling in drug development with cloud computing.
Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola
2015-01-26
Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.
Large ionospheric disturbances produced by the HAARP HF facility
NASA Astrophysics Data System (ADS)
Bernhardt, Paul A.; Siefring, Carl L.; Briczinski, Stanley J.; McCarrick, Mike; Michell, Robert G.
2016-07-01
The enormous transmitter power, fully programmable antenna array, and agile frequency generation of the High Frequency Active Auroral Research Program (HAARP) facility in Alaska have allowed the production of unprecedented disturbances in the ionosphere. Using both pencil beams and conical (or twisted) beam transmissions, artificial ionization clouds have been generated near the second, third, fourth, and sixth harmonics of the electron gyrofrequency. The conical beam has been used to sustain these clouds for up to 5 h as opposed to less than 30 min durations produced using pencil beams. The largest density plasma clouds have been produced at the highest harmonic transmissions. Satellite radio transmissions at 253 MHz from the National Research Laboratory TACSat4 communications experiment have been severely disturbed by propagating through artificial plasma regions. The scintillation levels for UHF waves passing through artificial ionization clouds from HAARP are typically 16 dB. This is much larger than previously reported scintillations at other HF facilities which have been limited to 3 dB or less. The goals of future HAARP experiments should be to build on these discoveries to sustain plasma densities larger than that of the background ionosphere for use as ionospheric reflectors of radio signals.
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
Beam tests of beampipe coatings for electron cloud mitigation in Fermilab Main Injector
Backfish, Michael; Eldred, Jeffrey; Tan, Cheng Yang; ...
2015-10-26
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Dedicated tests have shown beampipe coatings dramatically reduce the density of electron cloud in particle accelerators. In this work, we evaluate the performance of titanium nitride, amorphous carbon, and diamond-like carbon as beampipe coatings for the mitigation of electron cloud in the Fermilab Main Injector. Altogether our tests represent 2700 ampere-hours of proton operation spanning five years. Three electron cloud detectors, retarding field analyzers, are installed in a straight section and allow a direct comparisonmore » between the electron flux in the coated and uncoated stainless steel beampipe. We characterize the electron flux as a function of intensity up to a maximum of 50 trillion protons per cycle. Each beampipe material conditions in response to electron bombardment from the electron cloud and we track the changes in these materials as a function of time and the number of absorbed electrons. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. We measure the energy spectrum of electrons incident on the stainless steel, titanium nitride and amorphous carbon beampipes. We find the electron cloud signal is highly sensitive to stray magnetic fields and bunch-length over the Main Injector ramp cycle. In conclusion, we conduct a complete survey of the stray magnetic fields at the test station and compare the electron cloud signal to that in a field-free region.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubin, David L.
2015-01-23
Accelerators that collide high energy beams of matter and anti-matter are essential tools for the investigation of the fundamental constituents of matter, and the search for new forms of matter and energy. A “Linear Collider” is a machine that would bring high energy and very compact bunches of electrons and positrons (anti-electrons) into head-on collision. Such a machine would produce (among many other things) the newly discovered Higgs particle, enabling a detailed study of its properties. Among the most critical and challenging components of a linear collider are the damping rings that produce the very compact and intense beams ofmore » electrons and positrons that are to be accelerated into collision. Hot dilute particle beams are injected into the damping rings, where they are compressed and cooled. The size of the positron beam must be reduced more than a thousand fold in the damping ring, and this compression must be accomplished in a fraction of a second. The cold compact beams are then extracted from the damping ring and accelerated into collision at high energy. The proposed International Linear Collider (ILC), would require damping rings that routinely produce such cold, compact and intense beams. The goal of the Cornell study was a credible design for the damping rings for the ILC. Among the technical challenges of the damping rings; the development of instrumentation that can measure the properties of the very small beams in a very narrow window of time, and mitigation of the forces that can destabilize the beams and prevent adequate cooling, or worse lead to beam loss. One of the most pernicious destabilizing forces is due to the formation of clouds of electrons in the beam pipe. The electron cloud effect is a phenomenon in particle accelerators in which a high density of low energy electrons, build up inside the vacuum chamber. At the outset of the study, it was anticipated that electron cloud effects would limit the intensity of the positron ring, and that an instability associated with residual gas in the beam pipe would limit the intensity of the electron ring. It was also not clear whether the required very small beam size could be achieved. The results of this study are important contributions to the design of both the electron and positron damping rings in which all of those challenges are addressed and overcome. Our findings are documented in the ILC Technical Design Report, a document that represents the work of an international collaboration of scientists. Our contributions include design of the beam magnetic optics for the 3 km circumference damping rings, the vacuum system and surface treatments for electron cloud mitigation, the design of the guide field magnets, design of the superconducting damping wigglers, and new detectors for precision measurement of beam properties. Our study informed the specification of the basic design parameters for the damping rings, including alignment tolerances, magnetic field errors, and instrumentation. We developed electron cloud modelling tools and simulations to aid in the interpretation of the measurements that we carried out in the Cornell Electron-positron Storage Ring (CESR). The simulations provide a means for systematic extrapolation of our measurements at CESR to the proposed ILC damping rings, and ultimately to specify how the beam pipes should be fabricated in order to minimize the effects of the electron cloud. With the conclusion of this study, the design of the essential components of the damping rings is complete, including the development and characterization (with computer simulations) of the beam optics, specification of techniques for minimizing beam size, design of damping ring instrumentation, R&D into electron cloud suppression methods, tests of long term durability of electron cloud coatings, and design of damping ring vacuum system components.« less
LIDAR Point Cloud Data Extraction and Establishment of 3D Modeling of Buildings
NASA Astrophysics Data System (ADS)
Zhang, Yujuan; Li, Xiuhai; Wang, Qiang; Liu, Jiang; Liang, Xin; Li, Dan; Ni, Chundi; Liu, Yan
2018-01-01
This paper takes the method of Shepard’s to deal with the original LIDAR point clouds data, and generate regular grid data DSM, filters the ground point cloud and non ground point cloud through double least square method, and obtains the rules of DSM. By using region growing method for the segmentation of DSM rules, the removal of non building point cloud, obtaining the building point cloud information. Uses the Canny operator to extract the image segmentation is needed after the edges of the building, uses Hough transform line detection to extract the edges of buildings rules of operation based on the smooth and uniform. At last, uses E3De3 software to establish the 3D model of buildings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Pimpec, F.; /PSI, Villigen; Kirby, R.E.
In many accelerator storage rings running positively charged beams, ionization of residual gas and secondary electron emission (SEE) in the beam pipe will give rise to an electron cloud which can cause beam blow-up or loss of the circulating beam. A preventative measure that suppresses electron cloud formation is to ensure that the vacuum wall has a low secondary emission yield (SEY). The SEY of thin films of TiN, sputter deposited Non-Evaporable Getters and a novel TiCN alloy were measured under a variety of conditions, including the effect of re-contamination from residual gas.
Automatic Generation of Building Models with Levels of Detail 1-3
NASA Astrophysics Data System (ADS)
Nguatem, W.; Drauschke, M.; Mayer, H.
2016-06-01
We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.
Unusual chemical compositions of noctilucent-cloud particle nuclei
NASA Technical Reports Server (NTRS)
Hemenway, C. L.
1973-01-01
Two sounding rocket payloads were launched from the ESRO range in Sweden during a noctilucent cloud display. Large numbers of submicron particles were collected, most of which appear to be made up of a high density material coated with a low density material. Typical electron micrographs are shown. Particle chemical compositions have been measured by use of dispersive X-ray analysis equipment attached to an electron microscope and have revealed that most of the high density particle nuclei have atomic weights greater than iron.
Plasma Pancakes and Deep Cavities Generated by High Power Radio Waves from the Arecibo Observatory
NASA Astrophysics Data System (ADS)
Bernhardt, P. A.; Briczinski, S. J., Jr.; Zawdie, K.; Huba, J.; Siefring, C. L.; Sulzer, M. P.; Nossa, E.; Aponte, N.; Perillat, P.; Jackson-Booth, N.
2017-12-01
Breakdown of the neutral atmosphere at ionospheric altitudes can be achieved with high power HF waves that reflect on the bottomside of the ionosphere. For overdense heating (i.e., wave frequency < maximum plasma frequency in the F-layer), the largest electric fields in the plasma are found just below the reflection altitude. There, electromagnetic waves are converted into electron plasma (Langmir) waves and ion acoustic waves. These waves are measured by scattering of the 430 MHz radar at Arecibo to from an enhanced plasma line. The photo-electron excitation of Langmuir waves yields a weaker plasma-line profile that shows the complete electron profile with the radar. Once HF enhanced Langmuir waves are formed, they can accelerate the photo-electron population to sufficient energies for neutral breakdown and enhanced ionization inside the HF Radio Beam. Plasma pancakes are produced because the breakdown process continues to build up plasma on bottom of the breakdown clouds and recombination occurs on the older breakdown plasma at the top of these clouds. Thus, the plasma pancake falls with altitude from the initial HF wave reflection altitude near 250 km to about 160 km where ion-electron recombination prevents the plasma cloud from being sustained by the high power HF. Experiments in March 2017 have produced plasma pancakes with about 100 Mega-Watts effective radiated power 5.1 MHz with the Arecibo HF Facility. Observations using the 430 MHz radar show falling plasma pancakes that disappear at low altitudes and reform at the F-layer critical reflection altitude. Sometimes the periodic and regular falling motion of the plasma pancakes is influenced by Acoustic Gravity Waves (AGW) propagating through the modified HF region. A rising AGW can cause the plasma pancake to reside at nearly constant altitude for 10 to 20 minutes. Dense cavities are also produced by high power radio waves interacting with the F-Layer. These structures are observed with the Arecibo 430 MHz radar as intense bight-outs in the plasma profile. Multiple cavities are seen simultaneously.
Accuracy Analysis of a Dam Model from Drone Surveys
Buffi, Giulia; Venturi, Sara
2017-01-01
This paper investigates the accuracy of models obtained by drone surveys. To this end, this work analyzes how the placement of ground control points (GCPs) used to georeference the dense point cloud of a dam affects the resulting three-dimensional (3D) model. Images of a double arch masonry dam upstream face are acquired from drone survey and used to build the 3D model of the dam for vulnerability analysis purposes. However, there still remained the issue of understanding the real impact of a correct GCPs location choice to properly georeference the images and thus, the model. To this end, a high number of GCPs configurations were investigated, building a series of dense point clouds. The accuracy of these resulting dense clouds was estimated comparing the coordinates of check points extracted from the model and their true coordinates measured via traditional topography. The paper aims at providing information about the optimal choice of GCPs placement not only for dams but also for all surveys of high-rise structures. The knowledge a priori of the effect of the GCPs number and location on the model accuracy can increase survey reliability and accuracy and speed up the survey set-up operations. PMID:28771185
Accuracy Analysis of a Dam Model from Drone Surveys.
Ridolfi, Elena; Buffi, Giulia; Venturi, Sara; Manciola, Piergiorgio
2017-08-03
This paper investigates the accuracy of models obtained by drone surveys. To this end, this work analyzes how the placement of ground control points (GCPs) used to georeference the dense point cloud of a dam affects the resulting three-dimensional (3D) model. Images of a double arch masonry dam upstream face are acquired from drone survey and used to build the 3D model of the dam for vulnerability analysis purposes. However, there still remained the issue of understanding the real impact of a correct GCPs location choice to properly georeference the images and thus, the model. To this end, a high number of GCPs configurations were investigated, building a series of dense point clouds. The accuracy of these resulting dense clouds was estimated comparing the coordinates of check points extracted from the model and their true coordinates measured via traditional topography. The paper aims at providing information about the optimal choice of GCPs placement not only for dams but also for all surveys of high-rise structures. The knowledge a priori of the effect of the GCPs number and location on the model accuracy can increase survey reliability and accuracy and speed up the survey set-up operations.
Cloud-assisted mobile-access of health data with privacy and auditability.
Tong, Yue; Sun, Jinyuan; Chow, Sherman S M; Li, Pan
2014-03-01
Motivated by the privacy issues, curbing the adoption of electronic healthcare systems and the wild success of cloud service models, we propose to build privacy into mobile healthcare systems with the help of the private cloud. Our system offers salient features including efficient key management, privacy-preserving data storage, and retrieval, especially for retrieval at emergencies, and auditability for misusing health data. Specifically, we propose to integrate key management from pseudorandom number generator for unlinkability, a secure indexing method for privacy-preserving keyword search which hides both search and access patterns based on redundancy, and integrate the concept of attribute-based encryption with threshold signing for providing role-based access control with auditability to prevent potential misbehavior, in both normal and emergency cases.
Supervised Outlier Detection in Large-Scale Mvs Point Clouds for 3d City Modeling Applications
NASA Astrophysics Data System (ADS)
Stucker, C.; Richard, A.; Wegner, J. D.; Schindler, K.
2018-05-01
We propose to use a discriminative classifier for outlier detection in large-scale point clouds of cities generated via multi-view stereo (MVS) from densely acquired images. What makes outlier removal hard are varying distributions of inliers and outliers across a scene. Heuristic outlier removal using a specific feature that encodes point distribution often delivers unsatisfying results. Although most outliers can be identified correctly (high recall), many inliers are erroneously removed (low precision), too. This aggravates object 3D reconstruction due to missing data. We thus propose to discriminatively learn class-specific distributions directly from the data to achieve high precision. We apply a standard Random Forest classifier that infers a binary label (inlier or outlier) for each 3D point in the raw, unfiltered point cloud and test two approaches for training. In the first, non-semantic approach, features are extracted without considering the semantic interpretation of the 3D points. The trained model approximates the average distribution of inliers and outliers across all semantic classes. Second, semantic interpretation is incorporated into the learning process, i.e. we train separate inlieroutlier classifiers per semantic class (building facades, roof, ground, vegetation, fields, and water). Performance of learned filtering is evaluated on several large SfM point clouds of cities. We find that results confirm our underlying assumption that discriminatively learning inlier-outlier distributions does improve precision over global heuristics by up to ≍ 12 percent points. Moreover, semantically informed filtering that models class-specific distributions further improves precision by up to ≍ 10 percent points, being able to remove very isolated building, roof, and water points while preserving inliers on building facades and vegetation.
NASA Astrophysics Data System (ADS)
Chiabrando, F.; Lo Turco, M.; Santagati, C.
2017-02-01
The paper here presented shows the outcomes of a research/didactic activity carried out within a workshop titled "Digital Invasions. From point cloud to Heritage Building Information Modeling" held at Politecnico di Torino (29th September-5th October 2016). The term digital invasions refers to an Italian bottom up project born in the 2013 with the aim of promoting innovative digital ways for the enhancement of Cultural Heritage by the co-creation of cultural contents and its sharing through social media platforms. At this regard, we have worked with students of Architectural Master of Science degree, training them with a multidisciplinary teaching team (Architectural Representation, History of Architecture, Restoration, Digital Communication and Geomatics). The aim was also to test if our students could be involved in a sort of niche crowdsourcing for the creation of a library of H-BOMS (Historical-Building Object Modeling) of architectural elements.
Reconstruction of Building Outlines in Dense Urban Areas Based on LIDAR Data and Address Points
NASA Astrophysics Data System (ADS)
Jarzabek-Rychard, M.
2012-07-01
The paper presents a comprehensive method for automated extraction and delineation of building outlines in densely built-up areas. A novel approach to outline reconstruction is the use of geocoded building address points. They give information about building location thus highly reduce task complexity. Reconstruction process is executed on 3D point clouds acquired by airborne laser scanner. The method consists of three steps: building detection, delineation and contours refinement. The algorithm is tested against a data set that presents the old market town and its surroundings. The results are discussed and evaluated by comparison to reference cadastral data.
Xu, Long; Zhao, Zhiyuan; Xiao, Mingchao; Yang, Jie; Xiao, Jian; Yi, Zhengran; Wang, Shuai; Liu, Yunqi
2017-11-22
The exploration of novel electron-deficient building blocks is a key task for developing high-performance polymer semiconductors in organic thin-film transistors. In view of the situation of the lack of strong electron-deficient building blocks, we designed two novel π-extended isoindigo-based electron-deficient building blocks, IVI and F 4 IVI. Owing to the strong electron-deficient nature and the extended π-conjugated system of the two acceptor units, their copolymers, PIVI2T and PF 4 IVI2T, containing 2,2'-bithiophene donor units, are endowed with deep-lying highest occupied molecular orbital (HOMO)/lowest unoccupied molecular orbital (LUMO) energy levels and strong intermolecular interactions. In comparison to PIVI2T, the fluorinated PF 4 IVI2T exhibits stronger intra- and intermolecular interactions, lower HOMO/LUMO energy levels up to -5.74/-4.17 eV, and more ordered molecular packing with a smaller π-π stacking distance of up to 3.53 Å, resulting in an excellent ambipolar transporting behavior and a promising application in logic circuits for PF 4 IVI2T in ambient with hole and electron mobilities of up to 1.03 and 1.82 cm 2 V -1 s -1 , respectively. The results reveal that F 4 IVI is a promising and strong electron-deficient building unit to construct high-performance semiconducting polymers, which provides an insight into the structure-property relationships for the exploration and molecular engineering of excellent electron-deficient building blocks in the field of organic electronics.
a Fast and Flexible Method for Meta-Map Building for Icp Based Slam
NASA Astrophysics Data System (ADS)
Kurian, A.; Morin, K. W.
2016-06-01
Recent developments in LiDAR sensors make mobile mapping fast and cost effective. These sensors generate a large amount of data which in turn improves the coverage and details of the map. Due to the limited range of the sensor, one has to collect a series of scans to build the entire map of the environment. If we have good GNSS coverage, building a map is a well addressed problem. But in an indoor environment, we have limited GNSS reception and an inertial solution, if available, can quickly diverge. In such situations, simultaneous localization and mapping (SLAM) is used to generate a navigation solution and map concurrently. SLAM using point clouds possesses a number of computational challenges even with modern hardware due to the shear amount of data. In this paper, we propose two strategies for minimizing the cost of computation and storage when a 3D point cloud is used for navigation and real-time map building. We have used the 3D point cloud generated by Leica Geosystems's Pegasus Backpack which is equipped with Velodyne VLP-16 LiDARs scanners. To improve the speed of the conventional iterative closest point (ICP) algorithm, we propose a point cloud sub-sampling strategy which does not throw away any key features and yet significantly reduces the number of points that needs to be processed and stored. In order to speed up the correspondence finding step, a dual kd-tree and circular buffer architecture is proposed. We have shown that the proposed method can run in real time and has excellent navigation accuracy characteristics.
Electric potential distributions at the interface between plasmasheet clouds
NASA Technical Reports Server (NTRS)
Evans, D. S.; Roth, M.; Lemaire, J.
1987-01-01
At the interface between two plasma clouds with different densities, temperatures, and/or bulk velocities, there are large charge separation electric fields which can be modeled in the framework of a collisionless theory for tangential discontinuities. Two different classes of layers were identified: the first one corresponds to (stable) ion layers which are thicker than one ion Lamor radius; the second one corresponds to (unstable) electron layers which are only a few electron Larmor radii thick. It is suggested that these thin electron layers with large electric potential gradients (up to 400 mV/m) are the regions where large-amplitude electrostatic waves are spontaneously generated. These waves scatter the pitch angles of the ambient plasmasheet electron into the atmospheric loss cone. The unstable electron layers can therefore be considered as the seat of strong pitch angle scattering for the primary auroral electrons.
Storing and using health data in a virtual private cloud.
Regola, Nathan; Chawla, Nitesh V
2013-03-13
Electronic health records are being adopted at a rapid rate due to increased funding from the US federal government. Health data provide the opportunity to identify possible improvements in health care delivery by applying data mining and statistical methods to the data and will also enable a wide variety of new applications that will be meaningful to patients and medical professionals. Researchers are often granted access to health care data to assist in the data mining process, but HIPAA regulations mandate comprehensive safeguards to protect the data. Often universities (and presumably other research organizations) have an enterprise information technology infrastructure and a research infrastructure. Unfortunately, both of these infrastructures are generally not appropriate for sensitive research data such as HIPAA, as they require special accommodations on the part of the enterprise information technology (or increased security on the part of the research computing environment). Cloud computing, which is a concept that allows organizations to build complex infrastructures on leased resources, is rapidly evolving to the point that it is possible to build sophisticated network architectures with advanced security capabilities. We present a prototype infrastructure in Amazon's Virtual Private Cloud to allow researchers and practitioners to utilize the data in a HIPAA-compliant environment.
Hybrid Cloud Computing Environment for EarthCube and Geoscience Community
NASA Astrophysics Data System (ADS)
Yang, C. P.; Qin, H.
2016-12-01
The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.
Stratocumulus Cloud Top Radiative Cooling and Cloud Base Updraft Speeds
NASA Astrophysics Data System (ADS)
Kazil, J.; Feingold, G.; Balsells, J.; Klinger, C.
2017-12-01
Cloud top radiative cooling is a primary driver of turbulence in the stratocumulus-topped marine boundary. A functional relationship between cloud top cooling and cloud base updraft speeds may therefore exist. A correlation of cloud top radiative cooling and cloud base updraft speeds has been recently identified empirically, providing a basis for satellite retrieval of cloud base updraft speeds. Such retrievals may enable analysis of aerosol-cloud interactions using satellite observations: Updraft speeds at cloud base co-determine supersaturation and therefore the activation of cloud condensation nuclei, which in turn co-determine cloud properties and precipitation formation. We use large eddy simulation and an off-line radiative transfer model to explore the relationship between cloud-top radiative cooling and cloud base updraft speeds in a marine stratocumulus cloud over the course of the diurnal cycle. We find that during daytime, at low cloud water path (CWP < 50 g m-2), cloud base updraft speeds and cloud top cooling are well-correlated, in agreement with the reported empirical relationship. During the night, in the absence of short-wave heating, CWP builds up (CWP > 50 g m-2) and long-wave emissions from cloud top saturate, while cloud base heating increases. In combination, cloud top cooling and cloud base updrafts become weakly anti-correlated. A functional relationship between cloud top cooling and cloud base updraft speed can hence be expected for stratocumulus clouds with a sufficiently low CWP and sub-saturated long-wave emissions, in particular during daytime. At higher CWPs, in particular at night, the relationship breaks down due to saturation of long-wave emissions from cloud top.
Properties of the electron cloud in a high-energy positron and electron storage ring
Harkay, K. C.; Rosenberg, R. A.
2003-03-20
Low-energy, background electrons are ubiquitous in high-energy particle accelerators. Under certain conditions, interactions between this electron cloud and the high-energy beam can give rise to numerous effects that can seriously degrade the accelerator performance. These effects range from vacuum degradation to collective beam instabilities and emittance blowup. Although electron-cloud effects were first observed two decades ago in a few proton storage rings, they have in recent years been widely observed and intensely studied in positron and proton rings. Electron-cloud diagnostics developed at the Advanced Photon Source enabled for the first time detailed, direct characterization of the electron-cloud properties in amore » positron and electron storage ring. From in situ measurements of the electron flux and energy distribution at the vacuum chamber wall, electron-cloud production mechanisms and details of the beam-cloud interaction can be inferred. A significant longitudinal variation of the electron cloud is also observed, due primarily to geometrical details of the vacuum chamber. Furthermore, such experimental data can be used to provide realistic limits on key input parameters in modeling efforts, leading ultimately to greater confidence in predicting electron-cloud effects in future accelerators.« less
DeGaspari, John
2011-10-01
CIOs are hard at work coming up with the most effective and affordable strategies for protecting electronic data as their hospitals move forward on electronic medical records. While the rise of cloud computing and declining network costs are offering new opportunities in dealing with potential disasters, many find there is no substitute for good planning and constant testing.
Semantic Segmentation of Building Elements Using Point Cloud Hashing
NASA Astrophysics Data System (ADS)
Chizhova, M.; Gurianov, A.; Hess, M.; Luhmann, T.; Brunn, A.; Stilla, U.
2018-05-01
For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect) into different building types and structural elements (dome, nave, transept etc.), including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling).
Particle-in-cell simulations of the critical ionization velocity effect in finite size clouds
NASA Technical Reports Server (NTRS)
Moghaddam-Taaheri, E.; Lu, G.; Goertz, C. K.; Nishikawa, K. - I.
1994-01-01
The critical ionization velocity (CIV) mechanism in a finite size cloud is studied with a series of electrostatic particle-in-cell simulations. It is observed that an initial seed ionization, produced by non-CIV mechanisms, generates a cross-field ion beam which excites a modified beam-plasma instability (MBPI) with frequency in the range of the lower hybrid frequency. The excited waves accelerate electrons along the magnetic field up to the ion drift energy that exceeds the ionization energy of the neutral atoms. The heated electrons in turn enhance the ion beam by electron-neutral impact ionization, which establishes a positive feedback loop in maintaining the CIV process. It is also found that the efficiency of the CIV mechanism depends on the finite size of the gas cloud in the following ways: (1) Along the ambient magnetic field the finite size of the cloud, L (sub parallel), restricts the growth of the fastest growing mode, with a wavelength lambda (sub m parallel), of the MBPI. The parallel electron heating at wave saturation scales approximately as (L (sub parallel)/lambda (sub m parallel)) (exp 1/2); (2) Momentum coupling between the cloud and the ambient plasma via the Alfven waves occurs as a result of the finite size of the cloud in the direction perpendicular to both the ambient magnetic field and the neutral drift. This reduces exponentially with time the relative drift between the ambient plasma and the neutrals. The timescale is inversely proportional to the Alfven velocity. (3) The transvers e charge separation field across the cloud was found to result in the modulation of the beam velocity which reduces the parallel heating of electrons and increases the transverse acceleration of electrons. (4) Some energetic electrons are lost from the cloud along the magnetic field at a rate characterized by the acoustic velocity, instead of the electron thermal velocity. The loss of energetic electrons from the cloud seems to be larger in the direction of plasma drift relative to the neutrals, where the loss rate is characterized by the neutral drift velocity. It is also shown that a factor of 4 increase in the ambient plasma density, increases the CIV ionization yield by almost 2 orders of magnitude at the end of a typical run. It is concluded that a larger ambient plasma density can result in a larger CIV yield because of (1) larger seed ion production by non-CIV mechanisms, (2) smaller Alfven velocity and hence weak momentum coupling, and (3) smaller ratio of the ion beam density to the ambient ion density, and therefore a weaker modulation of the beam velocity. The simulation results are used to interpret various chemical release experiments in space.
2001-01-03
KENNEDY SPACE CENTER, Fla. -- Under wispy white morning clouds, Space Shuttle Atlantis approaches Launch Pad 39A, which shows the Rotating Service Structure open (left) and the Fixed Service Structure (right). At the RSS, the payload canister is being lifted up to the Payload Changeout Room. This is the Shuttle’s second attempt at rollout. Jan. 2 a failed computer processor on the crawler transporter aborted the rollout and the Shuttle was returned to the Vehicle Assembly Building using a secondary computer processor on the vehicle. Atlantis will fly on mission STS-98, the seventh construction flight to the International Space Station, carrying the U.S. Laboratory, named Destiny. The lab will have five system racks already installed inside the module. After delivery of electronics in the lab, electrically powered attitude control for Control Moment Gyroscopes will be activated. Atlantis is scheduled for launch no earlier than Jan. 19, 2001, with a crew of five
Supernova Remnant W49B and Its Environment
NASA Astrophysics Data System (ADS)
Zhu, H.; Tian, W. W.; Zuo, P.
2014-10-01
We study gamma-ray supernova remnant (SNR) W49B and its environment using recent radio and infrared data. Spitzer Infrared Spectrograph low resolution data of W49B shows shocked excitation lines of H2 (0,0) S(0)-S(7) from the SNR-molecular cloud interaction. The H2 gas is composed of two components with temperatures of ~260 K and ~1060 K, respectively. Various spectral lines from atomic and ionic particles are detected toward W49B. We suggest that the ionic phase has an electron density of ~500 cm-3 and a temperature of ~104 K by the spectral line diagnoses. The mid- and far-infrared data from MSX, Spitzer, and Herschel reveal a 151 ± 20 K hot dust component with a mass of 7.5 ± 6.6 × 10-4 M ⊙ and a 45 ± 4 K warm dust component with a mass of 6.4 ± 3.2 M ⊙. The hot dust is likely from materials swept up by the shock of W49B. The warm dust may possibly originate from the evaporation of clouds interacting with W49B. We build the H I absorption spectra of W49B and four nearby H II regions (W49A, G42.90+0.58, G42.43-0.26, and G43.19-0.53) and study the relation between W49B and the surrounding molecular clouds by employing the 2.12 μm infrared and CO data. We therefore obtain a kinematic distance of ~10 kpc for W49B and suggest that the remnant is likely associated with the CO cloud at about 40 km s-1.
Meteoric Metal Chemistry in the Martian Atmosphere
NASA Astrophysics Data System (ADS)
Plane, J. M. C.; Carrillo-Sanchez, J. D.; Mangan, T. P.; Crismani, M. M. J.; Schneider, N. M.; Määttänen, A.
2018-03-01
Recent measurements by the Imaging Ultraviolet Spectrograph (IUVS) instrument on NASA's Mars Atmosphere and Volatile EvolutioN mission show that a persistent layer of Mg+ ions occurs around 90 km in the Martian atmosphere but that neutral Mg atoms are not detectable. These observations can be satisfactorily modeled with a global meteoric ablation rate of 0.06 t sol-1, out of a cosmic dust input of 2.7 ± 1.6 t sol-1. The absence of detectable Mg at 90 km requires that at least 50% of the ablating Mg atoms ionize through hyperthermal collisions with CO2 molecules. Dissociative recombination of MgO+.(CO2)n cluster ions with electrons to produce MgCO3 directly, rather than MgO, also avoids a buildup of Mg to detectable levels. The meteoric injection rate of Mg, Fe, and other metals—constrained by the IUVS measurements—enables the production rate of metal carbonate molecules (principally MgCO3 and FeCO3) to be determined. These molecules have very large electric dipole moments (11.6 and 9.2 Debye, respectively) and thus form clusters with up to six H2O molecules at temperatures below 150 K. These clusters should then coagulate efficiently, building up metal carbonate-rich ice particles which can act as nucleating particles for the formation of CO2-ice clouds. Observable mesospheric clouds are predicted to occur between 65 and 80 km at temperatures below 95 K and above 85 km at temperatures about 5 K colder.
Characterizing Subpixel Spatial Resolution of a Hybrid CMOS Detector
NASA Astrophysics Data System (ADS)
Bray, Evan; Burrows, Dave; Chattopadhyay, Tanmoy; Falcone, Abraham; Hull, Samuel; Kern, Matthew; McQuaide, Maria; Wages, Mitchell
2018-01-01
The detection of X-rays is a unique process relative to other wavelengths, and allows for some novel features that increase the scientific yield of a single observation. Unlike lower photon energies, X-rays liberate a large number of electrons from the silicon absorber array of the detector. This number is usually on the order of several hundred to a thousand for moderate-energy X-rays. These electrons tend to diffuse outward into what is referred to as the charge cloud. This cloud can then be picked up by several pixels, forming a specific pattern based on the exact incident location. By conducting the first ever “mesh experiment" on a hybrid CMOS detector (HCD), we have experimentally determined the charge cloud shape and used it to characterize responsivity of the detector with subpixel spatial resolution.
On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things
NASA Astrophysics Data System (ADS)
Huang, Chao
2017-12-01
two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.
A secure EHR system based on hybrid clouds.
Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke
2012-10-01
Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.
Externally fed star formation: a numerical study
NASA Astrophysics Data System (ADS)
Mohammadpour, Motahareh; Stahler, Steven W.
2013-08-01
We investigate, through a series of numerical calculations, the evolution of dense cores that are accreting external gas up to and beyond the point of star formation. Our model clouds are spherical, unmagnetized configurations with fixed outer boundaries, across which gas enters subsonically. When we start with any near-equilibrium state, we find that the cloud's internal velocity also remains subsonic for an extended period, in agreement with observations. However, the velocity becomes supersonic shortly before the star forms. Consequently, the accretion rate building up the protostar is much greater than the benchmark value c_s^3/G, where cs is the sound speed in the dense core. This accretion spike would generate a higher luminosity than those seen in even the most embedded young stars. Moreover, we find that the region of supersonic infall surrounding the protostar races out to engulf much of the cloud, again in violation of the observations, which show infall to be spatially confined. Similar problematic results have been obtained by all other hydrodynamic simulations to date, regardless of the specific infall geometry or boundary conditions adopted. Low-mass star formation is evidently a quasi-static process, in which cloud gas moves inward subsonically until the birth of the star itself. We speculate that magnetic tension in the cloud's deep interior helps restrain the infall prior to this event.
a Voxel-Based Metadata Structure for Change Detection in Point Clouds of Large-Scale Urban Areas
NASA Astrophysics Data System (ADS)
Gehrung, J.; Hebel, M.; Arens, M.; Stilla, U.
2018-05-01
Mobile laser scanning has not only the potential to create detailed representations of urban environments, but also to determine changes up to a very detailed level. An environment representation for change detection in large scale urban environments based on point clouds has drawbacks in terms of memory scalability. Volumes, however, are a promising building block for memory efficient change detection methods. The challenge of working with 3D occupancy grids is that the usual raycasting-based methods applied for their generation lead to artifacts caused by the traversal of unfavorable discretized space. These artifacts have the potential to distort the state of voxels in close proximity to planar structures. In this work we propose a raycasting approach that utilizes knowledge about planar surfaces to completely prevent this kind of artifacts. To demonstrate the capabilities of our approach, a method for the iterative volumetric approximation of point clouds that allows to speed up the raycasting by 36 percent is proposed.
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Smith, W. L., Jr.; Spangenberg, D.; Palikonda, R.; Bedka, K. M.; Minnis, P.; Thieman, M. M.; Nordeen, M.
2017-12-01
Providing public access to research products including cloud macro and microphysical properties and satellite imagery are a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a web based visualization tool and API that allows end users to easily create customized cloud product and satellite imagery, ground site data and satellite ground track information that is generated dynamically. The tool has two uses, one to visualize the dynamically created imagery and the other to provide access to the dynamically generated imagery directly at a later time. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud to accommodate scalability issues. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information, satellite imagery, ground site data and satellite track information accessible and easily searchable. This tool is the culmination of our prior experience with dynamic imagery generation and provides a way to build a "mash-up" of dynamically generated imagery and related kinds of information that are visualized together to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much scientific knowledge, observations and products available to the citizen science, research and interested communities as well as for automated systems to acquire the same information for data mining or other analytic purposes. This tool and the underlying API's provide a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.
NASA Astrophysics Data System (ADS)
Antova, Gergana; Kunchev, Ivan; Mickrenska-Cherneva, Christina
2016-10-01
The representation of physical buildings in Building Information Models (BIM) has been a subject of research since four decades in the fields of Construction Informatics and GeoInformatics. The early digital representations of buildings mainly appeared as 3D drawings constructed by CAD software, and the 3D representation of the buildings was only geometric, while semantics and topology were out of modelling focus. On the other hand, less detailed building representations, with often focus on ‘outside’ representations were also found in form of 2D /2,5D GeoInformation models. Point clouds from 3D laser scanning data give a full and exact representation of the building geometry. The article presents different aspects and the benefits of using point clouds in BIM in the different stages of a lifecycle of a building.
A cloud-based framework for large-scale traditional Chinese medical record retrieval.
Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin
2018-01-01
Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.
Monitoring of Progressive Damage in Buildings Using Laser Scan Data
NASA Astrophysics Data System (ADS)
Puente, I.; Lindenbergh, R.; Van Natijne, A.; Esposito, R.; Schipper, R.
2018-05-01
Vulnerability of buildings to natural and man-induced hazards has become a main concern for our society. Ensuring their serviceability, safety and sustainability is of vital importance and the main reason for setting up monitoring systems to detect damages at an early stage. In this work, a method is presented for detecting changes from laser scan data, where no registration between different epochs is needed. To show the potential of the method, a case study of a laboratory test carried out at the Stevin laboratory of Delft University of Technology was selected. The case study was a quasi-static cyclic pushover test on a two-story high unreinforced masonry structure designed to simulate damage evolution caused by cyclic loading. During the various phases, we analysed the behaviour of the masonry walls by monitoring the deformation of each masonry unit. First a plane is fitted to the selected wall point cloud, consisting of one single terrestrial laser scan, using Principal Component Analysis (PCA). Second, the segmentation of individual elements is performed. Then deformations with respect to this plane model, for each epoch and specific element, are determined by computing their corresponding rotation and cloud-to-plane distances. The validation of the changes detected within this approach is done by comparison with traditional deformation analysis based on co-registered TLS point clouds between two or more epochs of building measurements. Initial results show that the sketched methodology is indeed able to detect changes at the mm level while avoiding 3D point cloud registration, which is a main issue in computer vision and remote sensing.
High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
Storing and Using Health Data in a Virtual Private Cloud
Regola, Nathan
2013-01-01
Electronic health records are being adopted at a rapid rate due to increased funding from the US federal government. Health data provide the opportunity to identify possible improvements in health care delivery by applying data mining and statistical methods to the data and will also enable a wide variety of new applications that will be meaningful to patients and medical professionals. Researchers are often granted access to health care data to assist in the data mining process, but HIPAA regulations mandate comprehensive safeguards to protect the data. Often universities (and presumably other research organizations) have an enterprise information technology infrastructure and a research infrastructure. Unfortunately, both of these infrastructures are generally not appropriate for sensitive research data such as HIPAA, as they require special accommodations on the part of the enterprise information technology (or increased security on the part of the research computing environment). Cloud computing, which is a concept that allows organizations to build complex infrastructures on leased resources, is rapidly evolving to the point that it is possible to build sophisticated network architectures with advanced security capabilities. We present a prototype infrastructure in Amazon’s Virtual Private Cloud to allow researchers and practitioners to utilize the data in a HIPAA-compliant environment. PMID:23485880
Mapping Urban Tree Canopy Cover Using Fused Airborne LIDAR and Satellite Imagery Data
NASA Astrophysics Data System (ADS)
Parmehr, Ebadat G.; Amati, Marco; Fraser, Clive S.
2016-06-01
Urban green spaces, particularly urban trees, play a key role in enhancing the liveability of cities. The availability of accurate and up-to-date maps of tree canopy cover is important for sustainable development of urban green spaces. LiDAR point clouds are widely used for the mapping of buildings and trees, and several LiDAR point cloud classification techniques have been proposed for automatic mapping. However, the effectiveness of point cloud classification techniques for automated tree extraction from LiDAR data can be impacted to the point of failure by the complexity of tree canopy shapes in urban areas. Multispectral imagery, which provides complementary information to LiDAR data, can improve point cloud classification quality. This paper proposes a reliable method for the extraction of tree canopy cover from fused LiDAR point cloud and multispectral satellite imagery data. The proposed method initially associates each LiDAR point with spectral information from the co-registered satellite imagery data. It calculates the normalised difference vegetation index (NDVI) value for each LiDAR point and corrects tree points which have been misclassified as buildings. Then, region growing of tree points, taking the NDVI value into account, is applied. Finally, the LiDAR points classified as tree points are utilised to generate a canopy cover map. The performance of the proposed tree canopy cover mapping method is experimentally evaluated on a data set of airborne LiDAR and WorldView 2 imagery covering a suburb in Melbourne, Australia.
Image-Based Airborne LiDAR Point Cloud Encoding for 3d Building Model Retrieval
NASA Astrophysics Data System (ADS)
Chen, Yi-Chen; Lin, Chao-Hung
2016-06-01
With the development of Web 2.0 and cyber city modeling, an increasing number of 3D models have been available on web-based model-sharing platforms with many applications such as navigation, urban planning, and virtual reality. Based on the concept of data reuse, a 3D model retrieval system is proposed to retrieve building models similar to a user-specified query. The basic idea behind this system is to reuse these existing 3D building models instead of reconstruction from point clouds. To efficiently retrieve models, the models in databases are compactly encoded by using a shape descriptor generally. However, most of the geometric descriptors in related works are applied to polygonal models. In this study, the input query of the model retrieval system is a point cloud acquired by Light Detection and Ranging (LiDAR) systems because of the efficient scene scanning and spatial information collection. Using Point clouds with sparse, noisy, and incomplete sampling as input queries is more difficult than that by using 3D models. Because that the building roof is more informative than other parts in the airborne LiDAR point cloud, an image-based approach is proposed to encode both point clouds from input queries and 3D models in databases. The main goal of data encoding is that the models in the database and input point clouds can be consistently encoded. Firstly, top-view depth images of buildings are generated to represent the geometry surface of a building roof. Secondly, geometric features are extracted from depth images based on height, edge and plane of building. Finally, descriptors can be extracted by spatial histograms and used in 3D model retrieval system. For data retrieval, the models are retrieved by matching the encoding coefficients of point clouds and building models. In experiments, a database including about 900,000 3D models collected from the Internet is used for evaluation of data retrieval. The results of the proposed method show a clear superiority over related methods.
Observation of thermal quench induced by runaway electrons in magnetic perturbation
NASA Astrophysics Data System (ADS)
Cheon, MunSeong; Seo, Dongcheol; Kim, Junghee
2018-04-01
Experimental observations in Korea Superconducting Tokamak Advanced Research (KSTAR) plasmas show that a loss of pre-disruptive runaway electrons can induce a rapid radiative cooling of the plasma, by generating impurity clouds from the first wall. The synchrotron radiation image shows that the loss of runaway electrons occurs from the edge region when the resonant magnetic perturbation is applied on the plasma. When the impact of the runaway electrons on the wall is strong enough, a sudden drop of the electron cyclotron emission (ECE) signal occurs with the characteristic plasma behaviors such as the positive spike and following decay of the plasma current, Dα spike, big magnetic fluctuation, etc. The visible images at this runaway loss show an evidence of the generation of impurity cloud and the following radiative cooling. When the runaway beam is located on the plasma edge, thermal quenches are expected to occur without global destruction of the magnetic structure up to the core.
D Building FAÇADE Reconstruction Using Handheld Laser Scanning Data
NASA Astrophysics Data System (ADS)
Sadeghi, F.; Arefi, H.; Fallah, A.; Hahn, M.
2015-12-01
3D The three dimensional building modelling has been an interesting topic of research for decades and it seems that photogrammetry methods provide the only economic means to acquire truly 3D city data. According to the enormous developments of 3D building reconstruction with several applications such as navigation system, location based services and urban planning, the need to consider the semantic features (such as windows and doors) becomes more essential than ever, and therefore, a 3D model of buildings as block is not any more sufficient. To reconstruct the façade elements completely, we employed the high density point cloud data that obtained from the handheld laser scanner. The advantage of the handheld laser scanner with capability of direct acquisition of very dense 3D point clouds is that there is no need to derive three dimensional data from multi images using structure from motion techniques. This paper presents a grammar-based algorithm for façade reconstruction using handheld laser scanner data. The proposed method is a combination of bottom-up (data driven) and top-down (model driven) methods in which, at first the façade basic elements are extracted in a bottom-up way and then they are served as pre-knowledge for further processing to complete models especially in occluded and incomplete areas. The first step of data driven modelling is using the conditional RANSAC (RANdom SAmple Consensus) algorithm to detect façade plane in point cloud data and remove noisy objects like trees, pedestrians, traffic signs and poles. Then, the façade planes are divided into three depth layers to detect protrusion, indentation and wall points using density histogram. Due to an inappropriate reflection of laser beams from glasses, the windows appear like holes in point cloud data and therefore, can be distinguished and extracted easily from point cloud comparing to the other façade elements. Next step, is rasterizing the indentation layer that holds the windows and doors information. After rasterization process, the morphological operators are applied in order to remove small irrelevant objects. Next, the horizontal splitting lines are employed to determine floors and vertical splitting lines are employed to detect walls, windows, and doors. The windows, doors and walls elements which are named as terminals are clustered during classification process. Each terminal contains a special property as width. Among terminals, windows and doors are named the geometry tiles in definition of the vocabularies of grammar rules. Higher order structures that inferred by grouping the tiles resulted in the production rules. The rules with three dimensional modelled façade elements constitute formal grammar that is named façade grammar. This grammar holds all the information that is necessary to reconstruct façades in the style of the given building. Thus, it can be used to improve and complete façade reconstruction in areas with no or limited sensor data. Finally, a 3D reconstructed façade model is generated that the accuracy of its geometry size and geometry position depends on the density of the raw point cloud.
Study of the transport parameters of cloud lightning plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Z. S.; Yuan, P.; Zhao, N.
2010-11-15
Three spectra of cloud lightning have been acquired in Tibet (China) using a slitless grating spectrograph. The electrical conductivity, the electron thermal conductivity, and the electron thermal diffusivity of the cloud lightning, for the first time, are calculated by applying the transport theory of air plasma. In addition, we investigate the change behaviors of parameters (the temperature, the electron density, the electrical conductivity, the electron thermal conductivity, and the electron thermal diffusivity) in one of the cloud lightning channels. The result shows that these parameters decrease slightly along developing direction of the cloud lightning channel. Moreover, they represent similar suddenmore » change behavior in tortuous positions and the branch of the cloud lightning channel.« less
NASA Astrophysics Data System (ADS)
Connell, P. H.
2017-12-01
The University of Valencia has developed a software simulator LEPTRACK to simulate lepton and photon scattering in any kind of media with a variable density, and permeated by electric/magnetic fields of any geometry, and which can handle an exponential runaway avalanche. Here we show results of simulating the interaction of electrons/positrons/photons in an incoming TeV cosmic ray shower with the kind of electric fields expected in a stormcloud after a CG discharge which removes much of the positive charge build up at the centre of the cloud. The point is to show not just a Relativistic Runaway Electron Avalanche (RREA) above the upper negative shielding layer at 12 km but other gamma ray emission due to electron/positron interaction in the remaining positive charge around 9km and the lower negative charge at 6km altitude. We present here images, lightcurves, altitude profiles, spectra and videos showing the different ionization, excitation and photon density fields produced, their time evolution, and how they depend critically on where the cosmic ray shower beam intercepts the electric field geometry. We also show a new effect of incoming positrons, which make up a significant fraction of the shower, where they appear to "orbit" within the high altitude negative shielding layer, and which has been conjectured to produce significant microwave emission, as well as a short range 511 keV annihilation line. The interesting question is if this conjectured emission can be observed and correlated with TGF orbital observations to prove that a TGF originates in the macro-fields of stormclouds or the micro-fields of light leaders and streamers where this "positron orbiting" is not likely to occur.
Meteoric Metal Chemistry in the Martian Atmosphere
Carrillo‐Sanchez, J. D.; Mangan, T. P.; Crismani, M. M. J.; Schneider, N. M.; Määttänen, A.
2018-01-01
Abstract Recent measurements by the Imaging Ultraviolet Spectrograph (IUVS) instrument on NASA's Mars Atmosphere and Volatile EvolutioN mission show that a persistent layer of Mg+ ions occurs around 90 km in the Martian atmosphere but that neutral Mg atoms are not detectable. These observations can be satisfactorily modeled with a global meteoric ablation rate of 0.06 t sol−1, out of a cosmic dust input of 2.7 ± 1.6 t sol−1. The absence of detectable Mg at 90 km requires that at least 50% of the ablating Mg atoms ionize through hyperthermal collisions with CO2 molecules. Dissociative recombination of MgO+.(CO2)n cluster ions with electrons to produce MgCO3 directly, rather than MgO, also avoids a buildup of Mg to detectable levels. The meteoric injection rate of Mg, Fe, and other metals—constrained by the IUVS measurements—enables the production rate of metal carbonate molecules (principally MgCO3 and FeCO3) to be determined. These molecules have very large electric dipole moments (11.6 and 9.2 Debye, respectively) and thus form clusters with up to six H2O molecules at temperatures below 150 K. These clusters should then coagulate efficiently, building up metal carbonate‐rich ice particles which can act as nucleating particles for the formation of CO2‐ice clouds. Observable mesospheric clouds are predicted to occur between 65 and 80 km at temperatures below 95 K and above 85 km at temperatures about 5 K colder. PMID:29780678
NASA Astrophysics Data System (ADS)
Casula, Giuseppe; Fais, Silvana; Giovanna Bianchi, Maria; Cuccuru, Francesco; Ligas, Paola
2015-04-01
The Terrestrial Laser Scanner (TLS) is a modern contactless non-destructive technique (NDT) useful to 3D-model complex-shaped objects with a few hours' field survey. A TLS survey produces very dense point clouds made up of coordinates of point and radiometric information given by the reflectivity parameter i.e. the ratio between the amount of energy emitted by the sensor and the energy reflected by the target object. Modern TLSs used in architecture are phase instruments where the phase difference obtained by comparing the emitted laser pulse with the reflected one is proportional to the sensor-target distance expressed as an integer multiple of the half laser wavelength. TLS data are processed by registering point clouds i.e. by referring them to the same reference frame and by aggregation after a fine registration procedure. The resulting aggregate point cloud can be compared with graphic primitives as single or multiple planes, cylinders or spheres, and the resulting residuals give a morphological map that affords information about the state of conservation of the building materials used in historical or modern buildings, in particular when compared with other NDT techniques. In spite of its great productivity, the TLS technique is limited in that it is unable to penetrate the investigated materials. For this reason both the 3D residuals map and the reflectivity map need to be correlated with the results of other NDT techniques such as the ultrasonic method, and a complex study of the composition of building materials is also necessary. The application of a methodology useful to evaluate the quality of stone building materials and locate altered or damaged zones is presented in this study based on the integrated application of three independent techniques, two non destructive such as the TLS and the ultrasonic techniques in the 24-54 kHz range, and a third to analyze the petrographical characteristics of the stone materials, mainly the texture, with optical and scanning electronic microscopy (SEM). A very interesting case study is presented on a carbonate stone door of great architectural and historical interest, well suited to a high definition survey . This architectural element is inside the "Palazzo di Città" museum in the historical center of the Town of Cagliari, Sardinia (Italy). The integrated application of TLS and in situ and laboratory ultrasonic techniques, enhanced by the knowledge of the petrographic characteristics of the rocks, improves the diagnostic process and affords reliable information on the state of conservation of the stones used to build it. The integrated use of the above non destructive techniques also provides suitable data for a possible restoration and future preservation. Acknowledgments: This work was financially supported by Sardinian Local Administration (RAS - LR 7,August 2007, n.7, Promotion of Scientific Research and Innovation in Sardinia - Italy, Responsible Scientist: S.Fais).
NASA Astrophysics Data System (ADS)
Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito
2017-07-01
The Structure From Motion (SFM) is a technique applied to a series of photographs of an object that returns a 3D reconstruction made up by points in the space (point clouds). This research aims at comparing the results of the SFM approach with the results of a 3D laser scanning in terms of density and accuracy of the model. The experience was conducted by detecting several architectural elements (walls and portals of historical buildings) both with a 3D laser scanner of the latest generation and an amateur photographic camera. The point clouds acquired by laser scanner and those acquired by the photo camera have been systematically compared. In particular we present the experience carried out on the "Don Diego Pappalardo Palace" site in Pedara (Catania, Sicily).
Accessing Cloud Properties and Satellite Imagery: A tool for visualization and data mining
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.
2016-12-01
Providing public access to imagery of cloud macro and microphysical properties and the underlying satellite imagery is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and system that allows end users to easily browse cloud information and satellite imagery that is otherwise difficult to acquire and manipulate. The tool has two uses, one to visualize the data and the other to access the data directly. It uses a widely used access protocol, the Open Geospatial Consortium's Web Map and Processing Services, to encourage user to access the data we produce. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud. One goal of the tool is to provide a demonstration of the back end capability to end users so that they can use the dynamically generated imagery and data as an input to their own work flows or to set up data mining constraints. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information and satellite imagery accessible and easily searchable. Increasingly, information is used in a "mash-up" form where multiple sources of information are combined to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much cutting edge scientific knowledge, observations and products available to the citizen science, research and interested communities for these kinds of "mash-ups" as well as provide a means for automated systems to data mine our information. This tool and access method provides a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.
Probabilistic Feasibility of the Reconstruction Process of Russian-Orthodox Churches
NASA Astrophysics Data System (ADS)
Chizhova, M.; Brunn, A.; Stilla, U.
2016-06-01
The cultural human heritage is important for the identity of following generations and has to be preserved in a suitable manner. In the course of time a lot of information about former cultural constructions has been lost because some objects were strongly damaged by natural erosion or on account of human work or were even destroyed. It is important to capture still available building parts of former buildings, mostly ruins. This data could be the basis for a virtual reconstruction. Laserscanning offers in principle the possibility to take up extensively surfaces of buildings in its actual status. In this paper we assume a priori given 3d-laserscanner data, 3d point cloud for the partly destroyed church. There are many well known algorithms, that describe different methods of extraction and detection of geometric primitives, which are recognized separately in 3d points clouds. In our work we put them in a common probabilistic framework, which guides the complete reconstruction process of complex buildings, in our case russian-orthodox churches. Churches are modeled with their functional volumetric components, enriched with a priori known probabilities, which are deduced from a database of russian-orthodox churches. Each set of components represents a complete church. The power of the new method is shown for a simulated dataset of 100 russian-orthodox churches.
NASA Astrophysics Data System (ADS)
Murata, K. T.
2014-12-01
Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp
Patient-Centered e-Health Record over the Cloud.
Koumaditis, Konstantinos; Themistocleous, Marinos; Vassilacopoulos, George; Prentza, Andrianna; Kyriazis, Dimosthenis; Malamateniou, Flora; Maglaveras, Nicos; Chouvarda, Ioanna; Mourouzis, Alexandros
2014-01-01
The purpose of this paper is to introduce the Patient-Centered e-Health (PCEH) conceptual aspects alongside a multidisciplinary project that combines state-of-the-art technologies like cloud computing. The project, by combining several aspects of PCEH, such as: (a) electronic Personal Healthcare Record (e-PHR), (b) homecare telemedicine technologies, (c) e-prescribing, e-referral, e-learning, with advanced technologies like cloud computing and Service Oriented Architecture (SOA), will lead to an innovative integrated e-health platform of many benefits to the society, the economy, the industry, and the research community. To achieve this, a consortium of experts, both from industry (two companies, one hospital and one healthcare organization) and academia (three universities), was set to investigate, analyse, design, build and test the new platform. This paper provides insights to the PCEH concept and to the current stage of the project. In doing so, we aim at increasing the awareness of this important endeavor and sharing the lessons learned so far throughout our work.
Building Facade Modeling Under Line Feature Constraint Based on Close-Range Images
NASA Astrophysics Data System (ADS)
Liang, Y.; Sheng, Y. H.
2018-04-01
To solve existing problems in modeling facade of building merely with point feature based on close-range images , a new method for modeling building facade under line feature constraint is proposed in this paper. Firstly, Camera parameters and sparse spatial point clouds data were restored using the SFM , and 3D dense point clouds were generated with MVS; Secondly, the line features were detected based on the gradient direction , those detected line features were fit considering directions and lengths , then line features were matched under multiple types of constraints and extracted from multi-image sequence. At last, final facade mesh of a building was triangulated with point cloud and line features. The experiment shows that this method can effectively reconstruct the geometric facade of buildings using the advantages of combining point and line features of the close - range image sequence, especially in restoring the contour information of the facade of buildings.
Data provenance assurance in the cloud using blockchain
NASA Astrophysics Data System (ADS)
Shetty, Sachin; Red, Val; Kamhoua, Charles; Kwiat, Kevin; Njilla, Laurent
2017-05-01
Ever increasing adoption of cloud technology scales up the activities like creation, exchange, and alteration of cloud data objects, which create challenges to track malicious activities and security violations. Addressing this issue requires implementation of data provenance framework so that each data object in the federated cloud environment can be tracked and recorded but cannot be modified. The blockchain technology gives a promising decentralized platform to build tamper-proof systems. Its incorruptible distributed ledger/blockchain complements the need of maintaining cloud data provenance. In this paper, we present a cloud based data provenance framework using block chain which traces data record operations and generates provenance data. We anchor provenance data records into block chain transactions, which provide validation on provenance data and preserve user privacy at the same time. Once the provenance data is uploaded to the global block chain network, it is extremely challenging to tamper the provenance data. Besides, the provenance data uses hashed user identifiers prior to uploading so the blockchain nodes cannot link the operations to a particular user. The framework ensures that the privacy is preserved. We implemented the architecture on ownCloud, uploaded records to blockchain network, stored records in a provenance database and developed a prototype in form of a web service.
Beam Tests of Diamond-Like Carbon Coating for Mitigation of Electron Cloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffrey; Backfish, Michael; Kato, Shigeki
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Our results evaluate the efficacy of a diamond-like carbon (DLC) coating for the mitigation of electron in the Fermilab Main Injector. The interior surface of the beampipe conditions in response to electron bombardment from the electron cloud and we track the change in electron cloud flux over time in the DLC coated beampipe and uncoated stainless steel beampipe. The electron flux is measured by retarding field analyzers placed in a field-free region of the Mainmore » Injector. We find the DLC coating reduces the electron cloud signal to roughly 2\\% of that measured in the uncoated stainless steel beampipe.« less
NASA Astrophysics Data System (ADS)
Wang, Jinxia; Dou, Aixia; Wang, Xiaoqing; Huang, Shusong; Yuan, Xiaoxiang
2016-11-01
Compared to remote sensing image, post-earthquake airborne Light Detection And Ranging (LiDAR) point cloud data contains a high-precision three-dimensional information on earthquake disaster which can improve the accuracy of the identification of destroy buildings. However after the earthquake, the damaged buildings showed so many different characteristics that we can't distinguish currently between trees and damaged buildings points by the most commonly used method of pre-processing. In this study, we analyse the number of returns for given pulse of trees and damaged buildings point cloud and explore methods to distinguish currently between trees and damaged buildings points. We propose a new method by searching for a certain number of neighbourhood space and calculate the ratio(R) of points whose number of returns for given pulse greater than 1 of the neighbourhood points to separate trees from buildings. In this study, we select some point clouds of typical undamaged building, collapsed building and tree as samples from airborne LiDAR point cloud data which got after 2010 earthquake in Haiti MW7.0 by the way of human-computer interaction. Testing to get the Rvalue to distinguish between trees and buildings and apply the R-value to test testing areas. The experiment results show that the proposed method in this study can distinguish between building (undamaged and damaged building) points and tree points effectively but be limited in area where buildings various, damaged complex and trees dense, so this method will be improved necessarily.
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; ...
2016-04-11
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnostics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud withmore » stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains, 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this study we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.« less
NASA Astrophysics Data System (ADS)
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; Dugan, G. F.; Flanagan, J.; McArdle, K. E.; Miller, M. I.; Palmer, M. A.; Ramirez, G. A.; Sonnad, K. G.; Totten, M. M.; Tucker, S. L.; Williams, H. A.
2016-04-01
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnotics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud with stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains; 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this paper we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.
Electron Cloud Measurements in Fermilab Main Injector and Recycler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffrey Scott; Backfish, M.; Tan, C. Y.
This conference paper presents a series of electron cloud measurements in the Fermilab Main Injector and Recycler. A new instability was observed in the Recycler in July 2014 that generates a fast transverse excitation in the first high intensity batch to be injected. Microwave measurements of electron cloud in the Recycler show a corresponding depen- dence on the batch injection pattern. These electron cloud measurements are compared to those made with a retard- ing field analyzer (RFA) installed in a field-free region of the Recycler in November. RFAs are also used in the Main Injector to evaluate the performance ofmore » beampipe coatings for the mitigation of electron cloud. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. The diamond-like carbon coating, in contrast, reduced the electron cloud signal to 1% of that measured in uncoated stainless steel beampipe.« less
Reading in the Clouds: Building a Library at a School in India.
ERIC Educational Resources Information Center
Khalsa, Gurupreet K.
2000-01-01
Describes how, over three years, teachers, students, parents, and administrators of a girls' school in India built a school library, going from a small locked-up collection with no check-out system to a warm and busy library room that was a popular hub of activity at the school, with a dynamic and expanding collection and a full-time librarian.…
Bipolar H II regions produced by cloud-cloud collisions
NASA Astrophysics Data System (ADS)
Whitworth, Anthony; Lomax, Oliver; Balfour, Scott; Mège, Pierre; Zavagno, Annie; Deharveng, Lise
2018-05-01
We suggest that bipolar H II regions may be the aftermath of collisions between clouds. Such a collision will produce a shock-compressed layer, and a star cluster can then condense out of the dense gas near the center of the layer. If the clouds are sufficiently massive, the star cluster is likely to contain at least one massive star, which emits ionizing radiation, and excites an H II region, which then expands, sweeping up the surrounding neutral gas. Once most of the matter in the clouds has accreted onto the layer, expansion of the H II region meets little resistance in directions perpendicular to the midplane of the layer, and so it expands rapidly to produce two lobes of ionized gas, one on each side of the layer. Conversely, in directions parallel to the midplane of the layer, expansion of the H II region stalls due to the ram pressure of the gas that continues to fall towards the star cluster from the outer parts of the layer; a ring of dense neutral gas builds up around the waist of the bipolar H II region, and may spawn a second generation of star formation. We present a dimensionless model for the flow of ionized gas in a bipolar H II region created according to the above scenario, and predict the characteristics of the resulting free-free continuum and recombination-line emission. This dimensionless model can be scaled to the physical parameters of any particular system. Our intention is that these predictions will be useful in testing the scenario outlined above, and thereby providing indirect support for the role of cloud-cloud collisions in triggering star formation.
Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.
2018-05-01
Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.
High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators
NASA Astrophysics Data System (ADS)
Feiz Zarrin Ghalam, Ali
Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)
Consolidation of cloud computing in ATLAS
NASA Astrophysics Data System (ADS)
Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration
2017-10-01
Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.
PIONEER VENUS 2 MULTI-PROBE PARACHUTE TESTS IN THE VEHICLE ASSEMBLY BUILDING
NASA Technical Reports Server (NTRS)
1975-01-01
A parachute system, designed to carry an instrument-laden probe down through the dense atmosphere of torrid, cloud-shrouded Venus, was tested in KSC's Vehicle Assembly Building. The tests are in preparation for a Pioneer multi-probe mission to Venus scheduled for launch from KSC in 1978. Full-scale (12-foot diameter) parachutes with simulated pressure vessels weighing up to 45 pounds were dropped from heights of up to 450 feet tot he floor of the VAB where the impact was cushioned by a honeycomb cardboard impact arrestor. The VAB offers an ideal, wind-free testing facility at no additional construction cost and was used for similar tests of the parachute system for the twin Viking spacecraft scheduled for launch toward Mars in August.
Frequency distributions and correlations of solar X-ray flare parameters
NASA Technical Reports Server (NTRS)
Crosby, Norma B.; Aschwanden, Markus J.; Dennis, Brian R.
1993-01-01
Frequency distributions of flare parameters are determined from over 12,000 solar flares. The flare duration, the peak counting rate, the peak hard X-ray flux, the total energy in electrons, and the peak energy flux in electrons are among the parameters studied. Linear regression fits, as well as the slopes of the frequency distributions, are used to determine the correlations between these parameters. The relationship between the variations of the frequency distributions and the solar activity cycle is also investigated. Theoretical models for the frequency distribution of flare parameters are dependent on the probability of flaring and the temporal evolution of the flare energy build-up. The results of this study are consistent with stochastic flaring and exponential energy build-up. The average build-up time constant is found to be 0.5 times the mean time between flares.
NASA Astrophysics Data System (ADS)
Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M.; Hartung, Kerstin; Ickes, Luisa; Kelley, Maxwell; Medeiros, Brian; Sandu, Irina; Steeneveld, Gert-Jan; Sterk, H. A. M.; Svensson, Gunilla; Vaillancourt, Paul A.; Zadra, Ayrton
2016-09-01
Weather and climate models struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Arctic winter, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic winter boundary layer. In the cloudy state, cloud liquid water limits surface radiative cooling, and temperature inversions are weak and elevated. In the radiatively clear state, strong surface radiative cooling leads to the build-up of surface-based temperature inversions. Many large-scale models lack the cloudy state, and some substantially underestimate inversion strength in the clear state. Here, the transformation from a moist to a cold dry air mass is modeled using an idealized Lagrangian perspective. The trajectory includes both boundary layer states, and the single-column experiment is the first Lagrangian Arctic air formation experiment (Larcform 1) organized within GEWEX GASS (Global atmospheric system studies). The intercomparison reproduces the typical biases of large-scale models: some models lack the cloudy state of the boundary layer due to the representation of mixed-phase microphysics or to the interaction between micro- and macrophysics. In some models, high emissivities of ice clouds or the lack of an insulating snow layer prevent the build-up of surface-based inversions in the radiatively clear state. Models substantially disagree on the amount of cloud liquid water in the cloudy state and on turbulent heat fluxes under clear skies. Observations of air mass transformations including both boundary layer states would allow for a tighter constraint of model behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pauly, Tyler; Garrod, Robin T., E-mail: tap74@cornell.edu
Computational models of interstellar gas-grain chemistry have historically adopted a single dust-grain size of 0.1 micron, assumed to be representative of the size distribution present in the interstellar medium. Here, we investigate the effects of a broad grain-size distribution on the chemistry of dust-grain surfaces and the subsequent build-up of molecular ices on the grains, using a three-phase gas-grain chemical model of a quiescent dark cloud. We include an explicit treatment of the grain temperatures, governed both by the visual extinction of the cloud and the size of each individual grain-size population. We find that the temperature difference plays amore » significant role in determining the total bulk ice composition across the grain-size distribution, while the effects of geometrical differences between size populations appear marginal. We also consider collapse from a diffuse to a dark cloud, allowing dust temperatures to fall. Under the initial diffuse conditions, small grains are too warm to promote grain-mantle build-up, with most ices forming on the mid-sized grains. As collapse proceeds, the more abundant, smallest grains cool and become the dominant ice carriers; the large population of small grains means that this ice is distributed across many grains, with perhaps no more than 40 monolayers of ice each (versus several hundred assuming a single grain size). This effect may be important for the subsequent processing and desorption of the ice during the hot-core phase of star formation, exposing a significant proportion of the ice to the gas phase, increasing the importance of ice-surface chemistry and surface–gas interactions.« less
National electronic medical records integration on cloud computing system.
Mirza, Hebah; El-Masri, Samir
2013-01-01
Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.
Building Reflection with Word Clouds for Online RN to BSN Students.
Volkert, Delene R
Reflection allows students to integrate learning with their personal context, developing deeper knowledge and promoting critical thinking. Word clouds help students develop themes/concepts beyond traditional methods, introducing visual aspects to an online learning environment. Students created word clouds and captions, then responded to those created by peers for a weekly discussion assignment. Students indicated overwhelming support for the use of word clouds to develop deeper understanding of the subject matter. This reflection assignment could be utilized in asynchronous, online undergraduate nursing courses for creative methods of building reflection and developing knowledge for the undergraduate RN to BSN student.
Fast Transverse Instability and Electron Cloud Measurements in Fermilab Recycler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffery; Adamson, Philip; Capista, David
2015-03-01
A new transverse instability is observed that may limit the proton intensity in the Fermilab Recycler. The instability is fast, leading to a beam-abort loss within two hundred turns. The instability primarily affects the first high-intensity batch from the Fermilab Booster in each Recycler cycle. This paper analyzes the dynamical features of the destabilized beam. The instability excites a horizontal betatron oscillation which couples into the vertical motion and also causes transverse emittance growth. This paper describes the feasibility of electron cloud as the mechanism for this instability and presents the first measurements of the electron cloud in the Fermilabmore » Recycler. Direct measurements of the electron cloud are made using a retarding field analyzer (RFA) newly installed in the Fermilab Recycler. Indirect measurements of the electron cloud are made by propagating a microwave carrier signal through the beampipe and analyzing the phase modulation of the signal. The maximum betatron amplitude growth and the maximum electron cloud signal occur during minimums of the bunch length oscillation.« less
NASA Astrophysics Data System (ADS)
Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.
2017-11-01
The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.
NASA Astrophysics Data System (ADS)
Razumnikov, S.; Kurmanbay, A.
2016-04-01
The present paper suggests a system approach to evaluation of the effectiveness and risks resulted from the integration of cloud-based services in a machine-building enterprise. This approach makes it possible to estimate a set of enterprise IT applications and choose the applications to be migrated to the cloud with regard to specific business requirements, a technological strategy and willingness to risk.
NASA Technical Reports Server (NTRS)
Jensen, Eric J.; Tabazadeh, Azadeh; Drdla, Katja; Toon, Owen B.; Gore, Warren J. (Technical Monitor)
2000-01-01
Recent satellite and in situ measurements have indicated that limited denitrification can occur in the Arctic stratosphere. In situ measurements from the SOLVE campaign indicate polar stratospheric clouds (PSCs) composed of small numbers (about 3 x 10^ -4 cm^-3) of 10-20 micron particles (probably NAT or NAD). These observations raise the issue of whether low number density NAT PSCs can substantially denitrify the air with reasonable cloud lifetimes. In this study, we use a one dimensional cloud model to investigate the verticle redistribution of HNO3 by NAT/NAD PSCs. The cloud formation is driven by a temperature oscillation which drops the temperature below the NAT/NAD formation threshold (about 195 K) for a few days. We assume that a small fraction of the available aerosols act as NAT nuclei when the saturation ratio of HNO3 over NAT(NAD) exceeds 10(l.5). The result is a cloud between about 16 and 20 km in the model, with NAT/NAD particle effective radii as large as about 10 microns (in agreement with the SOLVE data). We find that for typical cloud lifetimes of 2-3 days or less, the net depletion of HNO3 is no more than 1-2 ppbv, regardless of the NAT or NAD particle number density. Repeated passes of the air column through the cold pool build up the denitrification to 3-4 ppbv, and the cloud altitude steadily decreases due to the downward transport of nitric acid. Increasing the cloud lifetime results in considerably more effective denitrification, even with very low cloud particle number densities. As expected, the degree of denitrification by NAT clouds is much larger than that by NAD Clouds. Significant denitrification by NAD Clouds is only possible if the cloud lifetime is several days or more. The clouds also cause a local maximum HNO3 mixing ratio at cloud base where the cloud particles sublimate.
India in the Knowledge Economy--An Electronic Paradigm
ERIC Educational Resources Information Center
Bhattacharya, Indrajit; Sharma, Kunal
2007-01-01
Purpose: The purpose of this paper is to make a strong case for investing in information and communication technologies (ICT) for building up of quality human resource capital for economic upliftment of India. An attempt has been made to explore the possibilities of online learning (OL)/e-learning towards building up of quality human resources in…
ERIC Educational Resources Information Center
Nika, G. Gerald; Parameswaran, R.
1997-01-01
Describes a visual approach for explaining the filling of electrons in the shells, subshells, and orbitals of the chemical elements. Enables students to apply the principles of atomic electron configuration while using manipulatives to model the building up of electron configurations as the atomic numbers of elements increase on the periodic…
NASA Astrophysics Data System (ADS)
Tuttas, S.; Braun, A.; Borrmann, A.; Stilla, U.
2014-08-01
For construction progress monitoring a planned state of the construction at a certain time (as-planed) has to be compared to the actual state (as-built). The as-planed state is derived from a building information model (BIM), which contains the geometry of the building and the construction schedule. In this paper we introduce an approach for the generation of an as-built point cloud by photogrammetry. It is regarded that that images on a construction cannot be taken from everywhere it seems to be necessary. Because of this we use a combination of structure from motion process together with control points to create a scaled point cloud in a consistent coordinate system. Subsequently this point cloud is used for an as-built - as-planed comparison. For that voxels of an octree are marked as occupied, free or unknown by raycasting based on the triangulated points and the camera positions. This allows to identify not existing building parts. For the verification of the existence of building parts a second test based on the points in front and behind the as-planed model planes is performed. The proposed procedure is tested based on an inner city construction site under real conditions.
Beam induced electron cloud resonances in dipole magnetic fields
Calvey, J. R.; Hartung, W.; Makita, J.; ...
2016-07-01
The buildup of low energy electrons in an accelerator, known as electron cloud, can be severely detrimental to machine performance. Under certain beam conditions, the beam can become resonant with the cloud dynamics, accelerating the buildup of electrons. This paper will examine two such effects: multipacting resonances, in which the cloud development time is resonant with the bunch spacing, and cyclotron resonances, in which the cyclotron period of electrons in a magnetic field is a multiple of bunch spacing. Both resonances have been studied directly in dipole fields using retarding field analyzers installed in the Cornell Electron Storage Ring. Thesemore » measurements are supported by both analytical models and computer simulations.« less
IBM Cloud Computing Powering a Smarter Planet
NASA Astrophysics Data System (ADS)
Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu
With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.
Li, Erzhong; Austin, Max E.; White, R. B.; ...
2017-08-21
Intense bursts of electron cyclotron emission (ECE) triggered by magnetohydrodynamic (MHD) instabilities such as edge localized modes (ELMs) have been observed on many tokamaks. On the DIII-D tokamak, it is found that an MHD mode is observed to trigger the ECE bursts in the low collisionality regime at the plasma edge. ORBIT-code simulations have shown that energetic electrons build up due to an interaction between barely trapped electrons with an MHD mode (f = 50 kHz for current case). The energetic tail of the electron distribution function develops a bump within several microseconds for this collisionless case. This behavior dependsmore » on the competition between the perturbing MHD mode and slowing down and pitch angle scattering due to collisions. As a result, for typical DIII-D parameters, the calculated ECE radiation transport predicted by ORBIT is in excellent agreement with ECE measurements, clarifying the electron dynamics of the ECE bursts for the first time.« less
Helmet-Mounted Display Of Clouds Of Harmful Gases
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Barengoltz, Jack B.; Schober, Wayne R.
1995-01-01
Proposed helmet-mounted opto-electronic instrument provides real-time stereoscopic views of clouds of otherwise invisible toxic, explosive, and/or corrosive gas. Display semitransparent: images of clouds superimposed on scene ordinarily visible to wearer. Images give indications on sizes and concentrations of gas clouds and their locations in relation to other objects in scene. Instruments serve as safety devices for astronauts, emergency response crews, fire fighters, people cleaning up chemical spills, or anyone working near invisible hazardous gases. Similar instruments used as sensors in automated emergency response systems that activate safety equipment and emergency procedures. Both helmet-mounted and automated-sensor versions used at industrial sites, chemical plants, or anywhere dangerous and invisible or difficult-to-see gases present. In addition to helmet-mounted and automated-sensor versions, there could be hand-held version. In some industrial applications, desirable to mount instruments and use them similarly to parking-lot surveillance cameras.
Indoor Modelling from Slam-Based Laser Scanner: Door Detection to Envelope Reconstruction
NASA Astrophysics Data System (ADS)
Díaz-Vilariño, L.; Verbree, E.; Zlatanova, S.; Diakité, A.
2017-09-01
Updated and detailed indoor models are being increasingly demanded for various applications such as emergency management or navigational assistance. The consolidation of new portable and mobile acquisition systems has led to a higher availability of 3D point cloud data from indoors. In this work, we explore the combined use of point clouds and trajectories from SLAM-based laser scanner to automate the reconstruction of building indoors. The methodology starts by door detection, since doors represent transitions from one indoor space to other, which constitutes an initial approach about the global configuration of the point cloud into building rooms. For this purpose, the trajectory is used to create a vertical point cloud profile in which doors are detected as local minimum of vertical distances. As point cloud and trajectory are related by time stamp, this feature is used to subdivide the point cloud into subspaces according to the location of the doors. The correspondence between subspaces and building rooms is not unambiguous. One subspace always corresponds to one room, but one room is not necessarily depicted by just one subspace, for example, in case of a room containing several doors and in which the acquisition is performed in a discontinue way. The labelling problem is formulated as combinatorial approach solved as a minimum energy optimization. Once the point cloud is subdivided into building rooms, envelop (conformed by walls, ceilings and floors) is reconstructed for each space. The connectivity between spaces is included by adding the previously detected doors to the reconstructed model. The methodology is tested in a real case study.
Tran, Thi Huong Giang; Ressl, Camillo; Pfeifer, Norbert
2018-02-03
This paper suggests a new approach for change detection (CD) in 3D point clouds. It combines classification and CD in one step using machine learning. The point cloud data of both epochs are merged for computing features of four types: features describing the point distribution, a feature relating to relative terrain elevation, features specific for the multi-target capability of laser scanning, and features combining the point clouds of both epochs to identify the change. All these features are merged in the points and then training samples are acquired to create the model for supervised classification, which is then applied to the whole study area. The final results reach an overall accuracy of over 90% for both epochs of eight classes: lost tree, new tree, lost building, new building, changed ground, unchanged building, unchanged tree, and unchanged ground.
A simple map-based localization strategy using range measurements
NASA Astrophysics Data System (ADS)
Moore, Kevin L.; Kutiyanawala, Aliasgar; Chandrasekharan, Madhumita
2005-05-01
In this paper we present a map-based approach to localization. We consider indoor navigation in known environments based on the idea of a "vector cloud" by observing that any point in a building has an associated vector defining its distance to the key structural components (e.g., walls, ceilings, etc.) of the building in any direction. Given a building blueprint we can derive the "ideal" vector cloud at any point in space. Then, given measurements from sensors on the robot we can compare the measured vector cloud to the possible vector clouds cataloged from the blueprint, thus determining location. We present algorithms for implementing this approach to localization, using the Hamming norm, the 1-norm, and the 2-norm. The effectiveness of the approach is verified by experiments on a 2-D testbed using a mobile robot with a 360° laser range-finder and through simulation analysis of robustness.
Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-07-28
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.
NASA Astrophysics Data System (ADS)
Semeniuk, T. A.; Bruintjes, R. T.; Salazar, V.; Breed, D. W.; Jensen, T. L.; Buseck, P. R.
2014-03-01
An airborne study of cloud microphysics provided an opportunity to collect aerosol particles in ambient and updraft conditions of natural convection systems for transmission electron microscopy (TEM). Particles were collected simultaneously on lacey carbon and calcium-coated carbon (Ca-C) TEM grids, providing information on particle morphology and chemistry and a unique record of the particle's physical state on impact. In total, 22 particle categories were identified, including single, coated, aggregate, and droplet types. The fine fraction comprised up to 90% mixed cation sulfate (MCS) droplets, while the coarse fraction comprised up to 80% mineral-containing aggregates. Insoluble (dry), partially soluble (wet), and fully soluble particles (droplets) were recorded on Ca-C grids. Dry particles were typically silicate grains; wet particles were mineral aggregates with chloride, nitrate, or sulfate components; and droplets were mainly aqueous NaCl and MCS. Higher numbers of droplets were present in updrafts (80% relative humidity (RH)) compared with ambient conditions (60% RH), and almost all particles activated at cloud base (100% RH). Greatest changes in size and shape were observed in NaCl-containing aggregates (>0.3 µm diameter) along updraft trajectories. Their abundance was associated with high numbers of cloud condensation nuclei (CCN) and cloud droplets, as well as large droplet sizes in updrafts. Thus, compositional dependence was observed in activation behavior recorded for coarse and fine fractions. Soluble salts from local pollution and natural sources clearly affected aerosol-cloud interactions, enhancing the spectrum of particles forming CCN and by forming giant CCN from aggregates, thus, making cloud seeding with hygroscopic flares ineffective in this region.
Does a Relationship Between Arctic Low Clouds and Sea Ice Matter?
NASA Technical Reports Server (NTRS)
Taylor, Patrick C.
2016-01-01
Arctic low clouds strongly affect the Arctic surface energy budget. Through this impact Arctic low clouds influence important aspects of the Arctic climate system, namely surface and atmospheric temperature, sea ice extent and thickness, and atmospheric circulation. Arctic clouds are in turn influenced by these elements of the Arctic climate system, and these interactions create the potential for Arctic cloud-climate feedbacks. To further our understanding of potential Arctic cloudclimate feedbacks, the goal of this paper is to quantify the influence of atmospheric state on the surface cloud radiative effect (CRE) and its covariation with sea ice concentration (SIC). We build on previous research using instantaneous, active remote sensing satellite footprint data from the NASA A-Train. First, the results indicate significant differences in the surface CRE when stratified by atmospheric state. Second, there is a weak covariation between CRE and SIC for most atmospheric conditions. Third, the results show statistically significant differences in the average surface CRE under different SIC values in fall indicating a 3-5 W m(exp -2) larger LW CRE in 0% versus 100% SIC footprints. Because systematic changes on the order of 1 W m(exp -2) are sufficient to explain the observed long-term reductions in sea ice extent, our results indicate a potentially significant amplifying sea ice-cloud feedback, under certain meteorological conditions, that could delay the fall freeze-up and influence the variability in sea ice extent and volume. Lastly, a small change in the frequency of occurrence of atmosphere states may yield a larger Arctic cloud feedback than any cloud response to sea ice.
Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M.; ...
2016-08-27
We struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Artic winter using weather and climate models, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic winter boundary layer. In the cloudy state, cloud liquid water limits surface radiative cooling, and temperature inversions are weak and elevated. In the radiatively clear state, strong surface radiative cooling leads to the build-up of surface-based temperature inversions. Many large-scale models lack the cloudy state, and some substantially underestimate inversion strength in the clear state. Themore » transformation from a moist to a cold dry air mass is modeled using an idealized Lagrangian perspective. The trajectory includes both boundary layer states, and the single-column experiment is the first Lagrangian Arctic air formation experiment (Larcform 1) organized within GEWEX GASS (Global atmospheric system studies). The intercomparison reproduces the typical biases of large-scale models: some models lack the cloudy state of the boundary layer due to the representation of mixed-phase microphysics or to the interaction between micro- and macrophysics. In some models, high emissivities of ice clouds or the lack of an insulating snow layer prevent the build-up of surface-based inversions in the radiatively clear state. Models substantially disagree on the amount of cloud liquid water in the cloudy state and on turbulent heat fluxes under clear skies. Finally, observations of air mass transformations including both boundary layer states would allow for a tighter constraint of model behavior.« less
Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M.; Hartung, Kerstin; Ickes, Luisa; Kelley, Maxwell; Medeiros, Brian; Sandu, Irina; Steeneveld, Gert-Jan; Sterk, HAM; Svensson, Gunilla; Vaillancourt, Paul A.; Zadra, Ayrton
2017-01-01
Weather and climate models struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Arctic winter, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic winter boundary layer. In the cloudy state, cloud liquid water limits surface radiative cooling, and temperature inversions are weak and elevated. In the radiatively clear state, strong surface radiative cooling leads to the build-up of surface-based temperature inversions. Many large-scale models lack the cloudy state, and some substantially underestimate inversion strength in the clear state. Here, the transformation from a moist to a cold dry air mass is modelled using an idealized Lagrangian perspective. The trajectory includes both boundary layer states, and the single-column experiment is the first Lagrangian Arctic air formation experiment (Larcform 1) organized within GEWEX GASS (Global atmospheric system studies). The intercomparison reproduces the typical biases of large-scale models: Some models lack the cloudy state of the boundary layer due to the representation of mixed-phase micro-physics or to the interaction between micro-and macrophysics. In some models, high emissivities of ice clouds or the lack of an insulating snow layer prevent the build-up of surface-based inversions in the radiatively clear state. Models substantially disagree on the amount of cloud liquid water in the cloudy state and on turbulent heat fluxes under clear skies. Observations of air mass transformations including both boundary layer states would allow for a tighter constraint of model behaviour. PMID:28966718
The Effects of Grain Size and Temperature Distributions on the Formation of Interstellar Ice Mantles
NASA Astrophysics Data System (ADS)
Pauly, Tyler; Garrod, Robin T.
2016-02-01
Computational models of interstellar gas-grain chemistry have historically adopted a single dust-grain size of 0.1 micron, assumed to be representative of the size distribution present in the interstellar medium. Here, we investigate the effects of a broad grain-size distribution on the chemistry of dust-grain surfaces and the subsequent build-up of molecular ices on the grains, using a three-phase gas-grain chemical model of a quiescent dark cloud. We include an explicit treatment of the grain temperatures, governed both by the visual extinction of the cloud and the size of each individual grain-size population. We find that the temperature difference plays a significant role in determining the total bulk ice composition across the grain-size distribution, while the effects of geometrical differences between size populations appear marginal. We also consider collapse from a diffuse to a dark cloud, allowing dust temperatures to fall. Under the initial diffuse conditions, small grains are too warm to promote grain-mantle build-up, with most ices forming on the mid-sized grains. As collapse proceeds, the more abundant, smallest grains cool and become the dominant ice carriers; the large population of small grains means that this ice is distributed across many grains, with perhaps no more than 40 monolayers of ice each (versus several hundred assuming a single grain size). This effect may be important for the subsequent processing and desorption of the ice during the hot-core phase of star formation, exposing a significant proportion of the ice to the gas phase, increasing the importance of ice-surface chemistry and surface-gas interactions.
Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M; Hartung, Kerstin; Ickes, Luisa; Kelley, Maxwell; Medeiros, Brian; Sandu, Irina; Steeneveld, Gert-Jan; Sterk, Ham; Svensson, Gunilla; Vaillancourt, Paul A; Zadra, Ayrton
2016-09-01
Weather and climate models struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Arctic winter, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic winter boundary layer. In the cloudy state, cloud liquid water limits surface radiative cooling, and temperature inversions are weak and elevated. In the radiatively clear state, strong surface radiative cooling leads to the build-up of surface-based temperature inversions. Many large-scale models lack the cloudy state, and some substantially underestimate inversion strength in the clear state. Here, the transformation from a moist to a cold dry air mass is modelled using an idealized Lagrangian perspective. The trajectory includes both boundary layer states, and the single-column experiment is the first L agrangian Arc tic air form ation experiment (Larcform 1) organized within GEWEX GASS (Global atmospheric system studies). The intercomparison reproduces the typical biases of large-scale models: Some models lack the cloudy state of the boundary layer due to the representation of mixed-phase micro-physics or to the interaction between micro-and macrophysics. In some models, high emissivities of ice clouds or the lack of an insulating snow layer prevent the build-up of surface-based inversions in the radiatively clear state. Models substantially disagree on the amount of cloud liquid water in the cloudy state and on turbulent heat fluxes under clear skies. Observations of air mass transformations including both boundary layer states would allow for a tighter constraint of model behaviour.
A Novel Reflector/Reflectarray Antenna: An Enabling Technology for NASA's Dual-Frequency ACE Radar
NASA Technical Reports Server (NTRS)
Racette, Paul E.; Heymsfield, Gerald; Li, Lihua; Cooley, Michael E.; Park, Richard; Stenger, Peter
2011-01-01
This paper describes a novel dual-frequency shared aperture Ka/W-band antenna design that enables wide-swath Imaging via electronic scanning at Ka-band and Is specifically applicable to NASA's Aerosol, Cloud and Ecosystems (ACE) mission. The innovative antenna design minimizes size and weight via use of a shared aperture and builds upon NASA's investments in large-aperture reflectors and high technology-readiness-level (TRL) W-band radar architectures. The antenna is comprised of a primary cylindrical reflector/reflectarray surface illuminated by a fixed W-band feed and a Ka-band Active Electronically Scanned Array (AESA) line feed. The reflectarray surface provides beam focusing at W-band, but is transparent at Ka-band.
Numerical simulation of a radially injected barium cloud
NASA Technical Reports Server (NTRS)
Swift, D. W.; Wescott, E. M.
1981-01-01
Electrostatic two-dimensional numerical simulations of a radially symmetric barium injection experiment demonstrate that ions created by solar UV irradiation are electrostatically bound to the electrons which remain tied to the field lines on which they are created. Two possible instabilities are identified, but neither of them causes the barium plasma cloud to polarize in a way that would permit the plasma to keep up with the neutrals. In a second model, the velocity of the neutrals is allowed to be a function of the azimuthal angle. Here, a portion of the cloud does polarize in a way that allows a portion of the plasma to detach and move outward at the approximate speed of the neutrals. No rapid detachment is found when only the density of the neutrals is given an azimuthal asymmetry.
Electron-cloud updated simulation results for the PSR, and recent results for the SNS
NASA Astrophysics Data System (ADS)
Pivi, M.; Furman, M. A.
2002-05-01
Recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos are presented in this paper. A refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has recently been included in the electron-cloud code.
Evolution of the Far-Infrared Cloud at Titan's South Pole
NASA Technical Reports Server (NTRS)
Jennings, Donald E.; Achterberg, R. K.; Cottini, V.; Anderson, C. M.; Flasar, F. M.; Nixon, C. A.; Bjoraker, G. L.; Kunde, V. G.; Carlson, R. C.; Guandique, E.;
2015-01-01
A condensate cloud on Titan identified by its 220 cm (sup -1) far-infrared signature continues to undergo seasonal changes at both the north and south poles. In the north the cloud, which extends from 55 North to the pole, has been gradually decreasing in emission intensity since the beginning of the Cassini mission with a half-life of 3.8 years. The cloud in the south did not appear until 2012 but its intensity has increased rapidly, doubling every year. The shape of the cloud at the South Pole is very different from that in the north. Mapping in December 2013 showed that the condensate emission was confined to a ring with a maximum at 80 South. The ring was centered 4 degrees from Titan's pole. The pattern of emission from stratospheric trace gases like nitriles and complex hydrocarbons (mapped in January 2014) was also offset by 4 degrees, but had a central peak at the pole and a secondary maximum in a ring at about 70 South with a minimum at 80 South. The shape of the gas emissions distribution can be explained by abundances that are high at the atmospheric pole and diminish toward the equator, combined with correspondingly increasing temperatures. We discuss possible causes for the condensate ring. The present rapid build up of the condensate cloud at the South Pole is likely to transition to a gradual decline during 2015-16.
Point clouds segmentation as base for as-built BIM creation
NASA Astrophysics Data System (ADS)
Macher, H.; Landes, T.; Grussenmeyer, P.
2015-08-01
In this paper, a three steps segmentation approach is proposed in order to create 3D models from point clouds acquired by TLS inside buildings. The three scales of segmentation are floors, rooms and planes composing the rooms. First, floor segmentation is performed based on analysis of point distribution along Z axis. Then, for each floor, room segmentation is achieved considering a slice of point cloud at ceiling level. Finally, planes are segmented for each room, and planes corresponding to ceilings and floors are identified. Results of each step are analysed and potential improvements are proposed. Based on segmented point clouds, the creation of as-built BIM is considered in a future work section. Not only the classification of planes into several categories is proposed, but the potential use of point clouds acquired outside buildings is also considered.
NASA Technical Reports Server (NTRS)
1977-01-01
NASA aircraft-icing research has been applied to expand the utility of the big flying-crane helicopter built by the Sikorsky Aircraft Division of United Technologies in Stratford, Conn. Sikorsky wanted to adapt the Skycrane, used in both military and commercial service, to lift heavy external loads in areas where icing conditions occur; ice build-up around the engine air inlets caused the major problem. NASA-Lewis has a special wind tunnel for injecting super cooled water droplets into the wind thereby simulating a natural icing cloud and observing how ice builds up on various shaped surfaces. From Lewis, Sikorsky engineers obtained information which optimized the design of the inlet anti-ice system. The resulting design proved to be an effective anti-icing modification for the flying crane. Sikorsky is also using additional Lewis Icing Research Tunnel data in its development of a new VTOL (Vertical Take-Off and Landing) aircraft.
CIMEL Measurements of Zenith Radiances at the ARM Site
NASA Technical Reports Server (NTRS)
Marshak, Alexander; Wiscombe, Warren; Lau, William K. M. (Technical Monitor)
2002-01-01
Starting from October 1, 2001, Cimel at the ARM Central Facility in Oklahoma has been switched to a new "cloud mode." This mode allows taking measurements of zenith radiance when the Sun in blocked by clouds. In this case, every 13 min. Cimel points straight up and takes 10 measurements with 9 sec. time interval. The new Cimel's mode has four filters at 440, 670, 870 and 1020 nm. For cloudy conditions, the spectral contrast in surface albedo dominates over Rayleigh and aerosol effects; this makes normalized zenith radiances at 440 and 670 as well as for 870 and 1020 almost indistinguishable. We compare Cimel measurements with other ARM cart site instruments: Multi-Filter Rotating Shadowband Radiometer (MFRSR), Narrow Field of View (NFOV) sensor, and MicroWave Radiometer(MWR). Based on Cimel and MFRSR 670 and 870 nm channels, we build a normalized difference cloud index (NDCI) for radiances and fluxes, respectively. Radiance NDCI from Cimel and flux NDCI from MFRSR are compared between themselves as well as with cloud Liquid Water Path (LWP) retrieved from MWR. Based on our theoretical calculations and preliminary data analysis,there is a good correlation between NDCIs and LWP for cloudy sky above green vegetation. Based on this correlation, an algorithm to retrieve cloud optical depth from NDCI is proposed.
Florida, Bahamas, Cuba and Gulf Stream, USA
1992-08-08
This unique photo offers a view of the Florida peninsula, western Bahamas, north central Cuba and the deep blue waters of the Gulf Stream, that hugs the east coast of Florida (27.0N, 82.0W). In addition to being an excellent photograph for showing the geographical relationships between the variety of landforms in this scene, the typical effect of the land-sea breeze is very much in evidence as few clouds over water, cumulus build up over landmass.
ERIC Educational Resources Information Center
Islam, Muhammad Faysal
2013-01-01
Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…
NASA Astrophysics Data System (ADS)
Garagnani, S.; Manferdini, A. M.
2013-02-01
Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.
ERIC Educational Resources Information Center
Togawa, Satoshi; Kanenishi, Kazuhide
2014-01-01
In this research, we have built a framework of disaster recovery such as against earthquake, tsunami disaster and a heavy floods for e-Learning environment. Especially, our proposed framework is based on private cloud collaboration. We build a prototype system based on IaaS architecture, and this prototype system is constructed by several private…
Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-01-01
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978
An In Situ Method for Sizing Insoluble Residues in Precipitation and Other Aqueous Samples
Axson, Jessica L.; Creamean, Jessie M.; Bondy, Amy L.; Capracotta, Sonja S.; Warner, Katy Y.; Ault, Andrew P.
2015-01-01
Particles are frequently incorporated into clouds or precipitation, influencing climate by acting as cloud condensation or ice nuclei, taking up coatings during cloud processing, and removing species through wet deposition. Many of these particles, particularly ice nuclei, can remain suspended within cloud droplets/crystals as insoluble residues. While previous studies have measured the soluble or bulk mass of species within clouds and precipitation, no studies to date have determined the number concentration and size distribution of insoluble residues in precipitation or cloud water using in situ methods. Herein, for the first time we demonstrate that Nanoparticle Tracking Analysis (NTA) is a powerful in situ method for determining the total number concentration, number size distribution, and surface area distribution of insoluble residues in precipitation, both of rain and melted snow. The method uses 500 μL or less of liquid sample and does not require sample modification. Number concentrations for the insoluble residues in aqueous precipitation samples ranged from 2.0–3.0(±0.3)×108 particles cm−3, while surface area ranged from 1.8(±0.7)–3.2(±1.0)×107 μm2 cm−3. Number size distributions peaked between 133–150 nm, with both single and multi-modal character, while surface area distributions peaked between 173–270 nm. Comparison with electron microscopy of particles up to 10 μm show that, by number, > 97% residues are <1 μm in diameter, the upper limit of the NTA. The range of concentration and distribution properties indicates that insoluble residue properties vary with ambient aerosol concentrations, cloud microphysics, and meteorological dynamics. NTA has great potential for studying the role that insoluble residues play in critical atmospheric processes. PMID:25705069
NASA Astrophysics Data System (ADS)
Bernhardt, Paul A.; Siefring, Carl L.; Briczinski, Stanley J.; Viggiano, Albert; Caton, Ronald G.; Pedersen, Todd R.; Holmes, Jeffrey M.; Ard, Shaun; Shuman, Nicholas; Groves, Keith M.
2017-05-01
Atomic samarium has been injected into the neutral atmosphere for production of electron clouds that modify the ionosphere. These electron clouds may be used as high-frequency radio wave reflectors or for control of the electrodynamics of the F region. A self-consistent model for the photochemical reactions of Samarium vapor cloud released into the upper atmosphere has been developed and compared with the Metal Oxide Space Cloud (MOSC) experimental observations. The release initially produces a dense plasma cloud that that is rapidly reduced by dissociative recombination and diffusive expansion. The spectral emissions from the release cover the ultraviolet to the near infrared band with contributions from solar fluorescence of the atomic, molecular, and ionized components of the artificial density cloud. Barium releases in sunlight are more efficient than Samarium releases in sunlight for production of dense ionization clouds. Samarium may be of interest for nighttime releases but the artificial electron cloud is limited by recombination with the samarium oxide ion.
Cloud Infrastructure & Applications - CloudIA
NASA Astrophysics Data System (ADS)
Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank
The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.
INDIGO: Building a DataCloud Framework to support Open Science
NASA Astrophysics Data System (ADS)
Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana
2016-04-01
New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.
NASA Astrophysics Data System (ADS)
McGibbon, J.; Bretherton, C. S.
2017-06-01
During the Marine ARM GPCI Investigation of Clouds (MAGIC) in October 2011 to September 2012, a container ship making periodic cruises between Los Angeles, CA, and Honolulu, HI, was instrumented with surface meteorological, aerosol and radiation instruments, a cloud radar and ceilometer, and radiosondes. Here large-eddy simulation (LES) is performed in a ship-following frame of reference for 13 four day transects from the MAGIC field campaign. The goal is to assess if LES can skillfully simulate the broad range of observed cloud characteristics and boundary layer structure across the subtropical stratocumulus to cumulus transition region sampled during different seasons and meteorological conditions. Results from Leg 15A, which sampled a particularly well-defined stratocumulus to cumulus transition, demonstrate the approach. The LES reproduces the observed timing of decoupling and transition from stratocumulus to cumulus and matches the observed evolution of boundary layer structure, cloud fraction, liquid water path, and precipitation statistics remarkably well. Considering the simulations of all 13 cruises, the LES skillfully simulates the mean diurnal variation of key measured quantities, including liquid water path (LWP), cloud fraction, measures of decoupling, and cloud radar-derived precipitation. The daily mean quantities are well represented, and daily mean LWP and cloud fraction show the expected correlation with estimated inversion strength. There is a -0.6 K low bias in LES near-surface air temperature that results in a high bias of 5.6 W m-2 in sensible heat flux (SHF). Overall, these results build confidence in the ability of LES to represent the northeast Pacific stratocumulus to trade cumulus transition region.
Dynamic electronic institutions in agent oriented cloud robotic systems.
Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice
2015-01-01
The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.
Community Seismic Network (CSN)
NASA Astrophysics Data System (ADS)
Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.
2012-12-01
We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging between five and 22 stories tall have been constructed using Google SketchUp. Ambient vibration records are used to identify the first set of horizontal vibrational modal frequencies of the buildings. These frequencies are used to compute the response on every floor of the building, given either observed data or scenario ground motion input at the buildings' base.
HF propagation results from the Metal Oxide Space Cloud (MOSC) experiment
NASA Astrophysics Data System (ADS)
Joshi, Dev; Groves, Keith M.; McNeil, William; Carrano, Charles; Caton, Ronald G.; Parris, Richard T.; Pederson, Todd R.; Cannon, Paul S.; Angling, Matthew; Jackson-Booth, Natasha
2017-06-01
With support from the NASA sounding rocket program, the Air Force Research Laboratory launched two sounding rockets in the Kwajalein Atoll, Marshall Islands in May 2013 known as the Metal Oxide Space Cloud experiment. The rockets released samarium metal vapor at preselected altitudes in the lower F region that ionized forming a plasma cloud. Data from Advanced Research Project Agency Long-range Tracking and Identification Radar incoherent scatter radar and high-frequency (HF) radio links have been analyzed to understand the impacts of the artificial ionization on radio wave propagation. The HF radio wave ray-tracing toolbox PHaRLAP along with ionospheric models constrained by electron density profiles measured with the ALTAIR radar have been used to successfully model the effects of the cloud on HF propagation. Up to three new propagation paths were created by the artificial plasma injections. Observations and modeling confirm that the small amounts of ionized material injected in the lower F region resulted in significant changes to the natural HF propagation environment.
Six Years of Monitoring of the Sgr B2 Molecular Cloud with INTEGRAL
NASA Astrophysics Data System (ADS)
Terrier, R.; Bélanger, G.; Ponti, G.; Trap, G.; Goldwurm, A.; Decourchelle, A.
2009-05-01
Several molecular clouds around the Galactic Centre (GC) emit strong neutral iron fluorescence line at 6.4 keV, as well as hard X-ray emission up to 100 keV. The origin of this emission has long been a matter of controversy: irradiation by low energy cosmic ray electrons or X-rays emitted by a nearby flaring source in the central region. A recent evidence for time variability in the iron line intensity that has been detected in the Sgr B2 cloud favors the reflexion scenario. We present here the data obtained after 6 years of INTEGRAL monitoring of the GC. In particular, we show a lightcurve of Sgr B2 that reveals a decrease in the hard X-ray flux over the last years and discuss its implications. We finally discuss perspectives with Simbol-X.
Multidimensional photoemission spectroscopy—the space-charge limit
NASA Astrophysics Data System (ADS)
Schönhense, B.; Medjanik, K.; Fedchenko, O.; Chernov, S.; Ellguth, M.; Vasilyev, D.; Oelsner, A.; Viefhaus, J.; Kutnyakhov, D.; Wurth, W.; Elmers, H. J.; Schönhense, G.
2018-03-01
Photoelectron spectroscopy, especially at pulsed sources, is ultimately limited by the Coulomb interaction in the electron cloud, changing energy and angular distribution of the photoelectrons. A detailed understanding of this phenomenon is crucial for future pump-probe photoemission studies at (x-ray) free electron lasers and high-harmonic photon sources. Measurements have been performed for Ir(111) at hν = 1000 eV with photon flux densities between ˜102 and 104 photons per pulse and μm2 (beamline P04/PETRA III, DESY Hamburg), revealing space-charge induced energy shifts of up to 10 eV. In order to correct the essential part of the energy shift and restore the electron distributions close to the Fermi energy, we developed a semi-analytical theory for the space-charge effect in cathode-lens instruments (momentum microscopes, photoemission electron microscopes). The theory predicts a Lorentzian profile of energy isosurfaces and allows us to quantify the charge cloud from measured energy profiles. The correction is essential for the determination of the Fermi surface, as we demonstrate by means of ‘k-space movies’ for the prototypical high-Z material tungsten. In an energy interval of about 1 eV below the Fermi edge, the bandstructure can be restored up to substantial shifts of ˜7 eV. Scattered photoelectrons strongly enhance the inelastic background in the region several eV below E F, proving that the majority of scattering events involves a slow electron. The correction yields a gain of two orders of magnitude in usable intensity compared with the uncorrected case (assuming a tolerable shift of 250 meV). The results are particularly important for future experiments at SASE-type free electron lasers, since the correction also works for strongly fluctuating (but known) pulse intensities.
Earth observations taken from orbiter Discovery during STS-91 mission
2016-08-24
STS091-708-077 (2-12 June 1998) -- The cloud shadows grew long as the STS-91 astronauts aboard the Space Shuttle Discovery approached the dark side of the Earth during "sunset" over Poland. The taller building cumulus clouds cast shadows over the lower clouds.
Morphology and ionization of the interstellar cloud surrounding the solar system.
Frisch, P C
1994-09-02
The first encounter between the sun and the surrounding interstellar cloud appears to have occurred 2000 to 8000 years ago. The sun and cloud space motions are nearly perpendicular, an indication that the sun is skimming the cloud surface. The electron density derived for the surrounding cloud from the carbon component of the anomalous cosmic ray population in the solar system and from the interstellar ratio of Mg(+) to Mg degrees toward Sirius support an equilibrium model for cloud ionization (an electron density of 0.22 to 0.44 per cubic centimeter). The upwind magnetic field direction is nearly parallel to the cloud surface. The relative sun-cloud motion indicates that the solar system has a bow shock.
Clinical implementation of MOSFET detectors for dosimetry in electron beams.
Bloemen-van Gurp, Esther J; Minken, Andre W H; Mijnheer, Ben J; Dehing-Oberye, Cary J G; Lambin, Philippe
2006-09-01
To determine the factors converting the reading of a MOSFET detector placed on the patient's skin without additional build-up to the dose at the depth of dose maximum (D(max)) and investigate their feasibility for in vivo dose measurements in electron beams. Factors were determined to relate the reading of a MOSFET detector to D(max) for 4 - 15 MeV electron beams in reference conditions. The influence of variation in field size, SSD, angle and field shape on the MOSFET reading, obtained without additional build-up, was evaluated using 4, 8 and 15 MeV beams and compared to ionisation chamber data at the depth of dose maximum (z(max)). Patient entrance in vivo measurements included 40 patients, mostly treated for breast tumours. The MOSFET reading, converted to D(max), was compared to the dose prescribed at this depth. The factors to convert MOSFET reading to D(max) vary between 1.33 and 1.20 for the 4 and 15 MeV beams, respectively. The SSD correction factor is approximately 8% for a change in SSD from 95 to 100 cm, and 2% for each 5-cm increment above 100 cm SSD. A correction for fields having sides smaller than 6 cm and for irregular field shape is also recommended. For fields up to 20 x 20 cm(2) and for oblique incidence up to 45 degrees, a correction is not necessary. Patient measurements demonstrated deviations from the prescribed dose with a mean difference of -0.7% and a standard deviation of 2.9%. Performing dose measurements with MOSFET detectors placed on the patient's skin without additional build-up is a well suited technique for routine dose verification in electron beams, when applying the appropriate conversion and correction factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wokoma, S; Yoon, J; Jung, J
2014-06-01
Purpose: To investigate the impact of custom-made build-up caps for a diode detector in robotic radiosurgery radiation fields with variable collimator (IRIS) for collimator scatter factor (Sc) calculation. Methods: An acrylic cap was custom-made to fit our SFD (IBA Dosimetry, Germany) diode detector. The cap has thickness of 5 cm, corresponding to a depth beyond electron contamination. IAEA phase space data was used for beam modeling and DOSRZnrc code was used to model the detector. The detector was positioned at 80 cm source-to-detector distance. Calculations were performed with the SFD, with and without the build-up cap, for clinical IRIS settingsmore » ranging from 7.5 to 60 mm. Results: The collimator scatter factors were calculated with and without 5 cm build-up cap. They were agreed within 3% difference except 15 mm cone. The Sc factor for 15 mm cone without buildup was 13.2% lower than that with buildup. Conclusion: Sc data is a critical component in advanced algorithms for treatment planning in order to calculate the dose accurately. After incorporating build-up cap, we discovered differences of up to 13.2 % in Sc factors in the SFD detector, when compared against in-air measurements without build-up caps.« less
Challenges with Electrical, Electronics, and Electromechanical Parts for James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Jah, Muzar A.; Jeffers, Basil S.
2016-01-01
James Webb Space Telescope (JWST) is the space-based observatory that will extend the knowledge gained by the Hubble Space Telescope (HST). Hubble focuses on optical and ultraviolet wavelengths while JWST focuses on the infrared portion of the electromagnetic spectrum, to see the earliest stars and galaxies that formed in the Universe and to look deep into nearby dust clouds to study the formation of stars and planets. JWST, which commenced creation in 1996, is scheduled to launch in 2018. It includes a suite of four instruments, the spacecraft bus, optical telescope element, Integrated Science Instrument Module (ISIM, the platform to hold the instruments), and a sunshield. The mass of JWST is approximately 6200 kg, including observatory, on-orbit consumables and launch vehicle adaptor. Many challenges were overcome while providing the electrical and electronic components for the Goddard Space Flight Center hardware builds. Other difficulties encountered included developing components to work at cryogenic temperatures, failures of electronic components during development and flight builds, Integration and Test electronic parts problems, and managing technical issues with international partners. This paper will present the context of JWST from a EEE (electrical, electronic, and electromechanical) perspective with examples of challenges and lessons learned throughout the design, development, and fabrication of JWST in cooperation with our associated partners including the Canadian Space Agency (CSA), the European Space Agency (ESA), Lockheed Martin and their respective associated partners. Technical challenges and lessons learned will be discussed.
The Role of Standards in Cloud-Computing Interoperability
2012-10-01
services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift
Visualization of the Construction of Ancient Roman Buildings in Ostia Using Point Cloud Data
NASA Astrophysics Data System (ADS)
Hori, Y.; Ogawa, T.
2017-02-01
The implementation of laser scanning in the field of archaeology provides us with an entirely new dimension in research and surveying. It allows us to digitally recreate individual objects, or entire cities, using millions of three-dimensional points grouped together in what is referred to as "point clouds". In addition, the visualization of the point cloud data, which can be used in the final report by archaeologists and architects, should usually be produced as a JPG or TIFF file. Not only the visualization of point cloud data, but also re-examination of older data and new survey of the construction of Roman building applying remote-sensing technology for precise and detailed measurements afford new information that may lead to revising drawings of ancient buildings which had been adduced as evidence without any consideration of a degree of accuracy, and finally can provide new research of ancient buildings. We used laser scanners at fields because of its speed, comprehensive coverage, accuracy and flexibility of data manipulation. Therefore, we "skipped" many of post-processing and focused on the images created from the meta-data simply aligned using a tool which extended automatic feature-matching algorithm and a popular renderer that can provide graphic results.
Large-scale urban point cloud labeling and reconstruction
NASA Astrophysics Data System (ADS)
Zhang, Liqiang; Li, Zhuqiang; Li, Anjian; Liu, Fangyu
2018-04-01
The large number of object categories and many overlapping or closely neighboring objects in large-scale urban scenes pose great challenges in point cloud classification. In this paper, a novel framework is proposed for classification and reconstruction of airborne laser scanning point cloud data. To label point clouds, we present a rectified linear units neural network named ReLu-NN where the rectified linear units (ReLu) instead of the traditional sigmoid are taken as the activation function in order to speed up the convergence. Since the features of the point cloud are sparse, we reduce the number of neurons by the dropout to avoid over-fitting of the training process. The set of feature descriptors for each 3D point is encoded through self-taught learning, and forms a discriminative feature representation which is taken as the input of the ReLu-NN. The segmented building points are consolidated through an edge-aware point set resampling algorithm, and then they are reconstructed into 3D lightweight models using the 2.5D contouring method (Zhou and Neumann, 2010). Compared with deep learning approaches, the ReLu-NN introduced can easily classify unorganized point clouds without rasterizing the data, and it does not need a large number of training samples. Most of the parameters in the network are learned, and thus the intensive parameter tuning cost is significantly reduced. Experimental results on various datasets demonstrate that the proposed framework achieves better performance than other related algorithms in terms of classification accuracy and reconstruction quality.
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Murphy, K. J.; Baynes, K.; Lynnes, C.
2016-12-01
With the volume of Earth observation data expanding rapidly, cloud computing is quickly changing the way Earth observation data is processed, analyzed, and visualized. The cloud infrastructure provides the flexibility to scale up to large volumes of data and handle high velocity data streams efficiently. Having freely available Earth observation data collocated on a cloud infrastructure creates opportunities for innovation and value-added data re-use in ways unforeseen by the original data provider. These innovations spur new industries and applications and spawn new scientific pathways that were previously limited due to data volume and computational infrastructure issues. NASA, in collaboration with Amazon, Google, and Microsoft, have jointly developed a set of recommendations to enable efficient transfer of Earth observation data from existing data systems to a cloud computing infrastructure. The purpose of these recommendations is to provide guidelines against which all data providers can evaluate existing data systems and be used to improve any issues uncovered to enable efficient search, access, and use of large volumes of data. Additionally, these guidelines ensure that all cloud providers utilize a common methodology for bulk-downloading data from data providers thus preventing the data providers from building custom capabilities to meet the needs of individual cloud providers. The intent is to share these recommendations with other Federal agencies and organizations that serve Earth observation to enable efficient search, access, and use of large volumes of data. Additionally, the adoption of these recommendations will benefit data users interested in moving large volumes of data from data systems to any other location. These data users include the cloud providers, cloud users such as scientists, and other users working in a high performance computing environment who need to move large volumes of data.
NASA Astrophysics Data System (ADS)
Jeon, Hosang; Nam, Jiho; Lee, Jayoung; Park, Dahl; Baek, Cheol-Ha; Kim, Wontaek; Ki, Yongkan; Kim, Dongwon
2015-06-01
Accurate dose delivery is crucial to the success of modern radiotherapy. To evaluate the dose actually delivered to patients, in-vivo dosimetry (IVD) is generally performed during radiotherapy to measure the entrance doses. In IVD, a build-up device should be placed on top of an in-vivo dosimeter to satisfy the electron equilibrium condition. However, a build-up device made of tissue-equivalent material or metal may perturb dose delivery to a patient, and requires an additional laborious and time-consuming process. We developed a novel IVD method using a look-up table of conversion ratios instead of a build-up device. We validated this method through a monte-carlo simulation and 31 clinical trials. The mean error of clinical IVD is 3.17% (standard deviation: 2.58%), which is comparable to that of conventional IVD methods. Moreover, the required time was greatly reduced so that the efficiency of IVD could be improved for both patients and therapists.
NASA Technical Reports Server (NTRS)
Sadowy, Gregory; Tanelli, Simone; Chamberlain, Neil; Durden, Stephen; Fung, Andy; Sanchez-Barbetty, Mauricio; Thrivikraman, Tushar
2013-01-01
The National Resource Council’s Earth Science Decadal Survey” (NRCDS) has identified the Aerosol/Climate/Ecosystems (ACE) Mission as a priority mission for NASA Earth science. The NRC recommended the inclusion of "a cross-track scanning cloud radar with channels at 94 GHz and possibly 34 GHz for measurement of cloud droplet size, glaciation height, and cloud height". Several radar concepts have been proposed that meet some of the requirements of the proposed ACE mission but none have provided scanning capability at both 34 and 94 GHz due to the challenge of constructing scanning antennas at 94 GHz. In this paper, we will describe a radar design that leverages new developments in microwave monolithic integrated circuits (MMICs) and micro-machining to enable an electronically-scanned radar with both Ka-band (35 GHz) and W-band (94-GHz) channels. This system uses a dual-frequency linear active electronically-steered array (AESA) combined with a parabolic cylindrical reflector. This configuration provides a large aperture (3m x 5m) with electronic-steering but is much simpler than a two-dimension AESA of similar size. Still, the W-band frequency requires element spacing of approximately 2.5 mm, presenting significant challenges for signal routing and incorporation of MMICs. By combining (Gallium Nitride) GaN MMIC technology with micro-machined radiators and interconnects and silicon-germanium (SiGe) beamforming MMICs, we are able to meet all the performance and packaging requirements of the linear array feed and enable simultaneous scanning of Ka-band and W-band radars over swath of up to 100 km.
NASA Astrophysics Data System (ADS)
Griesbaum, Luisa; Marx, Sabrina; Höfle, Bernhard
2017-07-01
In recent years, the number of people affected by flooding caused by extreme weather events has increased considerably. In order to provide support in disaster recovery or to develop mitigation plans, accurate flood information is necessary. Particularly pluvial urban floods, characterized by high temporal and spatial variations, are not well documented. This study proposes a new, low-cost approach to determining local flood elevation and inundation depth of buildings based on user-generated flood images. It first applies close-range digital photogrammetry to generate a geo-referenced 3-D point cloud. Second, based on estimated camera orientation parameters, the flood level captured in a single flood image is mapped to the previously derived point cloud. The local flood elevation and the building inundation depth can then be derived automatically from the point cloud. The proposed method is carried out once for each of 66 different flood images showing the same building façade. An overall accuracy of 0.05 m with an uncertainty of ±0.13 m for the derived flood elevation within the area of interest as well as an accuracy of 0.13 m ± 0.10 m for the determined building inundation depth is achieved. Our results demonstrate that the proposed method can provide reliable flood information on a local scale using user-generated flood images as input. The approach can thus allow inundation depth maps to be derived even in complex urban environments with relatively high accuracies.
NASA Astrophysics Data System (ADS)
Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.
2017-11-01
Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.
NASA Astrophysics Data System (ADS)
Sirmacek, B.; Lindenbergh, R. C.; Menenti, M.
2013-10-01
Fusion of 3D airborne laser (LIDAR) data and terrestrial optical imagery can be applied in 3D urban modeling and model up-dating. The most challenging aspect of the fusion procedure is registering the terrestrial optical images on the LIDAR point clouds. In this article, we propose an approach for registering these two different data from different sensor sources. As we use iPhone camera images which are taken in front of the interested urban structure by the application user and the high resolution LIDAR point clouds of the acquired by an airborne laser sensor. After finding the photo capturing position and orientation from the iPhone photograph metafile, we automatically select the area of interest in the point cloud and transform it into a range image which has only grayscale intensity levels according to the distance from the image acquisition position. We benefit from local features for registering the iPhone image to the generated range image. In this article, we have applied the registration process based on local feature extraction and graph matching. Finally, the registration result is used for facade texture mapping on the 3D building surface mesh which is generated from the LIDAR point cloud. Our experimental results indicate possible usage of the proposed algorithm framework for 3D urban map updating and enhancing purposes.
THE LOCATION, CLUSTERING, AND PROPAGATION OF MASSIVE STAR FORMATION IN GIANT MOLECULAR CLOUDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ochsendorf, Bram B.; Meixner, Margaret; Chastenet, Jérémy
Massive stars are key players in the evolution of galaxies, yet their formation pathway remains unclear. In this work, we use data from several galaxy-wide surveys to build an unbiased data set of ∼600 massive young stellar objects, ∼200 giant molecular clouds (GMCs), and ∼100 young (<10 Myr) optical stellar clusters (SCs) in the Large Magellanic Cloud. We employ this data to quantitatively study the location and clustering of massive star formation and its relation to the internal structure of GMCs. We reveal that massive stars do not typically form at the highest column densities nor centers of their parentmore » GMCs at the ∼6 pc resolution of our observations. Massive star formation clusters over multiple generations and on size scales much smaller than the size of the parent GMC. We find that massive star formation is significantly boosted in clouds near SCs. However, whether a cloud is associated with an SC does not depend on either the cloud’s mass or global surface density. These results reveal a connection between different generations of massive stars on timescales up to 10 Myr. We compare our work with Galactic studies and discuss our findings in terms of GMC collapse, triggered star formation, and a potential dichotomy between low- and high-mass star formation.« less
Distributed clinical data sharing via dynamic access-control policy transformation.
Rezaeibagha, Fatemeh; Mu, Yi
2016-05-01
Data sharing in electronic health record (EHR) systems is important for improving the quality of healthcare delivery. Data sharing, however, has raised some security and privacy concerns because healthcare data could be potentially accessible by a variety of users, which could lead to privacy exposure of patients. Without addressing this issue, large-scale adoption and sharing of EHR data are impractical. The traditional solution to the problem is via encryption. Although encryption can be applied to access control, it is not applicable for complex EHR systems that require multiple domains (e.g. public and private clouds) with various access requirements. This study was carried out to address the security and privacy issues of EHR data sharing with our novel access-control mechanism, which captures the scenario of the hybrid clouds and need of access-control policy transformation, to provide secure and privacy-preserving data sharing among different healthcare enterprises. We introduce an access-control mechanism with some cryptographic building blocks and present a novel approach for secure EHR data sharing and access-control policy transformation in EHR systems for hybrid clouds. We propose a useful data sharing system for healthcare providers to handle various EHR users who have various access privileges in different cloud environments. A systematic study has been conducted on data sharing in EHR systems to provide a solution to the security and privacy issues. In conclusion, we introduce an access-control method for privacy protection of EHRs and EHR policy transformation that allows an EHR access-control policy to be transformed from a private cloud to a public cloud. This method has never been studied previously in the literature. Furthermore, we provide a protocol to demonstrate policy transformation as an application scenario. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Electron cloud simulations for the main ring of J-PARC
NASA Astrophysics Data System (ADS)
Yee-Rendon, Bruce; Muto, Ryotaro; Ohmi, Kazuhito; Satou, Kenichirou; Tomizawa, Masahito; Toyama, Takeshi
2017-07-01
The simulation of beam instabilities is a helpful tool to evaluate potential threats against the machine protection of the high intensity beams. At Main Ring (MR) of J-PARC, signals related to the electron cloud have been observed during the slow beam extraction mode. Hence, several studies were conducted to investigate the mechanism that produces it, the results confirmed a strong dependence on the beam intensity and the bunch structure in the formation of the electron cloud, however, the precise explanation of its trigger conditions remains incomplete. To shed light on the problem, electron cloud simulations were done using an updated version of the computational model developed from previous works at KEK. The code employed the signals of the measurements to reproduce the events seen during the surveys.
2012-08-01
The first phase consisted of Shared Services , Threat Detection and Reporting, and the Remote Weapon Station (RWS) build up and validation. The...Awareness build up and validation. The first phase consisted of the development of the shared services or core services that are required by many...C4ISR/EW systems. The shared services include: time synchronization, position, direction of travel, and orientation. Time synchronization is
Dorninger, Peter; Pfeifer, Norbert
2008-01-01
Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects. PMID:27873931
NASA Astrophysics Data System (ADS)
Macher, H.; Landes, T.; Grussenmeyer, P.
2016-06-01
Laser scanners are widely used for the modelling of existing buildings and particularly in the creation process of as-built BIM (Building Information Modelling). However, the generation of as-built BIM from point clouds involves mainly manual steps and it is consequently time consuming and error-prone. Along the path to automation, a three steps segmentation approach has been developed. This approach is composed of two phases: a segmentation into sub-spaces namely floors and rooms and a plane segmentation combined with the identification of building elements. In order to assess and validate the developed approach, different case studies are considered. Indeed, it is essential to apply algorithms to several datasets and not to develop algorithms with a unique dataset which could influence the development with its particularities. Indoor point clouds of different types of buildings will be used as input for the developed algorithms, going from an individual house of almost one hundred square meters to larger buildings of several thousand square meters. Datasets provide various space configurations and present numerous different occluding objects as for example desks, computer equipments, home furnishings and even wine barrels. For each dataset, the results will be illustrated. The analysis of the results will provide an insight into the transferability of the developed approach for the indoor modelling of several types of buildings.
First Prismatic Building Model Reconstruction from Tomosar Point Clouds
NASA Astrophysics Data System (ADS)
Sun, Y.; Shahzad, M.; Zhu, X.
2016-06-01
This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR) point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007) and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce) the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center) in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.
An approach of point cloud denoising based on improved bilateral filtering
NASA Astrophysics Data System (ADS)
Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin
2018-04-01
An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.
DAΦNE operation with electron-cloud-clearing electrodes.
Alesini, D; Drago, A; Gallo, A; Guiducci, S; Milardi, C; Stella, A; Zobov, M; De Santis, S; Demma, T; Raimondi, P
2013-03-22
The effects of an electron cloud (e-cloud) on beam dynamics are one of the major factors limiting performances of high intensity positron, proton, and ion storage rings. In the electron-positron collider DAΦNE, namely, a horizontal beam instability due to the electron-cloud effect has been identified as one of the main limitations on the maximum stored positron beam current and as a source of beam quality deterioration. During the last machine shutdown in order to mitigate such instability, special electrodes have been inserted in all dipole and wiggler magnets of the positron ring. It has been the first installation all over the world of this type since long metallic electrodes have been installed in all arcs of the collider positron ring and are currently used during the machine operation in collision. This has allowed a number of unprecedented measurements (e-cloud instabilities growth rate, transverse beam size variation, tune shifts along the bunch train) where the e-cloud contribution is clearly evidenced by turning the electrodes on and off. In this Letter we briefly describe a novel design of the electrodes, while the main focus is on experimental measurements. Here we report all results that clearly indicate the effectiveness of the electrodes for e-cloud suppression.
Cloud cover estimation optical package: New facility, algorithms and techniques
NASA Astrophysics Data System (ADS)
Krinitskiy, Mikhail
2017-02-01
Short- and long-wave radiation is an important component of surface heat budget over sea and land. For estimating them accurate observations of the cloud cover are needed. While massively observed visually, for building accurate parameterizations cloud cover needs also to be quantified using precise instrumental measurements. Major disadvantages of the most of existing cloud-cameras are associated with their complicated design and inaccuracy of post-processing algorithms which typically result in the uncertainties of 20% to 30% in the camera-based estimates of cloud cover. The accuracy of these types of algorithm in terms of true scoring compared to human-observed values is typically less than 10%. We developed new generation package for cloud cover estimating, which provides much more accurate results and also allows for measuring additional characteristics. New algorithm, namely SAIL GrIx, based on routine approach, also developed for this package. It uses the synthetic controlling index ("grayness rate index") which allows to suppress the background sunburn effect. This makes it possible to increase the reliability of the detection of the optically thin clouds. The accuracy of this algorithm in terms of true scoring became 30%. One more approach, namely SAIL GrIx ML, we have used to increase the cloud cover estimating accuracy is the algorithm that uses machine learning technique along with some other signal processing techniques. Sun disk condition appears to be a strong feature in this kind of models. Artificial Neural Networks type of model demonstrates the best quality. This model accuracy in terms of true scoring increases up to 95,5%. Application of a new algorithm lets us to modify the design of the optical sensing package and to avoid the use of the solar trackers. This made the design of the cloud camera much more compact. New cloud-camera has already been tested in several missions across Atlantic and Indian oceans on board of IORAS research vessels.
ESA's Ice Cloud Imager on Metop Second Generation
NASA Astrophysics Data System (ADS)
Klein, Ulf; Loiselet, Marc; Mason, Graeme; Gonzalez, Raquel; Brandt, Michael
2016-04-01
Since 2006, the European contribution to operational meteorological observations from polar orbit has been provided by the Meteorological Operational (MetOp) satellites, which is the space segment of the EUMETSAT Polar System (EPS). The first MetOp satellite was launched in 2006, 2nd 2012 and 3rd satellite is planned for launch in 2018. As part of the next generation EUMETSAT Polar System (EPS-SG), the MetOp Second Generation (MetOp-SG) satellites will provide continuity and enhancement of these observations in the 2021 - 2042 timeframe. The noel Ice Cloud Imager (ICI) is one of the instruments selected to be on-board the MetOp-SG satellite "B". The main objective of the ICI is to enable cloud ice retrieval, with emphasis on cirrus clouds. ICI will provide information on cloud ice mean altitude, cloud ice water path and cloud ice effective radius. In addition, it will provide water vapour profile measurement capability. ICI is a 13-channel microwave/sub-millimetre wave radiometer, covering the frequency range from 183 GHz up to 664 GHz. The instrument is composed of a rotating part and a fixed part. The rotating part includes the main antenna, the feed assembly and the receiver electronics. The fixed part contains the hot calibration target, the reflector for viewing the cold sky and the electronics for the instrument control and interface with the platform. Between the fixed and the rotating part is the scan mechanism. Scan mechanism is not only responsible of rotating the instrument and providing its angular position, but it will also have pass through the power and data lines. The Scan mechanism is controlled by the fully redundant Control and Drive Electronics ICI is calibrated using an internal hot target and a cold sky mirror, which are viewed once per rotation. The internal hot target is a traditional pyramidal target. The hot target is covered by an annular shield during rotation with only a small opening for the feed horns to guarantee a stable environment. Also, in order to achieve very good radiometric accuracy and stability, the ICI instrument is designed with sun-shields in order to minimize sun-intrusion at all possible sun angles. Details of the instrument design and the current development status will be presented.
Trirotron: triode rotating beam radio frequency amplifier
Lebacqz, Jean V.
1980-01-01
High efficiency amplification of radio frequencies to very high power levels including: establishing a cylindrical cloud of electrons; establishing an electrical field surrounding and coaxial with the electron cloud to bias the electrons to remain in the cloud; establishing a rotating electrical field that surrounds and is coaxial with the steady field, the circular path of the rotating field being one wavelength long, whereby the peak of one phase of the rotating field is used to accelerate electrons in a beam through the bias field in synchronism with the peak of the rotating field so that there is a beam of electrons continuously extracted from the cloud and rotating with the peak; establishing a steady electrical field that surrounds and is coaxial with the rotating field for high-energy radial acceleration of the rotating beam of electrons; and resonating the rotating beam of electrons within a space surrounding the second field, the space being selected to have a phase velocity equal to that of the rotating field to thereby produce a high-power output at the frequency of the rotating field.
An Imide-Based Pentacyclic Building Block for n-Type Organic Semiconductors
Wu, Fu-Peng; Un, Hio-Ieng; Li, Yongxi; ...
2017-10-09
For this study a new electron-deficient unit with fused 5-heterocyclic ring was developed by replacing a cyclopenta-1,3-diene from electron-rich donor indacenodithiophene (IDT) with cyclohepta-4,6-diene-1,3-diimde unit. The imide bridging endows BBI with fixed planar configuration and both low the highest occupied molecular orbital (HOMO) (-6.24 eV) and the lowest unoccupied molecular orbit (LUMO) (-2.57 eV) energy levels. Organic field-effect transistors (OFETs) based on BBI polymers exhibit electron mobility up to 0.34 cm 2 V -1 s -1, which indicates that the BBI is a promising n-type building block for optoelectronics.
An Imide-Based Pentacyclic Building Block for n-Type Organic Semiconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Fu-Peng; Un, Hio-Ieng; Li, Yongxi
For this study a new electron-deficient unit with fused 5-heterocyclic ring was developed by replacing a cyclopenta-1,3-diene from electron-rich donor indacenodithiophene (IDT) with cyclohepta-4,6-diene-1,3-diimde unit. The imide bridging endows BBI with fixed planar configuration and both low the highest occupied molecular orbital (HOMO) (-6.24 eV) and the lowest unoccupied molecular orbit (LUMO) (-2.57 eV) energy levels. Organic field-effect transistors (OFETs) based on BBI polymers exhibit electron mobility up to 0.34 cm 2 V -1 s -1, which indicates that the BBI is a promising n-type building block for optoelectronics.
Interior, building 1205, view to southeast showing roof truss system, ...
Interior, building 1205, view to southeast showing roof truss system, sliding main doors, and roll up door at center to allow clearance for aircraft tail assembly, 90 mm lens plus electronic flash fill lighting. - Travis Air Force Base, Readiness Maintenance Hangar, W Street, Air Defense Command Readiness Area, Fairfield, Solano County, CA
A Novel College Network Resource Management Method using Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Chen
At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-01-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-09-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.
A scalable approach for tree segmentation within small-footprint airborne LiDAR data
NASA Astrophysics Data System (ADS)
Hamraz, Hamid; Contreras, Marco A.; Zhang, Jun
2017-05-01
This paper presents a distributed approach that scales up to segment tree crowns within a LiDAR point cloud representing an arbitrarily large forested area. The approach uses a single-processor tree segmentation algorithm as a building block in order to process the data delivered in the shape of tiles in parallel. The distributed processing is performed in a master-slave manner, in which the master maintains the global map of the tiles and coordinates the slaves that segment tree crowns within and across the boundaries of the tiles. A minimal bias was introduced to the number of detected trees because of trees lying across the tile boundaries, which was quantified and adjusted for. Theoretical and experimental analyses of the runtime of the approach revealed a near linear speedup. The estimated number of trees categorized by crown class and the associated error margins as well as the height distribution of the detected trees aligned well with field estimations, verifying that the distributed approach works correctly. The approach enables providing information of individual tree locations and point cloud segments for a forest-level area in a timely manner, which can be used to create detailed remotely sensed forest inventories. Although the approach was presented for tree segmentation within LiDAR point clouds, the idea can also be generalized to scale up processing other big spatial datasets.
ERIC Educational Resources Information Center
Liao, Yuan
2011-01-01
The virtualization of computing resources, as represented by the sustained growth of cloud computing, continues to thrive. Information Technology departments are building their private clouds due to the perception of significant cost savings by managing all physical computing resources from a single point and assigning them to applications or…
Wiewiórka, Marek S; Messina, Antonio; Pacholewska, Alicja; Maffioletti, Sergio; Gawrysiak, Piotr; Okoniewski, Michał J
2014-09-15
Many time-consuming analyses of next -: generation sequencing data can be addressed with modern cloud computing. The Apache Hadoop-based solutions have become popular in genomics BECAUSE OF: their scalability in a cloud infrastructure. So far, most of these tools have been used for batch data processing rather than interactive data querying. The SparkSeq software has been created to take advantage of a new MapReduce framework, Apache Spark, for next-generation sequencing data. SparkSeq is a general-purpose, flexible and easily extendable library for genomic cloud computing. It can be used to build genomic analysis pipelines in Scala and run them in an interactive way. SparkSeq opens up the possibility of customized ad hoc secondary analyses and iterative machine learning algorithms. This article demonstrates its scalability and overall fast performance by running the analyses of sequencing datasets. Tests of SparkSeq also prove that the use of cache and HDFS block size can be tuned for the optimal performance on multiple worker nodes. Available under open source Apache 2.0 license: https://bitbucket.org/mwiewiorka/sparkseq/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
High Temperature Planetary Nebulae in the Magellanic Clouds
NASA Astrophysics Data System (ADS)
Maran, Stephen P.
Following up on our recent discovery that a very hot planetary in the Small Magellanic Cloud has an extraordinary underabundance of carbon, we propose to observe two similar hot planetaries in the Clouds with IUE as part of an optical/UV investigation. The objectives are (1) to test the suggestion that high nebular electron temperatures can result from a strong deficiency of carbon that deprives the nebula of an important cooling channel; and (2) to determine accurate chemical abundances to constrain limits on the efficiency of "hot bottom burning" in massive progenitors of planetary nebulae. The targets are SMC 25 (Te = 34,000 K) and LMC 88 (= 25,500 K). These UV observations of targets not previously observed with IUE will be combined, for analysis, with visible wavelength spectra of both targets from the Anglo-Australian Telescope and the 2-3-m Siding Spring reflector. The objects will also be compared in the analysis stage with previous IUE observations (and consequent modeling) of type I planetaries in the Clouds. Model nebulae will be calculated, and physical parameters of the central stars will be inferred.
NASA Astrophysics Data System (ADS)
Jun, Byung-Il; Jones, T. W.
1999-02-01
We present two-dimensional MHD simulations of the evolution of a young Type Ia supernova remnant (SNR) during its interaction with an interstellar cloud of comparable size at impact. We include for the first time in such simulations explicit relativistic electron transport. This was done using a simplified treatment of the diffusion-advection equation, thus allowing us to model injection and acceleration of cosmic-ray electrons at shocks and their subsequent transport. From this information we also model radio synchrotron emission, including spectral information. The simulations were carried out in spherical coordinates with azimuthal symmetry and compare three different situations, each incorporating an initially uniform interstellar magnetic field oriented in the polar direction on the grid. In particular, we modeled the SNR-cloud interactions for a spherical cloud on the polar axis, a toroidal cloud whose axis is aligned with the polar axis, and, for comparison, a uniform medium with no cloud. We find that the evolution of the overrun cloud qualitatively resembles that seen in simulations of simpler but analogous situations: that is, the cloud is crushed and begins to be disrupted by Rayleigh-Taylor and Kelvin-Helmholtz instabilities. However, we demonstrate here that, in addition, the internal structure of the SNR is severely distorted as such clouds are engulfed. This has important dynamical and observational implications. The principal new conclusions we draw from these experiments are the following. (1) Independent of the cloud interaction, the SNR reverse shock can be an efficient site for particle acceleration in a young SNR. (2) The internal flows of the SNR become highly turbulent once it encounters a large cloud. (3) An initially uniform magnetic field is preferentially amplified along the magnetic equator of the SNR, primarily because of biased amplification in that region by Rayleigh-Taylor instabilities. A similar bias produces much greater enhancement to the magnetic energy in the SNR during an encounter with a cloud when the interstellar magnetic field is partially transverse to the expansion of the SNR. The enhanced magnetic fields have a significant radial component, independent of the field orientation external to the SNR. This leads to a strong equatorial bias in synchrotron brightness that could easily mask any enhancements to electron-acceleration efficiency near the magnetic equator of the SNR. Thus, to establish the latter effect, it will be essential to establish that the magnetic field in the brightest regions are actually tangential to the blast wave. (4) The filamentary radio structures correlate well with ``turbulence-enhanced'' magnetic structures, while the diffuse radio emission more closely follows the gas-density distribution within the SNR. (5) At these early times, the synchrotron spectral index due to electrons accelerated at the primary shocks should be close to 0.5 unless those shocks are modified by cosmic-ray proton pressures. While that result is predictable, we find that this simple result can be significantly complicated in practice by SNR interactions with clouds. Those events can produce regions with significantly steeper spectra. Especially if there are multiple cloud encounters, this interaction can lead to nonuniform spatial spectral distributions or, through turbulent mixing, produce a spectrum that is difficult to relate to the actual strength of the blast wave. (6) Interaction with the cloud enhances the nonthermal electron population in the SNR in our simulations because of additional electron injection taking place in the shocks associated with the cloud. Together with point 3, this means that SNR-cloud encounters can significantly increase the radio emission from the SNR.
New Developments on the PSR Instability
NASA Astrophysics Data System (ADS)
Macek, Robert
2000-04-01
A strong, fast, transverse instability has long been observed at the Los Alamos Proton Storage Ring (PSR) where it is a limiting factor on peak intensity. Most of the characteristics and experimental data are consistent with a two-stream instability (e-p) arising from coupled oscillations of the proton beam and an electron cloud. In past operations, where the average intensity was limited by beam losses, the instability was controlled by sufficient rf voltage in the ring. The need for higher beam intensity has motivated new work to better understand and control the instability. Results will be presented from studies of the production and characteristics of the electron cloud at various locations in the ring for both stable and unstable beams and suppression of electron cloud generation by TiN coatings. Studies of additional or alternate controls include application of dual harmonic rf, damping of the instability by higher order multipoles, damping by X,Y coupling from skew quadrupoles and the use of inductive inserts to compensate longitudinal space charge forces. Use of a skew quadrupole, heated inductive inserts and higher rf voltage from a refurbished rf buncher has enabled the PSR to accumulate stable beam intensity up to 9.7 micro-Coulombs (6 E13 protons) per macropulse, a significant increase (60over the previous maximum of 6 micro-Coulombs (3.7 E13 protons). However, slow losses were rather high and must be reduced for routine operation at repetition rates of 20 Hz or higher.
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-04-01
The cloud processing scheme APOLLO (Avhrr Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. While building upon the physical principles having served well in the original APOLLO a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is not performed as a binary yes/no decision based on these physical principals but is expressed as cloud probability for each satellite pixel. Consequently the outcome of the algorithm can be tuned from clear confident to cloud confident depending on the purpose. The probabilistic approach allows to retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for the application with large amounts of historical satellite data. Thus the radiative transfer solution is approximated by the same two stream approach which also had been used for the original APOLLO. This allows the algorithm to be robust enough for being applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e. within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from on NOAA-18 are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S. Y.
We propose using beam scrubbing to mitigate the electron cloud effect in the eRHIC. The bunch number is adjusted below the heat load limit, then it increases with the reduced secondary electron yield resulted from the beam scrubbing, up to the design bunch number. Since the electron density threshold of beam instability is lower at the injection, a preliminary injection scrubbing should go first, where large chromaticity can be used to keep the beam in the ring for scrubbing. After that, the beam can be ramped to full energy, allowing physics scrubbing. Simulations demonstrated that with beam scrubbing in amore » reasonable period of time, the eRHIC baseline design is feasible.« less
Outdoor Illegal Construction Identification Algorithm Based on 3D Point Cloud Segmentation
NASA Astrophysics Data System (ADS)
An, Lu; Guo, Baolong
2018-03-01
Recently, various illegal constructions occur significantly in our surroundings, which seriously restrict the orderly development of urban modernization. The 3D point cloud data technology is used to identify the illegal buildings, which could address the problem above effectively. This paper proposes an outdoor illegal construction identification algorithm based on 3D point cloud segmentation. Initially, in order to save memory space and reduce processing time, a lossless point cloud compression method based on minimum spanning tree is proposed. Then, a ground point removing method based on the multi-scale filtering is introduced to increase accuracy. Finally, building clusters on the ground can be obtained using a region growing method, as a result, the illegal construction can be marked. The effectiveness of the proposed algorithm is verified using a publicly data set collected from the International Society for Photogrammetry and Remote Sensing (ISPRS).
Experimental Investigation of Electron Cloud Containment in a Nonuniform Magnetic Field
NASA Technical Reports Server (NTRS)
Eninger, J. E.
1974-01-01
Dense clouds of electrons were generated and studied in an axisymmetric, nonuniform magnetic field created by a short solenoid. The operation of the experiment was similar to that of a low-pressure (approximately 0.000001 Torr) magnetron discharge. Discharge current characteristics are presented as a function of pressure, magnetic field strength, voltage, and cathode end-plate location. The rotation of the electron cloud is determined from the frequency of diocotron waves. In the space charge saturated regime of operation, the cloud is found to rotate as a solid body with frequency close to V sub a/phi sub a where V sub a is the anode voltage and phi suba is the total magnetic flux. This result indicates that, in regions where electrons are present, the magnetic field lines are electrostatic equipotentials (E bar, B bar = 0). Equilibrium electron density distributions suggested by this conditions are integrated with respect to total ionizing power and are found consistent with measured discharge currents.
The Public Health Community Platform, Electronic Case Reporting, and the Digital Bridge.
Cooney, Mary Ann; Iademarco, Michael F; Huang, Monica; MacKenzie, William R; Davidson, Arthur J
At the intersection of new technology advancements, ever-changing health policy, and fiscal constraints, public health agencies seek to leverage modern technical innovations and benefit from a more comprehensive and cooperative approach to transforming public health, health care, and other data into action. State health agencies recognized a way to advance population health was to integrate public health with clinical health data through electronic infectious disease case reporting. The Public Health Community Platform (PHCP) concept of bidirectional data flow and knowledge management became the foundation to build a cloud-based system connecting electronic health records to public health data for a select initial set of notifiable conditions. With challenges faced and lessons learned, significant progress was made and the PHCP grew into the Digital Bridge, a national governance model for systems change, bringing together software vendors, public health, and health care. As the model and technology advance together, opportunities to advance future connectivity solutions for both health care and public health will emerge.
NASA Astrophysics Data System (ADS)
Gleason, Alyx; Bedard, Jamie; Bellis, Matthew; CMS Collaboration
2016-03-01
In the summer of 2015, we hosted 10 high school teachers for a three-day ``Physics at the Frontier'' Workshop. The mornings were spent learning about particle physics, CMS and the LHC, and radiation safety while the afternoons were spent building turn-key cloud chambers for use in their classrooms. The basic cloud chamber design uses Peltier thermoelectric coolers, rather than dry ice, and instructions can be found in multiple places online. For a robust build procedure and for easy use in the classroom, we redesigned parts of the construction process to make it easier to put together while holding costs below 200 per chamber. In addition to this new design, we also created a website with instructions for those who are interested in building their own using this design. This workshop was funded in part by a minigrant for Outreach and Education from the USCMS collaboration. Our experience with the workshop and the lessons learned from the cloud chamber design will be discussed. This work was funded in part by NSF Grants PHY-1307562 and a USCMS-administered minigrant for Outreach and Education.
Evolution of the Far-Infrared Cloud at Titan's South Pole
NASA Technical Reports Server (NTRS)
Jennings, Donald E.; Achterberg, R. K.; Cottini, V.; Anderson, C. M.; Flasar, F. M.; Nixon, C. A.; Bjoraker, G. L.; Kunde, V. G.; Carlson, R. C.; Guandique, E.;
2015-01-01
A condensate cloud on Titan identified by its 220 cm-1 far-infrared signature continues to undergo seasonal changes at both the north and south poles. In the north, the cloud, which extends from 55 N to the pole, has been gradually decreasing in emission intensity since the beginning of the Cassini mission with a half-life of 3.8 years. The cloud in the south did not appear until 2012 but its intensity has increased rapidly, doubling every year. The shape of the cloud at the south pole is very different from that in the north. Mapping in 2013 December showed that the condensate emission was confined to a ring with a maximum at 80 S. The ring was centered 4deg from Titan's pole. The pattern of emission from stratospheric trace gases like nitriles and complex hydrocarbons (mapped in 2014 January) was also offset by 4deg, but had a central peak at the pole and a secondary maximum in a ring at about 70 S with a minimum at 80 S. The shape of the gas emission distribution can be explained by abundances that are high at the atmospheric pole and diminish toward the equator, combined with correspondingly increasing temperatures. We discuss possible causes for the condensate ring. The present rapid build up of the condensate cloud at the south pole is likely to transition to a gradual decline from 2015 to 2016. Key words: molecular processes - planets and satellites: atmospheres - planets and satellites: composition - planets and satellites: individual (Titan) - radiation mechanisms: thermal
NASA Technical Reports Server (NTRS)
Ivory, K.; Schwenn, R.
1995-01-01
The solar wind data obtained from the two Helios solar probes in the years 1974 to 1986 were systematically searched for the occurrence of bi-directional electron events. Most often these events are found in conjunction with shock associated magnetic clouds. The implications of these observations for the topology of interplanetary plasma clouds are discussed.
Protection against lightning on the geomagnetic observatory
NASA Astrophysics Data System (ADS)
Čop, R.; Milev, G.; Deželjin, D.; Kosmač, J.
2014-04-01
The Sinji Vrh Geomagnetic Observatory was built on the brow of the mountain Gora, above Ajdovščina, and all over Europe one may hardly find an area which is more often struck by lightning than this south-western part of Slovenia. When the humid air masses of a storm front hit the edge of Gora, they rise up more than 1000 m in a very short time, and this causes the additional electrical charge of stormy clouds. The reliability of operations performed in the every building of observatory could be increased by understanding the formation of lightning in the thunderstorm cloud, the application of already proven methods of protection against a strike of lightning and against its secondary effects. To reach this goal the following groups of experts have to co-operate: the experts in the field of protection against lightening phenomenon, the constructors and manufacturers of equipment and the observatory managers.
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-10-01
The cloud processing scheme APOLLO (AVHRR Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. It builds upon the physical principles that have served well in the original APOLLO scheme. Nevertheless, a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is no longer performed as a binary yes/no decision based on these physical principles. It is rather expressed as cloud probability for each satellite pixel. Consequently, the outcome of the algorithm can be tuned from being sure to reliably identify clear pixels to conditions of reliably identifying definitely cloudy pixels, depending on the purpose. The probabilistic approach allows retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for application to large amounts of historical satellite data. The radiative transfer solution is approximated by the same two-stream approach which also had been used for the original APOLLO. This allows the algorithm to be applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e., within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from NOAA-18 are presented.
Average value of the shape and direction factor in the equation of refractive index
NASA Astrophysics Data System (ADS)
Zhang, Tao
2017-10-01
The theoretical calculation of the refractive indices is of great significance for the developments of new optical materials. The calculation method of refractive index, which was deduced from the electron-cloud-conductor model, contains the shape and direction factor 〈g〉. 〈g〉 affects the electromagnetic-induction energy absorbed by the electron clouds, thereby influencing the refractive indices. It is not yet known how to calculate 〈g〉 value of non-spherical electron clouds. In this paper, 〈g〉 value is derived by imaginatively dividing the electron cloud into numerous little volume elements and then regrouping them. This paper proves that 〈g〉 = 2/3 when molecules’ spatial orientations distribute randomly. The calculations of the refractive indices of several substances validate this equation. This result will help to promote the application of the calculation method of refractive index.
Storm Clouds Roll In Over The Vehicle Assembly Building
2009-07-12
Storm clouds roll in over the NASA Vehicle Assembly building moments after STS-127 Space Shuttle Launch Director Pete Nickolenko and the launch team called the launch a "No Go" due to weather conditions at the NASA Kennedy Space Center in Cape Canaveral, Florida, Sunday, July 12, 2009. Endeavour will be launching with the crew of STS-127 on a 16-day mission that will feature five spacewalks and complete construction of the Japan Aerospace Exploration Agency's Kibo laboratory. Photo Credit: (NASA/Bill Ingalls)
Storm Clouds Roll In Over The Vehicle Assembly Building
2009-07-11
Storm clouds roll in over the NASA Vehicle Assembly building moments after STS-127 Space Shuttle Launch Director Pete Nickolenko and the launch team called the launch a "No Go" due to weather conditions at the NASA Kennedy Space Center in Cape Canaveral, Florida, Sunday, July 12, 2009. Endeavour will be launching with the crew of STS-127 on a 16-day mission that will feature five spacewalks and complete construction of the Japan Aerospace Exploration Agency's Kibo laboratory. Photo Credit: (NASA/Bill Ingalls)
[Porting Radiotherapy Software of Varian to Cloud Platform].
Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin
2017-09-30
To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.
Building damage assessment using airborne lidar
NASA Astrophysics Data System (ADS)
Axel, Colin; van Aardt, Jan
2017-10-01
The assessment of building damage following a natural disaster is a crucial step in determining the impact of the event itself and gauging reconstruction needs. Automatic methods for deriving damage maps from remotely sensed data are preferred, since they are regarded as being rapid and objective. We propose an algorithm for performing unsupervised building segmentation and damage assessment using airborne light detection and ranging (lidar) data. Local surface properties, including normal vectors and curvature, were used along with region growing to segment individual buildings in lidar point clouds. Damaged building candidates were identified based on rooftop inclination angle, and then damage was assessed using planarity and point height metrics. Validation of the building segmentation and damage assessment techniques were performed using airborne lidar data collected after the Haiti earthquake of 2010. Building segmentation and damage assessment accuracies of 93.8% and 78.9%, respectively, were obtained using lidar point clouds and expert damage assessments of 1953 buildings in heavily damaged regions. We believe this research presents an indication of the utility of airborne lidar remote sensing for increasing the efficiency and speed at which emergency response operations are performed.
2001-01-01
International Acer Incorporated, Hsin Chu, Taiwan Aerospace Industrial Development Corporation, Taichung, Taiwan American Institute of Taiwan, Taipei, Taiwan...Singapore and Malaysia .5 - 4 - The largest market for semiconductor products is the high technology consumer electronics industry that consumes up...Singapore, and Malaysia . A new semiconductor facility costs around $3 billion to build and takes about two years to become operational
2. Historic American Buildings Survey Russell Jones, Photographer June 1963 ...
2. Historic American Buildings Survey Russell Jones, Photographer June 1963 SOUTHEAST VIEW - Abner Cloud House, Intersection of Canal Road & Reservoir Road Northwest, Washington, District of Columbia, DC
1. Historic American Buildings Survey Russell Jones, Photographer June 1963 ...
1. Historic American Buildings Survey Russell Jones, Photographer June 1963 SOUTHWEST VIEW - Abner Cloud House, Intersection of Canal Road & Reservoir Road Northwest, Washington, District of Columbia, DC
NASA Astrophysics Data System (ADS)
Lengert, W.; Mondon, E.; Bégin, M. E.; Ferrer, M.; Vallois, F.; DelaMar, J.
2015-12-01
Helix Nebula, a European science cross-domain initiative building on an active PPP, is aiming to implement the concept of an open science commons[1] while using a cloud hybrid model[2] as the proposed implementation solution. This approach allows leveraging and merging of complementary data intensive Earth Science disciplines (e.g. instrumentation[3] and modeling), without introducing significant changes in the contributors' operational set-up. Considering the seamless integration with life-science (e.g. EMBL), scientific exploitation of meteorological, climate, and Earth Observation data and models open an enormous potential for new big data science. The work of Helix Nebula has shown that is it feasible to interoperate publicly funded infrastructures, such as EGI [5] and GEANT [6], with commercial cloud services. Such hybrid systems are in the interest of the existing users of publicly funded infrastructures and funding agencies because they will provide "freedom and choice" over the type of computing resources to be consumed and the manner in which they can be obtained. But to offer such freedom and choice across a spectrum of suppliers, various issues such as intellectual property, legal responsibility, service quality agreements and related issues need to be addressed. Finding solutions to these issues is one of the goals of the Helix Nebula initiative. [1] http://www.egi.eu/news-and-media/publications/OpenScienceCommons_v3.pdf [2] http://www.helix-nebula.eu/events/towards-the-european-open-science-cloud [3] e.g. https://sentinel.esa.int/web/sentinel/sentinel-data-access [5] http://www.egi.eu/ [6] http://www.geant.net/
Heterogeneous ice nucleation of α-pinene SOA particles before and after ice cloud processing
NASA Astrophysics Data System (ADS)
Wagner, Robert; Höhler, Kristina; Huang, Wei; Kiselev, Alexei; Möhler, Ottmar; Mohr, Claudia; Pajunoja, Aki; Saathoff, Harald; Schiebel, Thea; Shen, Xiaoli; Virtanen, Annele
2017-05-01
The ice nucleation ability of α-pinene secondary organic aerosol (SOA) particles was investigated at temperatures between 253 and 205 K in the Aerosol Interaction and Dynamics in the Atmosphere cloud simulation chamber. Pristine SOA particles were nucleated and grown from pure gas precursors and then subjected to repeated expansion cooling cycles to compare their intrinsic ice nucleation ability during the first nucleation event with that observed after ice cloud processing. The unprocessed α-pinene SOA particles were found to be inefficient ice-nucleating particles at cirrus temperatures, with nucleation onsets (for an activated fraction of 0.1%) as high as for the homogeneous freezing of aqueous solution droplets. Ice cloud processing at temperatures below 235 K only marginally improved the particles' ice nucleation ability and did not significantly alter their morphology. In contrast, the particles' morphology and ice nucleation ability was substantially modified upon ice cloud processing in a simulated convective cloud system, where the α-pinene SOA particles were first activated to supercooled cloud droplets and then froze homogeneously at about 235 K. As evidenced by electron microscopy, the α-pinene SOA particles adopted a highly porous morphology during such a freeze-drying cycle. When probing the freeze-dried particles in succeeding expansion cooling runs in the mixed-phase cloud regime up to 253 K, the increase in relative humidity led to a collapse of the porous structure. Heterogeneous ice formation was observed after the droplet activation of the collapsed, freeze-dried SOA particles, presumably caused by ice remnants in the highly viscous material or the larger surface area of the particles.
Building a Cloud Computing and Big Data Infrastructure for Cybersecurity Research and Education
2015-04-17
408 1,408 312,912 17 Hadoop- Integration M/D Node R720xd 2 24 128 3,600 5 Subtotal: 120 640 18,000 5 Cloud - Production VRTX M620 2 16 256 30,720...4 Subtotal: 8 64 1,024 30,720 4 Cloud - Integration IBM HS22 7870H5U 2 12 84 4,800 5 Subtotal: 10 60 420 4,800 5 TOTAL: 62 652 3,492 366,432...3,492 366,432 Cloud - Integration Hadoop- Production Hadoop- Integration Cloud - Production September 2014 8 Exploring New Opportunities (Cybersecurity
3. Historic American Buildings Survey Russell Jones, Photographer June 1963NORTHWEST ...
3. Historic American Buildings Survey Russell Jones, Photographer June 1963NORTHWEST VIEW - Abner Cloud House, Intersection of Canal Road & Reservoir Road Northwest, Washington, District of Columbia, DC
DOE Office of Scientific and Technical Information (OSTI.GOV)
FISCHER,W.
We summarize the ECL2 workshop on electron cloud clearing, which was held at CERN in early March 2007, and highlight a number of novel ideas for electron cloud suppression, such as continuous clearing electrodes based on enamel, slotted structures, and electrete inserts.
Infrastructures for Distributed Computing: the case of BESIII
NASA Astrophysics Data System (ADS)
Pellegrino, J.
2018-05-01
The BESIII is an electron-positron collision experiment hosted at BEPCII in Beijing and aimed to investigate Tau-Charm physics. Now BESIII has been running for several years and gathered more than 1PB raw data. In order to analyze these data and perform massive Monte Carlo simulations, a large amount of computing and storage resources is needed. The distributed computing system is based up on DIRAC and it is in production since 2012. It integrates computing and storage resources from different institutes and a variety of resource types such as cluster, grid, cloud or volunteer computing. About 15 sites from BESIII Collaboration from all over the world joined this distributed computing infrastructure, giving a significant contribution to the IHEP computing facility. Nowadays cloud computing is playing a key role in the HEP computing field, due to its scalability and elasticity. Cloud infrastructures take advantages of several tools, such as VMDirac, to manage virtual machines through cloud managers according to the job requirements. With the virtually unlimited resources from commercial clouds, the computing capacity could scale accordingly in order to deal with any burst demands. General computing models have been discussed in the talk and are addressed herewith, with particular focus on the BESIII infrastructure. Moreover new computing tools and upcoming infrastructures will be addressed.
On the evolution of Saturn's 'Spokes' - Theory
NASA Technical Reports Server (NTRS)
Morfill, G. E.; Gruen, E.; Goertz, C. K.; Johnson, T. V.
1983-01-01
Starting with the assumption that negatively charged micron-sized dust grains may be elevated above Saturn's ring plane by plasma interactions, the subsequent evolution of the system is discussed. The discharge of the fine dust by solar UV radiation produces a cloud of electrons which moves adiabatically in Saturn's dipolar magnetic field. The electron cloud is absorbed by the ring after one bounce, alters the local ring potential significantly, and reduces the local Debye length. As a result, more micron-sized dust particles may be elevated above the ring plane and the spoke grows. This process continues until the electron cloud has dissipated.
ROOFN3D: Deep Learning Training Data for 3d Building Reconstruction
NASA Astrophysics Data System (ADS)
Wichmann, A.; Agoub, A.; Kada, M.
2018-05-01
Machine learning methods have gained in importance through the latest development of artificial intelligence and computer hardware. Particularly approaches based on deep learning have shown that they are able to provide state-of-the-art results for various tasks. However, the direct application of deep learning methods to improve the results of 3D building reconstruction is often not possible due, for example, to the lack of suitable training data. To address this issue, we present RoofN3D which provides a new 3D point cloud training dataset that can be used to train machine learning models for different tasks in the context of 3D building reconstruction. It can be used, among others, to train semantic segmentation networks or to learn the structure of buildings and the geometric model construction. Further details about RoofN3D and the developed data preparation framework, which enables the automatic derivation of training data, are described in this paper. Furthermore, we provide an overview of other available 3D point cloud training data and approaches from current literature in which solutions for the application of deep learning to unstructured and not gridded 3D point cloud data are presented.
A bright-rimmed cloud sculpted by the H ii region Sh2-48
NASA Astrophysics Data System (ADS)
Ortega, M. E.; Paron, S.; Giacani, E.; Rubio, M.; Dubner, G.
2013-08-01
Aims: We characterize a bright-rimmed cloud embedded in the H ii region Sh2-48 while searching for evidence of triggered star formation. Methods: We carried out observations towards a region of 2' × 2' centered at RA = 18h22m11.39s, Dec = -14°35'24.81''(J2000) using the Atacama Submillimeter Telescope Experiment (ASTE; Chile) in the 12CO J = 3-2, 13CO J = 3-2, HCO+J = 4-3, and CS J = 7-6 lines with an angular resolution of about 22''. We also present radio continuum observations at 5 GHz carried out with the Jansky Very Large Array (JVLA; EEUU) interferometer with a synthetized beam of 7'' × 5''. The molecular transitions were used to study the distribution and kinematics of the molecular gas of the bright-rimmed cloud. The radio continuum data was used to characterize the ionized gas located on the illuminated border of this molecular condensation. Combining these observations with infrared public data allowed us to build up a comprehensive picture of the current state of star formation within this cloud. Results: The analysis of our molecular observations reveals a relatively dense clump with n(H2) ~ 3 × 103cm-3, located in projection onto the interior of the H ii region Sh2-48. The emission distribution of the four observed molecular transitions has, at VLSR ~ 38 km s-1, morphological anticorrelation with the bright-rimmed cloud as seen in the optical emission. From the new radio continuum observations, we identify a thin layer of ionized gas located on the border of the clump that is facing the ionizing star. The ionized gas has an electron density of about 73 cm-3, which is a factor three higher than the typical critical density (nc ~ 25 cm-3), above which an ionized boundary layer can be formed and maintained. This supports the hypothesis that the clump is being photoionized by the nearby O9.5V star, BD-14 5014. From the evaluation of the pressure balance between the ionized and molecular gas, we conclude that the clump would be in a prepressure balance state with the shocks being driven into the surface layer. Among the five YSO candidates found in the region, two of them (class I) are placed slightly beyond the bright rim, suggesting that their formation could have been triggered by the radiation-driven implosion process.
Rupturing of Biological Spores As a Source of Secondary Particles in Amazonia.
China, Swarup; Wang, Bingbing; Weis, Johannes; Rizzo, Luciana; Brito, Joel; Cirino, Glauber G; Kovarik, Libor; Artaxo, Paulo; Gilles, Mary K; Laskin, Alexander
2016-11-15
Airborne biological particles, such as fungal spores and pollen, are ubiquitous in the Earth's atmosphere and may influence the atmospheric environment and climate, impacting air quality, cloud formation, and the Earth's radiation budget. The atmospheric transformations of airborne biological spores at elevated relative humidity remain poorly understood and their climatic role is uncertain. Using an environmental scanning electron microscope (ESEM), we observed rupturing of Amazonian fungal spores and subsequent release of submicrometer size fragments after exposure to high humidity. We find that fungal fragments contain elements of inorganic salts (e.g., Na and Cl). They are hygroscopic in nature with a growth factor up to 2.3 at 96% relative humidity, thus they may potentially influence cloud formation. Due to their hygroscopic growth, light scattering cross sections of the fragments are enhanced by up to a factor of 10. Furthermore, rupturing of fungal spores at high humidity may explain the bursting events of new particle formation in Amazonia.
Reconstructing Buildings with Discontinuities and Roof Overhangs from Oblique Aerial Imagery
NASA Astrophysics Data System (ADS)
Frommholz, D.; Linkiewicz, M.; Meissner, H.; Dahlke, D.
2017-05-01
This paper proposes a two-stage method for the reconstruction of city buildings with discontinuities and roof overhangs from oriented nadir and oblique aerial images. To model the structures the input data is transformed into a dense point cloud, segmented and filtered with a modified marching cubes algorithm to reduce the positional noise. Assuming a monolithic building the remaining vertices are initially projected onto a 2D grid and passed to RANSAC-based regression and topology analysis to geometrically determine finite wall, ground and roof planes. If this should fail due to the presence of discontinuities the regression will be repeated on a 3D level by traversing voxels within the regularly subdivided bounding box of the building point set. For each cube a planar piece of the current surface is approximated and expanded. The resulting segments get mutually intersected yielding both topological and geometrical nodes and edges. These entities will be eliminated if their distance-based affiliation to the defining point sets is violated leaving a consistent building hull including its structural breaks. To add the roof overhangs the computed polygonal meshes are projected onto the digital surface model derived from the point cloud. Their shapes are offset equally along the edge normals with subpixel accuracy by detecting the zero-crossings of the second-order directional derivative in the gradient direction of the height bitmap and translated back into world space to become a component of the building. As soon as the reconstructed objects are finished the aerial images are further used to generate a compact texture atlas for visualization purposes. An optimized atlas bitmap is generated that allows perspectivecorrect multi-source texture mapping without prior rectification involving a partially parallel placement algorithm. Moreover, the texture atlases undergo object-based image analysis (OBIA) to detect window areas which get reintegrated into the building models. To evaluate the performance of the proposed method a proof-of-concept test on sample structures obtained from real-world data of Heligoland/Germany has been conducted. It revealed good reconstruction accuracy in comparison to the cadastral map, a speed-up in texture atlas optimization and visually attractive render results.
Building Change Detection from Bi-Temporal Dense-Matching Point Clouds and Aerial Images.
Pang, Shiyan; Hu, Xiangyun; Cai, Zhongliang; Gong, Jinqi; Zhang, Mi
2018-03-24
In this work, a novel building change detection method from bi-temporal dense-matching point clouds and aerial images is proposed to address two major problems, namely, the robust acquisition of the changed objects above ground and the automatic classification of changed objects into buildings or non-buildings. For the acquisition of changed objects above ground, the change detection problem is converted into a binary classification, in which the changed area above ground is regarded as the foreground and the other area as the background. For the gridded points of each period, the graph cuts algorithm is adopted to classify the points into foreground and background, followed by the region-growing algorithm to form candidate changed building objects. A novel structural feature that was extracted from aerial images is constructed to classify the candidate changed building objects into buildings and non-buildings. The changed building objects are further classified as "newly built", "taller", "demolished", and "lower" by combining the classification and the digital surface models of two periods. Finally, three typical areas from a large dataset are used to validate the proposed method. Numerous experiments demonstrate the effectiveness of the proposed algorithm.
Automatic Generation of Indoor Navigable Space Using a Point Cloud and its Scanner Trajectory
NASA Astrophysics Data System (ADS)
Staats, B. R.; Diakité, A. A.; Voûte, R. L.; Zlatanova, S.
2017-09-01
Automatic generation of indoor navigable models is mostly based on 2D floor plans. However, in many cases the floor plans are out of date. Buildings are not always built according to their blue prints, interiors might change after a few years because of modified walls and doors, and furniture may be repositioned to the user's preferences. Therefore, new approaches for the quick recording of indoor environments should be investigated. This paper concentrates on laser scanning with a Mobile Laser Scanner (MLS) device. The MLS device stores a point cloud and its trajectory. If the MLS device is operated by a human, the trajectory contains information which can be used to distinguish different surfaces. In this paper a method is presented for the identification of walkable surfaces based on the analysis of the point cloud and the trajectory of the MLS scanner. This method consists of several steps. First, the point cloud is voxelized. Second, the trajectory is analysing and projecting to acquire seed voxels. Third, these seed voxels are generated into floor regions by the use of a region growing process. By identifying dynamic objects, doors and furniture, these floor regions can be modified so that each region represents a specific navigable space inside a building as a free navigable voxel space. By combining the point cloud and its corresponding trajectory, the walkable space can be identified for any type of building even if the interior is scanned during business hours.
Automatic 3d Building Model Generations with Airborne LiDAR Data
NASA Astrophysics Data System (ADS)
Yastikli, N.; Cetin, Z.
2017-11-01
LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D building models can be generated successfully using raw LiDAR point cloud data.
NASA Astrophysics Data System (ADS)
Huang, Yishuo; Chiang, Chih-Hung; Hsu, Keng-Tsang
2018-03-01
Defects presented on the facades of a building do have profound impacts on extending the life cycle of the building. How to identify the defects is a crucial issue; destructive and non-destructive methods are usually employed to identify the defects presented on a building. Destructive methods always cause the permanent damages for the examined objects; on the other hand, non-destructive testing (NDT) methods have been widely applied to detect those defects presented on exterior layers of a building. However, NDT methods cannot provide efficient and reliable information for identifying the defects because of the huge examination areas. Infrared thermography is often applied to quantitative energy performance measurements for building envelopes. Defects on the exterior layer of buildings may be caused by several factors: ventilation losses, conduction losses, thermal bridging, defective services, moisture condensation, moisture ingress, and structure defects. Analyzing the collected thermal images can be quite difficult when the spatial variations of surface temperature are small. In this paper the authors employ image segmentation to cluster those pixels with similar surface temperatures such that the processed thermal images can be composed of limited groups. The surface temperature distribution in each segmented group is homogenous. In doing so, the regional boundaries of the segmented regions can be identified and extracted. A terrestrial laser scanner (TLS) is widely used to collect the point clouds of a building, and those point clouds are applied to reconstruct the 3D model of the building. A mapping model is constructed such that the segmented thermal images can be projected onto the 2D image of the specified 3D building. In this paper, the administrative building in Chaoyang University campus is used as an example. The experimental results not only provide the defect information but also offer their corresponding spatial locations in the 3D model.
NASA Astrophysics Data System (ADS)
Zhou, S.; Tao, W. K.; Li, X.; Matsui, T.; Sun, X. H.; Yang, X.
2015-12-01
A cloud-resolving model (CRM) is an atmospheric numerical model that can numerically resolve clouds and cloud systems at 0.25~5km horizontal grid spacings. The main advantage of the CRM is that it can allow explicit interactive processes between microphysics, radiation, turbulence, surface, and aerosols without subgrid cloud fraction, overlapping and convective parameterization. Because of their fine resolution and complex physical processes, it is challenging for the CRM community to i) visualize/inter-compare CRM simulations, ii) diagnose key processes for cloud-precipitation formation and intensity, and iii) evaluate against NASA's field campaign data and L1/L2 satellite data products due to large data volume (~10TB) and complexity of CRM's physical processes. We have been building the Super Cloud Library (SCL) upon a Hadoop framework, capable of CRM database management, distribution, visualization, subsetting, and evaluation in a scalable way. The current SCL capability includes (1) A SCL data model enables various CRM simulation outputs in NetCDF, including the NASA-Unified Weather Research and Forecasting (NU-WRF) and Goddard Cumulus Ensemble (GCE) model, to be accessed and processed by Hadoop, (2) A parallel NetCDF-to-CSV converter supports NU-WRF and GCE model outputs, (3) A technique visualizes Hadoop-resident data with IDL, (4) A technique subsets Hadoop-resident data, compliant to the SCL data model, with HIVE or Impala via HUE's Web interface, (5) A prototype enables a Hadoop MapReduce application to dynamically access and process data residing in a parallel file system, PVFS2 or CephFS, where high performance computing (HPC) simulation outputs such as NU-WRF's and GCE's are located. We are testing Apache Spark to speed up SCL data processing and analysis.With the SCL capabilities, SCL users can conduct large-domain on-demand tasks without downloading voluminous CRM datasets and various observations from NASA Field Campaigns and Satellite data to a local computer, and inter-compare CRM output and data with GCE and NU-WRF.
NASA Astrophysics Data System (ADS)
Diner, David
2010-05-01
The Multi-angle Imaging SpectroRadiometer (MISR) instrument has been collecting global Earth data from NASA's Terra satellite since February 2000. With its 9 along-track view angles, 4 spectral bands, intrinsic spatial resolution of 275 m, and stable radiometric and geometric calibration, no instrument that combines MISR's attributes has previously flown in space, nor is there is a similar capability currently available on any other satellite platform. Multiangle imaging offers several tools for remote sensing of aerosol and cloud properties, including bidirectional reflectance and scattering measurements, stereoscopic pattern matching, time lapse sequencing, and potentially, optical tomography. Current data products from MISR employ several of these techniques. Observations of the intensity of scattered light as a function of view angle and wavelength provide accurate measures of aerosol optical depths (AOD) over land, including bright desert and urban source regions. Partitioning of AOD according to retrieved particle classification and incorporation of height information improves the relationship between AOD and surface PM2.5 (fine particulate matter, a regulated air pollutant), constituting an important step toward a satellite-based particulate pollution monitoring system. Stereoscopic cloud-top heights provide a unique metric for detecting interannual variability of clouds and exceptionally high quality and sensitivity for detection and height retrieval for low-level clouds. Using the several-minute time interval between camera views, MISR has enabled a pole-to-pole, height-resolved atmospheric wind measurement system. Stereo imagery also makes possible global measurement of the injection heights and advection speeds of smoke plumes, volcanic plumes, and dust clouds, for which a large database is now available. To build upon what has been learned during the first decade of MISR observations, we are evaluating algorithm updates that not only refine retrieval accuracies but also include enhancements (e.g., finer spatial resolution) that would have been computationally prohibitive just ten years ago. In addition, we are developing technological building blocks for future sensors that enable broader spectral coverage, wider swath, and incorporation of high-accuracy polarimetric imaging. Prototype cameras incorporating photoelastic modulators have been constructed. To fully capitalize on the rich information content of the current and next-generation of multiangle imagers, several algorithmic paradigms currently employed need to be re-examined, e.g., the use of aerosol look-up tables, neglect of 3-D effects, and binary partitioning of the atmosphere into "cloudy" or "clear" designations. Examples of progress in algorithm and technology developments geared toward advanced application of multiangle imaging to remote sensing of aerosols and clouds will be presented.
MAVEN Mapping of Plasma Clouds Near Mars
NASA Astrophysics Data System (ADS)
Hurley, D.; Tran, T.; DiBraccio, G. A.; Espley, J. R.; Soobiah, Y. I. J.
2017-12-01
Brace et al. identified parcels of ionospheric plasma above the nominal ionosphere of Venus, dubbed plasma clouds. These were envisioned as instabilities on the ionopause that evolved to escaping parcels of ionospheric plasma. Mars Global Surveyor (MGS) Electron Reflectometer (ER) also detected signatures of ionospheric plasma above the nominal ionopause of Mars. Initial examination of the MGS ER data suggests that plasma clouds are more prevalent at Mars than at Venus, and similarly exhibit a connection to rotations in the upstream Interplanetary Magnetic Field (IMF) as Zhang et al. showed at Venus. We examine electron data from Mars to determine the locations of plasma clouds in the near-Mars environment using MGS and MAVEN data. The extensive coverage of the MAVEN orbit enables mapping an occurrence rate of the photoelectron spectra in Solar Wind Electron Analyzer (SWEA) data spanning all relevant altitudes and solar zenith angles. Martian plasma clouds are observed near the terminator like at Venus. They move to higher altitude as solar zenith angle increases, consistent with the escaping plasma hypothesis.
Semantic Segmentation of Indoor Point Clouds Using Convolutional Neural Network
NASA Astrophysics Data System (ADS)
Babacan, K.; Chen, L.; Sohn, G.
2017-11-01
As Building Information Modelling (BIM) thrives, geometry becomes no longer sufficient; an ever increasing variety of semantic information is needed to express an indoor model adequately. On the other hand, for the existing buildings, automatically generating semantically enriched BIM from point cloud data is in its infancy. The previous research to enhance the semantic content rely on frameworks in which some specific rules and/or features that are hand coded by specialists. These methods immanently lack generalization and easily break in different circumstances. On this account, a generalized framework is urgently needed to automatically and accurately generate semantic information. Therefore we propose to employ deep learning techniques for the semantic segmentation of point clouds into meaningful parts. More specifically, we build a volumetric data representation in order to efficiently generate the high number of training samples needed to initiate a convolutional neural network architecture. The feedforward propagation is used in such a way to perform the classification in voxel level for achieving semantic segmentation. The method is tested both for a mobile laser scanner point cloud, and a larger scale synthetically generated data. We also demonstrate a case study, in which our method can be effectively used to leverage the extraction of planar surfaces in challenging cluttered indoor environments.
Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parishani, Hossein; Pritchard, Michael S.; Bretherton, Christopher S.
Systematic biases in the representation of boundary layer (BL) clouds are a leading source of uncertainty in climate projections. A variation on superparameterization (SP) called “ultraparameterization” (UP) is developed, in which the grid spacing of the cloud-resolving models (CRMs) is fine enough (250 × 20 m) to explicitly capture the BL turbulence, associated clouds, and entrainment in a global climate model capable of multiyear simulations. UP is implemented within the Community Atmosphere Model using 2° resolution (~14,000 embedded CRMs) with one-moment microphysics. By using a small domain and mean-state acceleration, UP is computationally feasible today and promising for exascale computers.more » Short-duration global UP hindcasts are compared with SP and satellite observations of top-of-atmosphere radiation and cloud vertical structure. The most encouraging improvement is a deeper BL and more realistic vertical structure of subtropical stratocumulus (Sc) clouds, due to stronger vertical eddy motions that promote entrainment. Results from 90 day integrations show climatological errors that are competitive with SP, with a significant improvement in the diurnal cycle of offshore Sc liquid water. Ongoing concerns with the current UP implementation include a dim bias for near-coastal Sc that also occurs less prominently in SP and a bright bias over tropical continental deep convection zones. Nevertheless, UP makes global eddy-permitting simulation a feasible and interesting alternative to conventionally parameterized GCMs or SP-GCMs with turbulence parameterizations for studying BL cloud-climate and cloud-aerosol feedback.« less
Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence
Parishani, Hossein; Pritchard, Michael S.; Bretherton, Christopher S.; ...
2017-06-19
Systematic biases in the representation of boundary layer (BL) clouds are a leading source of uncertainty in climate projections. A variation on superparameterization (SP) called “ultraparameterization” (UP) is developed, in which the grid spacing of the cloud-resolving models (CRMs) is fine enough (250 × 20 m) to explicitly capture the BL turbulence, associated clouds, and entrainment in a global climate model capable of multiyear simulations. UP is implemented within the Community Atmosphere Model using 2° resolution (~14,000 embedded CRMs) with one-moment microphysics. By using a small domain and mean-state acceleration, UP is computationally feasible today and promising for exascale computers.more » Short-duration global UP hindcasts are compared with SP and satellite observations of top-of-atmosphere radiation and cloud vertical structure. The most encouraging improvement is a deeper BL and more realistic vertical structure of subtropical stratocumulus (Sc) clouds, due to stronger vertical eddy motions that promote entrainment. Results from 90 day integrations show climatological errors that are competitive with SP, with a significant improvement in the diurnal cycle of offshore Sc liquid water. Ongoing concerns with the current UP implementation include a dim bias for near-coastal Sc that also occurs less prominently in SP and a bright bias over tropical continental deep convection zones. Nevertheless, UP makes global eddy-permitting simulation a feasible and interesting alternative to conventionally parameterized GCMs or SP-GCMs with turbulence parameterizations for studying BL cloud-climate and cloud-aerosol feedback.« less
Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence
NASA Astrophysics Data System (ADS)
Parishani, Hossein; Pritchard, Michael S.; Bretherton, Christopher S.; Wyant, Matthew C.; Khairoutdinov, Marat
2017-07-01
Systematic biases in the representation of boundary layer (BL) clouds are a leading source of uncertainty in climate projections. A variation on superparameterization (SP) called "ultraparameterization" (UP) is developed, in which the grid spacing of the cloud-resolving models (CRMs) is fine enough (250 × 20 m) to explicitly capture the BL turbulence, associated clouds, and entrainment in a global climate model capable of multiyear simulations. UP is implemented within the Community Atmosphere Model using 2° resolution (˜14,000 embedded CRMs) with one-moment microphysics. By using a small domain and mean-state acceleration, UP is computationally feasible today and promising for exascale computers. Short-duration global UP hindcasts are compared with SP and satellite observations of top-of-atmosphere radiation and cloud vertical structure. The most encouraging improvement is a deeper BL and more realistic vertical structure of subtropical stratocumulus (Sc) clouds, due to stronger vertical eddy motions that promote entrainment. Results from 90 day integrations show climatological errors that are competitive with SP, with a significant improvement in the diurnal cycle of offshore Sc liquid water. Ongoing concerns with the current UP implementation include a dim bias for near-coastal Sc that also occurs less prominently in SP and a bright bias over tropical continental deep convection zones. Nevertheless, UP makes global eddy-permitting simulation a feasible and interesting alternative to conventionally parameterized GCMs or SP-GCMs with turbulence parameterizations for studying BL cloud-climate and cloud-aerosol feedback.
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; Fisher, Ward; Yoksas, Tom
2015-04-01
Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high expectations from students who have grown up with smartphones and tablets. These changes are upending traditional approaches to accessing and using data and software. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable in the form of downloadable Unidata-in-a-box virtual images, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our ongoing efforts to deploy a suite of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.
Impact of the resistive wall impedance on beam dynamics in the Future Circular e+e- Collider
NASA Astrophysics Data System (ADS)
Migliorati, M.; Belli, E.; Zobov, M.
2018-04-01
The Future Circular Collider study, which aims at designing post-LHC particle accelerator options, is entering in the final stage, which foresees a conceptual design report containing the basic requirements for a hadron and a lepton collider, as well as options for an electron-proton machine. Due to the high beam intensities of these accelerators, collective effects have to be carefully analyzed. Among them, the finite conductivity of the beam vacuum chamber represents a major source of impedance for the electron-positron collider. By using numerical and analytical tools, a parametric study of longitudinal and transverse instabilities caused by the resistive wall is performed in this paper for the case of the Future Circular Collider lepton machine, by taking into account also the effects of coating, used to fight the electron cloud build up. It will be proved that under certain assumptions the coupling impedance of a two layer system does not depend on the conductivity of the coating and this property represents an important characteristic for the choice of the material itself. The results and findings of this study have an impact on the machine design in several aspects. In particular the quite low threshold of single bunch instabilities with respect to the nominal beam current and the not negligible power losses due to the resistive wall are shown, together with the necessity of a new feedback system to counteract the fast transverse coupled bunch instability. The importance of a round vacuum chamber to avoid the quadrupolar tune shift is also discussed. Finally the crucial importance of the beam pipe material coating and thickness choice for the above results is underlined.
NASA Astrophysics Data System (ADS)
Ge, Xuming
2017-08-01
The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.
Factors governing the total rainfall yield from continental convective clouds
NASA Technical Reports Server (NTRS)
Rosenfeld, Daniel; Gagin, Abraham
1989-01-01
Several important factors that govern the total rainfall from continental convective clouds were investigated by tracking thousands of convective cells in Israel and South Africa. The rainfall volume yield (Rvol) of the individual cells that build convective rain systems has been shown to depend mainly on the cloud-top height. There is, however, considerable variability in this relationship. The following factors that influence the Rvol were parameterized and quantitatively analyzed: (1) cloud base temperature, (2)atmospheric instability, and (3) the extent of isolation of the cell. It is also shown that a strong low level forcing increases the duration of Rvol of clouds reaching the same vertical extent.
NASA Astrophysics Data System (ADS)
Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia
2018-05-01
Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.
Cloud-Based Virtual Laboratory for Network Security Education
ERIC Educational Resources Information Center
Xu, Le; Huang, Dijiang; Tsai, Wei-Tek
2014-01-01
Hands-on experiments are essential for computer network security education. Existing laboratory solutions usually require significant effort to build, configure, and maintain and often do not support reconfigurability, flexibility, and scalability. This paper presents a cloud-based virtual laboratory education platform called V-Lab that provides a…
Reconciliation of the cloud computing model with US federal electronic health record regulations
2011-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204
Reconciliation of the cloud computing model with US federal electronic health record regulations.
Schweitzer, Eugene J
2012-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.
NASA Astrophysics Data System (ADS)
Sun, Junqiang; Madhavan, S.; Wang, M.
2016-09-01
MODerate resolution Imaging Spectroradiometer (MODIS), a remarkable heritage sensor in the fleet of Earth Observing System for the National Aeronautics and Space Administration (NASA) is in space orbit on two spacecrafts. They are the Terra (T) and Aqua (A) platforms which tracks the Earth in the morning and afternoon orbits. T-MODIS has continued to operate over 15 years easily surpassing the 6 year design life time on orbit. Of the several science products derived from MODIS, one of the primary derivatives is the MODIS Cloud Mask (MOD035). The cloud mask algorithm incorporates several of the MODIS channels in both reflective and thermal infrared wavelengths to identify cloud pixels from clear sky. Two of the thermal infrared channels used in detecting clouds are the 6.7 μm and 8.5 μm. Based on a difference threshold with the 11 μm channel, the 6.7 μm channel helps in identifying thick high clouds while the 8.5 μm channel being useful for identifying thin clouds. Starting 2010, it had been observed in the cloud mask products that several pixels have been misclassified due to the change in the thermal band radiometry. The long-term radiometric changes in these thermal channels have been attributed to the electronic crosstalk contamination. In this paper, the improvement in cloud detection using the 6.7 μm and 8.5 μm channels are demonstrated using the electronic crosstalk correction. The electronic crosstalk phenomena analysis and characterization were developed using the regular moon observation of MODIS and reported in several works. The results presented in this paper should significantly help in improving the MOD035 product, maintaining the long term dataset from T-MODIS which is important for global change monitoring.
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Wang, Lihui
2017-08-01
Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.
Studies in the use of cloud type statistics in mission simulation
NASA Technical Reports Server (NTRS)
Fowler, M. G.; Willand, J. H.; Chang, D. T.; Cogan, J. L.
1974-01-01
A study to further improve NASA's global cloud statistics for mission simulation is reported. Regional homogeneity in cloud types was examined; most of the original region boundaries defined for cloud cover amount in previous studies were supported by the statistics on cloud types and the number of cloud layers. Conditionality in cloud statistics was also examined with special emphasis on temporal and spatial dependencies, and cloud type interdependence. Temporal conditionality was found up to 12 hours, and spatial conditionality up to 200 miles; the diurnal cycle in convective cloudiness was clearly evident. As expected, the joint occurrence of different cloud types reflected the dynamic processes which form the clouds. Other phases of the study improved the cloud type statistics for several region and proposed a mission simulation scheme combining the 4-dimensional atmospheric model, sponsored by MSFC, with the global cloud model.
Dynamics of charge clouds ejected from laser-induced warm dense gold nanofilms
Zhou, Jun; Li, Junjie; Correa, Alfredo A.; ...
2014-10-24
We report the first systematic study of the ejected charge dynamics surrounding laser-produced 30-nm warm dense gold films using single-shot femtosecond electron shadow imaging and deflectometry. The results reveal a two-step dynamical process of the ejected electrons under the high pump fluence conditions: an initial emission and accumulation of a large amount of electrons near the pumped surface region followed by the formation of hemispherical clouds of electrons on both sides of the film, which are escaping into the vacuum at a nearly isotropic and constant velocity with an unusually high kinetic energy of more than 300 eV. We alsomore » developed a model of the escaping charge distribution that not only reproduces the main features of the observed charge expansion dynamics but also allows us to extract the number of ejected electrons remaining in the cloud.« less
Dynamics of charge clouds ejected from laser-induced warm dense gold nanofilms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Jun; Li, Junjie; Correa, Alfredo A.
We report the first systematic study of the ejected charge dynamics surrounding laser-produced 30-nm warm dense gold films using single-shot femtosecond electron shadow imaging and deflectometry. The results reveal a two-step dynamical process of the ejected electrons under the high pump fluence conditions: an initial emission and accumulation of a large amount of electrons near the pumped surface region followed by the formation of hemispherical clouds of electrons on both sides of the film, which are escaping into the vacuum at a nearly isotropic and constant velocity with an unusually high kinetic energy of more than 300 eV. We alsomore » developed a model of the escaping charge distribution that not only reproduces the main features of the observed charge expansion dynamics but also allows us to extract the number of ejected electrons remaining in the cloud.« less
Integration of Point Clouds Dataset from Different Sensors
NASA Astrophysics Data System (ADS)
Abdullah, C. K. A. F. Che Ku; Baharuddin, N. Z. S.; Ariff, M. F. M.; Majid, Z.; Lau, C. L.; Yusoff, A. R.; Idris, K. M.; Aspuri, A.
2017-02-01
Laser Scanner technology become an option in the process of collecting data nowadays. It is composed of Airborne Laser Scanner (ALS) and Terrestrial Laser Scanner (TLS). ALS like Phoenix AL3-32 can provide accurate information from the viewpoint of rooftop while TLS as Leica C10 can provide complete data for building facade. However if both are integrated, it is able to produce more accurate data. The focus of this study is to integrate both types of data acquisition of ALS and TLS and determine the accuracy of the data obtained. The final results acquired will be used to generate models of three-dimensional (3D) buildings. The scope of this study is focusing on data acquisition of UTM Eco-home through laser scanning methods such as ALS which scanning on the roof and the TLS which scanning on building façade. Both device is used to ensure that no part of the building that are not scanned. In data integration process, both are registered by the selected points among the manmade features which are clearly visible in Cyclone 7.3 software. The accuracy of integrated data is determined based on the accuracy assessment which is carried out using man-made registration methods. The result of integration process can achieve below 0.04m. This integrated data then are used to generate a 3D model of UTM Eco-home building using SketchUp software. In conclusion, the combination of the data acquisition integration between ALS and TLS would produce the accurate integrated data and able to use for generate a 3D model of UTM eco-home. For visualization purposes, the 3D building model which generated is prepared in Level of Detail 3 (LOD3) which recommended by City Geographic Mark-Up Language (CityGML).
A Data Driven Pre-cooling Framework for Energy Cost Optimization in Commercial Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vishwanath, Arun; Chandan, Vikas; Mendoza, Cameron
Commercial buildings consume significant amount of energy. Facility managers are increasingly grappling with the problem of reducing their buildings’ peak power, overall energy consumption and energy bills. In this paper, we first develop an optimization framework – based on a gray box model for zone thermal dynamics – to determine a pre-cooling strategy that simultaneously shifts the peak power to low energy tariff regimes, and reduces both the peak power and overall energy consumption by exploiting the flexibility in a building’s thermal comfort range. We then evaluate the efficacy of the pre-cooling optimization framework by applying it to building managementmore » system data, spanning several days, obtained from a large commercial building located in a tropical region of the world. The results from simulations show that optimal pre-cooling reduces peak power by over 50%, energy consumption by up to 30% and energy bills by up to 37%. Next, to enable ease of use of our framework, we also propose a shortest path based heuristic algorithmfor solving the optimization problemand show that it has comparable erformance with the optimal solution. Finally, we describe an application of the proposed optimization framework for developing countries to reduce the dependency on expensive fossil fuels, which are often used as a source for energy backup.We conclude by highlighting our real world deployment of the optimal pre-cooling framework via a software service on the cloud platform of a major provider. Our pre-cooling methodology, based on the gray box optimization framework, incurs no capital expense and relies on data readily available from a building management system, thus enabling facility managers to take informed decisions for improving the energy and cost footprints of their buildings« less
A Diffusion Cloud Chamber Study of Very Slow Mesons. II. Beta Decay of the Muon
DOE R&D Accomplishments Database
Lederman, L. M.; Sargent, C. P.; Rinehart, M.; Rogers, K.
1955-03-01
The spectrum of electrons arising from the decay of the negative mu meson has been determined. The muons are arrested in the gas of a high pressure hydrogen filled diffusion cloud chamber. The momenta of the decay electrons are determined from their curvature in a magnetic field of 7750 gauss. The spectrum of 415 electrons has been analyzed according to the theory of Michel.
3D modeling of building indoor spaces and closed doors from imagery and point clouds.
Díaz-Vilariño, Lucía; Khoshelham, Kourosh; Martínez-Sánchez, Joaquín; Arias, Pedro
2015-02-03
3D models of indoor environments are increasingly gaining importance due to the wide range of applications to which they can be subjected: from redesign and visualization to monitoring and simulation. These models usually exist only for newly constructed buildings; therefore, the development of automatic approaches for reconstructing 3D indoors from imagery and/or point clouds can make the process easier, faster and cheaper. Among the constructive elements defining a building interior, doors are very common elements and their detection can be very useful either for knowing the environment structure, to perform an efficient navigation or to plan appropriate evacuation routes. The fact that doors are topologically connected to walls by being coplanar, together with the unavoidable presence of clutter and occlusions indoors, increases the inherent complexity of the automation of the recognition process. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors based on point clouds and images. The methodology analyses the visibility problem of indoor environments and goes in depth with door candidate detection. The presented approach is tested in real data sets showing its potential with a high door detection rate and applicability for robust and efficient envelope reconstruction.
NASA Technical Reports Server (NTRS)
Spann, J. F., Jr.; Germany, G. A.; Brittnacher, M. J.; Parks, G. K.; Elsen, R.
1997-01-01
The January 10-11, 1997 magnetic cloud event provided a rare opportunity to study auroral energy deposition under varying but intense IMF conditions. The Wind spacecraft located about 100 RE upstream monitored the IMF and plasma parameters during the passing of the cloud. The Polar Ultraviolet Imager (UVI) observed the aurora[ precipitation during the first encounter of the cloud with Earth's magnetosphere and during several subsequent substorm events. The UVI has the unique capability of measuring the energy flux and characteristic energy of the precipitating electrons through the use of narrow band filters that distinguish short and long wavelength molecular nitrogen emissions. The spatial and temporal characteristics of the precipitating electron energy will be discussed beginning with the inception of the event at the Earth early January 1 Oth and continuing through the subsidence of auroral activity on January 11th.
Grammar-Supported 3d Indoor Reconstruction from Point Clouds for As-Built Bim
NASA Astrophysics Data System (ADS)
Becker, S.; Peter, M.; Fritsch, D.
2015-03-01
The paper presents a grammar-based approach for the robust automatic reconstruction of 3D interiors from raw point clouds. The core of the approach is a 3D indoor grammar which is an extension of our previously published grammar concept for the modeling of 2D floor plans. The grammar allows for the modeling of buildings whose horizontal, continuous floors are traversed by hallways providing access to the rooms as it is the case for most office buildings or public buildings like schools, hospitals or hotels. The grammar is designed in such way that it can be embedded in an iterative automatic learning process providing a seamless transition from LOD3 to LOD4 building models. Starting from an initial low-level grammar, automatically derived from the window representations of an available LOD3 building model, hypotheses about indoor geometries can be generated. The hypothesized indoor geometries are checked against observation data - here 3D point clouds - collected in the interior of the building. The verified and accepted geometries form the basis for an automatic update of the initial grammar. By this, the knowledge content of the initial grammar is enriched, leading to a grammar with increased quality. This higher-level grammar can then be applied to predict realistic geometries to building parts where only sparse observation data are available. Thus, our approach allows for the robust generation of complete 3D indoor models whose quality can be improved continuously as soon as new observation data are fed into the grammar-based reconstruction process. The feasibility of our approach is demonstrated based on a real-world example.
76 FR 323 - Information Systems Technical Advisory Committee; Notice of Partially Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-04
...), Building 33, Cloud Room, 53560 Hull Street, San Diego, California 92152. The Committee advises the Office... and Relation to Category 3 5. Godson Microprocessor Project 6. Autonomous Vehicle Project 7. Cloud Computing, Technology and Security Issues Thursday, January 27 Closed Session 8. Discussion of matters...
2011-03-31
CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, an ominous thunderstorm cloud hovers over the Vehicle Assembly Building in the Launch Complex 39 area. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Kim Shiflett
2011-03-31
CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, dark clouds hover over the Vehicle Assembly Building in the Launch Complex 39 area. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Jack Pfaller
2011-03-31
CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, dark clouds hover over the Vehicle Assembly Building in the Launch Complex 39 area. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Jack Pfaller
Probing optical excitations in chevron-like armchair graphene nanoribbons.
Denk, Richard; Lodi-Rizzini, Alberto; Wang, Shudong; Hohage, Michael; Zeppenfeld, Peter; Cai, Jinming; Fasel, Roman; Ruffieux, Pascal; Berger, Reinhard Franz Josef; Chen, Zongping; Narita, Akimitsu; Feng, Xinliang; Müllen, Klaus; Biagi, Roberto; De Renzi, Valentina; Prezzi, Deborah; Ruini, Alice; Ferretti, Andrea
2017-11-30
The bottom-up fabrication of graphene nanoribbons (GNRs) has opened new opportunities to specifically tune their electronic and optical properties by precisely controlling their atomic structure. Here, we address excitation in GNRs with periodic structural wiggles, the so-called chevron GNRs. Based on reflectance difference and high-resolution electron energy loss spectroscopies together with ab initio simulations, we demonstrate that their excited-state properties are of excitonic nature. The spectral fingerprints corresponding to different reaction stages in their bottom-up fabrication are also unequivocally identified, allowing us to follow the exciton build-up from the starting monomer precursor to the final GNR structure.
NASA Astrophysics Data System (ADS)
Feng, Bing
Electron cloud instabilities have been observed in many circular accelerators around the world and raised concerns of future accelerators and possible upgrades. In this thesis, the electron cloud instabilities are studied with the quasi-static particle-in-cell (PIC) code QuickPIC. Modeling in three-dimensions the long timescale propagation of beam in electron clouds in circular accelerators requires faster and more efficient simulation codes. Thousands of processors are easily available for parallel computations. However, it is not straightforward to increase the effective speed of the simulation by running the same problem size on an increasingly number of processors because there is a limit to domain size in the decomposition of the two-dimensional part of the code. A pipelining algorithm applied on the fully parallelized particle-in-cell code QuickPIC is implemented to overcome this limit. The pipelining algorithm uses multiple groups of processors and optimizes the job allocation on the processors in parallel computing. With this novel algorithm, it is possible to use on the order of 102 processors, and to expand the scale and the speed of the simulation with QuickPIC by a similar factor. In addition to the efficiency improvement with the pipelining algorithm, the fidelity of QuickPIC is enhanced by adding two physics models, the beam space charge effect and the dispersion effect. Simulation of two specific circular machines is performed with the enhanced QuickPIC. First, the proposed upgrade to the Fermilab Main Injector is studied with an eye upon guiding the design of the upgrade and code validation. Moderate emittance growth is observed for the upgrade of increasing the bunch population by 5 times. But the simulation also shows that increasing the beam energy from 8GeV to 20GeV or above can effectively limit the emittance growth. Then the enhanced QuickPIC is used to simulate the electron cloud effect on electron beam in the Cornell Energy Recovery Linac (ERL) due to extremely small emittance and high peak currents anticipated in the machine. A tune shift is discovered from the simulation; however, emittance growth of the electron beam in electron cloud is not observed for ERL parameters.
Galactic Building Blocks Seen Swarming Around Andromeda
NASA Astrophysics Data System (ADS)
2004-02-01
Green Bank, WV - A team of astronomers using the National Science Foundation's Robert C. Byrd Green Bank Telescope (GBT) has made the first conclusive detection of what appear to be the leftover building blocks of galaxy formation -- neutral hydrogen clouds -- swarming around the Andromeda Galaxy, the nearest large spiral galaxy to the Milky Way. This discovery may help scientists understand the structure and evolution of the Milky Way and all spiral galaxies. It also may help explain why certain young stars in mature galaxies are surprisingly bereft of the heavy elements that their contemporaries contain. Andromeda Galaxy This image depicts several long-sought galactic "building blocks" in orbit of the Andromeda Galaxy (M31). The newfound hydrogen clouds are depicted in a shade of orange (GBT), while gas that comprises the massive hydrogen disk of Andromeda is shown at high-resolution in blue (Westerbork Sythesis Radio Telescope). CREDIT: NRAO/AUI/NSF, WSRT (Click on Image for Larger Version) "Giant galaxies, like Andromeda and our own Milky Way, are thought to form through repeated mergers with smaller galaxies and through the accretion of vast numbers of even lower mass 'clouds' -- dark objects that lack stars and even are too small to call galaxies," said David A. Thilker of the Johns Hopkins University in Baltimore, Maryland. "Theoretical studies predict that this process of galactic growth continues today, but astronomers have been unable to detect the expected low mass 'building blocks' falling into nearby galaxies, until now." Thilker's research is published in the Astrophysical Journal Letters. Other contributors include: Robert Braun of the Netherlands Foundation for Research in Astronomy; Rene A.M. Walterbos of New Mexico State University; Edvige Corbelli of the Osservatorio Astrofisico di Arcetri in Italy; Felix J. Lockman and Ronald Maddalena of the National Radio Astronomy Observatory (NRAO) in Green Bank, West Virginia; and Edward Murphy of the University of Virginia. The Milky Way and Andromeda were formed many billions of years ago in a cosmic neighborhood brimming with galactic raw materials -- among which hydrogen, helium, and cold dark matter were primary constituents. By now, most of this raw material has probably been gobbled up by the two galaxies, but astronomers suspect that some primitive clouds are still floating free. Previous studies have revealed a number of clouds of neutral atomic hydrogen that are near the Milky Way but not part of its disk. These were initially referred to as high-velocity clouds (HVCs) when they were first discovered because they appeared to move at velocities difficult to reconcile with Galactic rotation. Scientists were uncertain if HVCs comprised building blocks of the Milky Way that had so far escaped capture, or if they traced gas accelerated to unexpected velocities by energetic processes (multiple supernovae) within the Milky Way. The discovery of similar clouds bound to the Andromeda Galaxy strengthens the case that at least some of these HVCs are indeed galactic building blocks. Astronomers are able to use radio telescopes to detect the characteristic 21-centimeter radiation emitted naturally by neutral atomic hydrogen. The great difficulty in analyzing these low-mass galactic building blocks has been that their natural radio emission is extremely faint. Even those nearest to us, clouds orbiting our Galaxy, are hard to study because of serious distance uncertainties. "We know the Milky Way HVCs are relatively nearby, but precisely how close is maddeningly tough to determine," said Thilker. Past attempts to find missing satellites around external galaxies at well-known distances have been unsuccessful because of the need for a very sensitive instrument capable of producing high-fidelity images, even in the vicinity of a bright source such as the Andromeda Galaxy. One might consider this task similar to visually distinguishing a candle placed adjacent to a spotlight. The novel design of the recently commissioned GBT met these challenges brilliantly, and gave astronomers their first look at the cluttered neighborhood around Andromeda. The Andromeda Galaxy was targeted because it is the nearest massive spiral galaxy. "In some sense, the rich get richer, even in space," said Thilker. "All else being equal, one would expect to find more primordial clouds in the vicinity of a large spiral galaxy than near a small dwarf galaxy, for instance. This makes Andromeda a good place to look, especially considering its relative proximity -- a mere 2.5 million light-years from Earth." What the GBT was able to pin down was a population of 20 discrete neutral hydrogen clouds, together with an extended filamentary component, which, the astronomers believe, are both associated with Andromeda. These objects, seemingly under the gravitational influence of Andromeda's halo, are thought to be the gaseous clouds of the "missing" (perhaps dark-matter dominated) satellites and their merger remnants. They were found within 163,000 light-years of Andromeda. Favored cosmological models have predicted the existence of these satellites, and their discovery could account for some of the missing "cold dark matter" in the Universe. Also, confirmation that these low-mass objects are ubiquitous around larger galaxies could help solve the mystery of why certain young stars, known as G-dwarf stars, are chemically similar to ones that evolved billions of years ago. As galaxies age, they develop greater concentrations of heavy elements formed by the nuclear reactions in the cores of stars and in the cataclysmic explosions of supernovae. These explosions spew heavy elements out into the galaxy, which then become planets and get taken up in the next generation of stars. Spectral and photometric analysis of young stars in the Milky Way and other galaxies, however, show that there are a certain number of young stars that are surprisingly bereft of heavy elements, making them resemble stars that should have formed in the early stages of galactic evolution. "One way to account for this strange anomaly is to have a fresh source of raw galactic material from which to form new stars," said Murphy. "Since high-velocity clouds may be the leftover building blocks of galaxy formation, they contain nearly pristine concentrations of hydrogen, mostly free from the heavy metals that seed older galaxies." Their merger into large galaxies, therefore, could explain how fresh material is available for the formation of G-dwarf stars. The Andromeda Galaxy, also known as M31, is one of only a few galaxies that are visible from Earth with the unaided eye, and is seen as a faint smudge in the constellation Andromeda. When viewed through a modest telescope, Andromeda also reveals that it has two prominent satellite dwarf galaxies, known as M32 and M110. These dwarfs, along with the clouds studied by Thilker and collaborators, are doomed to eventually merge with Andromeda. The Milky Way, M33, and the Andromeda Galaxy plus about 40 dwarf companions, comprise what is known as the "Local Group." Today, Andromeda is perhaps the most studied galaxy other than the Milky Way. In fact, many of the things we know about the nature of galaxies like the Milky Way were learned by studying Andromeda, since the overall features of our own galaxy are disguised by our internal vantage point. "In this case, Andromeda is a good analogue for the Milky Way," said Murphy. "It clarifies the picture. Living inside the Milky Way is like trying to determine what your house looks like from the inside, without stepping outdoors. However, if you look at neighbors' houses, you can get a feeling for what your own home might look like." The GBT is the world's largest fully steerable radio telescope. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
Background: With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. Methods: The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. Results: The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. Conclusion: According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology. PMID:29719309
A Multi-Frequency Wide-Swath Spaceborne Cloud and Precipitation Imaging Radar
NASA Technical Reports Server (NTRS)
Li, Lihua; Racette, Paul; Heymsfield, Gary; McLinden, Matthew; Venkatesh, Vijay; Coon, Michael; Perrine, Martin; Park, Richard; Cooley, Michael; Stenger, Pete;
2016-01-01
Microwave and millimeter-wave radars have proven their effectiveness in cloud and precipitation observations. The NASA Earth Science Decadal Survey (DS) Aerosol, Cloud and Ecosystems (ACE) mission calls for a dual-frequency cloud radar (W band 94 GHz and Ka-band 35 GHz) for global measurements of cloud microphysical properties. Recently, there have been discussions of utilizing a tri-frequency (KuKaW-band) radar for a combined ACE and Global Precipitation Measurement (GPM) follow-on mission that has evolved into the Cloud and Precipitation Process Mission (CaPPM) concept. In this presentation we will give an overview of the technology development efforts at the NASA Goddard Space Flight Center (GSFC) and at Northrop Grumman Electronic Systems (NGES) through projects funded by the NASA Earth Science Technology Office (ESTO) Instrument Incubator Program (IIP). Our primary objective of this research is to advance the key enabling technologies for a tri-frequency (KuKaW-band) shared-aperture spaceborne imaging radar to provide unprecedented, simultaneous multi-frequency measurements that will enhance understanding of the effects of clouds and precipitation and their interaction on Earth climate change. Research effort has been focused on concept design and trade studies of the tri-frequency radar; investigating architectures that provide tri-band shared-aperture capability; advancing the development of the Ka band active electronically scanned array (AESA) transmitreceive (TR) module, and development of the advanced radar backend electronics.
Dust particle radial confinement in a dc glow discharge.
Sukhinin, G I; Fedoseev, A V; Antipov, S N; Petrov, O F; Fortov, V E
2013-01-01
A self-consistent nonlocal model of the positive column of a dc glow discharge with dust particles is presented. Radial distributions of plasma parameters and the dust component in an axially homogeneous glow discharge are considered. The model is based on the solution of a nonlocal Boltzmann equation for the electron energy distribution function, drift-diffusion equations for ions, and the Poisson equation for a self-consistent electric field. The radial distribution of dust particle density in a dust cloud was fixed as a given steplike function or was chosen according to an equilibrium Boltzmann distribution. The balance of electron and ion production in argon ionization by an electron impact and their losses on the dust particle surface and on the discharge tube walls is taken into account. The interrelation of discharge plasma and the dust cloud is studied in a self-consistent way, and the radial distributions of the discharge plasma and dust particle parameters are obtained. It is shown that the influence of the dust cloud on the discharge plasma has a nonlocal behavior, e.g., density and charge distributions in the dust cloud substantially depend on the plasma parameters outside the dust cloud. As a result of a self-consistent evolution of plasma parameters to equilibrium steady-state conditions, ionization and recombination rates become equal to each other, electron and ion radial fluxes become equal to zero, and the radial component of electric field is expelled from the dust cloud.
THE LAUNCHING OF COLD CLOUDS BY GALAXY OUTFLOWS. II. THE ROLE OF THERMAL CONDUCTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brüggen, Marcus; Scannapieco, Evan
2016-05-01
We explore the impact of electron thermal conduction on the evolution of radiatively cooled cold clouds embedded in flows of hot and fast material as it occurs in outflowing galaxies. Performing a parameter study of three-dimensional adaptive mesh refinement hydrodynamical simulations, we show that electron thermal conduction causes cold clouds to evaporate, but it can also extend their lifetimes by compressing them into dense filaments. We distinguish between low column-density clouds, which are disrupted on very short times, and high-column density clouds with much longer disruption times that are set by a balance between impinging thermal energy and evaporation. Wemore » provide fits to the cloud lifetimes and velocities that can be used in galaxy-scale simulations of outflows in which the evolution of individual clouds cannot be modeled with the required resolution. Moreover, we show that the clouds are only accelerated to a small fraction of the ambient velocity because compression by evaporation causes the clouds to present a small cross-section to the ambient flow. This means that either magnetic fields must suppress thermal conduction, or that the cold clouds observed in galaxy outflows are not formed of cold material carried out from the galaxy.« less
Molgenis-impute: imputation pipeline in a box.
Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A
2015-08-19
Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional computational steps or it can be included in other bioinformatics pipelines. It is available as open source from: https://github.com/molgenis/molgenis-imputation.
PIONEER VENUS 2 MULTI-PROBE PARACHUTE TESTS IN THE VAB SHOWS OPEN PARACHUTE
NASA Technical Reports Server (NTRS)
1975-01-01
A parachute system, designed to carry an instrument-laden probe down through the dense atmosphere of torrid, cloud-shrouded Venus, was tested in KSC's Vehicle Assembly Building. The tests are in preparation for a Pioneer multi-probe mission to Venus scheduled for launch from KSC in 1978. Full-scale (12-foot diameter) parachutes with simulated pressure vessels weighing up to 45 pounds were dropped from heights of up to 450 feet tot he floor of the VAB where the impact was cushioned by a honeycomb cardboard impact arrestor. The VAB offers an ideal, wind-free testing facility at no additional construction cost and was used for similar tests of the parachute system for the twin Viking spacecraft scheduled for launch toward Mars in August.
PIONEER VENUS 2 MULTI-PROBE PARACHUTE TESTS IN VAB WITH PARACHUTE HOISTED HIGH
NASA Technical Reports Server (NTRS)
1975-01-01
A parachute system, designed to carry an instrument-laden probe down through the dense atmosphere of torrid, cloud-shrouded Venus, was tested in KSC's Vehicle Assembly Building. The tests are in preparation for a Pioneer multi-probe mission to Venus scheduled for launch from KSC in 1978. Full-scale (12-foot diameter) parachutes with simulated pressure vessels weighing up to 45 pounds were dropped from heights of up to 450 feet tot he floor of the VAB where the impact was cushioned by a honeycomb cardboard impact arrestor. The VAB offers an ideal, wind-free testing facility at no additional construction cost and was used for similar tests of the parachute system for the twin Viking spacecraft scheduled for launch toward Mars in August.
PIONEER VENUS 2 MULTI-PROBE PARACHUTE TESTS IN VAB PRIOR TO ATTACHING PRESSURE VESSEL
NASA Technical Reports Server (NTRS)
1975-01-01
A parachute system, designed to carry an instrument-laden probe down through the dense atmosphere of torrid, cloud-shrouded Venus, was tested in KSC's Vehicle Assembly Building. The tests are in preparation for a Pioneer multi-probe mission to Venus scheduled for launch from KSC in 1978. Full-scale (12-foot diameter) parachutes with simulated pressure vessels weighing up to 45 pounds were dropped from heights of up to 450 feet tot he floor of the VAB where the impact was cushioned by a honeycomb cardboard impact arrestor. The VAB offers an ideal, wind-free testing facility at no additional construction cost and was used for similar tests of the parachute system for the twin Viking spacecraft scheduled for launch toward Mars in August.
Radiological considerations in the operation of the low-energy undulator test line (LEUTL).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moe, H.J.
1998-11-11
The Low-Energy Undulator Test Line (LEUTL) is a facility that uses the existing APS linac to accelerate electrons up to an energy of 700 MeV. These electrons are transported through the Pm into a portion of the booster synchrotrons and on into the LEUTL main enclosure (MIL 97). Figure 1 shows the layout of the LEUTL building, which consists of an earth-benned concrete enclosure and an end-station building. The concrete enclosure houses the electron beamline, test undulator, and beam dump. This facility is about 51 m long and 3.66 m wide. Technical components and diagnostics for characterizing the undulator lightmore » are found in the end station. This building has about 111 m{sup 2} of floor space. This note deals with the radiological considerations of operations using electrons up to 700 MeV and at power levels up to the safety envelope of 1 kW. Previous radiological considerations for electron and positron operations in the linac, PAR, and synchrotrons have been addressed else-where (MOE 93a, 93b, and 93c). Much of the methodology discussed in the previous writeups, as well as in MOE 94, has been used in the computations in this note. The radiological aspects that are addressed include the following: prompt secondary radiation (bremsstrahlung, giant resonance neutrons, medium- and high-energy neutrons) produced by electrons interacting in a beam stop or in component structures; skyshine radiation, which produces a radiation field in nearby areas and at the nearest off-site location; radioactive gases produced by neutron irradiation of air in the vicinity of a particle loss site; noxious gases (ozone and others) produced in air by the escaping bremsstrahlung radiation that results from absorbing particles in the components; activation of the LEUTL components that results in a residual radiation field in the vicinity of these materials following shutdown; potential activation of water used for cooling the magnets and other purposes in the tunnel; and evaluation of the radiation fields due to escaping gas bremsstrahlung. Estimated dose rates have been computed or scaled (in the case of 400 MeV electrons) outside of the bermed tunnel, in Building 412, and in the Klystron Gallery for several modes of operation, including potential safety envelope beam power, normal beam power and MCI (maximum credible incident) conditions. Radiological aspects of shielding changes to the synchrotrons and their effect upon operations are addressed in MOE 97. No change in the safety envelope for synchrotrons operation was warranted.« less
Cloud/web mapping and geoprocessing services - Intelligently linking geoinformation
NASA Astrophysics Data System (ADS)
Veenendaal, Bert; Brovelli, Maria Antonia; Wu, Lixin
2016-04-01
We live in a world that is alive with information and geographies. "Everything happens somewhere" (Tosta, 2001). This reality is being exposed in the digital earth technologies providing a multi-dimensional, multi-temporal and multi-resolution model of the planet, based on the needs of diverse actors: from scientists to decision makers, communities and citizens (Brovelli et al., 2015). We are building up a geospatial information infrastructure updated in real time thanks to mobile, positioning and sensor observations. Users can navigate, not only through space but also through time, to access historical data and future predictions based on social and/or environmental models. But how do we find the information about certain geographic locations or localities when it is scattered in the cloud and across the web of data behind a diversity of databases, web services and hyperlinked pages? We need to be able to link geoinformation together in order to integrate it, make sense of it, and use it appropriately for managing the world and making decisions.
Discovery of massive star formation quenching by non-thermal effects in the centre of NGC 1097
NASA Astrophysics Data System (ADS)
Tabatabaei, F. S.; Minguez, P.; Prieto, M. A.; Fernández-Ontiveros, J. A.
2018-01-01
Observations show that massive star formation quenches first at the centres of galaxies. To understand quenching mechanisms, we investigate the thermal and non-thermal energy balance in the central kpc of NGC 1097—a prototypical galaxy undergoing quenching—and present a systematic study of the nuclear star formation efficiency and its dependencies. This region is dominated by the non-thermal pressure from the magnetic field, cosmic rays and turbulence. A comparison of the mass-to-magnetic flux ratio of the molecular clouds shows that most of them are magnetically critical or supported against the gravitational collapse needed to form the cores of massive stars. Moreover, the star formation efficiency of the clouds drops with the magnetic field strength. Such an anti-correlation holds with neither the turbulent nor the thermal pressure. Hence, a progressive build up of the magnetic field results in high-mass stars forming inefficiently, and this may be the cause of the low-mass stellar population in the bulges of galaxies.
Physical conditions in CaFe interstellar clouds
NASA Astrophysics Data System (ADS)
Gnaciński, P.; Krogulec, M.
2008-01-01
Interstellar clouds that exhibit strong Ca I and Fe I lines are called CaFe clouds. Ionisation equilibrium equations were used to model the column densities of Ca II, Ca I, K I, Na I, Fe I and Ti II in CaFe clouds. We find that the chemical composition of CaFe clouds is solar and that there is no depletion into dust grains. CaFe clouds have high electron densities, n_e≈1 cm-3, that lead to high column densities of neutral Ca and Fe.
Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun
NASA Technical Reports Server (NTRS)
Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.
2008-01-01
Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.
Discharge processes, electric field, and electron energy in ISUAL-recorded gigantic jets
NASA Astrophysics Data System (ADS)
Kuo, Cheng-Ling; Chou, J. K.; Tsai, L. Y.; Chen, A. B.; Su, H. T.; Hsu, R. R.; Cummer, S. A.; Frey, H. U.; Mende, S. B.; Takahashi, Y.; Lee, L. C.
2009-04-01
This article reports the first high time resolution measurements of gigantic jets from the Imager of Sprites and Upper Atmospheric Lightning (ISUAL) experiment. The velocity of the upward propagating fully developed jet stage of the gigantic jets was ˜107 m s-1, which is similar to that observed for downward sprite streamers. Analysis of spectral ratios for the fully developed jet emissions gives a reduced E field of 400-655 Td and average electron energy of 8.5-12.3 eV. These values are higher than those in the sprites but are similar to those predicted by streamer models, which implies the existence of streamer tips in fully developed jets. The gigantic jets studied here all contained two distinct photometric peaks. The first peak is from the fully developed jet, which steadily propagates from the cloud top (˜20 km) to the lower ionosphere at ˜90 km. We suggest that the second photometric peak, which occurs ˜1 ms after the first peak, is from a current wave or potential wave-enhanced emissions that originate at an altitude of ˜50 km and extend toward the cloud top. We propose that the fully developed jet serves as an extension of the local ionosphere and produces a lowered ionosphere boundary. As the attachment processes remove the charges, the boundary of the local ionosphere moves up. The current in the channel persists and its contact point with the ionosphere moves upward, which produces the upward surging trailing jets. Imager and photometer data indicate that the lightning activity associated with the gigantic jets likely is in-cloud, and thus the initiation of the gigantic jets is not directly associated with cloud-to-ground discharges.
Building environment assessment and energy consumption estimation using smart phones
NASA Astrophysics Data System (ADS)
Li, Xiangli; Zhang, Li; Jia, Yingqi; Wang, Zihan; Jin, Xin; Zhao, Xuefeng
2017-04-01
In this paper, an APP for building indoor environment evaluation and energy consumption estimation based on Android platform is proposed and established. While using the APP, the smart phone built-in sensors are called for real-time monitoring of the building environmental information such as temperature, humidity and noise, etc. the built-in algorithm is developed to calculate the heat and power consumption, and questionnaires, grading and other methods are used to feed back to the space heating system. In addition, with the application of the technology of big data and cloud technology, the data collected by users will be uploaded to the cloud. After the statistics of the uploaded data, regional difference can be obtained, thus providing a more accurate basis for macro-control and research of energy, thermal comfort, greenhouse effect.
All-in-One Shape-Adaptive Self-Charging Power Package for Wearable Electronics.
Guo, Hengyu; Yeh, Min-Hsin; Lai, Ying-Chih; Zi, Yunlong; Wu, Changsheng; Wen, Zhen; Hu, Chenguo; Wang, Zhong Lin
2016-11-22
Recently, a self-charging power unit consisting of an energy harvesting device and an energy storage device set the foundation for building a self-powered wearable system. However, the flexibility of the power unit working under extremely complex deformations (e.g., stretching, twisting, and bending) becomes a key issue. Here, we present a prototype of an all-in-one shape-adaptive self-charging power unit that can be used for scavenging random body motion energy under complex mechanical deformations and then directly storing it in a supercapacitor unit to build up a self-powered system for wearable electronics. A kirigami paper based supercapacitor (KP-SC) was designed to work as the flexible energy storage device (stretchability up to 215%). An ultrastretchable and shape-adaptive silicone rubber triboelectric nanogenerator (SR-TENG) was utilized as the flexible energy harvesting device. By combining them with a rectifier, a stretchable, twistable, and bendable, self-charging power package was achieved for sustainably driving wearable electronics. This work provides a potential platform for the flexible self-powered systems.
NASA Astrophysics Data System (ADS)
Campi, M.; di Luggo, A.; Scandurra, S.
2017-02-01
The object of this paper is one of the most ancient palaces of Naples, Palazzo Penne, a fourteenth-century residential building located on a small high ground which originally was in the outer fringe of the built up area in a privileged position enabling to enjoy the landscape and gulf beauty. This building, which today is in the heart of the historical center, was the subject of an extensive analysis and documentary research, as well as of metric laser scanner survey carried out by the group researchers working at the Interdepartmental Centre of Research Urban Eco of the University of Naples Federico II. Starting from scan to bim systems the creation of a parametric model of the current state of the building is completed, by bringing the point cloud elements back to objects to which historical and construction data can be associated. Moreover starting from acquired data, the 3D model shows the reconstructive hypothesis of the original structure and the virtual reconstruction of the building based on traces found on-site and on the comparison with coeval creations allowing to properly hypothesize the design of point features.
Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds
NASA Astrophysics Data System (ADS)
Li, Rui; Chen, Lei; Li, Wen-Syan
Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.
NASA Astrophysics Data System (ADS)
Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.
2014-03-01
Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.
An Examination of the Nature of Global MODIS Cloud Regimes
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin; Kato, Seiji; Huffman, George J.
2014-01-01
We introduce global cloud regimes (previously also referred to as "weather states") derived from cloud retrievals that use measurements by the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Aqua and Terra satellites. The regimes are obtained by applying clustering analysis on joint histograms of retrieved cloud top pressure and cloud optical thickness. By employing a compositing approach on data sets from satellites and other sources, we examine regime structural and thermodynamical characteristics. We establish that the MODIS cloud regimes tend to form in distinct dynamical and thermodynamical environments and have diverse profiles of cloud fraction and water content. When compositing radiative fluxes from the Clouds and the Earth's Radiant Energy System instrument and surface precipitation from the Global Precipitation Climatology Project, we find that regimes with a radiative warming effect on the atmosphere also produce the largest implied latent heat. Taken as a whole, the results of the study corroborate the usefulness of the cloud regime concept, reaffirm the fundamental nature of the regimes as appropriate building blocks for cloud system classification, clarify their association with standard cloud types, and underscore their distinct radiative and hydrological signatures.
NASA Astrophysics Data System (ADS)
Correia Rodrigues, H.; Tavian, L.
2017-12-01
The Future Circular Collider (FCC) under study at CERN will produce 50-TeV high-energy proton beams. The high-energy particle beams are bent by 16-T superconducting dipole magnets operating at 1.9 K and distributed over a circumference of 80 km. The circulating beams induce 5 MW of dynamic heat loads by several processes such as synchrotron radiation, resistive dissipation of beam image currents and electron clouds. These beam-induced heat loads will be intercepted by beam screens operating between 40 and 60 K and induce transients during beam injection. Energy ramp-up and beam dumping on the distributed beam-screen cooling loops, the sector cryogenic plants and the dedicated circulators. Based on the current baseline parameters, numerical simulations of the fluid flow in the cryogenic distribution system during a beam operation cycle were performed. The effects of the thermal inertia of the headers on the helium flow temperature at the cryogenic plant inlet as well as the temperature gradient experienced by the beam screen has been assessed. Additionally, this work enabled a thorough exergetic analysis of different cryogenic plant configurations and laid the building-block for establishing design specification of cold and warm circulators.
2001-06-15
KENNEDY SPACE CENTER, Fla. -- Dark clouds and strong winds seem almost to touch the ground near the tow-way leading from the Shuttle Landing Facility (SLF). In the background (right) can be seen the new hangar at the SLF and the mate/demate device. The cloud formation is proceeding across the SLF towards the Vehicle Assembly Building
Improving Indoor Air Quality in St. Cloud Schools.
ERIC Educational Resources Information Center
Forer, Mike; Haus, El
2000-01-01
Describes how the St. Cloud Area School District (Minnesota), using Tools for Schools provided by the U.S. Environmental Protection Agency, managed the improvement of their school building indoor air quality (IAQ). The district goals of the IAQ Management Committee and the policy elements used to maintain high classroom air quality are…
Default Parallels Plesk Panel Page
services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this
NASA Astrophysics Data System (ADS)
Hess, M. R.; Petrovic, V.; Kuester, F.
2017-08-01
Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.
Spontaneous evolution of rydberg atoms into an ultracold plasma
Robinson; Tolra; Noel; Gallagher; Pillet
2000-11-20
We have observed the spontaneous evolution of a dense sample of Rydberg atoms into an ultracold plasma, in spite of the fact that each of the atoms may initially be bound by up to 100 cm(-1). When the atoms are initially bound by 70 cm(-1), this evolution occurs when most of the atoms are translationally cold, <1 mK, but a small fraction, approximately 1%, is at room temperature. Ionizing collisions between hot and cold Rydberg atoms and blackbody photoionization produce an essentially stationary cloud of cold ions, which traps electrons produced later. The trapped electrons rapidly collisionally ionize the remaining cold Rydberg atoms to form a cold plasma.
Titan's atomic nitrogen torus - Inferred properties and consequences for the Saturnian aurora
NASA Astrophysics Data System (ADS)
Barbosa, D. D.
1987-10-01
This paper follows up the lead suggested by Barbosa and Eviatar (1986) that Titanogenic nitrogen ions are a key component of the magnetospheric particle populations and can account for the energetics of the Saturnian aurora without undue assumptions. Nitrogen atoms resulting from electron impact dissociations of N2 (Strobel and Shemansky 1982) escape from Titan and form a large doughnut-shaped ring around the satellite's orbit that is cospatial with the McDonough-Brice (1973) hydrogen cloud. Processes attendant to the ionization and pickup of nitrogen ions include the production of a warm kiloelectronvolt electron population and the excitation of the UV aurora by particle precipitation from the outer magnetosphere.
Chemical Composition of Nebulosities in the Magellanic Clouds
Aller, L. H.; Czyzak, S. J.; Keyes, C. D.; Boeshaar, G.
1974-01-01
From photoelectric spectrophotometric data secured at Cerro Tololo Interamerican Observatory we have attempted to derive electron densities and temperatures, ionic concentrations, and chemical abundances of He, C, N, O, Ne, S, and Ar in nebulosities in the Magellanic Clouds. Although 10 distinct nebulosities were observed in the Small Cloud and 20 such objects in the Large Cloud, the most detailed observations were secured only for the brighter objects. Results for 30 Doradus are in harmony with those published previously and recent work by Peimbert and Torres-Peimbert. Nitrogen and heavier elements appear to be less abundant in the Small Cloud than in the Large Cloud, in accordance with the conclusions of Dufour. A comparison with the Orion nebula suggests He, N, Ne, O, and S may all be less abundant in the Megellanic Clouds, although adequate evaluations will require construction of detailed models. For example, if we postulate that the [NII], [OII], and [SII] radiations originate primarily in regions with electron temperatures near 8000°K, while the [OIII], [NeIII], [ArIII], and H radiations are produced primarily in regions with Tε = 10,000° K, the derived chemical abundances in the clouds are enhanced. PMID:16592199
NASA Astrophysics Data System (ADS)
Vetrivel, Anand; Gerke, Markus; Kerle, Norman; Nex, Francesco; Vosselman, George
2018-06-01
Oblique aerial images offer views of both building roofs and façades, and thus have been recognized as a potential source to detect severe building damages caused by destructive disaster events such as earthquakes. Therefore, they represent an important source of information for first responders or other stakeholders involved in the post-disaster response process. Several automated methods based on supervised learning have already been demonstrated for damage detection using oblique airborne images. However, they often do not generalize well when data from new unseen sites need to be processed, hampering their practical use. Reasons for this limitation include image and scene characteristics, though the most prominent one relates to the image features being used for training the classifier. Recently features based on deep learning approaches, such as convolutional neural networks (CNNs), have been shown to be more effective than conventional hand-crafted features, and have become the state-of-the-art in many domains, including remote sensing. Moreover, often oblique images are captured with high block overlap, facilitating the generation of dense 3D point clouds - an ideal source to derive geometric characteristics. We hypothesized that the use of CNN features, either independently or in combination with 3D point cloud features, would yield improved performance in damage detection. To this end we used CNN and 3D features, both independently and in combination, using images from manned and unmanned aerial platforms over several geographic locations that vary significantly in terms of image and scene characteristics. A multiple-kernel-learning framework, an effective way for integrating features from different modalities, was used for combining the two sets of features for classification. The results are encouraging: while CNN features produced an average classification accuracy of about 91%, the integration of 3D point cloud features led to an additional improvement of about 3% (i.e. an average classification accuracy of 94%). The significance of 3D point cloud features becomes more evident in the model transferability scenario (i.e., training and testing samples from different sites that vary slightly in the aforementioned characteristics), where the integration of CNN and 3D point cloud features significantly improved the model transferability accuracy up to a maximum of 7% compared with the accuracy achieved by CNN features alone. Overall, an average accuracy of 85% was achieved for the model transferability scenario across all experiments. Our main conclusion is that such an approach qualifies for practical use.
Photocathode Optimization for a Dynamic Transmission Electron Microscope: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, P; Flom, Z; Heinselman, K
2011-08-04
The Dynamic Transmission Electron Microscope (DTEM) team at Harvey Mudd College has been sponsored by LLNL to design and build a test setup for optimizing the performance of the DTEM's electron source. Unlike a traditional TEM, the DTEM achieves much faster exposure times by using photoemission from a photocathode to produce electrons for imaging. The DTEM team's work is motivated by the need to improve the coherence and current density of the electron cloud produced by the electron gun in order to increase the image resolution and contrast achievable by DTEM. The photoemission test setup is nearly complete and themore » team will soon complete baseline tests of electron gun performance. The photoemission laser and high voltage power supply have been repaired; the optics path for relaying the laser to the photocathode has been finalized, assembled, and aligned; the internal setup of the vacuum chamber has been finalized and mostly implemented; and system control, synchronization, and data acquisition has been implemented in LabVIEW. Immediate future work includes determining a consistent alignment procedure to place the laser waist on the photocathode, and taking baseline performance measurements of the tantalum photocathode. Future research will examine the performance of the electron gun as a function of the photoemission laser profile, the photocathode material, and the geometry and voltages of the accelerating and focusing components in the electron gun. This report presents the team's progress and outlines the work that remains.« less
A review on the state-of-the-art privacy-preserving approaches in the e-health clouds.
Abbas, Assad; Khan, Samee U
2014-07-01
Cloud computing is emerging as a new computing paradigm in the healthcare sector besides other business domains. Large numbers of health organizations have started shifting the electronic health information to the cloud environment. Introducing the cloud services in the health sector not only facilitates the exchange of electronic medical records among the hospitals and clinics, but also enables the cloud to act as a medical record storage center. Moreover, shifting to the cloud environment relieves the healthcare organizations of the tedious tasks of infrastructure management and also minimizes development and maintenance costs. Nonetheless, storing the patient health data in the third-party servers also entails serious threats to data privacy. Because of probable disclosure of medical records stored and exchanged in the cloud, the patients' privacy concerns should essentially be considered when designing the security and privacy mechanisms. Various approaches have been used to preserve the privacy of the health information in the cloud environment. This survey aims to encompass the state-of-the-art privacy-preserving approaches employed in the e-Health clouds. Moreover, the privacy-preserving approaches are classified into cryptographic and noncryptographic approaches and taxonomy of the approaches is also presented. Furthermore, the strengths and weaknesses of the presented approaches are reported and some open issues are highlighted.
Rupturing of Biological Spores As a Source of Secondary Particles in Amazonia
DOE Office of Scientific and Technical Information (OSTI.GOV)
China, Swarup; Wang, Bingbing; Weis, Johannes
Airborne biological particles, such as fungal spores and pollen, are ubiquitous in the Earth’s atmosphere and play an important role in the atmospheric environment and climate, impacting air quality, cloud formation, and the Earth’s radiation budget. The atmospheric transformations of airborne biological spores at elevated relative humidity remain poorly understood and their climatic role is uncertain. Using an environmental scanning electron microscope (ESEM), we observed rupturing of Amazonian fungal spores and subsequent release of nanometer to submicron size fragments after exposure to high humidity. We find that fungal fragments contain elements of inorganic salts (e.g., Na and Cl). They aremore » hygroscopic in nature with a growth factor up to 2.3 at 96% relative humidity, thus they may potentially influence cloud formation. Due to their hygroscopic growth, light scattering cross sections of the fragments are enhanced by up to a factor of 10. Furthermore, rupturing of fungal spores at high humidity may explain the bursting events of nanoparticles and may provide insight into new particle formation in Amazonia.« less
Yang, Shu; Qiu, Yuyan; Shi, Bo
2016-09-01
This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.
Application of Dusty Plasmas for Space
NASA Astrophysics Data System (ADS)
Bhavasar, Hemang; Ahuja, Smariti
In space, dust particles alone are affected by gravity and radiation pressure when near stars and planets. When the dust particles are immersed in plasma, the dust is usually charged either by photo ionization, due to incident UV radiation, secondary electron emission, due to collisions with energetic ions and electrons, or absorption of charged particles, due to collisions with thermal ions and electrons. A 1 micron radius dust particle in a plasma with an electron temperature of a few eV, will have a charge corresponding to a few thousand electron volts, with a resulting charge to mass ratio, Q/m ¡1. They will also be affected by electric and magnetic fields. Since the electrons are magnetized in these regions, electron E B or diamagnetic cross-field drifts may drive instabilities. Dust grains (micron to sub-micron sized solid particles) in plasma and/or radiative environments can be electrically charged by processes such as plasma current collection or photoemission. The effect of charged dust on known electrojet instabil-ities and low frequency dust acoustic and dust drift instabilities. As the plasma affects the dust particles, the dust particles can affect the plasma environment. In Dust Plasma, Plasma is Combination of ions and electrons. Dusty plasmas (also known as complex plasmas) are ordinary plasmas with embedded solid particles consisting of electrons, ions, and neutrals. The particles can be made of either dielectric or conducting materials, and can have any shape. The typical size range is anywhere from 100 nm up to say 100 m. Most often, these small objects or dust particles are electrically charged. Dusty plasmas are ubiquitous in the universe as proto-planetary and solar nebulae, molecular clouds, supernova explosions, interplanetary medium, circumsolar rings, and steroids. Closer to earth, there are the noctilucent clouds, clouds of tiny (charged) ice particles that form in the summer polar mesosphere at an altitude of about 85 km. In processing plasmas, dust particles are actually grown in the discharge from the reactive gases used to form the plasmas. Perhaps the most intriguing aspect of dusty plasmas is that the particles can be directly imaged and their dynamic behavior recorded as digital images. This is accomplished by laser light scattering from the particles. Since the particle mass is relatively high, their dynamical timescales are much longer than that of the ions or electrons. Dusty plasmas has a broad range of applications including interplanetary space dust, comets, planetary rings, dusty surfaces in space, and aerosols in the atmosphere.
A hierarchical methodology for urban facade parsing from TLS point clouds
NASA Astrophysics Data System (ADS)
Li, Zhuqiang; Zhang, Liqiang; Mathiopoulos, P. Takis; Liu, Fangyu; Zhang, Liang; Li, Shuaipeng; Liu, Hao
2017-01-01
The effective and automated parsing of building facades from terrestrial laser scanning (TLS) point clouds of urban environments is an important research topic in the GIS and remote sensing fields. It is also challenging because of the complexity and great variety of the available 3D building facade layouts as well as the noise and data missing of the input TLS point clouds. In this paper, we introduce a novel methodology for the accurate and computationally efficient parsing of urban building facades from TLS point clouds. The main novelty of the proposed methodology is that it is a systematic and hierarchical approach that considers, in an adaptive way, the semantic and underlying structures of the urban facades for segmentation and subsequent accurate modeling. Firstly, the available input point cloud is decomposed into depth planes based on a data-driven method; such layer decomposition enables similarity detection in each depth plane layer. Secondly, the labeling of the facade elements is performed using the SVM classifier in combination with our proposed BieS-ScSPM algorithm. The labeling outcome is then augmented with weak architectural knowledge. Thirdly, least-squares fitted normalized gray accumulative curves are applied to detect regular structures, and a binarization dilation extraction algorithm is used to partition facade elements. A dynamic line-by-line division is further applied to extract the boundaries of the elements. The 3D geometrical façade models are then reconstructed by optimizing facade elements across depth plane layers. We have evaluated the performance of the proposed method using several TLS facade datasets. Qualitative and quantitative performance comparisons with several other state-of-the-art methods dealing with the same facade parsing problem have demonstrated its superiority in performance and its effectiveness in improving segmentation accuracy.
Electron cloud generation and trapping in a quadrupole magnet at the Los Alamos proton storage ring
NASA Astrophysics Data System (ADS)
Macek, Robert J.; Browman, Andrew A.; Ledford, John E.; Borden, Michael J.; O'Hara, James F.; McCrady, Rodney C.; Rybarcyk, Lawrence J.; Spickermann, Thomas; Zaugg, Thomas J.; Pivi, Mauro T. F.
2008-01-01
Recent beam physics studies on the two-stream e-p instability at the LANL proton storage ring (PSR) have focused on the role of the electron cloud generated in quadrupole magnets where primary electrons, which seed beam-induced multipacting, are expected to be largest due to grazing angle losses from the beam halo. A new diagnostic to measure electron cloud formation and trapping in a quadrupole magnet has been developed, installed, and successfully tested at PSR. Beam studies using this diagnostic show that the “prompt” electron flux striking the wall in a quadrupole is comparable to the prompt signal in the adjacent drift space. In addition, the “swept” electron signal, obtained using the sweeping feature of the diagnostic after the beam was extracted from the ring, was larger than expected and decayed slowly with an exponential time constant of 50 to 100μs. Other measurements include the cumulative energy spectra of prompt electrons and the variation of both prompt and swept electron signals with beam intensity. Experimental results were also obtained which suggest that a good fraction of the electrons observed in the adjacent drift space for the typical beam conditions in the 2006 run cycle were seeded by electrons ejected from the quadrupole.
NASA Astrophysics Data System (ADS)
Gevaert, C. M.; Persello, C.; Sliuzas, R.; Vosselman, G.
2016-06-01
Unmanned Aerial Vehicles (UAVs) are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.
Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Fraser, C. S.; Lu, G.
2015-08-01
Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2017-10-01
Molecular clouds which youre likely familiar with from stunning popular astronomy imagery lead complicated, tumultuous lives. A recent study has now found that these features must be rapidly built and destroyed.Star-Forming CollapseA Hubble view of a molecular cloud, roughly two light-years long, that has broken off of the Carina Nebula. [NASA/ESA, N. Smith (University of California, Berkeley)/The Hubble Heritage Team (STScI/AURA)]Molecular gas can be found throughout our galaxy in the form of eminently photogenic clouds (as featured throughout this post). Dense, cold molecular gas makes up more than 20% of the Milky Ways total gas mass, and gravitational instabilities within these clouds lead them to collapse under their own weight, resulting in the formation of our galaxys stars.How does this collapse occur? The simplest explanation is that the clouds simply collapse in free fall, with no source of support to counter their contraction. But if all the molecular gas we observe collapsed on free-fall timescales, star formation in our galaxy would churn a rate thats at least an order of magnitude higher than the observed 12 solar masses per year in the Milky Way.Destruction by FeedbackAstronomers have theorized that there may be some mechanism that supports these clouds against gravity, slowing their collapse. But both theoretical studies and observations of the clouds have ruled out most of these potential mechanisms, and mounting evidence supports the original interpretation that molecular clouds are simply gravitationally collapsing.A sub-mm image from ESOs APEX telescope of part of the Taurus molecular cloud, roughly ten light-years long, superimposed on a visible-light image of the region. [ESO/APEX (MPIfR/ESO/OSO)/A. Hacar et al./Digitized Sky Survey 2. Acknowledgment: Davide De Martin]If this is indeed the case, then one explanation for our low observed star formation rate could be that molecular clouds are rapidly destroyed by feedback from the very stars they create. But to match with observations, this wouldsuggest that molecular clouds are short-lived objects that are built (and therefore replenished) just as quickly as they are destroyed. Is this possible?Speedy Building?In a recent study, a team of scientists led by Mordecai-Mark Mac Low (American Museum of Natural History and Heidelberg University, Germany) explore whether there is a way to create molecular clouds rapidly enough to match the necessary rate of destruction.Mac Low and collaborators find that some common mechanisms used to explain the formation of molecular clouds like gas being swept up by supernovae cant quite operate quickly enough to combat the rate of cloud destruction. On the other hand, the Toomre gravitational instability,which is a large-scale gravitational instability that occurs in gas disks,can very rapidly assemble gas into clumps dense enough to form molecules.A composite of visible and near-infrared images from the VLT ANTU telescope of the Barnard 68 molecular cloud, roughly half a light-year in diameter. [ESO]A Rapid CycleBased on their findings, the authors argue that dense, star-forming molecular clouds persist only for a short time before collapsing into stars and then being blown apart by stellar feedback but these very clouds are built equally quickly via gravitational instabilities.Conveniently, this model has a very testable prediction: the Toomre instability is expected to become even stronger at higher redshift, which suggests that the fraction of gas in the form of molecules should increase at high redshifts. This appears to agree with observations, supporting the authors picture of a rapid cycle of cloud assembly and destruction.CitationMordecai-Mark Mac Low et al 2017 ApJL 847 L10. doi:10.3847/2041-8213/aa8a61
Cloud screening Coastal Zone Color Scanner images using channel 5
NASA Technical Reports Server (NTRS)
Eckstein, B. A.; Simpson, J. J.
1991-01-01
Clouds are removed from Coastal Zone Color Scanner (CZCS) data using channel 5. Instrumentation problems require pre-processing of channel 5 before an intelligent cloud-screening algorithm can be used. For example, at intervals of about 16 lines, the sensor records anomalously low radiances. Moreover, the calibration equation yields negative radiances when the sensor records zero counts, and pixels corrupted by electronic overshoot must also be excluded. The remaining pixels may then be used in conjunction with the procedure of Simpson and Humphrey to determine the CZCS cloud mask. These results plus in situ observations of phytoplankton pigment concentration show that pre-processing and proper cloud-screening of CZCS data are necessary for accurate satellite-derived pigment concentrations. This is especially true in the coastal margins, where pigment content is high and image distortion associated with electronic overshoot is also present. The pre-processing algorithm is critical to obtaining accurate global estimates of pigment from spacecraft data.
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Calculation of gyrosynchrotron radiation brightness temperature for outer bright loop of ICME
NASA Astrophysics Data System (ADS)
Sun, Weiying; Wu, Ji; Wang, C. B.; Wang, S.
:Solar polar orbit radio telescope (SPORT) is proposed to detect the high density plasma clouds of outer bright loop of ICMEs from solar orbit with large inclination. Of particular interest is following the propagation of the plasma clouds with remote sensor in radio wavelength band. Gyrosynchrotron emission is a main radio radiation mechanism of the plasma clouds and can provide information of interplanetary magnetic field. In this paper, we statistically analyze the electron density, electron temperature and magnetic field of background solar wind in time of quiet sun and ICMEs propagation. We also estimate the fluctuation range of the electron density, electron temperature and magnetic field of outer bright loop of ICMEs. Moreover, we calculate and analyze the emission brightness temperature and degree of polarization on the basis of the study of gyrosynchrotron emission, absorption and polarization characteristics as the optical depth is less than or equal to 1.
Hybrid Automatic Building Interpretation System
NASA Astrophysics Data System (ADS)
Pakzad, K.; Klink, A.; Müterthies, A.; Gröger, G.; Stroh, V.; Plümer, L.
2011-09-01
HABIS (Hybrid Automatic Building Interpretation System) is a system for an automatic reconstruction of building roofs used in virtual 3D building models. Unlike most of the commercially available systems, HABIS is able to work to a high degree automatically. The hybrid method uses different sources intending to exploit the advantages of the particular sources. 3D point clouds usually provide good height and surface data, whereas spatial high resolution aerial images provide important information for edges and detail information for roof objects like dormers or chimneys. The cadastral data provide important basis information about the building ground plans. The approach used in HABIS works with a multi-stage-process, which starts with a coarse roof classification based on 3D point clouds. After that it continues with an image based verification of these predicted roofs. In a further step a final classification and adjustment of the roofs is done. In addition some roof objects like dormers and chimneys are also extracted based on aerial images and added to the models. In this paper the used methods are described and some results are presented.
3D Reconstruction of Irregular Buildings and Buddha Statues
NASA Astrophysics Data System (ADS)
Zhang, K.; Li, M.-j.
2014-04-01
Three-dimensional laser scanning could acquire object's surface data quickly and accurately. However, the post-processing of point cloud is not perfect and could be improved. Based on the study of 3D laser scanning technology, this paper describes the details of solutions to modelling irregular ancient buildings and Buddha statues in Jinshan Temple, which aiming at data acquisition, modelling and texture mapping, etc. In order to modelling irregular ancient buildings effectively, the structure of each building is extracted manually by point cloud and the textures are mapped by the software of 3ds Max. The methods clearly combine 3D laser scanning technology with traditional modelling methods, and greatly improves the efficiency and accuracy of the ancient buildings restored. On the other hand, the main idea of modelling statues is regarded as modelling objects in reverse engineering. The digital model of statues obtained is not just vivid, but also accurate in the field of surveying and mapping. On this basis, a 3D scene of Jinshan Temple is reconstructed, which proves the validity of the solutions.
Improved Arctic Cloud and Aerosol Research and Model Parameterizations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenneth Sassen
2007-03-01
In this report are summarized our contributions to the Atmospheric Measurement (ARM) program supported by the Department of Energy. Our involvement commenced in 1990 during the planning stages of the design of the ARM Cloud and Radiation Testbed (CART) sites. We have worked continuously (up to 2006) on our ARM research objectives, building on our earlier findings to advance our knowledge in several areas. Below we summarize our research over this period, with an emphasis on the most recent work. We have participated in several aircraft-supported deployments at the SGP and NSA sites. In addition to deploying the Polarization Diversitymore » Lidar (PDL) system (Sassen 1994; Noel and Sassen 2005) designed and constructed under ARM funding, we have operated other sophisticated instruments W-band polarimetric Doppler radar, and midinfrared radiometer for intercalibration and student training purposes. We have worked closely with University of North Dakota scientists, twice co-directing the Citation operations through ground-to-air communications, and serving as the CART ground-based mission coordinator with NASA aircraft during the 1996 SUCCESS/IOP campaign. We have also taken a leading role in initiating case study research involving a number of ARM coinvestigators. Analyses of several case studies from these IOPs have been reported in journal articles, as we show in Table 1. The PDL has also participated in other major field projects, including FIRE II and CRYSTAL-FACE. In general, the published results of our IOP research can be divided into two categories: comprehensive cloud case study analyses to shed light on fundamental cloud processes using the unique CART IOP measurement capabilities, and the analysis of in situ data for the testing of remote sensing cloud retrieval algorithms. One of the goals of the case study approach is to provide sufficiently detailed descriptions of cloud systems from the data-rich CART environment to make them suitable for application to cloud modeling groups, such as the GEWEX Cloud Simulation Study (GCSS) Cirrus Working Groups. In this paper we summarize our IOP-related accomplishments.« less
TLS for generating multi-LOD of 3D building model
NASA Astrophysics Data System (ADS)
Akmalia, R.; Setan, H.; Majid, Z.; Suwardhi, D.; Chong, A.
2014-02-01
The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown.
Automated Classification of Heritage Buildings for As-Built Bim Using Machine Learning Techniques
NASA Astrophysics Data System (ADS)
Bassier, M.; Vergauwen, M.; Van Genechten, B.
2017-08-01
Semantically rich three dimensional models such as Building Information Models (BIMs) are increasingly used in digital heritage. They provide the required information to varying stakeholders during the different stages of the historic buildings life cyle which is crucial in the conservation process. The creation of as-built BIM models is based on point cloud data. However, manually interpreting this data is labour intensive and often leads to misinterpretations. By automatically classifying the point cloud, the information can be proccesed more effeciently. A key aspect in this automated scan-to-BIM process is the classification of building objects. In this research we look to automatically recognise elements in existing buildings to create compact semantic information models. Our algorithm efficiently extracts the main structural components such as floors, ceilings, roofs, walls and beams despite the presence of significant clutter and occlusions. More specifically, Support Vector Machines (SVM) are proposed for the classification. The algorithm is evaluated using real data of a variety of existing buildings. The results prove that the used classifier recognizes the objects with both high precision and recall. As a result, entire data sets are reliably labelled at once. The approach enables experts to better document and process heritage assets.
Values of the phase space factors for double beta decay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoica, Sabin, E-mail: stoica@theory.nipne.ro; Mirea, Mihai; Horia Hulubei National Institute of Physics and Nuclear Engineering, 30 Reactorului street, P.O. Box MG6, Magurele
2015-10-28
We report an up-date list of the experimentally most interesting phase space factors for double beta decay (DBD). The electron/positron wave functions are obtained by solving the Dirac equations with a Coulomb potential derived from a realistic proton density distribution in nucleus and with inclusion of the finite nuclear size (FNS) and electron screening (ES) effects. We build up new numerical routines which allow us a good control of the accuracy of calculations. We found several notable differences as compared with previous results reported in literature and possible sources of these discrepancies are discussed.
NASA Glenn Icing Research Tunnel: Upgrade and Cloud Calibration
NASA Technical Reports Server (NTRS)
VanZante, Judith Foss; Ide, Robert F.; Steen, Laura E.
2012-01-01
In 2011, NASA Glenn s Icing Research Tunnel underwent a major modification to it s refrigeration plant and heat exchanger. This paper presents the results of the subsequent full cloud calibration. Details of the calibration procedure and results are presented herein. The steps include developing a nozzle transfer map, establishing a uniform cloud, conducting a drop sizing calibration and finally a liquid water content calibration. The goal of the calibration is to develop a uniform cloud, and to build a transfer map from the inputs of air speed, spray bar atomizing air pressure and water pressure to the output of median volumetric droplet diameter and liquid water content.
NASA Glenn Icing Research Tunnel: 2012 Cloud Calibration Procedure and Results
NASA Technical Reports Server (NTRS)
VanZante, Judith Foss; Ide, Robert F.; Steen, Laura E.
2012-01-01
In 2011, NASA Glenn s Icing Research Tunnel underwent a major modification to it s refrigeration plant and heat exchanger. This paper presents the results of the subsequent full cloud calibration. Details of the calibration procedure and results are presented herein. The steps include developing a nozzle transfer map, establishing a uniform cloud, conducting a drop sizing calibration and finally a liquid water content calibration. The goal of the calibration is to develop a uniform cloud, and to build a transfer map from the inputs of air speed, spray bar atomizing air pressure and water pressure to the output of median volumetric droplet diameter and liquid water content.
NASA Astrophysics Data System (ADS)
Macher, H.; Grussenmeyer, P.; Landes, T.; Halin, G.; Chevrier, C.; Huyghe, O.
2017-08-01
The French collection of Plan-Reliefs, scale models of fortified towns, constitutes a precious testimony of the history of France. The aim of the URBANIA project is the valorisation and the diffusion of this Heritage through the creation of virtual models. The town scale model of Strasbourg at 1/600 currently exhibited in the Historical Museum of Strasbourg was selected as a case study. In this paper, the photogrammetric recording of this scale model is first presented. The acquisition protocol as well as the data post-processing are detailed. Then, the modelling of the city and more specially building blocks is investigated. Based on point clouds of the scale model, the extraction of roof elements is considered. It deals first with the segmentation of the point cloud into building blocks. Then, for each block, points belonging to roofs are identified and the extraction of chimney point clouds as well as roof ridges and roof planes is performed. Finally, the 3D parametric modelling of the building blocks is studied by considering roof polygons and polylines describing chimneys as input. In a future works section, the semantically enrichment and the potential usage scenarios of the scale model are envisaged.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Developing cloud-based Business Process Management (BPM): a survey
NASA Astrophysics Data System (ADS)
Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh
2018-03-01
In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.
Challenges in Securing the Interface Between the Cloud and Pervasive Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagesse, Brent J
2011-01-01
Cloud computing presents an opportunity for pervasive systems to leverage computational and storage resources to accomplish tasks that would not normally be possible on such resource-constrained devices. Cloud computing can enable hardware designers to build lighter systems that last longer and are more mobile. Despite the advantages cloud computing offers to the designers of pervasive systems, there are some limitations of leveraging cloud computing that must be addressed. We take the position that cloud-based pervasive system must be secured holistically and discuss ways this might be accomplished. In this paper, we discuss a pervasive system utilizing cloud computing resources andmore » issues that must be addressed in such a system. In this system, the user's mobile device cannot always have network access to leverage resources from the cloud, so it must make intelligent decisions about what data should be stored locally and what processes should be run locally. As a result of these decisions, the user becomes vulnerable to attacks while interfacing with the pervasive system.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
... Rehabilitation Research--Disability and Rehabilitation Research Project--Inclusive Cloud and Web Computing CFDA... inclusive Cloud and Web computing. The Assistant Secretary may use this priority for competitions in fiscal... Priority for Inclusive Cloud and Web Computing'' in the subject line of your electronic message. FOR...
Using Word Clouds to Develop Proactive Learners
ERIC Educational Resources Information Center
Miley, Frances; Read, Andrew
2011-01-01
This article examines student responses to a technique for summarizing electronically available information based on word frequency. Students used this technique to create word clouds, using those word clouds to enhance personal and small group study. This is a qualitative study. Small focus groups were used to obtain student feedback. Feedback…
The Effects of Thunderstorm Static and Quasi-Static Electric Fields on the Lower Ionosphere
NASA Astrophysics Data System (ADS)
Salem, Mohammad Ahmad
Thunderstorms and their lightning discharges are of great interest to many areas of geophysics and atmospheric electricity. A thunderstorm is an electric generator; it can produce both electrostatic and quasi-electrostatic fields in the overhead atmospheric D region. The D region is the lower part of the ionosphere that extends from about 40-90 km altitude where the electrons and ions are sufficient enough to affect the propagation of radio waves. In contrast to the electrostatic field, the quasi-electrostatic fields can be much stronger in magnitude, but shorter in duration, and can trigger halos. A halo is one type of the transient luminous events (TLEs) and typically appears within 1-2 ms after an intense cloud to ground lightning discharge. It looks like a relatively homogeneous glow in the shape of a pancake that is centered around 75-80 km altitude with a horizontal extent of tens of kilometers and vertical thickness of several kilometers. The goals of this dissertation research are to investigate the electrical effects of thunderstorm electrostatic and quasi-electrostatic fields on the nighttime lower ionosphere, and their covert relation to the formation of atmospheric halos. This work entails numerical and theoretical modeling analyses, and comparison of current theory and simulation results with the actual observations. For the first part of this study we have demonstrated that, under steady state conditions, electrostatic fields of <0.4Ek values (not strong enough to produce TLEs) can be established in the lower ionosphere due to underlying thunderstorms. We utilized the simplified nighttime ion chemistry model described in the work of Liu [2012] to investigate how these fields affect the lower ionosphere ion density profile. The three-body electron attachment, through which electrons can be converted to negative ions, is the only process whose rate constant depends on the field values within the above-mentioned limit. As a result of the variation of the rate constant with the electric field, the nighttime steady state electron density profile can be reduced by ˜40% or enhanced by a factor of ˜6. We have improved our model in order to self-consistently calculate the steady state conductivity of the lower ionosphere above a thunderstorm. The new model takes into account the heating effects of thunderstorm electrostatic fields on the free electrons. The modeling results indicate that under steady state condition, although the electron density is generally increased, the nighttime lower ionospheric conductivity can be reduced by up to 1-2 orders of magnitude because electron mobility is significantly reduced due to the electron heating effect. Because of this reduction, it is found that for a typical ionospheric density profile, the resulting changes in the reflection heights of ELF and VLF waves are 5 and 2 km, respectively. In the second part of this dissertation, a one-dimensional plasma discharge fluid model is developed to study the response of the nighttime lower ionosphere to the quasi-electrostatic field produced by cloud-to-ground lightning flashes. When the quasi-electrostatic field reaches and exceeds about E k, a halo can be triggered in the lower ionosphere. The modeling results indicate that the ionospheric perturbation is determined by the ambient ionospheric density profile, the charge. moment change, and charge transfer time. Tenuous ambient profiles result in larger changes in the ionospheric electron density. Cloud-to-ground lightning discharges, with larger charge moment changes and shorter charge transfer times, result in a larger change in the ionospheric electron density. In particular, the enhancement in the lower ionospheric electron density due to impulsive negative cloud-to-ground lightning flashes has been investigated. It is found that the enhancement can reach up to about 3 orders of magnitude above ˜70 km altitude in a few seconds. Below ˜75 km altitude, this enhancement recovers in a few seconds due to the fast electron attachment process. The recovery time of the electron enhancement above ˜75 km altitude is controlled by a slower recombination process; it depends on the ambient density profile and can last for tens of minutes to hours. Finally, the modeling results of the lower ionosphere recovery time are analyzed to investigate the role of halos in producing early VLF events with long recovery time. It is found that these events can be explained when sufficient ionization is produced around ˜80 km altitude. Such ionization can be produced by the impact of impulsive negative cloud-to-ground lightning flashes with a relatively large charge moment change on a tenuous ionospheric density profile.
NASA Astrophysics Data System (ADS)
Parishani, H.; Pritchard, M. S.; Bretherton, C. S.; Wyant, M. C.; Khairoutdinov, M.; Singh, B.
2017-12-01
Biases and parameterization formulation uncertainties in the representation of boundary layer clouds remain a leading source of possible systematic error in climate projections. Here we show the first results of cloud feedback to +4K SST warming in a new experimental climate model, the ``Ultra-Parameterized (UP)'' Community Atmosphere Model, UPCAM. We have developed UPCAM as an unusually high-resolution implementation of cloud superparameterization (SP) in which a global set of cloud resolving arrays is embedded in a host global climate model. In UP, the cloud-resolving scale includes sufficient internal resolution to explicitly generate the turbulent eddies that form marine stratocumulus and trade cumulus clouds. This is computationally costly but complements other available approaches for studying low clouds and their climate interaction, by avoiding parameterization of the relevant scales. In a recent publication we have shown that UP, while not without its own complexity trade-offs, can produce encouraging improvements in low cloud climatology in multi-month simulations of the present climate and is a promising target for exascale computing (Parishani et al. 2017). Here we show results of its low cloud feedback to warming in multi-year simulations for the first time. References: Parishani, H., M. S. Pritchard, C. S. Bretherton, M. C. Wyant, and M. Khairoutdinov (2017), Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence, J. Adv. Model. Earth Syst., 9, doi:10.1002/2017MS000968.
Real-time single-molecule imaging of quantum interference.
Juffmann, Thomas; Milic, Adriana; Müllneritsch, Michael; Asenbaum, Peter; Tsukernik, Alexander; Tüxen, Jens; Mayor, Marcel; Cheshnovsky, Ori; Arndt, Markus
2012-03-25
The observation of interference patterns in double-slit experiments with massive particles is generally regarded as the ultimate demonstration of the quantum nature of these objects. Such matter-wave interference has been observed for electrons, neutrons, atoms and molecules and, in contrast to classical physics, quantum interference can be observed when single particles arrive at the detector one by one. The build-up of such patterns in experiments with electrons has been described as the "most beautiful experiment in physics". Here, we show how a combination of nanofabrication and nano-imaging allows us to record the full two-dimensional build-up of quantum interference patterns in real time for phthalocyanine molecules and for derivatives of phthalocyanine molecules, which have masses of 514 AMU and 1,298 AMU respectively. A laser-controlled micro-evaporation source was used to produce a beam of molecules with the required intensity and coherence, and the gratings were machined in 10-nm-thick silicon nitride membranes to reduce the effect of van der Waals forces. Wide-field fluorescence microscopy detected the position of each molecule with an accuracy of 10 nm and revealed the build-up of a deterministic ensemble interference pattern from single molecules that arrived stochastically at the detector. In addition to providing this particularly clear demonstration of wave-particle duality, our approach could also be used to study larger molecules and explore the boundary between quantum and classical physics.
Real-time single-molecule imaging of quantum interference
NASA Astrophysics Data System (ADS)
Juffmann, Thomas; Milic, Adriana; Müllneritsch, Michael; Asenbaum, Peter; Tsukernik, Alexander; Tüxen, Jens; Mayor, Marcel; Cheshnovsky, Ori; Arndt, Markus
2012-05-01
The observation of interference patterns in double-slit experiments with massive particles is generally regarded as the ultimate demonstration of the quantum nature of these objects. Such matter-wave interference has been observed for electrons, neutrons, atoms and molecules and, in contrast to classical physics, quantum interference can be observed when single particles arrive at the detector one by one. The build-up of such patterns in experiments with electrons has been described as the ``most beautiful experiment in physics''. Here, we show how a combination of nanofabrication and nano-imaging allows us to record the full two-dimensional build-up of quantum interference patterns in real time for phthalocyanine molecules and for derivatives of phthalocyanine molecules, which have masses of 514 AMU and 1,298 AMU respectively. A laser-controlled micro-evaporation source was used to produce a beam of molecules with the required intensity and coherence, and the gratings were machined in 10-nm-thick silicon nitride membranes to reduce the effect of van der Waals forces. Wide-field fluorescence microscopy detected the position of each molecule with an accuracy of 10 nm and revealed the build-up of a deterministic ensemble interference pattern from single molecules that arrived stochastically at the detector. In addition to providing this particularly clear demonstration of wave-particle duality, our approach could also be used to study larger molecules and explore the boundary between quantum and classical physics.
Plasma waves associated with the AMPTE artificial comet
NASA Technical Reports Server (NTRS)
Gurnett, D. A.; Anderson, R. R.; Haeusler, B.; Haerendel, G.; Bauer, O. H.
1985-01-01
Numerous plasma wave effects were detected by the AMPTE/IRM spacecraft during the artificial comet experiment on December 27, 1984. As the barium ion cloud produced by the explosion expanded over the spacecraft, emissions at the electron plasma frequency and ion plasma frequency provided a determination of the local electron density. The electron density in the diamagnetic cavity produced by the ion cloud reached a peak of more than 5 x 10 to the 5th per cu cm, then decayed smoothly as the cloud expanded, varying approximately as t exp-2. As the cloud began to move due to interactions with the solar wind, a region of compressed plasma was encountered on the upstream side of the diamagnetic cavity. The peak electron density in the compression region was about 1.5 x 10 to the 4th per cu cm. Later, a very intense (140 mVolt/m) broadband burst of electrostatic noise was encountered on the sunward side of the compression region. This noise has characteristics very similar to noise observed in the earth's bow shock, and is believed to be a shocklike interaction produced by an ion beam-plasma instability between the nearly stationary barium ions and the streaming solar wind protons.
NASA Astrophysics Data System (ADS)
Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas
2016-10-01
In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied ;as is; to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.
Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María
2017-10-01
New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
The Community Cloud Atlas - Building an Informed Cloud Watching Community
NASA Astrophysics Data System (ADS)
Guy, N.; Rowe, A.
2014-12-01
The sky is dynamic, from long lasting cloud systems to ethereal, fleeting formations. After years of observing the sky and growing our personal collections of cloud photos, we decided to take to social media to share pictures, as well as build and educate a community of cloud enthusiasts. We began a Facebook page, the Community Cloud Atlas, described as "...the place to show off your pictures of the sky, identify clouds, and to discuss how specific cloud types form and what they can tell you about current and future weather." Our main goal has been to encourage others to share their pictures, while we describe the scenes from a meteorological perspective and reach out to the general public to facilitate a deeper understanding of the sky. Nearly 16 months later, we have over 1400 "likes," spanning 45 countries with ages ranging from 13 to over 65. We have a consistent stream of submissions; so many that we decided to start a corresponding blog to better organize the photos, provide more detailed explanations, and reach a bigger audience. Feedback from users has been positive in support of not only sharing cloud pictures, but also to "learn the science as well as admiring" the clouds. As one community member stated, "This is not 'just' a place to share some lovely pictures." We have attempted to blend our social media presence with providing an educational resource, and we are encouraged by the response we have received. Our Atlas has been informally implemented into classrooms, ranging from a 6th grade science class to Meteorology courses at universities. NOVA's recent Cloud Lab also made use of our Atlas as a supply of categorized pictures. Our ongoing goal is to not only continue to increase understanding and appreciation of the sky among the public, but to provide an increasingly useful tool for educators. We continue to explore different social media options to interact with the public and provide easier content submission, as well as software options for managing a growing database.
Characterization of Individual Aerosol Particles Associated with Clouds (CRYSTAL-FACE)
NASA Technical Reports Server (NTRS)
Buseck, Peter R.
2004-01-01
The aim of our research was to obtain data on the chemical and physical properties of individual aerosol particles from near the bottoms and tops of the deep convective systems that lead to the generation of tropical cirrus clouds and to provide insights into the particles that serve as CCN or IN. We used analytical transmission electron microscopy (ATEM), including energy-dispersive X-ray spectrometry (EDS) and electron energy-loss spectroscopy (EELS), and field-emission electron microscopy (FESEM) to compare the compositions, concentrations, size distributions, shapes, surface coatings, and degrees of aggregation of individual particles from cloud bases and the anvils near the tropopause. Aggregates of sea salt and mineral dust, ammonium sulfate, and soot particles are abundant in in-cloud samples. Cirrus samples contain many H2SO4 droplets, but acidic sulfate particles are rare at the cloud bases. H2SO4 probably formed at higher altitudes through oxidation of SO2 in cloud droplets. The relatively high extent of ammoniation in the upper troposphere in-cloud samples appears to have resulted from vertical transport by strong convection. The morphology of H2SO4 droplets indicates that they had been at least yartiy ammoniated at the time of collection. They are internally mixed with organic materials, metal sulfates, and solid particles of various compositions. Ammoniation and internal mixing of result in freezing at higher temperature than in pure H2SO4 aerosols. K- and S-bearing organic particles and Si-Al-rich particles are common throughout. Sea salt and mineral dust were incorporated into the convective systems from the cloud bases and worked as ice nuclei while being vertically transported. The nonsulfate particles originated from the lower troposphere and were transported to the upper troposphere and lower stratosphere.
NASA Astrophysics Data System (ADS)
Stanier, C. O.; Janechek, N. J.; Bryngelson, N.; Marek, R. F.; Lersch, T.; Bunker, K.; Casuccio, G.; Brune, W. H.; Hornbuckle, K. C.
2017-12-01
Cyclic volatile methyl siloxanes are anthropogenic chemicals present in personal care products such as antiperspirants and lotions. These are volatile chemicals that are readily released into the atmosphere by product use. Due to their emission and relatively slow kinetics of their major transformation pathway, reaction with hydroxyl radicals (OH), these compounds are present in high concentrations in indoor environments and widespread in outdoor environments. Cyclic siloxane reaction with OH can lead to secondary organic aerosols, and due to the widespread prevalence of the parent compounds, may be an important source of ambient aerosols. Atmospheric aerosols have important influences to the climate by affecting the radiative balance and by serving as cloud condensation nuclei (CCN) which influence clouds. While the parent compounds have been well-studied, the oxidation products have received much less attention, with almost no ambient measurements or experimental physical property data. We report physical properties of aerosols generated by reacting the cyclic siloxane D5 with OH using a Potential Aerosol Mass (PAM) photochemical chamber. The particles were characterized by SMPS, imaging and elemental analysis using both Transmission Electron Microscopy and Scanning Transmission Electron Microscopy equipped with Energy Dispersive X-ray Spectroscopy systems (TEM-EDS and STEM-EDS), volatility measurements using Volatility Tandem Differential Mobility Analyzer (V-TDMA), and hygroscopicity measurements to determine CCN potential using a Droplet Measurement Technologies Cloud Condensation Nuclei Counter (DMT-CCN). Aerosol yield sensitivity to D5 and OH concentrations, residence time, and seed aerosols were analyzed. TEM-EDS and STEM-EDS analysis show spherical particle morphology with elemental composition consistent with aerosols derived from cyclic siloxane sources. Measured aerosol yields were 20-50% with typical aerosol concentrations 300,000 particles cm-3, up to 200 μg m-3, and diameters of 30-90 nm. Particles experienced little diameter change after heating up to 200°C suggesting low volatility, while particle activation was shifted to higher supersaturations compared to ammonium sulfate suggesting moderate hygroscopicity in line with other secondary organics.
Utilization of Additive Manufacturing for Aerospace Heat Exchangers
2016-02-29
is made up of flat plates that are layered on top of each other creating air passages in between the plates where the hot liquid and cold liquid flow...electron beam- based) for two-dimensional scanning of the heat source on the powder layer , stages that decrease the build plate and increase the powder...build plate and result in uneven coating of subsequent powder layers or complete failure of the system to recoat. The perturbations in recoater
MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID ...
MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID NOT EXPLAIN DARK CLOUD. MTR WING WILL ATTACH TO GROUND FLOOR. INL NEGATIVE NO. 1567. Unknown Photographer, 2/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
Response of lightning energy and total electron content with sprites over Antarctic Peninsula
NASA Astrophysics Data System (ADS)
Suparta, W.; Yusop, N.
2017-05-01
This paper investigates the response of the lightning energy with the total electron content (TEC) derived from GPS over Antarctic Peninsula during St Patrick’s geomagnetic storm. During this event, sprite as one of the mesospheric transient luminous events (TLEs) associated with positive cloud-to-ground (+CG) lightning discharges can be generated. In this work, GPS and lightning data for the period from 14 to 20 March 2015 is analyzed. Geomagnetic activity and electric field data are also processed to relate the geomagnetic storm and lightning. Results show that during St Patrick’s geomagnetic storm, the lighting energy was produced up to ∼257 kJ. The ionospheric TEC was obtained 60 TECU, 38 TECU and 78 TECU between 18:00 and 21:00 UT for OHI3, PALV and ROTH stations, respectively. The peak of lightning energy was observed 14 hours after peaked of TEC. Sprite possibly generated through the electrical coupling process between the top cloud, middle and upper atmosphere with the DC electric field found to be ∼10 mVm-1 which leading to the sprite generation after the return strokes on 18 March 2015.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.
2017-12-01
Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.
NASA Astrophysics Data System (ADS)
Aneri, Parikh; Sumathy, S.
2017-11-01
Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.
Simulations of space charge neutralization in a magnetized electron cooler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerity, James; McIntyre, Peter M.; Bruhwiler, David Leslie
Magnetized electron cooling at relativistic energies and Ampere scale current is essential to achieve the proposed ion luminosities in a future electron-ion collider (EIC). Neutralization of the space charge in such a cooler can significantly increase the magnetized dynamic friction and, hence, the cooling rate. The Warp framework is being used to simulate magnetized electron beam dynamics during and after the build-up of neutralizing ions, via ionization of residual gas in the cooler. The design follows previous experiments at Fermilab as a verification case. We also discuss the relevance to EIC designs.
Konishi, Yuki; Hayashi, Hiroaki; Takegami, Kazuki; Fukuda, Ikuma; Ueno, Junji
2014-01-01
A cloud chamber is a detector that can visualize the tracks of charged particles. Hayashi, et al. suggested a visualization experiment in which X-rays generated by diagnostic X-ray equipment were directed into a cloud chamber; however, there was a problem in that the wall of the cloud chamber scattered the incoming X-rays. In this study, we developed a new cloud chamber with entrance windows. Because these windows are made of thin film, we were able to direct the X-rays through them without contamination by scattered X-rays from the cloud chamber wall. We have newly proposed an experiment in which beta-particles emitted from radioisotopes are directed into a cloud chamber. We place shielding material in the cloud chamber and visualize the various shielding effects seen with the material positioned in different ways. During the experiment, electrons scattered in the air were measured quantitatively using GM counters. We explained the physical phenomena in the cloud chamber using Monte Carlo simulation code EGS5. Because electrons follow a tortuous path in air, the shielding material must be placed appropriately to be able to effectively block their emissions. Visualization of the tracks of charged particles in this experiment proved effective for instructing not only trainee radiological technologists but also different types of healthcare professionals.
Autumn at Titan's South Pole: The 220 cm-1 Cloud
NASA Astrophysics Data System (ADS)
Jennings, D. E.; Cottini, V.; Achterberg, R. K.; Anderson, C. M.; Flasar, F. M.; de Kok, R. J.; Teanby, N. A.; Coustenis, A.; Vinatier, S.
2015-10-01
Beginning in 2012 an atmospheric cloud known by its far-infrared emission has formed rapidly at Tit an's South Pole [1, 2]. The build-up of this condensate is a result of deepening temperatures and a gathering of gases as Winter approaches. Emission from the cloud in the south has been doubling each year since 2012, in contrast to the north where it has halved every 3.8 years since 2004. The morphology of the cloud in the south is quite different from that in the north. In the north, the cloud has extended over the whole polar region beyond 55 N, whereas in the south the cloud has been confined to within about 10 degrees of the pole. The cloud in the north has had the form of a uniform hood, whereas the southern cloud has been much more complex. A map from December 2014,recorded by the Composite Infrared Spectrometer (CIRS) on Cassini, showed the 220 cm-1 emission coming from a distinct ring with a maximum at about 80 S. In contrast, emissions from the gases HC3N, C4H2 and C6H6 peaked near the pole and had a ring at 70 S. The 220 cm-1 ring at 80 S coincided with the minimum in the gas emission pattern. The80 S condensate ring encompassed the vortex cloud seen by the Cassini Imaging Science Subsystem (ISS) and Visible and Infrared Mapping Spectrometer (VIMS)[3, 4]. Both the 220 cm-1 ring and the gas "bull's-eye" pattern were centered on a point that was shifted from the geographic South Pole by 4 degrees in the direction of the Sun. This corresponds to the overall tilt of Titan's atmosphere discovered from temperature maps early in the Cassini mission by Achterberg et al. [5]. The tilt may be reinforced by the presumably twice-yearly (north and south) spin-up of the atmosphere at the autumnal pole. The bull's-eye pattern of the gas emissions can be explained by the retrieved abundance distributions, which are maximum near the pole and decrease sharply toward lower latitudes, together with temperatures that are minimum at the pole and increase toward lower latitudes. The increasing temperatures overcome the decreasing gas abundances to produce emission in the narrow range around 70 S. This cannot, however, explain the maximum of emission at 80 S from the condensate ring. The coincidence at 80 S of the 220 cm-1 peak with the gas emission minimum may indicate where the condensation is taking place. The central, polar minimum in the cloud emission may be due to faster rain-out and smaller extinction cross-sections. Spectral maps from 2013-15 [6] show that the gas emission pattern has been evolving quickly, with noticeable changes from one flyby to the next (about one month). The bull's-eye structure appears to have been most prominent in early 2014 and by late 2014 the pattern was becoming more uniform. As Titan progresses through late southern Autumn we expect the morphology of the condensate cloud to take on a hood-like distribution similar to that in the north.
A computer vision approach for solar radiation nowcasting using MSG images
NASA Astrophysics Data System (ADS)
Álvarez, L.; Castaño Moraga, C. A.; Martín, J.
2010-09-01
Cloud structures and haze are the two main atmospheric phenomena that reduce the performance of solar power plants, since they absorb solar energy reaching terrestrial surface. Thus, accurate forecasting of solar radiation is a challenging research area that involves both a precise localization of cloud structures and haze, as well as the attenuation introduced by these artifacts. Our work presents a novel approach for nowcasting services based on image processing techniques applied to MSG satellite images provided by the EUMETSAT Rapid Scan Service (RSS) service. These data are an interesting source of information for our purposes since every 5 minutes we obtain actual information of the atmospheric state in nearly real time. However, a workaround must be given in order to forecast solar radiation. To that end, we synthetically forecast MSG images forecasts from past images applying computer vision techniques adapted to fluid flows in order to evolve atmospheric state. First, we classify cloud structures on two different layers, corresponding to top and bottom clouds, which includes haze. This two-level classification responds to the dominant climate conditions found in our region of interest, the Canary Islands archipelago, regulated by the Gulf Stream and Trade Winds. Vertical structure of Trade Winds consists of two layers, the bottom one, which is fresh and humid, and the top one, which is warm and dry. Between these two layers a thermal inversion appears that does not allow bottom clouds to go up and naturally divides clouds in these two layers. Top clouds can be directly obtained from satellite images by means of a segmentation algorithm on histogram heights. However, bottom clouds are usually overlapped by the former, so an inpainting algorithm is used to recover overlapped areas of bottom clouds. For each layer, cloud motion is estimated through a correlation based optic flow algorithm that provides a vector field that describes the displacement field in each layer between two consecutive images in a sequence. Since RSS service from EUMETSAT provides images every 5 minutes (Δt), the cloud motion vector field between images at time t0 and (t0 - Δt) is quite similar to that between (t0 - Δt) and (t0 - 2Δt). Under this assumption, we infer the motion vector field for the next image in order to build a synthetic version of the image at time (t0 + Δt). The computation of this future motion vector field takes into account terrain orography in order to produce more realistic forecasts. In this sense, we are currently working on the integration of information from NWP outputs in order to introduce other atmospheric phenomena. Applying this algorithm several times we are able to produce short-term forecasts up to 6 hours with encouraging performance. To validate our results, we use both, comparison of synthetically generated images with the corresponding images at a given time, and direct solar radiation measurement with the set of meteorological stations located at several points of the canarian archipelago.
NASA Astrophysics Data System (ADS)
Sanchez, K.; Roberts, G.; Calmer, R.; Nicoll, K.; Hashimshoni, E.; Rosenfeld, D.; Ovadnevaite, J.; Preissler, J.; Ceburnis, D.; O'Dowd, C. D. D.; Russell, L. M.
2017-12-01
Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head atmospheric research station in Galway, Ireland in August 2015. Instrument platforms include ground-based, unmanned aerial vehicles (UAV), and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction, or a 5-hole probe for 3D wind vectors. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in-situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 W m-2 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNC) were within 30% of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment, and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after accounting for cloud-top entrainment. This work demonstrates the need to take in-situ measurements of aerosol properties for cases where the boundary layer is decoupled as well as consider cloud-top entrainment to accurately model stratocumulus cloud radiative flux.
NASA Astrophysics Data System (ADS)
Sanchez, K.; Roberts, G.; Calmer, R.; Nicoll, K.; Hashimshoni, E.; Rosenfeld, D.; Ovadnevaite, J.; Preissler, J.; Ceburnis, D.; O'Dowd, C. D. D.; Russell, L. M.
2016-12-01
Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head atmospheric research station in Galway, Ireland in August 2015. Instrument platforms include ground-based, unmanned aerial vehicles (UAV), and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction, or a 5-hole probe for 3D wind vectors. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in-situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 W m-2 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNC) were within 30% of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment, and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after accounting for cloud-top entrainment. This work demonstrates the need to take in-situ measurements of aerosol properties for cases where the boundary layer is decoupled as well as consider cloud-top entrainment to accurately model stratocumulus cloud radiative flux.
Formation of Benzene in the Interstellar Medium
NASA Technical Reports Server (NTRS)
Jones, Brant M.; Zhang, Fangtong; Kaiser, Ralf I.; Jamal, Adeel; Mebel, Alexander M.; Cordiner, Martin A.; Charnley, Steven B.; Crim, F. Fleming (Editor)
2010-01-01
Polycyclic aromatic hydrocarbons and related species have been suggested to play a key role in the astrochemical evolution of the interstellar medium, but the formation mechanism of even their simplest building block-the aromatic benzene molecule-has remained elusive for decades. Here we demonstrate in crossed molecular beam experiments combined with electronic structure and statistical calculations that benzene (C6H6) can be synthesized via the barrierless, exoergic reaction of the ethynyl radical and 1,3- butadiene, C2H + H2CCHCHCH2 --> C6H6, + H, under single collision conditions. This reaction portrays the simplest representative of a reaction class in which aromatic molecules with a benzene core can be formed from acyclic precursors via barrierless reactions of ethynyl radicals with substituted 1,3-butadlene molecules. Unique gas-grain astrochemical models imply that this low-temperature route controls the synthesis of the very first aromatic ring from acyclic precursors in cold molecular clouds, such as in the Taurus Molecular Cloud. Rapid, subsequent barrierless reactions of benzene with ethynyl radicals can lead to naphthalene-like structures thus effectively propagating the ethynyl-radical mediated formation of aromatic molecules in the interstellar medium.
Formation of benzene in the interstellar medium
Jones, Brant M.; Zhang, Fangtong; Kaiser, Ralf I.; Jamal, Adeel; Mebel, Alexander M.; Cordiner, Martin A.; Charnley, Steven B.
2011-01-01
Polycyclic aromatic hydrocarbons and related species have been suggested to play a key role in the astrochemical evolution of the interstellar medium, but the formation mechanism of even their simplest building block—the aromatic benzene molecule—has remained elusive for decades. Here we demonstrate in crossed molecular beam experiments combined with electronic structure and statistical calculations that benzene (C6H6) can be synthesized via the barrierless, exoergic reaction of the ethynyl radical and 1,3-butadiene, C2H + H2CCHCHCH2 → C6H6 + H, under single collision conditions. This reaction portrays the simplest representative of a reaction class in which aromatic molecules with a benzene core can be formed from acyclic precursors via barrierless reactions of ethynyl radicals with substituted 1,3-butadiene molecules. Unique gas-grain astrochemical models imply that this low-temperature route controls the synthesis of the very first aromatic ring from acyclic precursors in cold molecular clouds, such as in the Taurus Molecular Cloud. Rapid, subsequent barrierless reactions of benzene with ethynyl radicals can lead to naphthalene-like structures thus effectively propagating the ethynyl-radical mediated formation of aromatic molecules in the interstellar medium. PMID:21187430
Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data
NASA Astrophysics Data System (ADS)
Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.
2016-06-01
Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.
Confidentiality Protection of Digital Health Records in Cloud Computing.
Chen, Shyh-Wei; Chiang, Dai Lun; Liu, Chia-Hui; Chen, Tzer-Shyong; Lai, Feipei; Wang, Huihui; Wei, Wei
2016-05-01
Electronic medical records containing confidential information were uploaded to the cloud. The cloud allows medical crews to access and manage the data and integration of medical records easily. This data system provides relevant information to medical personnel and facilitates and improve electronic medical record management and data transmission. A structure of cloud-based and patient-centered personal health record (PHR) is proposed in this study. This technique helps patients to manage their health information, such as appointment date with doctor, health reports, and a completed understanding of their own health conditions. It will create patients a positive attitudes to maintain the health. The patients make decision on their own for those whom has access to their records over a specific span of time specified by the patients. Storing data in the cloud environment can reduce costs and enhance the share of information, but the potential threat of information security should be taken into consideration. This study is proposing the cloud-based secure transmission mechanism is suitable for multiple users (like nurse aides, patients, and family members).
Pulse sequences for uniform perfluorocarbon droplet vaporization and ultrasound imaging.
Puett, C; Sheeran, P S; Rojas, J D; Dayton, P A
2014-09-01
Phase-change contrast agents (PCCAs) consist of liquid perfluorocarbon droplets that can be vaporized into gas-filled microbubbles by pulsed ultrasound waves at diagnostic pressures and frequencies. These activatable contrast agents provide benefits of longer circulating times and smaller sizes relative to conventional microbubble contrast agents. However, optimizing ultrasound-induced activation of these agents requires coordinated pulse sequences not found on current clinical systems, in order to both initiate droplet vaporization and image the resulting microbubble population. Specifically, the activation process must provide a spatially uniform distribution of microbubbles and needs to occur quickly enough to image the vaporized agents before they migrate out of the imaging field of view. The development and evaluation of protocols for PCCA-enhanced ultrasound imaging using a commercial array transducer are described. The developed pulse sequences consist of three states: (1) initial imaging at sub-activation pressures, (2) activating droplets within a selected region of interest, and (3) imaging the resulting microbubbles. Bubble clouds produced by the vaporization of decafluorobutane and octafluoropropane droplets were characterized as a function of focused pulse parameters and acoustic field location. Pulse sequences were designed to manipulate the geometries of discrete microbubble clouds using electronic steering, and cloud spacing was tailored to build a uniform vaporization field. The complete pulse sequence was demonstrated in the water bath and then in vivo in a rodent kidney. The resulting contrast provided a significant increase (>15 dB) in signal intensity. Copyright © 2014 Elsevier B.V. All rights reserved.
Discovery of very-high-energy gamma-rays from the Galactic Centre ridge.
Aharonian, F; Akhperjanian, A G; Bazer-Bachi, A R; Beilicke, M; Benbow, W; Berge, D; Bernlöhr, K; Boisson, C; Bolz, O; Borrel, V; Braun, I; Breitling, F; Brown, A M; Chadwick, P M; Chounet, L-M; Cornils, R; Costamante, L; Degrange, B; Dickinson, H J; Djannati-Ataï, A; Drury, L O'C; Dubus, G; Emmanoulopoulos, D; Espigat, P; Feinstein, F; Fontaine, G; Fuchs, Y; Funk, S; Gallant, Y A; Giebels, B; Gillessen, S; Glicenstein, J F; Goret, P; Hadjichristidis, C; Hauser, D; Hauser, M; Heinzelmann, G; Henri, G; Hermann, G; Hinton, J A; Hofmann, W; Holleran, M; Horns, D; Jacholkowska, A; de Jager, O C; Khélifi, B; Klages, S; Komin, Nu; Konopelko, A; Latham, I J; Le Gallou, R; Lemière, A; Lemoine-Goumard, M; Leroy, N; Lohse, T; Marcowith, A; Martin, J M; Martineau-Huynh, O; Masterson, C; McComb, T J L; de Naurois, M; Nolan, S J; Noutsos, A; Orford, K J; Osborne, J L; Ouchrif, M; Panter, M; Pelletier, G; Pita, S; Pühlhofer, G; Punch, M; Raubenheimer, B C; Raue, M; Raux, J; Rayner, S M; Reimer, A; Reimer, O; Ripken, J; Rob, L; Rolland, L; Rowell, G; Sahakian, V; Saugé, L; Schlenker, S; Schlickeiser, R; Schuster, C; Schwanke, U; Siewert, M; Sol, H; Spangler, D; Steenkamp, R; Stegmann, C; Tavernet, J-P; Terrier, R; Théoret, C G; Tluczykont, M; van Eldik, C; Vasileiadis, G; Venter, C; Vincent, P; Völk, H J; Wagner, S J
2006-02-09
The source of Galactic cosmic rays (with energies up to 10(15) eV) remains unclear, although it is widely believed that they originate in the shock waves of expanding supernova remnants. At present the best way to investigate their acceleration and propagation is by observing the gamma-rays produced when cosmic rays interact with interstellar gas. Here we report observations of an extended region of very-high-energy (> 10(11) eV) gamma-ray emission correlated spatially with a complex of giant molecular clouds in the central 200 parsecs of the Milky Way. The hardness of the gamma-ray spectrum and the conditions in those molecular clouds indicate that the cosmic rays giving rise to the gamma-rays are likely to be protons and nuclei rather than electrons. The energy associated with the cosmic rays could have come from a single supernova explosion around 10(4) years ago.
LAD Dissertation Prize Talk: Molecular Collisional Excitation in Astrophysical Environments
NASA Astrophysics Data System (ADS)
Walker, Kyle M.
2017-06-01
While molecular excitation calculations are vital in determining particle velocity distributions, internal state distributions, abundances, and ionization balance in gaseous environments, both theoretical calculations and experimental data for these processes are lacking. Reliable molecular collisional data with the most abundant species - H2, H, He, and electrons - are needed to probe material in astrophysical environments such as nebulae, molecular clouds, comets, and planetary atmospheres. However, excitation calculations with the main collider, H2, are computationally expensive and therefore various approximations are used to obtain unknown rate coefficients. The widely-accepted collider-mass scaling approach is flawed, and alternate scaling techniques based on physical and mathematical principles are presented here. The most up-to-date excitation data are used to model the chemical evolution of primordial species in the Recombination Era and produce accurate non-thermal spectra of the molecules H2+, HD, and H2 in a primordial cloud as it collapses into a first generation star.
NASA Astrophysics Data System (ADS)
Dieckmann, M. E.
2008-11-01
Recent particle-in-cell (PIC) simulation studies have addressed particle acceleration and magnetic field generation in relativistic astrophysical flows by plasma phase space structures. We discuss the astrophysical environments such as the jets of compact objects, and we give an overview of the global PIC simulations of shocks. These reveal several types of phase space structures, which are relevant for the energy dissipation. These structures are typically coupled in shocks, but we choose to consider them here in an isolated form. Three structures are reviewed. (1) Simulations of interpenetrating or colliding plasma clouds can trigger filamentation instabilities, while simulations of thermally anisotropic plasmas observe the Weibel instability. Both transform a spatially uniform plasma into current filaments. These filament structures cause the growth of the magnetic fields. (2) The development of a modified two-stream instability is discussed. It saturates first by the formation of electron phase space holes. The relativistic electron clouds modulate the ion beam and a secondary, spatially localized electrostatic instability grows, which saturates by forming a relativistic ion phase space hole. It accelerates electrons to ultra-relativistic speeds. (3) A simulation is also revised, in which two clouds of an electron-ion plasma collide at the speed 0.9c. The inequal densities of both clouds and a magnetic field that is oblique to the collision velocity vector result in waves with a mixed electrostatic and electromagnetic polarity. The waves give rise to growing corkscrew distributions in the electrons and ions that establish an equipartition between the electron, the ion and the magnetic energy. The filament-, phase space hole- and corkscrew structures are discussed with respect to electron acceleration and magnetic field generation.
[Research and Implementation of Vital Signs Monitoring System Based on Cloud Platform].
Yu, Man; Tan, Anzu; Huang, Jianqi
2018-05-30
Through analyzing the existing problems in the current mode, the vital signs monitoring information system based on cloud platform is designed and developed. The system's aim is to assist nurse carry out vital signs nursing work effectively and accurately. The system collects, uploads and analyzes patient's vital signs data by PDA which connecting medical inspection equipments. Clinical application proved that the system can effectively improve the quality and efficiency of medical care and may reduce medical expenses. It is alse an important practice result to build a medical cloud platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pivi, M.T.F.; Collet, G.; King, F.
Beam instability caused by the electron cloud has been observed in positron and proton storage rings and it is expected to be a limiting factor in the performance of the positron Damping Ring (DR) of future Linear Colliders (LC) such as ILC and CLIC. To test a series of promising possible electron cloud mitigation techniques as surface coatings and grooves, in the Positron Low Energy Ring (LER) of the PEP-II accelerator, we have installed several test vacuum chambers including (i) a special chamber to monitor the variation of the secondary electron yield of technical surface materials and coatings under themore » effect of ion, electron and photon conditioning in situ in the beam line; (ii) chambers with grooves in a straight magnetic-free section; and (iii) coated chambers in a dedicated newly installed 4-magnet chicane to study mitigations in a magnetic field region. In this paper, we describe the ongoing R&D effort to mitigate the electron cloud effect for the LC damping ring, focusing on the first experimental area and on results of the reduction of the secondary electron yield due to in situ conditioning.« less
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
Context dependent off loading for cloudlet in mobile ad-hoc network
NASA Astrophysics Data System (ADS)
Bhatt, N.; Nadesh, R. K.; ArivuSelvan, K.
2017-11-01
Cloud Computing in Mobile Ad-hoc network is emerging part of research consideration as the demand and competency of mobile devices increased in last few years. To follow out operation within the remote cloud builds the postponement and influences the administration standard. To keep away from this trouble cloudlet is presented. Cloudlet gives identical support of the devices as cloud at low inactivity however at high transfer speed. Be that as it may, choice of a cloudlet for offloading calculation with flat energy is a noteworthy test if multiple cloud let is accessible adjacent. Here I proposed energy and bandwidth (Traffic overload for communication with cloud) aware cloudlet selection strategy based on the context dependency of the device location. It works on the basis of mobile device location and bandwidth availability of cloudlet. The cloudlet offloading and selection process using given solution is simulated in Cloud ~ Simulator.
Grids, virtualization, and clouds at Fermilab
Timm, S.; Chadwick, K.; Garzoglio, G.; ...
2014-06-11
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-01-01
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121
Grids, virtualization, and clouds at Fermilab
NASA Astrophysics Data System (ADS)
Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.
2014-06-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-12-24
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.
Localization of Pathology on Complex Architecture Building Surfaces
NASA Astrophysics Data System (ADS)
Sidiropoulos, A. A.; Lakakis, K. N.; Mouza, V. K.
2017-02-01
The technology of 3D laser scanning is considered as one of the most common methods for heritage documentation. The point clouds that are being produced provide information of high detail, both geometric and thematic. There are various studies that examine techniques of the best exploitation of this information. In this study, an algorithm of pathology localization, such as cracks and fissures, on complex building surfaces is being tested. The algorithm makes use of the points' position in the point cloud and tries to distinguish them in two groups-patterns; pathology and non-pathology. The extraction of the geometric information that is being used for recognizing the pattern of the points is being accomplished via Principal Component Analysis (PCA) in user-specified neighborhoods in the whole point cloud. The implementation of PCA leads to the definition of the normal vector at each point of the cloud. Two tests that operate separately examine both local and global geometric criteria among the points and conclude which of them should be categorized as pathology. The proposed algorithm was tested on parts of the Gazi Evrenos Baths masonry, which are located at the city of Giannitsa at Northern Greece.
NASA Astrophysics Data System (ADS)
Brovelli, A.; Robinson, C. E.; Barry, D. A.; Gerhard, J.
2009-12-01
Enhanced reductive dechlorination is a viable technology for in situ remediation of chlorinated solvent DNAPL source areas. Although in recent years increased understanding of this technology has led to more rapid dechlorination rates, complete dechlorination can be hindered by unfavorable conditions. Hydrochloric acid produced from dechlorination and organic acids generated from electron donor fermentation can lead to significant groundwater acidification. Adverse pH conditions can inhibit the activity of dehalogenating microorganisms and thus slow or stall the remediation process. The extent of acidification likely to occur at a contaminated site depends on a number of factors including (1) the extent of dechlorination, (2) the pH-sensitivity of dechlorinating bacteria, and (3) the geochemical composition of the soil and water, in particular the soil’s natural buffering capacity. The substantial mass of solvents available for dechlorination when treating DNAPL source zones means that these applications are particularly susceptible to acidification. In this study a reactive transport biogeochemical model was developed to investigate the chemical and physical parameters that control the build-up of acidity and subsequent remediation efficiency. The model accounts for the site water chemistry, mineral precipitation and dissolution kinetics, electron donor fermentation, gas phase formation, competing electron-accepting processes (e.g., sulfate and iron reduction) and the sensitivity of microbial processes to pH. Confidence in the model was achieved by simulating a well-documented field study, for which the 2-D field scale model was able to reproduce long-term variations of pH, and the concurrent build up of reaction products. Sensitivity analyses indicated the groundwater flow velocity is able to reduce acidity build-up when the rate of advection is comparable or larger than the rate of dechlorination. The extent of pH change is highly dependent on the presence of calcite in soil, the availability of competing electron acceptors (in particular dissolved sulfates) and the efficiency with which microbes utilize electron donor. This work is part of SABRE (Source Area BioREmediation), a collaborative international research project that aimed to evaluate and improve enhanced bioremediation of chlorinated solvent source zones.
A computational- And storage-cloud for integration of biodiversity collections
Matsunaga, A.; Thompson, A.; Figueiredo, R. J.; Germain-Aubrey, C.C; Collins, M.; Beeman, R.S; Macfadden, B.J.; Riccardi, G.; Soltis, P.S; Page, L. M.; Fortes, J.A.B
2013-01-01
A core mission of the Integrated Digitized Biocollections (iDigBio) project is the building and deployment of a cloud computing environment customized to support the digitization workflow and integration of data from all U.S. nonfederal biocollections. iDigBio chose to use cloud computing technologies to deliver a cyberinfrastructure that is flexible, agile, resilient, and scalable to meet the needs of the biodiversity community. In this context, this paper describes the integration of open source cloud middleware, applications, and third party services using standard formats, protocols, and services. In addition, this paper demonstrates the value of the digitized information from collections in a broader scenario involving multiple disciplines.
Construction and application of Red5 cluster based on OpenStack
NASA Astrophysics Data System (ADS)
Wang, Jiaqing; Song, Jianxin
2017-08-01
With the application and development of cloud computing technology in various fields, the resource utilization rate of the data center has been improved obviously, and the system based on cloud computing platform has also improved the expansibility and stability. In the traditional way, Red5 cluster resource utilization is low and the system stability is poor. This paper uses cloud computing to efficiently calculate the resource allocation ability, and builds a Red5 server cluster based on OpenStack. Multimedia applications can be published to the Red5 cloud server cluster. The system achieves the flexible construction of computing resources, but also greatly improves the stability of the cluster and service efficiency.
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollias, Pavlos
2017-04-23
With the vast upgrades to the ARM program radar measurement capabilities in 2010 and beyond, our ability to probe the 3D structure of clouds and associated precipitation has increased dramatically. This project build on the PI's and co-I's expertisein the analysis of radar observations. The first research thrust aims to document the 3D morphological (as depicted by the radar reflectivity structure) and 3D dynamical (cloud$-$scale eddies) structure of boundary layer clouds. Unraveling the 3D dynamical structure of stratocumulus and shallow cumulus clouds requires decomposition of the environmental wind contribution and particle sedimentation velocity from the observed radial Doppler velocity. Themore » second thrust proposes to unravel the mechanism of cumulus entrainment (location, scales) and its impact on microphysics utilizing radar measurements from the vertically pointing and new scanning radars at the ARM sites. The third research thrust requires the development of a cloud$-$tracking algorithm that monitors the properties of cloud.« less
3-D Object Recognition from Point Cloud Data
NASA Astrophysics Data System (ADS)
Smith, W.; Walker, A. S.; Zhang, B.
2011-09-01
The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs). Massively parallel processes such as graphics processing unit (GPU) computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM) and digital elevation model (DEM), so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex roofs. Several case studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.
Sensor data fusion for textured reconstruction and virtual representation of alpine scenes
NASA Astrophysics Data System (ADS)
Häufel, Gisela; Bulatov, Dimitri; Solbrig, Peter
2017-10-01
The concept of remote sensing is to provide information about a wide-range area without making physical contact with this area. If, additionally to satellite imagery, images and videos taken by drones provide a more up-to-date data at a higher resolution, or accurate vector data is downloadable from the Internet, one speaks of sensor data fusion. The concept of sensor data fusion is relevant for many applications, such as virtual tourism, automatic navigation, hazard assessment, etc. In this work, we describe sensor data fusion aiming to create a semantic 3D model of an extremely interesting yet challenging dataset: An alpine region in Southern Germany. A particular challenge of this work is that rock faces including overhangs are present in the input airborne laser point cloud. The proposed procedure for identification and reconstruction of overhangs from point clouds comprises four steps: Point cloud preparation, filtering out vegetation, mesh generation and texturing. Further object types are extracted in several interesting subsections of the dataset: Building models with textures from UAV (Unmanned Aerial Vehicle) videos, hills reconstructed as generic surfaces and textured by the orthophoto, individual trees detected by the watershed algorithm, as well as the vector data for roads retrieved from openly available shapefiles and GPS-device tracks. We pursue geo-specific reconstruction by assigning texture and width to roads of several pre-determined types and modeling isolated trees and rocks using commercial software. For visualization and simulation of the area, we have chosen the simulation system Virtual Battlespace 3 (VBS3). It becomes clear that the proposed concept of sensor data fusion allows a coarse reconstruction of a large scene and, at the same time, an accurate and up-to-date representation of its relevant subsections, in which simulation can take place.
NASA Technical Reports Server (NTRS)
Spann, J.; Germany, G.; Swift, W.; Parks, G.; Brittnacher, M.; Elsen, R.
1997-01-01
The observed precipitating electron energy between 0130 UT and 0400 UT of January 10 th, 1997, indicates that there is a more energetic precipitating electron population that appears in the auroral oval at 1800-2200 UT at 030) UT. This increase in energy occurs after the initial shock of the magnetic cloud reaches the Earth (0114 UT) and after faint but dynamic polar cap precipitation has been cleared out. The more energetic population is observed to remain rather constant in MLT through the onset of auroral activity (0330 UT) and to the end of the Polar spacecraft apogee pass. Data from the Ultraviolet Imager LBH long and LBH short images are used to quantify the average energy of the precipitating auroral electrons. The Wind spacecraft located about 100 RE upstream monitored the IMF and plasma parameters during the passing of the cloud. The affects of oblique angle viewing are included in the analysis. Suggestions as to the source of this hot electron population will be presented.
Search for water and life's building blocks in the Universe: An Introduction
NASA Astrophysics Data System (ADS)
Kwok, Sun
Water and organics are commonly believed to be the essential ingredients for life on Earth. The development of infrared and submillimeter observational techniques has resulted in the detection of water in circumstellar envelopes, interstellar clouds, comets, asteroids, planetary satellites and the Sun. Complex organics have also been found in stellar ejecta, diffuse and molecular clouds, meteorites, interplanetary dust particles, comets and planetary satellites. In this Focus Meeting, we will discuss the origin, distribution, and detection of water and other life's building blocks both inside and outside of the Solar System. The possibility of extraterrestrial organics and water on the origin of life on Earth will also be discussed.
Possible origin and roles of nano-porosity in ZrO2 scales for hydrogen pick-up in Zr alloys
NASA Astrophysics Data System (ADS)
Lindgren, Mikaela; Geers, Christine; Panas, Itai
2017-08-01
A mechanistic understanding of Wagnerian build-up and subsequent non-Wagnerian break-down of barrier oxide upon oxidation of zirconium alloys by water is reiterated. Hydrogen assisted build-up of nano-porosity is addressed. Growth of sub-nanometer wide stalactitic pores owing to increasing aggregation of neutral oxygen vacancies offering a means to permeate hydrogen into the alloy is explored by density functional theory. The Wagnerian channel utilizes charge separation allowing charged oxygen vacancies and electrons to move separately from nominal anode to nominal cathode. This process becomes increasingly controlled by the charging of the barrier oxide resulting in sub-parabolic rate law for oxide growth. The break-down of the barrier oxide is understood to be preceded by avalanching hydrogen pick-up in the alloy. Pore mediated diffusion allows water to effectively short circuit the barrier oxide.
Analysis of the Security and Privacy Requirements of Cloud-Based Electronic Health Records Systems
Fernández, Gonzalo; López-Coronado, Miguel
2013-01-01
Background The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients’ medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. Objective To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. Methods To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Results Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Conclusions Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access and data breaches. Patients must be kept informed about how their data are being managed. PMID:23965254
Analysis of the security and privacy requirements of cloud-based electronic health records systems.
Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel
2013-08-21
The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access and data breaches. Patients must be kept informed about how their data are being managed.
Hybrid cloud: bridging of private and public cloud computing
NASA Astrophysics Data System (ADS)
Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol
2018-05-01
Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.
NASA Technical Reports Server (NTRS)
Hoadley, A. W.; Porter, A. J.
1991-01-01
The theory and experimental verification of a method of detecting fluid-mass loss, expansion-chamber pressure loss, or excessive vapor build-up in NASA's Airborne Information Management System (AIMS) are presented. The primary purpose of this leak-detection method is to detect the fluid-mass loss before the volume of vapor on the liquid side causes a temperature-critical part to be out of the liquid. The method detects the initial leak after the first 2.5 pct of the liquid mass has been lost, and it can be used for detecting subsequent situations including the leaking of air into the liquid chamber and the subsequent vapor build-up.
Odd cloud in the Ross Sea, Antarctica
NASA Technical Reports Server (NTRS)
2002-01-01
On January 28, 2002, MODIS captured this image of an interesting cloud formation in the boundary waters between Antarctica's Ross Sea and the Southern Ocean. A dragon? A snake? A fish? No, but it is an interesting example of the atmospheric physics of convection. The 'eye' of this dragon-looking cloud is likely a small spot of convection, the process by which hot moist air rises up into the atmosphere, often producing big, fluffy clouds as moisture in the air condenses as rises into the colder parts of the atmosphere. A false color analysis that shows different kinds of clouds in different colors reveals that the eye is composed of ice crystals while the 'body' is a liquid water cloud. This suggests that the eye is higher up in the atmosphere than the body. The most likely explanation for the eye feature is that the warm, rising air mass had enough buoyancy to punch through the liquid water cloud. As a convective parcel of air rises into the atmosphere, it pushes the colder air that is higher up out of its way. That cold air spills down over the sides of the convective air mass, and in this case has cleared away part of the liquid cloud layer below in the process. This spilling over of cold air from higher up in the atmosphere is the reason why thunderstorms are often accompanied by a cool breeze. Credit: Jacques Descloitres, MODIS Land Rapid Response Team, NASA/GSFC
NASA Astrophysics Data System (ADS)
Marshall, R. A.; Inan, U. S.; Glukhov, V. S.
2010-04-01
A 3-D finite difference time domain model is used to simulate the lightning electromagnetic pulse (EMP) and its interaction with the lower ionosphere. Results agree with the frequently observed, doughnut-shaped optical signature of elves but show that the structure exhibits asymmetry due to the presence of Earth's ambient magnetic field. Furthermore, in-cloud (horizontal) lightning channels produce observable optical emissions without the doughnut shape and, in fact, produce a much stronger optical output for the same channel current. Electron density perturbations associated with elves are also calculated, with contributions from attachment and ionization. Results presented as a function of parameters such as magnetic field direction, dipole current orientation, altitude and amplitude, and ambient ionospheric density profile demonstrate the highly nonlinear nature of the EMP-ionosphere interaction. Ionospheric effects of a sequence of in-cloud discharges are calculated, simulating a burst of in-cloud lightning activity and resulting in large density changes in the overlying ionosphere.
Electronic Health Records in the Cloud: Improving Primary Health Care Delivery in South Africa.
Cilliers, Liezel; Wright, Graham
2017-01-01
In South Africa, the recording of health data is done manually in a paper-based file, while attempts to digitize healthcare records have had limited success. In many countries, Electronic Health Records (EHRs) has developed in silos, with little or no integration between different operational systems. Literature has provided evidence that the cloud can be used to 'leapfrog' some of these implementation issues, but the adoption of this technology in the public health care sector has been very limited. This paper aims to identify the major reasons why the cloud has not been used to implement EHRs for the South African public health care system, and to provide recommendations of how to overcome these challenges. From the literature, it is clear that there are technology, environmental and organisational challenges affecting the implementation of EHRs in the cloud. Four recommendations are provided that can be used by the National Department of Health to implement EHRs making use of the cloud.
Simulating the growth of an charge cloud for a microchannel plate detector
NASA Astrophysics Data System (ADS)
Siwal, Davinder; Wiggins, Blake; Desouza, Romualdo
2015-10-01
Position sensitive microchannel plate (MCP) detectors have a variety of applications in the fields of astronomy, medical imaging, neutron imaging, and ion beam tracking. Recently, a novel approach has been implemented to detect the position of an incident particle. The charge cloud produced by the MCP induces a signal on a wire harp placed between the MCP and an anode. On qualitative grounds it is clear that in this detector the induced signal shape depends on the size of the electron cloud. A detailed study has therefore been performed to investigate the size of the charge cloud within the MCP and its growth as it propagates from the MCP to the anode. A simple model has been developed to calculate the impact of charge repulsion on the growth of the electron cloud. Both the details of the model and its predictions will be presented. Supported by the US DOE NNSA under Award No. DE-NA0002012.
Far-infrared Extinction Mapping of Infrared Dark Clouds
NASA Astrophysics Data System (ADS)
Lim, Wanggi; Tan, Jonathan C.
2014-01-01
Progress in understanding star formation requires detailed observational constraints on the initial conditions, i.e., dense clumps and cores in giant molecular clouds that are on the verge of gravitational instability. Such structures have been studied by their extinction of near-infrared and, more recently, mid-infrared (MIR) background light. It has been somewhat more of a surprise to find that there are regions that appear as dark shadows at far-infrared (FIR) wavelengths as long as ~100 μm! Here we develop analysis methods of FIR images from Spitzer-MIPS and Herschel-PACS that allow quantitative measurements of cloud mass surface density, Σ. The method builds on that developed for MIR extinction mapping by Butler & Tan, in particular involving a search for independently saturated, i.e., very opaque, regions that allow measurement of the foreground intensity. We focus on three massive starless core/clumps in the Infrared Dark Cloud (IRDC) G028.37+00.07, deriving mass surface density maps from 3.5 to 70 μm. A by-product of this analysis is the measurement of the spectral energy distribution of the diffuse foreground emission. The lower opacity at 70 μm allows us to probe to higher Σ values, up to ~1 g cm-2 in the densest parts of the core/clumps. Comparison of the Σ maps at different wavelengths constrains the shape of the MIR-FIR dust opacity law in IRDCs. We find that it is most consistent with the thick ice mantle models of Ossenkopf & Henning. There is tentative evidence for grain ice mantle growth as one goes from lower to higher Σ regions.
NASA Astrophysics Data System (ADS)
Jiang, Guodong; Fan, Ming; Li, Lihua
2016-03-01
Mammography is the gold standard for breast cancer screening, reducing mortality by about 30%. The application of a computer-aided detection (CAD) system to assist a single radiologist is important to further improve mammographic sensitivity for breast cancer detection. In this study, a design and realization of the prototype for remote diagnosis system in mammography based on cloud platform were proposed. To build this system, technologies were utilized including medical image information construction, cloud infrastructure and human-machine diagnosis model. Specifically, on one hand, web platform for remote diagnosis was established by J2EE web technology. Moreover, background design was realized through Hadoop open-source framework. On the other hand, storage system was built up with Hadoop distributed file system (HDFS) technology which enables users to easily develop and run on massive data application, and give full play to the advantages of cloud computing which is characterized by high efficiency, scalability and low cost. In addition, the CAD system was realized through MapReduce frame. The diagnosis module in this system implemented the algorithms of fusion of machine and human intelligence. Specifically, we combined results of diagnoses from doctors' experience and traditional CAD by using the man-machine intelligent fusion model based on Alpha-Integration and multi-agent algorithm. Finally, the applications on different levels of this system in the platform were also discussed. This diagnosis system will have great importance for the balanced health resource, lower medical expense and improvement of accuracy of diagnosis in basic medical institutes.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.
Cianfrocco, Michael A; Leschziner, Andres E
2015-05-08
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.
A 94 GHz RF Electronics Subsystem for the CloudSat Cloud Profiling Radar
NASA Technical Reports Server (NTRS)
LaBelle, Remi C.; Girard, Ralph; Arbery, Graham
2003-01-01
The CloudSat spacecraft, scheduled for launch in 2004, will carry the 94 GHz Cloud Profiling Radar (CPR) instrument. The design, assembly and test of the flight Radio Frequency Electronics Subsystem (RFES) for this instrument has been completed and is presented here. The RFES consists of an Upconverter (which includes an Exciter and two Drive Amplifiers (DA's)), a Receiver, and a Transmitter Calibrator assembly. Some key performance parameters of the RFES are as follows: dual 100 mW pulse-modulated drive outputs at 94 GHz, overall Receiver noise figure < 5.0 dB, a highly stable W-band noise source to provide knowledge accuracy of Receiver gain of < 0.4 dB over the 2 year mission life, and a W-band peak power detector to monitor the transmitter output power to within 0.5 dB over life. Some recent monolithic microwave integrated circuit (MMIC) designs were utilized which implement the DA's in 0.1 micron GaAs high electron-mobility transistor (HEMT) technology and the Receiver low-noise amplifier (LNA) in 0.1 micron InP HEMT technology.
Cloud computing in pharmaceutical R&D: business risks and mitigations.
Geiger, Karl
2010-05-01
Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.
Current State of the Art Historic Building Information Modelling
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2017-08-01
In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
NASA Astrophysics Data System (ADS)
Gouwens, C.; Dragosavic, M.
The large reserves and increasing use of natural gas as a source of energy have resulted in its storage and transport becoming an urgent problem. Since a liquid of the same mass occupies only a fraction of the volume of a gas, it is economical to store natural gas as a liquid. Liquefied natural gas is stored in insulated tanks and also carried by ship at a temperature of -160 C to 170 C. If a serious accident allows the LNG to escape, a gas cloud forms. The results of a possible explosion from such a gas cloud are studied. The development of a leak, escape and evaporation, size and propagation of the gas cloud, the explosive pressures to be expected and the results on the environment are investigated. Damage to buildings is examined making use of the preliminary conclusions of the other sub-projects and especially the explosive pressures.
NASA Astrophysics Data System (ADS)
Thau, D.
2017-12-01
For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson
The Segmentation of Point Clouds with K-Means and ANN (artifical Neural Network)
NASA Astrophysics Data System (ADS)
Kuçak, R. A.; Özdemir, E.; Erol, S.
2017-05-01
Segmentation of point clouds is recently used in many Geomatics Engineering applications such as the building extraction in urban areas, Digital Terrain Model (DTM) generation and the road or urban furniture extraction. Segmentation is a process of dividing point clouds according to their special characteristic layers. The present paper discusses K-means and self-organizing map (SOM) which is a type of ANN (Artificial Neural Network) segmentation algorithm which treats the segmentation of point cloud. The point clouds which generate with photogrammetric method and Terrestrial Lidar System (TLS) were segmented according to surface normal, intensity and curvature. Thus, the results were evaluated. LIDAR (Light Detection and Ranging) and Photogrammetry are commonly used to obtain point clouds in many remote sensing and geodesy applications. By photogrammetric method or LIDAR method, it is possible to obtain point cloud from terrestrial or airborne systems. In this study, the measurements were made with a Leica C10 laser scanner in LIDAR method. In photogrammetric method, the point cloud was obtained from photographs taken from the ground with a 13 MP non-metric camera.
Using Deep Learning Model for Meteorological Satellite Cloud Image Prediction
NASA Astrophysics Data System (ADS)
Su, X.
2017-12-01
A satellite cloud image contains much weather information such as precipitation information. Short-time cloud movement forecast is important for precipitation forecast and is the primary means for typhoon monitoring. The traditional methods are mostly using the cloud feature matching and linear extrapolation to predict the cloud movement, which makes that the nonstationary process such as inversion and deformation during the movement of the cloud is basically not considered. It is still a hard task to predict cloud movement timely and correctly. As deep learning model could perform well in learning spatiotemporal features, to meet this challenge, we could regard cloud image prediction as a spatiotemporal sequence forecasting problem and introduce deep learning model to solve this problem. In this research, we use a variant of Gated-Recurrent-Unit(GRU) that has convolutional structures to deal with spatiotemporal features and build an end-to-end model to solve this forecast problem. In this model, both the input and output are spatiotemporal sequences. Compared to Convolutional LSTM(ConvLSTM) model, this model has lower amount of parameters. We imply this model on GOES satellite data and the model perform well.
NASA Astrophysics Data System (ADS)
Fernandez Galarreta, J.; Kerle, N.; Gerke, M.
2015-06-01
Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.
Interaction of a neutral cloud moving through a magnetized plasma
NASA Technical Reports Server (NTRS)
Goertz, C. K.; Lu, G.
1990-01-01
Current collection by outgassing probes in motion relative to a magnetized plasma may be significantly affected by plasma processes that cause electron heating and cross field transport. Simulations of a neutral gas cloud moving across a static magnetic field are discussed. The authors treat a low-Beta plasma and use a 2-1/2 D electrostatic code linked with the authors' Plasma and Neutral Interaction Code (PANIC). This study emphasizes the understanding of the interface between the neutral gas cloud and the surrounding plasma where electrons are heated and can diffuse across field lines. When ionization or charge exchange collisions occur a sheath-like structure is formed at the surface of the neutral gas. In that region the crossfield component of the electric field causes the electron to E times B drift with a velocity of the order of the neutral gas velocity times the square root of the ion to electron mass ratio. In addition a diamagnetic drift of the electron occurs due to the number density and temperature inhomogeneity in the front. These drift currents excite the lower-hybrid waves with the wave k-vectors almost perpendicular to the neutral flow and magnetic field again resulting in electron heating. The thermal electron current is significantly enhanced due to this heating.
NASA Astrophysics Data System (ADS)
Chow, L.; Fai, S.
2017-08-01
The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.
NASA Technical Reports Server (NTRS)
Genkova, I.; Long, C. N.; Heck, P. W.; Minnis, P.
2003-01-01
One of the primary Atmospheric Radiation Measurement (ARM) Program objectives is to obtain measurements applicable to the development of models for better understanding of radiative processes in the atmosphere. We address this goal by building a three-dimensional (3D) characterization of the cloud structure and properties over the ARM Southern Great Plains (SGP). We take the approach of juxtaposing the cloud properties as retrieved from independent satellite and ground-based retrievals, and looking at the statistics of the cloud field properties. Once these retrievals are well understood, they will be used to populate the 3D characterization database. As a first step we determine the relationship between surface fractional sky cover and satellite viewing angle dependent cloud fraction (CF). We elaborate on the agreement intercomparing optical depth (OD) datasets from satellite and ground using available retrieval algorithms with relation to the CF, cloud height, multi-layer cloud presence, and solar zenith angle (SZA). For the SGP Central Facility, where output from the active remote sensing cloud layer (ARSCL) valueadded product (VAP) is available, we study the uncertainty of satellite estimated cloud heights and evaluate the impact of this uncertainty for radiative studies.
NASA Astrophysics Data System (ADS)
LIU, J.; Bi, Y.; Duan, S.; Lu, D.
2017-12-01
It is well-known that cloud characteristics, such as top and base heights and their layering structure of micro-physical parameters, spatial coverage and temporal duration are very important factors influencing both radiation budget and its vertical partitioning as well as hydrological cycle through precipitation data. Also, cloud structure and their statistical distribution and typical values will have respective characteristics with geographical and seasonal variation. Ka band radar is a powerful tool to obtain above parameters around the world, such as ARM cloud radar at the Oklahoma US, Since 2006, Cloudsat is one of NASA's A-Train satellite constellation, continuously observe the cloud structure with global coverage, but only twice a day it monitor clouds over same local site at same local time.By using IAP Ka band Doppler radar which has been operating continuously since early 2013 over the roof of IAP building in Beijing, we obtained the statistical characteristic of clouds, including cloud layering, cloud top and base heights, as well as the thickness of each cloud layer and their distribution, and were analyzed monthly and seasonal and diurnal variation, statistical analysis of cloud reflectivity profiles is also made. The analysis covers both non-precipitating clouds and precipitating clouds. Also, some preliminary comparison of the results with Cloudsat/Calipso products for same period and same area are made.
Tourism guide cloud service quality: What actually delights customers?
Lin, Shu-Ping; Yang, Chen-Lung; Pi, Han-Chung; Ho, Thao-Minh
2016-01-01
The emergence of advanced IT and cloud services has beneficially supported the information-intensive tourism industry, simultaneously caused extreme competitions in attracting customers through building efficient service platforms. On response, numerous nations have implemented cloud platforms to provide value-added sightseeing information and personal intelligent service experiences. Despite these efforts, customers' actual perspectives have yet been sufficiently understood. To bridge the gap, this study attempts to investigate what aspects of tourism cloud services actually delight customers' satisfaction and loyalty. 336 valid survey questionnaire answers were analyzed using structural equation modeling method. The results prove positive impacts of function quality, enjoyment, multiple visual aids, and information quality on customers' satisfaction as well as of enjoyment and satisfaction on use loyalty. The findings hope to provide helpful references of customer use behaviors for enhancing cloud service quality in order to achieve better organizational competitiveness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
China, Swarup; Kulkarni, Gourihar; Scarnato, Barbara V.
Freshly emitted soot particles are fractal-like aggregates, but atmospheric processing often transforms their morphology. Morphology of soot particles plays an important role in determining their optical properties, life cycle and hence their effect on Earth’s radiative balance. However, little is known about the morphology of soot particles that participated in cold cloud processes. Here we report results from laboratory experiments that simulate cold cloud processing of diesel soot particles by allowing them to form supercooled droplets and ice crystals at -20 and -40°C, respectively. Electron microscopy revealed that soot residuals from ice crystals were more compact (roundness~0.55) than those frommore » supercooled droplets (roundness ~0.45), while nascent soot particles were the least compact (roundness~0.41). Optical simulations using the discrete dipole approximation showed that the more compact structure enhances soot single scattering albedo by a factor up to 1.4, thereby reducing the top-of-the-atmosphere direct radiative forcing by ~63%. Lastly, these results underscore that climate models should consider the morphological evolution of soot particles due to cold cloud processing to improve the estimate of direct radiative forcing of soot.« less
China, Swarup; Kulkarni, Gourihar; Scarnato, Barbara V.; ...
2015-11-01
Freshly emitted soot particles are fractal-like aggregates, but atmospheric processing often transforms their morphology. Morphology of soot particles plays an important role in determining their optical properties, life cycle and hence their effect on Earth’s radiative balance. However, little is known about the morphology of soot particles that participated in cold cloud processes. Here we report results from laboratory experiments that simulate cold cloud processing of diesel soot particles by allowing them to form supercooled droplets and ice crystals at -20 and -40°C, respectively. Electron microscopy revealed that soot residuals from ice crystals were more compact (roundness~0.55) than those frommore » supercooled droplets (roundness ~0.45), while nascent soot particles were the least compact (roundness~0.41). Optical simulations using the discrete dipole approximation showed that the more compact structure enhances soot single scattering albedo by a factor up to 1.4, thereby reducing the top-of-the-atmosphere direct radiative forcing by ~63%. Lastly, these results underscore that climate models should consider the morphological evolution of soot particles due to cold cloud processing to improve the estimate of direct radiative forcing of soot.« less
A Neural Network Approach to Infer Optical Depth of Thick Ice Clouds at Night
NASA Technical Reports Server (NTRS)
Minnis, P.; Hong, G.; Sun-Mack, S.; Chen, Yan; Smith, W. L., Jr.
2016-01-01
One of the roadblocks to continuously monitoring cloud properties is the tendency of clouds to become optically black at cloud optical depths (COD) of 6 or less. This constraint dramatically reduces the quantitative information content at night. A recent study found that because of their diffuse nature, ice clouds remain optically gray, to some extent, up to COD of 100 at certain wavelengths. Taking advantage of this weak dependency and the availability of COD retrievals from CloudSat, an artificial neural network algorithm was developed to estimate COD values up to 70 from common satellite imager infrared channels. The method was trained using matched 2007 CloudSat and Aqua MODIS data and is tested using similar data from 2008. The results show a significant improvement over the use of default values at night with high correlation. This paper summarizes the results and suggests paths for future improvement.
An electron microscope for the aberration-corrected era.
Krivanek, O L; Corbin, G J; Dellby, N; Elston, B F; Keyse, R J; Murfitt, M F; Own, C S; Szilagyi, Z S; Woodruff, J W
2008-02-01
Improved resolution made possible by aberration correction has greatly increased the demands on the performance of all parts of high-end electron microscopes. In order to meet these demands, we have designed and built an entirely new scanning transmission electron microscope (STEM). The microscope includes a flexible illumination system that allows the properties of its probe to be changed on-the-fly, a third-generation aberration corrector which corrects all geometric aberrations up to fifth order, an ultra-responsive yet stable five-axis sample stage, and a flexible configuration of optimized detectors. The microscope features many innovations, such as a modular column assembled from building blocks that can be stacked in almost any order, in situ storage and cleaning facilities for up to five samples, computer-controlled loading of samples into the column, and self-diagnosing electronics. The microscope construction is described, and examples of its capabilities are shown.
Slicing Method for curved façade and window extraction from point clouds
NASA Astrophysics Data System (ADS)
Iman Zolanvari, S. M.; Laefer, Debra F.
2016-09-01
Laser scanning technology is a fast and reliable method to survey structures. However, the automatic conversion of such data into solid models for computation remains a major challenge, especially where non-rectilinear features are present. Since, openings and the overall dimensions of the buildings are the most critical elements in computational models for structural analysis, this article introduces the Slicing Method as a new, computationally-efficient method for extracting overall façade and window boundary points for reconstructing a façade into a geometry compatible for computational modelling. After finding a principal plane, the technique slices a façade into limited portions, with each slice representing a unique, imaginary section passing through a building. This is done along a façade's principal axes to segregate window and door openings from structural portions of the load-bearing masonry walls. The method detects each opening area's boundaries, as well as the overall boundary of the façade, in part, by using a one-dimensional projection to accelerate processing. Slices were optimised as 14.3 slices per vertical metre of building and 25 slices per horizontal metre of building, irrespective of building configuration or complexity. The proposed procedure was validated by its application to three highly decorative, historic brick buildings. Accuracy in excess of 93% was achieved with no manual intervention on highly complex buildings and nearly 100% on simple ones. Furthermore, computational times were less than 3 sec for data sets up to 2.6 million points, while similar existing approaches required more than 16 hr for such datasets.
A support architecture for reliable distributed computing systems
NASA Technical Reports Server (NTRS)
Dasgupta, Partha; Leblanc, Richard J., Jr.
1988-01-01
The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
If It's in the Cloud, Get It on Paper: Cloud Computing Contract Issues
ERIC Educational Resources Information Center
Trappler, Thomas J.
2010-01-01
Much recent discussion has focused on the pros and cons of cloud computing. Some institutions are attracted to cloud computing benefits such as rapid deployment, flexible scalability, and low initial start-up cost, while others are concerned about cloud computing risks such as those related to data location, level of service, and security…
Sulfur Upwelling off the African Coast
NASA Technical Reports Server (NTRS)
2002-01-01
Though these aquamarine clouds in the waters off the coast of northern Namibia may look like algae blooms, they are in fact clouds of sulfur produced by anaerobic bacteria on the ocean's floor. This image of the sulfur-filled water was taken on April 24, 2002, by the Sea-viewing Wide Field-of-View Sensor (SeaWiFS), flying aboard the Orbview-2 satellite. The anaerobic bacteria (bacteria that can live without oxygen) feed upon algae carcasses that exist in abundance on the ocean's floor off of Namibia. As the bacteria ingest the algae husks, they produce hydrogen sulfide, which slowly builds up in the sea-floor sediments. Eventually, the hydrogen sulfide reaches the point where the sediment can no longer contain it, and it bubbles forth. When this poisonous chemical reaches the surface, it combines with the oxygen in the upper layers of the ocean to create clouds of pure sulfur. The sulfur causes the Namibian coast to smell like rotten eggs, and the hydrogen sulfide will often kill fish and drive lobsters away. For more information, read: A Bloom By Any Other Name A high-resolution (250 meters per pixel) image earlier on the 24th taken from the Moderate-Resolution Imaging Spectroradiometer (MODIS) shows additional detail in the plumes. Image courtesy the SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE. MODIS image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC
Lightning-channel conditioning
NASA Astrophysics Data System (ADS)
Sonnenfeld, R.; da Silva, C. L.; Eack, K.; Edens, H. E.; Harley, J.; McHarg, M.; Contreras Vidal, L.
2017-12-01
The concept of "conditioning" has several distinct applications in understanding lightning. It is commonly associated to the greater speed of dart-leaders vs. stepped leaders and the retrace of a cloud-to-ground channel by later return strokes. We will showadditional examples of conditioning: (A) High-speed videos of triggered flashes show "dark" periods of up to 50 ms between rebrightenings of an existing channel. (B) Interferometer (INTF) images of intra-cloud (IC) flashes demonstrate that electric-field "K-changes" correspond to rapid propagation of RF impulses along a previously formed channel separated by up to 20 ms with little RF emission on that channel. (C) Further, INTF images (like the one below) frequently show that the initial IC channel is more branched and "fuzzier'' than its later incarnations. Also, we contrast high-speed video, INTF observations, and spectroscopic measurements with possible physical mechanisms that can explain how channel conditioning guides and facilitates dart leader propagation. These mechanisms include: (1) a plasmochemical effect where electrons are stored in negative ions and released during the dart leader propagation via field-induced detachment; (2) small-amplitude residual currents that can maintain electrical conductivity; and (3) slow heat conduction cooling of plasma owing to channel expansion dynamics.
Analysis of Aurora's Performance Simulation Engine for Three Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, Janine; Simon, Joseph
2015-07-07
Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systemsmore » in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.« less
Advances in structural and functional analysis of membrane proteins by electron crystallography
Wisedchaisri, Goragot; Reichow, Steve L.; Gonen, Tamir
2011-01-01
Summary Electron crystallography is a powerful technique for the study of membrane protein structure and function in the lipid environment. When well-ordered two-dimensional crystals are obtained the structure of both protein and lipid can be determined and lipid-protein interactions analyzed. Protons and ionic charges can be visualized by electron crystallography and the protein of interest can be captured for structural analysis in a variety of physiologically distinct states. This review highlights the strengths of electron crystallography and the momentum that is building up in automation and the development of high throughput tools and methods for structural and functional analysis of membrane proteins by electron crystallography. PMID:22000511
Advances in structural and functional analysis of membrane proteins by electron crystallography.
Wisedchaisri, Goragot; Reichow, Steve L; Gonen, Tamir
2011-10-12
Electron crystallography is a powerful technique for the study of membrane protein structure and function in the lipid environment. When well-ordered two-dimensional crystals are obtained the structure of both protein and lipid can be determined and lipid-protein interactions analyzed. Protons and ionic charges can be visualized by electron crystallography and the protein of interest can be captured for structural analysis in a variety of physiologically distinct states. This review highlights the strengths of electron crystallography and the momentum that is building up in automation and the development of high throughput tools and methods for structural and functional analysis of membrane proteins by electron crystallography. Copyright © 2011 Elsevier Ltd. All rights reserved.
New experimental measurements of electron clouds in ion beams with large tune depression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molvik, A W; Covo, M K; Cohen, R H
We study electron clouds in high perveance beams (K = 8E-4) with a large tune depression of 0.2 (defined as the ratio of a single particle oscillation response to the applied focusing fields, with and without space charge). These 1 MeV, 180 mA, K+ beams have a beam potential of +2 kV when electron clouds are minimized. Simulation results are discussed in a companion paper [J-L. Vay, this Conference]. We have developed the first diagnostics that quantitatively measure the accumulation of electrons in a beam [1]. This, together with measurements of electron sources, will enable the electron particle balance tomore » be measured, and electron-trapping efficiencies determined. We, along with colleagues from GSI and CERN, have also measured the scaling of gas desorption with beam energy and dE/dx [2]. Experiments where the heavy-ion beam is transported with solenoid magnetic fields, rather than with quadrupole magnetic or electrostatic fields, are being initiated. We will discuss initial results from experiments using electrode sets (in the middle and at the ends of magnets) to either expel or to trap electrons within the magnets. We observe electron oscillations in the last quadrupole magnet when we flood the beam with electrons from an end wall. These oscillations, of order 10 MHz, are observed to grow from the center of the magnet while drifting upstream against the beam, in good agreement with simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Ruixue; Chen, Kezheng, E-mail: dxb@sdu.edu.cn; Liao, Zhongmiao
Highlights: ► Hydroxyapatite hierarchical microstructures have been synthesized by a facile method. ► The morphology and size of the building units of 3D structures can be controlled. ► The hydroxyapatite with 3D structure is morphologically and structurally stable up to 800 °C. - Abstract: Hydroxyapatite (HAp) hierarchical microstructures with novel 3D morphology were prepared through a template- and surfactant-free hydrothermal homogeneous precipitation method. Field emission scanning electron microscopy (FESEM), high-resolution transmission electron microscopy (HRTEM), and X-ray diffraction (XRD) were used to characterize the morphology and composition of the synthesized products. Interestingly, the obtained HAp with 3D structure is composed ofmore » one-dimensional (1D) nanorods or two-dimensional (2D) nanoribbons, and the length and morphology of these building blocks can be controlled through controlling the pH of the reaction. The building blocks are single crystalline and have different preferential orientation growth under different pH conditions. At low pH values, octacalcium phosphate (OCP) phase formed first and then transformed into HAp phase due to the increased pH value caused by the decomposition of urea. The investigation on the thermal stability reveals that the prepared HAp hierarchical microstructures are morphologically and structurally stable up to 800 °C.« less
You're a What?: Tower Technician
ERIC Educational Resources Information Center
Vilorio, Dennis
2012-01-01
In this article, the author talks about the role and functions of a tower technician. A tower technician climbs up the face of telecommunications towers to remove, install, test, maintain, and repair a variety of equipment--from antennas to light bulbs. Tower technicians also build shelters and radiofrequency shields for electronic equipment, lay…
NASA Astrophysics Data System (ADS)
Lagasio, Martina; Parodi, Antonio; Procopio, Renato; Rachidi, Farhad; Fiori, Elisabetta
2017-04-01
Lightning activity is a characteristic phenomenon of severe weather as confirmed by many studies on different weather regimes that reveal strong interplay between lightning phenomena and extreme rainfall process in thunderstorms. The improvement of the so-called total (i.e. cloud-to-ground and intra-cloud) lightning observation systems in the last decades has allowed to investigate the relationship between the lightning flash rate and the kinematic and microphysical properties of severe hydro-meteorological events characterized by strong convection. V-shape back-building Mesoscale Convective Systems (MCSs) occurring over short periods of time have hit several times the Liguria region located in north-western Italy in the period between October 2010 and November 2014, generating flash-flood events responsible for hundreds of fatalities and millions of euros of damage. All these events showed an area of intense precipitation sweeping an arc of a few degrees around the warm conveyor belt originating about 50-60 km from the Liguria coastline. A second main ingredient was the presence of a convergence line, which supported the development and the maintenance of the aforementioned back-building process. Other common features were the persistence of such geometric configuration for many hours and the associated strong lightning activity. A methodological approach for the evaluation of these types of extreme rainfall and lightning convective events is presented for a back-building MCS event occurred in Genoa in 2014. A microphysics driven ensemble of WRF simulations at cloud-permitting grid spacing (1 km) with different microphysics parameterizations is used and compared to the available observational radar and lightning data. To pursue this aim, the performance of the Lightning Potential Index (LPI) as a measure of the potential for charge generation and separation that leads to lightning occurrence in clouds, is computed and analyzed to gain further physical insight in these V-shape convective processes and to understand its predictive ability.
NASA Technical Reports Server (NTRS)
Bartkus, Tadas; Tsao, Jen-Ching; Struk, Peter
2017-01-01
This paper builds on previous work that compares numerical simulations of mixed-phase icing clouds with experimental data. The model couples the thermal interaction between ice particles and water droplets of the icing cloud with the flowing air of an icing wind tunnel for simulation of NASA Glenn Research Centers (GRC) Propulsion Systems Laboratory (PSL). Measurements were taken during the Fundamentals of Ice Crystal Icing Physics Tests at the PSL tunnel in March 2016. The tests simulated ice-crystal and mixed-phase icing that relate to ice accretions within turbofan engines.
Method and apparatus for measuring purity of noble gases
Austin, Robert
2008-04-01
A device for detecting impurities in a noble gas includes a detection chamber and a source of pulsed ultraviolet light. The pulse of the ultraviolet light is transferred into the detection chamber and onto a photocathode, thereby emitting a cloud of free electrons into the noble gas within the detection chamber. The cloud of electrons is attracted to the opposite end of the detection chamber by a high positive voltage potential at that end and focused onto a sensing anode. If there are impurities in the noble gas, some or all of the electrons within the cloud will bond with the impurity molecules and not reach the sensing anode. Therefore, measuring a lower signal at the sensing anode indicates a higher level of impurities while sensing a higher signal indicates fewer impurities. Impurities in the range of one part per billion can be measured by this device.
Burner Rig in the Material and Stresses Building
1969-11-21
A burner rig heats up a material sample in the Materials and Stresses Building at the National Aeronautics and Space Administration (NASA) Lewis Research Center. Materials technology is an important element in the successful development of advanced airbreathing and rocket propulsion systems. Different types of engines operate in different environments so an array of dependable materials is needed. NASA Lewis began investigating the characteristics of different materials shortly after World War II. In 1949 the materials group was expanded into its own division. The Lewis researchers sought to study and test materials in environments that simulate the environment in which they would operate. The Materials and Stresses Building, built in 1949, contained a number of laboratories to analyze the materials. They are subjected to high temperatures, high stresses, corrosion, irradiation, and hot gasses. The Physics of Solids Laboratory included a cyclotron, cloud chamber, helium cryostat, and metallurgy cave. The Metallographic Laboratory possessed six x-ray diffraction machines, two metalloscopes, and other equipment. The Furnace Room had two large induction machines, a 4500⁰ F graphite furnace, and heat treating equipment. The Powder Laboratory included 60-ton and 3000-ton presses. The Stresses Laboratory included stress rupture machines, fatigue machines, and tensile strength machines.
CesrTA Retarding Field Analyzer Modeling Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calvey, J.R.; Celata, C.M.; Crittenden, J.A.
2010-05-23
Retarding field analyzers (RFAs) provide an effective measure of the local electron cloud density and energy distribution. Proper interpretation of RFA data can yield information about the behavior of the cloud, as well as the surface properties of the instrumented vacuum chamber. However, due to the complex interaction of the cloud with the RFA itself, understanding these measurements can be nontrivial. This paper examines different methods for interpreting RFA data via cloud simulation programs. Techniques include postprocessing the output of a simulation code to predict the RFA response; and incorporating an RFA model into the cloud modeling program itself.
Multichannel scanning radiometer for remote sensing cloud physical parameters
NASA Technical Reports Server (NTRS)
Curran, R. J.; Kyle, H. L.; Blaine, L. R.; Smith, J.; Clem, T. D.
1981-01-01
A multichannel scanning radiometer developed for remote observation of cloud physical properties is described. Consisting of six channels in the near infrared and one channel in the thermal infrared, the instrument can observe cloud physical parameters such as optical thickness, thermodynamic phase, cloud top altitude, and cloud top temperature. Measurement accuracy is quantified through flight tests on the NASA CV-990 and the NASA WB-57F, and is found to be limited by the harsh environment of the aircraft at flight altitude. The electronics, data system, and calibration of the instrument are also discussed.
Magnetic Field Generation During the Collision of Narrow Plasma Clouds
NASA Astrophysics Data System (ADS)
Sakai, Jun-ichi; Kazimura, Yoshihiro; Haruki, Takayuki
1999-06-01
We investigate the dynamics of the collision of narrow plasma clouds,whose transverse dimension is on the order of the electron skin depth.A 2D3V (two dimensions in space and three dimensions in velocity space)particle-in-cell (PIC) collisionless relativistic code is used toshow the generation of a quasi-staticmagnetic field during the collision of narrow plasma clouds both inelectron-ion and electron-positron (pair) plasmas. The localizedstrong magnetic fluxes result in the generation of the charge separationwith complicated structures, which may be sources of electromagneticas well as Langmuir waves. We also present one applicationof this process, which occurs during coalescence of magnetic islandsin a current sheet of pair plasmas.
Common Faults and Their Prioritization in Small Commercial Buildings: February 2017 - December 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, Stephen M; Kim, Janghyun; Cai, Jie
To support an ongoing project at NREL titled 'An Open, Cloud-Based Platform for Whole-Building Fault Detection and Diagnostics' (work breakdown structure number 3.2.6.18 funded by the Department of Energy Building Technologies Office), this report documents faults that are commonly found in small commercial buildings (with a floor area of 10,000 ft2 or less) based on a literature review and discussions with building commissioning experts. It also provides a list of prioritized faults based on an estimation of the prevalence, energy impact, and financial impact of each fault.
Medeiros, Brian; Nuijens, Louise
2016-05-31
Trade wind regions cover most of the tropical oceans, and the prevailing cloud type is shallow cumulus. These small clouds are parameterized by climate models, and changes in their radiative effects strongly and directly contribute to the spread in estimates of climate sensitivity. This study investigates the structure and variability of these clouds in observations and climate models. The study builds upon recent detailed model evaluations using observations from the island of Barbados. Using a dynamical regimes framework, satellite and reanalysis products are used to compare the Barbados region and the broader tropics. It is shown that clouds in the Barbados region are similar to those across the trade wind regions, implying that observational findings from the Barbados Cloud Observatory are relevant to clouds across the tropics. The same methods are applied to climate models to evaluate the simulated clouds. The models generally capture the cloud radiative effect, but underestimate cloud cover and show an array of cloud vertical structures. Some models show strong biases in the environment of the Barbados region in summer, weakening the connection between the regional biases and those across the tropics. Even bearing that limitation in mind, it is shown that covariations of cloud and environmental properties in the models are inconsistent with observations. The models tend to misrepresent sensitivity to moisture variations and inversion characteristics. These model errors are likely connected to cloud feedback in climate projections, and highlight the importance of the representation of shallow cumulus convection.
Nuijens, Louise
2016-01-01
Trade wind regions cover most of the tropical oceans, and the prevailing cloud type is shallow cumulus. These small clouds are parameterized by climate models, and changes in their radiative effects strongly and directly contribute to the spread in estimates of climate sensitivity. This study investigates the structure and variability of these clouds in observations and climate models. The study builds upon recent detailed model evaluations using observations from the island of Barbados. Using a dynamical regimes framework, satellite and reanalysis products are used to compare the Barbados region and the broader tropics. It is shown that clouds in the Barbados region are similar to those across the trade wind regions, implying that observational findings from the Barbados Cloud Observatory are relevant to clouds across the tropics. The same methods are applied to climate models to evaluate the simulated clouds. The models generally capture the cloud radiative effect, but underestimate cloud cover and show an array of cloud vertical structures. Some models show strong biases in the environment of the Barbados region in summer, weakening the connection between the regional biases and those across the tropics. Even bearing that limitation in mind, it is shown that covariations of cloud and environmental properties in the models are inconsistent with observations. The models tend to misrepresent sensitivity to moisture variations and inversion characteristics. These model errors are likely connected to cloud feedback in climate projections, and highlight the importance of the representation of shallow cumulus convection. PMID:27185925
APEX reveals glowing stellar nurseries
NASA Astrophysics Data System (ADS)
2008-11-01
Illustrating the power of submillimetre-wavelength astronomy, an APEX image reveals how an expanding bubble of ionised gas about ten light-years across is causing the surrounding material to collapse into dense clumps that are the birthplaces of new stars. Submillimetre light is the key to revealing some of the coldest material in the Universe, such as these cold, dense clouds. Glowing Stellar Nurseries ESO PR Photo 40/08 Glowing Stellar Nurseries The region, called RCW120, is about 4200 light years from Earth, towards the constellation of Scorpius. A hot, massive star in its centre is emitting huge amounts of ultraviolet radiation, which ionises the surrounding gas, stripping the electrons from hydrogen atoms and producing the characteristic red glow of so-called H-alpha emission. As this ionised region expands into space, the associated shock wave sweeps up a layer of the surrounding cold interstellar gas and cosmic dust. This layer becomes unstable and collapses under its own gravity into dense clumps, forming cold, dense clouds of hydrogen where new stars are born. However, as the clouds are still very cold, with temperatures of around -250˚ Celsius, their faint heat glow can only be seen at submillimetre wavelengths. Submillimetre light is therefore vital in studying the earliest stages of the birth and life of stars. The submillimetre-wavelength data were taken with the LABOCA camera on the 12-m Atacama Pathfinder Experiment (APEX) telescope, located on the 5000 m high plateau of Chajnantor in the Chilean Atacama desert. Thanks to LABOCA's high sensitivity, astronomers were able to detect clumps of cold gas four times fainter than previously possible. Since the brightness of the clumps is a measure of their mass, this also means that astronomers can now study the formation of less massive stars than they could before. The plateau of Chajnantor is also where ESO, together with international partners, is building a next generation submillimetre telescope, ALMA, the Atacama Large Millimeter/submillimeter Array. ALMA will use over sixty 12-m antennas, linked together over distances of more than 16 km, to form a single, giant telescope. APEX is a collaboration between the Max-Planck-Institute for Radio Astronomy (MPIfR), the Onsala Space Observatory (OSO) and ESO. The telescope is based on a prototype antenna constructed for the ALMA project. Operation of APEX at Chajnantor is entrusted to ESO.
Radiation belt electron observations following the January 1997 magnetic cloud event
NASA Astrophysics Data System (ADS)
Selesnick, R. S.; Blake, J. B.
Relativistic electrons in the outer radiation belt associated with the January 1997 magnetic cloud event were observed by the HIST instrument on POLAR at kinetic energies from 0.7 to 7 MeV and L shells from 3 to 9. The electron enhancement occurred on a time scale of hours or less throughout the outer radiation belt, except for a more gradual rise in the higher energy electrons at the lower L values indicative of local acceleration and inward radial diffusion. At the higher L values, variations on a time scale of several days following the initial injection on January 10 are consistent with data from geosynchronous orbit and may be an adiabatic response.
Electrostatic plasma lens for focusing negatively charged particle beams.
Goncharov, A A; Dobrovolskiy, A M; Dunets, S M; Litovko, I V; Gushenets, V I; Oks, E M
2012-02-01
We describe the current status of ongoing research and development of the electrostatic plasma lens for focusing and manipulating intense negatively charged particle beams, electrons, and negative ions. The physical principle of this kind of plasma lens is based on magnetic isolation electrons providing creation of a dynamical positive space charge cloud in shortly restricted volume propagating beam. Here, the new results of experimental investigations and computer simulations of wide-aperture, intense electron beam focusing by plasma lens with positive space charge cloud produced due to the cylindrical anode layer accelerator creating a positive ion stream towards an axis system is presented.
SU-F-BRCD-03: Dose Calculation of Electron Therapy Using Improved Lateral Buildup Ratio Method.
Gebreamlak, W; Tedeschi, D; Alkhatib, H
2012-06-01
To calculate the percentage depth dose of any irregular shape electron beam using modified lateral build-up-ratio method. Percentage depth dose (PDD) curves were measured using 6, 9, 12, and 15MeV electron beam energies for applicator cone sizes of 6×6, 10×10, 14×14, and 14×14cm 2 . Circular cutouts for each cone were prepared from 2.0cm diameter to the maximum possible size for each cone. In addition, three irregular cutouts were prepared. The scanning was done using a water tank and two diodes - one for the signal and the other a stationary reference outside the tank. The water surface was determined by scanning the signal diode slowly from water to air and by noting the sharp change of the percentage depth dose curve at the water/air interface. The lateral build-up-ratio (LBR) for each circular cutout was calculated from the measured PDD curve using the open field of the 14×14 cm 2 cone as the reference field. Using the LBR values and the radius of the circular cutouts, the corresponding lateral spread parameter (sigma) of the electron shower was calculated. Unlike the commonly accepted assumption that sigma is independent of cutout size, it is shown that the sigma value increases linearly with circular cutout size. Using this characteristic of sigma, the PDD curves of irregularly shaped cutouts were calculated. Finally, the calculated PDD curves were compared with measured PDD curves. In this research, it is shown that sigma increases with cutout size. For radius of circular cutout sizes up to the equilibrium range of the electron beam, the increase of sigma with the cutout size is linear. The percentage difference of the calculated PDD from the measured PDD for irregularly shaped cutouts was under 1.0%. Similar Result was obtained for four electron beam energies (6, 9, 12, and 15MeV). © 2012 American Association of Physicists in Medicine.
Revealing isomerism in sodium-water clusters: Photoionization spectra of Na(H2O)n (n = 2-90)
NASA Astrophysics Data System (ADS)
Dierking, Christoph W.; Zurheide, Florian; Zeuch, Thomas; Med, Jakub; Parez, Stanislav; Slavíček, Petr
2017-06-01
Soft ionization of sodium tagged polar clusters is increasingly used as a powerful technique for sizing and characterization of small aerosols with possible application, e.g., in atmospheric chemistry or combustion science. Understanding the structure and photoionization of the sodium doped clusters is critical for such applications. In this work, we report on measurements of photoionization spectra for sodium doped water clusters containing 2-90 water molecules. While most of the previous studies focused on the ionization threshold of the Na(H2O)n clusters, we provide for the first time full photoionization spectra, including the high-energy region, which are used as reference for a comparison with theory. As reported in previous work, we have seen an initial drop of the appearance ionization energy with cluster size to values of about 3.2 eV for n <5 . In the size range from n = 5 to n = 15, broad ion yield curves emerge; for larger clusters, a constant range between signal appearance (˜2.8 eV) and signal saturation (˜4.1 eV) has been observed. The measurements are interpreted with ab initio calculations and ab initio molecular dynamics simulations for selected cluster sizes (n ≤ 15). The simulations revealed theory shortfalls when aiming at quantitative agreement but allowed us identifying structural motifs consistent with the observed ionization energy distributions. We found a decrease in the ionization energy with increasing coordination of the Na atom and increasing delocalization of the Na 3s electron cloud. The appearance ionization energy is determined by isomers with fully solvated sodium and a highly delocalized electron cloud, while both fully and incompletely solvated isomers with localized electron clouds can contribute to the high energy part of the photoionization spectrum. Simulations at elevated temperatures show an increased abundance of isomers with low ionization energies, an entropic effect enabling size selective infrared action spectroscopy, based on near threshold photoionization of Na(H2O)n clusters. In addition, simulations of the sodium pick-up process were carried out to study the gradual formation of the hydrated electron which is the basis of the sodium-tagging sizing.
Revealing isomerism in sodium-water clusters: Photoionization spectra of Na(H2O)n (n = 2-90).
Dierking, Christoph W; Zurheide, Florian; Zeuch, Thomas; Med, Jakub; Parez, Stanislav; Slavíček, Petr
2017-06-28
Soft ionization of sodium tagged polar clusters is increasingly used as a powerful technique for sizing and characterization of small aerosols with possible application, e.g., in atmospheric chemistry or combustion science. Understanding the structure and photoionization of the sodium doped clusters is critical for such applications. In this work, we report on measurements of photoionization spectra for sodium doped water clusters containing 2-90 water molecules. While most of the previous studies focused on the ionization threshold of the Na(H 2 O) n clusters, we provide for the first time full photoionization spectra, including the high-energy region, which are used as reference for a comparison with theory. As reported in previous work, we have seen an initial drop of the appearance ionization energy with cluster size to values of about 3.2 eV for n<5. In the size range from n = 5 to n = 15, broad ion yield curves emerge; for larger clusters, a constant range between signal appearance (∼2.8 eV) and signal saturation (∼4.1 eV) has been observed. The measurements are interpreted with ab initio calculations and ab initio molecular dynamics simulations for selected cluster sizes (n≤ 15). The simulations revealed theory shortfalls when aiming at quantitative agreement but allowed us identifying structural motifs consistent with the observed ionization energy distributions. We found a decrease in the ionization energy with increasing coordination of the Na atom and increasing delocalization of the Na 3s electron cloud. The appearance ionization energy is determined by isomers with fully solvated sodium and a highly delocalized electron cloud, while both fully and incompletely solvated isomers with localized electron clouds can contribute to the high energy part of the photoionization spectrum. Simulations at elevated temperatures show an increased abundance of isomers with low ionization energies, an entropic effect enabling size selective infrared action spectroscopy, based on near threshold photoionization of Na(H 2 O) n clusters. In addition, simulations of the sodium pick-up process were carried out to study the gradual formation of the hydrated electron which is the basis of the sodium-tagging sizing.
NASA Astrophysics Data System (ADS)
Foster, S. Q.; Johnson, R. M.; Randall, D.; Denning, S.; Russell, R.; Gardiner, L.; Hatheway, B.; Genyuk, J.; Bergman, J.
2008-12-01
The need for improving the representation of cloud processes in climate models has been one of the most important limitations of the reliability of climate-change simulations. Now in its third year, the National Science Foundation-funded Center for Multi-scale Modeling of Atmospheric Processes (CMMAP) at Colorado State University is addressing this problem through a revolutionary new approach to representing cloud processes on their native scales, including the cloud-scale interaction processes that are active in cloud systems. CMMAP has set ambitious education and human-resource goals to share basic information about the atmosphere, clouds, weather, climate, and modeling with diverse K-12 and public audiences through its affiliation with the Windows to the Universe (W2U) program at University Corporation for Atmospheric Research (UCAR). W2U web pages are written at three levels in English and Spanish. This information targets learners at all levels, educators, and families who seek to understand and share resources and information about the nature of weather and the climate system, and career role models from related research fields. This resource can also be helpful to educators who are building bridges in the classroom between the sciences, the arts, and literacy. Visitors to the W2U's CMMAP web portal can access a beautiful new clouds image gallery; information about each cloud type and the atmospheric processes that produce them; a Clouds in Art interactive; collections of weather-themed poetry, art, and myths; links to games and puzzles for children; and extensive classroom- ready resources and activities for K-12 teachers. Biographies of CMMAP scientists and graduate students are featured. Basic science concepts important to understanding the atmosphere, such as condensation, atmosphere pressure, lapse rate, and more have been developed, as well as 'microworlds' that enable students to interact with experimental tools while building fundamental knowledge. These resources can be accessed online at no cost by the entire atmospheric science K-12 and informal science education community.
Ifcwall Reconstruction from Unstructured Point Clouds
NASA Astrophysics Data System (ADS)
Bassier, M.; Klein, R.; Van Genechten, B.; Vergauwen, M.
2018-05-01
The automated reconstruction of Building Information Modeling (BIM) objects from point cloud data is still ongoing research. A key aspect is the creation of accurate wall geometry as it forms the basis for further reconstruction of objects in a BIM. After segmenting and classifying the initial point cloud, the labelled segments are processed and the wall topology is reconstructed. However, the preocedure is challenging due to noise, occlusions and the complexity of the input data.In this work, a method is presented to automatically reconstruct consistent wall geometry from point clouds. More specifically, the use of room information is proposed to aid the wall topology creation. First, a set of partial walls is constructed based on classified planar primitives. Next, the rooms are identified using the retrieved wall information along with the floors and ceilings. The wall topology is computed by the intersection of the partial walls conditioned on the room information. The final wall geometry is defined by creating IfcWallStandardCase objects conform the IFC4 standard. The result is a set of walls according to the as-built conditions of a building. The experiments prove that the used method is a reliable framework for wall reconstruction from unstructured point cloud data. Also, the implementation of room information reduces the rate of false positives for the wall topology. Given the walls, ceilings and floors, 94% of the rooms is correctly identified. A key advantage of the proposed method is that it deals with complex rooms and is not bound to single storeys.
Scargiali, F; Grisafi, F; Busciglio, A; Brucato, A
2011-12-15
The formation of toxic heavy clouds as a result of sudden accidental releases from mobile containers, such as road tankers or railway tank cars, may occur inside urban areas so the problem arises of their consequences evaluation. Due to the semi-confined nature of the dispersion site simplified models may often be inappropriate. As an alternative, computational fluid dynamics (CFD) has the potential to provide realistic simulations even for geometrically complex scenarios since the heavy gas dispersion process is described by basic conservation equations with a reduced number of approximations. In the present work a commercial general purpose CFD code (CFX 4.4 by Ansys(®)) is employed for the simulation of dense cloud dispersion in urban areas. The simulation strategy proposed involves a stationary pre-release flow field simulation followed by a dynamic after-release flow and concentration field simulations. In order to try a generalization of results, the computational domain is modeled as a simple network of straight roads with regularly distributed blocks mimicking the buildings. Results show that the presence of buildings lower concentration maxima and enlarge the side spread of the cloud. Dispersion dynamics is also found to be strongly affected by the quantity of heavy-gas released. Copyright © 2011 Elsevier B.V. All rights reserved.
The diverse use of clouds by CMS
Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...
2015-12-23
The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less
Clouds Dominate the Galactic Halo
NASA Astrophysics Data System (ADS)
2003-01-01
Using the exquisite sensitivity of the National Science Foundation's Robert C. Byrd Green Bank Telescope (GBT), astronomer Jay Lockman of the National Radio Astronomy Observatory (NRAO) in Green Bank, W. Va., has produced the best cross-section ever of the Milky Way Galaxy's diffuse halo of hydrogen gas. This image confirms the presence of discrete hydrogen clouds in the halo, and could help astronomers understand the origin and evolution of the rarefied atmosphere that surrounds our Galaxy. Lockman presented his findings at the American Astronomical Society meeting in Seattle, WA. Hydrogen Clouds Graphic Artist's Rendering of the Milky Way (background) with insert showing GBT image of cross-section of neutral atomic Hydrogen Credit: Kirk Woellert/National Science Foundation Patricia Smiley, NRAO. "The first observations with the Green Bank Telescope suggested that the hydrogen in the lower halo, the transition zone between the Milky Way and intergalactic space, is very clumpy," said Lockman. "The latest data confirm these results and show that instead of trailing away smoothly from the Galactic plane, a significant fraction of the hydrogen gas in the halo is concentrated in discrete clouds. There are even some filaments." Beyond the star-filled disk of the Milky Way, there exists an extensive yet diffuse halo of hydrogen gas. For years, astronomers have speculated about the origin and structure of this gas. "Even the existence of neutral hydrogen in the halo has been somewhat of a puzzle," Lockman remarked. "Unlike the Earth's atmosphere, which is hot enough to hold itself up against the force of gravity, the hydrogen in the halo is too cool to support itself against the gravitational pull of the Milky Way." Lockman points out that some additional factor has to be involved to get neutral hydrogen to such large distances from the Galactic plane. "This force could be cosmic rays, a supersonic wind, the blast waves from supernovae, or something we have not thought of yet," he said. Earlier this year, data taken with the newly commissioned GBT demonstrated that rather than a diffuse mist or other ill-defined feature - as many astronomers had speculated - the halo was in fact made up of well-defined clouds. "The discovery of these clouds, each containing 50-to-100 solar masses of hydrogen and averaging about 100 light-years in diameter, challenged many of the prevailing theories about the structure and dynamics of the halo," said Lockman. The clouds were discovered about 25,000 light-years from Earth toward the center of our Galaxy. The latest findings show the clouds extend at least 5,000 light-years above and below the Galactic plane. Though the initial studies by Lockman revealed the presence of these clouds, the data were insufficient to conclusively show that they were present throughout the entire halo. These latest results provide valuable evidence that the earlier results were truly representative of the entire halo. "The richness and variety of this phenomenon continues to astound me," remarked Lockman. Lockman's new studies also confirm that these clouds travel along with the rest of the Galaxy, rotating about its center. These studies clearly rule out the possibility that so-called "high-velocity clouds" were responsible for what was detected initially. High-velocity clouds are vagabond clumps of intergalactic gas, possibly left over from the formation of the Milky Way and other nearby galaxies. "One thing that is for certain is that these are not high-velocity clouds, this is an entirely separate phenomenon," said Lockman. According to the researcher, the ubiquitous nature and dynamics of these newly discovered clouds support the theory that they are condensing out of the hot gas that is lifted into the halo through supernova explosions. When a massive star dies, it produces a burst of cosmic rays and an enormous expanding bubble of gas at a temperature of several million degrees Celsius. Over time, this hot gas will rise into the Milky Way's halo. The results presented by Lockman suggest that, as some astronomers have predicted, the hot gas in the halo slowly cools and condenses into hydrogen clouds along with wispy filaments that connect them. When these clouds become as massive as many of those discovered by Lockman, they should then begin to fall back onto the Galactic plane. This phenomenon is commonly referred to as a "galactic fountain." "If the clouds were part of the galactic fountain process," Lockman said, "then it is likely that they are now falling back onto the Galaxy." Radio telescopes are able to detect the naturally occurring radio emission from neutral atomic hydrogen. As hydrogen atoms move about in space, they can absorb small amounts of energy, sending the atom's single electron to a higher energy state. When the electron eventually moves back to its lower energy -- or resting state, it gives up a small amount of electromagnetic radiation at a wavelength of 21 centimeters. The GBT, dedicated in August of 2000, is the world's largest fully steerable radio telescope. Its 100 by 110 meter dish is composed of 2004 individually hinged panels. It also has a unique offset feed arm, which greatly enhances the performance of the telescope, making it ideal for observations of faint astronomical objects. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.
Simulating Terrestrial Gamma Ray Flashes due to cosmic ray shower electrons and positrons
NASA Astrophysics Data System (ADS)
Connell, Paul
2017-04-01
The University of Valencia has developed a software simulator LEPTRACK to simulate the relativistic runaway electron avalanches, RREA, that are presumed to be the cause of Terrestrial Gamma Ray Flashes and their powerful accompanying Ionization/Excitation Flashes. We show here results of LEPTRACK simulations of RREA by the interaction of MeV energy electrons/positrons and photons in cosmic ray showers traversing plausible electric field geometries expected in storm clouds. The input beams of MeV shower products were created using the CORSIKA software package from the Karlsruhe Institute of Technology. We present images, videos and plots showing the different Ionization, Excitation and gamma-ray photon density fields produced, along with their time and spatial profile evolution, which depend critically on where the line of shower particles intercept the electric field geometry. We also show a new effect of incoming positrons in the shower, which make up a significant fraction of shower products, in particular their apparent "orbiting" within a high altitude negative induced shielding charge layer, which has been conjectured to produce a signature microwave emission, as well as a short range 511 keV annihilation line. The interesting question posed is if this conjectured positron emission can be observed and correlated with TGF orbital observations to show if a TGF originates in the macro E-fields of storm clouds or the micro E-fields of lightning leaders where this positron "orbiting" is not likely to occur.
NASA Astrophysics Data System (ADS)
Laurent, Philippe; Titarchuk, Lev
2018-06-01
We consider a Compton cloud (CC) surrounding a black hole (BH) in an accreting BH system, where electrons propagate with thermal and bulk velocities. In that cloud, soft (disk) photons may be upscattered off these energetic electrons and attain energies of several MeV. They could then create pairs due to photon–photon interactions. In this paper, we study the formation of the 511 keV annihilation line due to this photon–photon interaction, which results in the creation of electron–positron pairs, followed by the annihilation of the created positrons with the CC electrons. The appropriate conditions for annihilation-line generation take place very close to a BH horizon within (103–104)m cm from it, where m is the BH hole mass in solar units. As a result, the created annihilation line should be seen by the Earth observer as a blackbody bump, or the so-called reflection bump at energies around (511/20) (20/z) keV, where z ∼ 20 is a typical gravitational redshift experienced by the created annihilation-line photons when they emerge. This transient feature should occur in any accreting BH system, either galactic or extragalactic. Observational evidences for this feature in several galactic BH systems is detailed in an accompanying paper. An extended hard tail of the spectrum up to 1 MeV may also be formed due to X-ray photons upscattering off created pairs.
NASA Technical Reports Server (NTRS)
Farrugia, C. J.; Richardson, I. G.; Burlaga, L. F.; Lepping, R. P.; Osherovich, V. A.
1993-01-01
Simultaneous ISEE 3 and IMP 8 spacecraft observations of magnetic fields and flow anisotropies of solar energetic protons and electrons during the passage of an interplanetary magnetic cloud show various particle signature differences at the two spacecraft. These differences are interpretable in terms of the magnetic line topology of the cloud, the connectivity of the cloud field lines to the solar surface, and the interconnection between the magnetic fields of the magnetic clouds and of the earth. These observations are consistent with a magnetic cloud model in which these mesoscale configurations are curved magnetic flux ropes attached at both ends to the sun's surface, extending out to 1 AU.
Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds
NASA Astrophysics Data System (ADS)
Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.
2012-11-01
A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chidong
Motivated by the success of the AMIE/DYNAMO field campaign, which collected unprecedented observations of cloud and precipitation from the tropical Indian Ocean in Octber 2011 – March 2012, this project explored how such observations can be applied to assist the development of global cloud-permitting models through evaluating and correcting model biases in cloud statistics. The main accomplishment of this project were made in four categories: generating observational products for model evaluation, using AMIE/DYNAMO observations to validate global model simulations, using AMIE/DYNAMO observations in numerical studies of cloud-permitting models, and providing leadership in the field. Results from this project provide valuablemore » information for building a seamless bridge between DOE ASR program’s component on process level understanding of cloud processes in the tropics and RGCM focus on global variability and regional extremes. In particular, experience gained from this project would be directly applicable to evaluation and improvements of ACME, especially as it transitions to a non-hydrostatic variable resolution model.« less
NASA Technical Reports Server (NTRS)
Irvine, W. M.; Hjalmarson, A.; Rydbeck, O. E. H.
1981-01-01
The physical conditions and chemical compositions of the gas in interstellar clouds are reviewed in light of the importance of interstellar clouds for star formation and the origin of life. The Orion A region is discussed as an example of a giant molecular cloud where massive stars are being formed, and it is pointed out that conditions in the core of the cloud, with a kinetic temperature of about 75 K and a density of 100,000-1,000,000 molecules/cu cm, may support gas phase ion-molecule chemistry. The Taurus Molecular Clouds are then considered as examples of cold, dark, relatively dense interstellar clouds which may be the birthplaces of solar-type stars and which have been found to contain the heaviest interstellar molecules yet discovered. The molecular species identified in each of these regions are tabulated, including such building blocks of biological monomers as H2O, NH3, H2CO, CO, H2S, CH3CN and H2, and more complex species such as HCOOCH3 and CH3CH2CN.
NAFFS: network attached flash file system for cloud storage on portable consumer electronics
NASA Astrophysics Data System (ADS)
Han, Lin; Huang, Hao; Xie, Changsheng
Cloud storage technology has become a research hotspot in recent years, while the existing cloud storage services are mainly designed for data storage needs with stable high speed Internet connection. Mobile Internet connections are often unstable and the speed is relatively low. These native features of mobile Internet limit the use of cloud storage in portable consumer electronics. The Network Attached Flash File System (NAFFS) presented the idea of taking the portable device built-in NAND flash memory as the front-end cache of virtualized cloud storage device. Modern portable devices with Internet connection have built-in more than 1GB NAND Flash, which is quite enough for daily data storage. The data transfer rate of NAND flash device is much higher than mobile Internet connections[1], and its non-volatile feature makes it very suitable as the cache device of Internet cloud storage on portable device, which often have unstable power supply and intermittent Internet connection. In the present work, NAFFS is evaluated with several benchmarks, and its performance is compared with traditional network attached file systems, such as NFS. Our evaluation results indicate that the NAFFS achieves an average accessing speed of 3.38MB/s, which is about 3 times faster than directly accessing cloud storage by mobile Internet connection, and offers a more stable interface than that of directly using cloud storage API. Unstable Internet connection and sudden power off condition are tolerable, and no data in cache will be lost in such situation.
ELECTRON CLOUD OBSERVATIONS AND CURES IN RHIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
FISCHER,W.; BLASKIEWICZ, M.; HUANG, H.
Since 2001 RHIC has experienced electron cloud effects, which have limited the beam intensity. These include dynamic pressure rises - including pressure instabilities, tune shifts, a reduction of the stability threshold for bunches crossing the transition energy, and possibly incoherent emittance growth. We summarize the main observations in operation and dedicated experiments, as well as countermeasures including baking, NEG coated warm beam pipes, solenoids, bunch patterns, anti-grazing rings, pre-pumped cold beam pipes, scrubbing, and operation with long bunches.
Summary of SLAC's SEY Measurement On Flat Accelerator Wall Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Pimpec, F.; /PSI, Villigen /SLAC
The electron cloud effect (ECE) causes beam instabilities in accelerator structures with intense positively charged bunched beams. Reduction of the secondary electron yield (SEY) of the beam pipe inner wall is effective in controlling cloud formation. We summarize SEY results obtained from flat TiN, TiZrV and Al surfaces carried out in a laboratory environment. SEY was measured after thermal conditioning, as well as after low energy, less than 300 eV, particle exposure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narayan, Ramesh; Sironi, Lorenzo; Oezel, Feryal
2012-10-01
A dense ionized cloud of gas has been recently discovered to be moving directly toward the supermassive black hole, Sgr A*, at the Galactic center. In 2013 June, at the pericenter of its highly eccentric orbit, the cloud will be approximately 3100 Schwarzschild radii from the black hole and will move supersonically through the ambient hot gas with a velocity of v{sub p} Almost-Equal-To 5400 km s{sup -1}. A bow shock is likely to form in front of the cloud and could accelerate electrons to relativistic energies. We estimate via particle-in-cell simulations the energy distribution of the accelerated electrons andmore » show that the non-thermal synchrotron emission from these electrons might exceed the quiescent radio emission from Sgr A* by a factor of several. The enhanced radio emission should be detectable at GHz and higher frequencies around the time of pericentric passage and in the following months. The bow shock emission is expected to be displaced from the quiescent radio emission of Sgr A* by {approx}33 mas. Interferometric observations could resolve potential changes in the radio image of Sgr A* at wavelengths {approx}< 6 cm.« less
Yang, Haitao; Wang, Fujia; Zheng, Jilin; Lin, Hao; Liu, Bin; Tang, Yi-Da; Zhang, Chong-Jing
2018-06-04
Energy transfer between fluorescent dyes and quenchers is widely used in the design of light-up probes. Although dual quenchers are more effective in offering lower background signals and higher turn-on ratios than one quencher, such probes are less explored in practice as they require both quenchers to be within the proximity of the fluorescent core. In this contribution, we utilized intramolecular motion and photoinduced electron transfer (PET) as quenching mechanisms to build super-quenched light-up probes based on fluorogens with aggregation-induced emission. The optimized light-up probe possesses negligible background and is able to detect not only free formaldehyde (FA) but also polymeric FA, with an unprecedented turn-on ratio of >4900. We envision that this novel dual quenching strategy will help to develop various light-up probes for analyte sensing. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud
Cianfrocco, Michael A; Leschziner, Andres E
2015-01-01
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969
NASA Astrophysics Data System (ADS)
Borowiec, N.
2013-12-01
Gathering information about the roof shapes of the buildings is still current issue. One of the many sources from which we can obtain information about the buildings is the airborne laser scanning. However, detect information from cloud o points about roofs of building automatically is still a complex task. You can perform this task by helping the additional information from other sources, or based only on Lidar data. This article describes how to detect the building roof only from a point cloud. To define the shape of the roof is carried out in three tasks. The first step is to find the location of the building, the second is the precise definition of the edge, while the third is an indication of the roof planes. First step based on the grid analyses. And the next two task based on Hough Transformation. Hough transformation is a method of detecting collinear points, so a perfect match to determine the line describing a roof. To properly determine the shape of the roof is not enough only the edges, but it is necessary to indicate roofs. Thus, in studies Hough Transform, also served as a tool for detection of roof planes. The only difference is that the tool used in this case is a three-dimensional.
Time lens assisted photonic sampling extraction
NASA Astrophysics Data System (ADS)
Petrillo, Keith Gordon
Telecommunication bandwidth demands have dramatically increased in recent years due to Internet based services like cloud computing and storage, large file sharing, and video streaming. Additionally, sensing systems such as wideband radar, magnetic imaging resonance systems, and complex modulation formats to handle large data transfer in telecommunications require high speed, high resolution analog-to-digital converters (ADCs) to interpret the data. Accurately processing and acquiring the information at next generation data rates from these systems has become challenging for electronic systems. The largest contributors to the electronic bottleneck are bandwidth and timing jitter which limit speed and reduce accuracy. Optical systems have shown to have at least three orders of magnitude increase in bandwidth capabilities and state of the art mode locked lasers have reduced timing jitters into thousands of attoseconds. Such features have encouraged processing signals without the use of electronics or using photonics to assist electronics. All optical signal processing has allowed the processing of telecommunication line rates up to 1.28 Tb/s and high resolution analog-to-digital converters in the 10s of gigahertz. The major drawback to these optical systems is the high cost of the components. The application of all optical processing techniques such as a time lens and chirped processing can greatly reduce bandwidth and cost requirements of optical serial to parallel converters and push photonically assisted ADCs into the 100s of gigahertz. In this dissertation, the building blocks to a high speed photonically assisted ADC are demonstrated, each providing benefits to its own respective application. A serial to parallel converter using a continuously operating time lens as an optical Fourier processor is demonstrated to fully convert a 160-Gb/s optical time division multiplexed signal to 16 10-Gb/s channels with error free operation. Using chirped processing, an optical sample and hold concept is demonstrated and analyzed as a resolution improvement to existing photonically assisted ADCs. Simulations indicate that the application of a continuously operating time lens to a photonically assisted sampling system can increase photonically sampled systems by an order of magnitude while acquiring properties similar to an optical sample and hold system.
The New York Trade Publishers and CD-ROM: Where Are They Heading?
ERIC Educational Resources Information Center
Newmark, Eileen
1994-01-01
Discusses the strategies top trade publishers, including Putnam New Media, Paramount Communications, Random House, Penguin USA, and Reader's Digest, are taking toward CD-ROM publishing. Strategies include purchasing electronic media companies; setting up an independent business; and building a new media business by integrating new efforts into the…
Portable Power And Digital-Communication Units
NASA Technical Reports Server (NTRS)
Levin, Richard R.; Henry, Paul K.; Rosenberg, Leigh S.
1992-01-01
Conceptual network of electronic-equipment modules provides electrical power and digital radio communications at multiple sites not served by cables. System includes central communication unit and portable units powered by solar photovoltaic arrays. Useful to serve equipment that must be set up quickly at remote sites or buildings that cannot be modified to provide cable connections.
NASA Astrophysics Data System (ADS)
Liousse, C.; Knippertz, P.; Flamant, C.; Adon, J.; Akpo, A.; Annesi-Maesano, I.; Assamoi, E.; Baeza, A.; Julien, B.; Bedou, M.; Brooks, B. J.; Chiu, J. Y. C.; Chiron, C.; Coe, H.; Danuor, S.; Djossou, J.; Evans, M. J.; Fayomi, B.; Fink, A. H.; Galy-Lacaux, C.; Gardrat, E.; Jegede, O.; Kalthoff, N.; Kedote, M.; Keita, S.; Kouame, K.; Konare, A.; Leon, J. F.; Mari, C. H.; Lohou, F.; Roblou, L.; Schlager, H.; Schwarzenboeck, A.; Toure, E. N.; Veronique, Y.
2016-12-01
The EU-funded project DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa) is investigating the relationship between weather, climate, air pollution and health in southern West Africa. The air over the coastal region of West Africa is a unique mixture of natural and anthropogenic gases, liquids and particles, emitted in an environment, in which multi-layer cloud decks frequently form. These exert a large influence on the local weather and climate, which has never been studied in detail over West Africa: this information is currently not included in the majority of weather and climate models. For the first time, the entire chain of impacts of natural and manmade emissions on the West African atmosphere was investigated in a coordinated field campaign. As part of this campaign, three research aircraft (Falcon 20, Twin Otter and ATR) based in Lomé (Togo) flew targeted 50 missions over West Africa from 27 June to 16 July 2016. In that campaign also, three highly instrumented measuring sites inland were set up with weather balloons launched several times a day across the region. The main objective was to build robust statistics of cloud properties in southern West Africa in different chemical landscapes (background state, ship/flaring emissions, polluted megacities, agricultural and forest areas, dust from the Sahel/Sahara). In addition, DACCIWA scientists working on measurements of urban emissions, air pollution, and health have set up four urban sites in Abidjan (Cote d'Ivoire) and Cotonou (Benin) focusing on main specific regional combustion sources (domestic fires, traffic and waste burning). Long-term measurements of gases and particles and census of hospital admissions for respiratory diseases were started in January 2015 and will continue until March 2017 to determine the links between human health and air pollution. Intensive measurement periods took place in July 2015, January 2016, and July 2016 (a final one is planned for January 2017) in order to characterize toxicological effects of size-speciated aerosol chemical composition. First highlights on the flight sampling strategy, the acquired datasets, and accompanying modelling work will be presented.
Hopkins, Israel Green; Dunn, Kelly; Bourgeois, Fabienne; Rogers, Jayne; Chiang, Vincent W
2016-06-01
The purpose of this case study was to investigate opportunities to electronically enhance the transitions of care for both patients and providers and to describe the process of development and implementation of such tools. We describe the current challenges and fragmentation of care for pediatric patients and families being discharged from inpatient stays, and review barriers to change in practice. Care transitions vary in the complexity of the clinical and social scenarios and no one-size-fits-all approach works for every patient, provider or hospital system. A substantial challenge that providers who are designing and implementing digital tools for patients surrounds the complexity in building such tools to apply to such broad populations. Our case study provides a framework using a multidisciplinary approach, brainstorming and rapid digital prototyping to build an in-house electronic discharge follow-up platform. In describing this process, we review design and implementation measures that may further support digital tool development in other areas. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sanchez, Kevin J.; Roberts, Gregory C.; Calmer, Radiance; Nicoll, Keri; Hashimshoni, Eyal; Rosenfeld, Daniel; Ovadnevaite, Jurgita; Preissler, Jana; Ceburnis, Darius; O'Dowd, Colin; Russell, Lynn M.
2017-08-01
Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head Atmospheric Research Station in Galway, Ireland, in August 2015. This study is part of the BACCHUS (Impact of Biogenic versus Anthropogenic emissions on Clouds and Climate: towards a Holistic UnderStanding) European collaborative project, with the goal of understanding key processes affecting aerosol-cloud shortwave radiative flux closures to improve future climate predictions and develop sustainable policies for Europe. Instrument platforms include ground-based unmanned aerial vehicles (UAVs)1 and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1-D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction or a five-hole probe for 3-D wind vectors. UAV cloud measurements are rare and have only become possible in recent years through the miniaturization of instrumentation. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNCs) were within 30 % of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after accounting for cloud-top entrainment. This work demonstrates the need to take in situ measurements of aerosol properties for cases where the boundary layer is decoupled as well as consider cloud-top entrainment to accurately model stratocumulus cloud radiative flux. 1The regulatory term for UAV is remotely piloted aircraft (RPA).
NASA Astrophysics Data System (ADS)
Ranjan, Sukrit; Wordsworth, Robin; Sasselov, Dimitar D.
2017-08-01
Recent findings suggest that Mars may have been a clement environment for the emergence of life and may even have compared favorably to Earth in this regard. These findings have revived interest in the hypothesis that prebiotically important molecules or even nascent life may have formed on Mars and been transferred to Earth. UV light plays a key role in prebiotic chemistry. Characterizing the early martian surface UV environment is key to understanding how Mars compares to Earth as a venue for prebiotic chemistry. Here, we present two-stream, multilayer calculations of the UV surface radiance on Mars at 3.9 Ga to constrain the surface UV environment as a function of atmospheric state. We explore a wide range of atmospheric pressures, temperatures, and compositions that correspond to the diversity of martian atmospheric states consistent with available constraints. We include the effects of clouds and dust. We calculate dose rates to quantify the effect of different atmospheric states on UV-sensitive prebiotic chemistry. We find that, for normative clear-sky CO2-H2O atmospheres, the UV environment on young Mars is comparable to young Earth. This similarity is robust to moderate cloud cover; thick clouds (τcloud ≥ 100) are required to significantly affect the martian UV environment, because cloud absorption is degenerate with atmospheric CO2. On the other hand, absorption from SO2, H2S, and dust is nondegenerate with CO2, meaning that, if these constituents build up to significant levels, surface UV fluence can be suppressed. These absorbers have spectrally variable absorption, meaning that their presence affects prebiotic pathways in different ways. In particular, high SO2 environments may admit UV fluence that favors pathways conducive to abiogenesis over pathways unfavorable to it. However, better measurements of the spectral quantum yields of these pathways are required to evaluate this hypothesis definitively.
NASA Astrophysics Data System (ADS)
Różańska, A.; Nikołajuk, M.; Czerny, B.; Dobrzycki, A.; Hryniewicz, K.; Bechtold, J.; Ebeling, H.
2014-04-01
We present the photoionisation modelling of the intrinsic absorber in the bright quasar HS 1603 + 3820. We constructed the broad-band spectral energy distribution using the optical/UV/X-ray observations from different instruments as inputs for the photoionisation calculations. The spectra from the Keck telescope show extremely high CIV to HI ratios, for the first absorber in system A, named A1. This value, together with high column density of CIV ion, place strong constraints on the photoionisation model. We used two photoionisation codes to derive the hydrogen number density at the cloud illuminated surface. By estimating bolometric luminosity of HS 1603 + 3820 using the typical formula for quasars, we calculated the distance to A1. We could find one photoionization solution, by assuming either a constant density cloud (which was modelled using CLOUDY), or a stratified cloud (which was modelled using TITAN), as well as the solar abundances. This model explained both the ionic column density of CIV and the high CIV to HI ratio. The location of A1 is 0.1 pc, and it is situated even closer to the nucleus than the possible location of the Broad Line Region in this object. The upper limit of the distance is sensitive to the adopted covering factor and the carbon abundance. Photoionisation modelling always prefers dense clouds with the number density n0 = 1010 - 1012 cm-3, which explains intrinsic absorption in HS 1603 + 3820. This number density is of the same order as that in the disk atmosphere at the implied distance of A1. Therefore, our results show that the disk wind that escapes from the outermost accretion disk atmosphere can build up dense absorber in quasars.
NASA Astrophysics Data System (ADS)
Li-Chee-Ming, J.; Armenakis, C.
2014-11-01
This paper presents the ongoing development of a small unmanned aerial mapping system (sUAMS) that in the future will track its trajectory and perform 3D mapping in near-real time. As both mapping and tracking algorithms require powerful computational capabilities and large data storage facilities, we propose to use the RoboEarth Cloud Engine (RCE) to offload heavy computation and store data to secure computing environments in the cloud. While the RCE's capabilities have been demonstrated with terrestrial robots in indoor environments, this paper explores the feasibility of using the RCE in mapping and tracking applications in outdoor environments by small UAMS. The experiments presented in this work assess the data processing strategies and evaluate the attainable tracking and mapping accuracies using the data obtained by the sUAMS. Testing was performed with an Aeryon Scout quadcopter. It flew over York University, up to approximately 40 metres above the ground. The quadcopter was equipped with a single-frequency GPS receiver providing positioning to about 3 meter accuracies, an AHRS (Attitude and Heading Reference System) estimating the attitude to about 3 degrees, and an FPV (First Person Viewing) camera. Video images captured from the onboard camera were processed using VisualSFM and SURE, which are being reformed as an Application-as-a-Service via the RCE. The 3D virtual building model of York University was used as a known environment to georeference the point cloud generated from the sUAMS' sensor data. The estimated position and orientation parameters of the video camera show increases in accuracy when compared to the sUAMS' autopilot solution, derived from the onboard GPS and AHRS. The paper presents the proposed approach and the results, along with their accuracies.
Reconstruction of 3d Models from Point Clouds with Hybrid Representation
NASA Astrophysics Data System (ADS)
Hu, P.; Dong, Z.; Yuan, P.; Liang, F.; Yang, B.
2018-05-01
The three-dimensional (3D) reconstruction of urban buildings from point clouds has long been an active topic in applications related to human activities. However, due to the structures significantly differ in terms of complexity, the task of 3D reconstruction remains a challenging issue especially for the freeform surfaces. In this paper, we present a new reconstruction algorithm which allows the 3D-models of building as a combination of regular structures and irregular surfaces, where the regular structures are parameterized plane primitives and the irregular surfaces are expressed as meshes. The extraction of irregular surfaces starts with an over-segmented method for the unstructured point data, a region growing approach based the adjacent graph of super-voxels is then applied to collapse these super-voxels, and the freeform surfaces can be clustered from the voxels filtered by a thickness threshold. To achieve these regular planar primitives, the remaining voxels with a larger flatness will be further divided into multiscale super-voxels as basic units, and the final segmented planes are enriched and refined in a mutually reinforcing manner under the framework of a global energy optimization. We have implemented the proposed algorithms and mainly tested on two point clouds that differ in point density and urban characteristic, and experimental results on complex building structures illustrated the efficacy of the proposed framework.
NASA Technical Reports Server (NTRS)
Pearl, J. C.; Smith, M. D.; Conrath, B. J.; Bandfield, J. L.; Christensen, P. R.
1999-01-01
Successful operation of the Mars Global Surveyor spacecraft beginning in September 1997, has permitted extensive infrared observations of condensation clouds during the martian southern summer and fall seasons (184 deg
Impact of Low Level Clouds on radiative and turbulent surface flux in southern West Africa
NASA Astrophysics Data System (ADS)
Lohou, Fabienne; Kalthoff, Norbert; Dione, Cheikh; Lothon, Marie; Adler, Bianca; Babic, Karmen; Pedruzo-Bagazgoitia, Xabier; Vila-Guerau De Arellano, Jordi
2017-04-01
During the monsoon season in West Africa, low-level clouds form almost every night and break up between 0900 and the middle of the afternoon depending on the day. The break-up of these clouds leads to the formation of boundary-layer cumuli clouds, which can sometimes evolve into deep convection. The low-level clouds have a strong impact on the radiation and energy budget at the surface and consequently on the humidity in the boundary layer and the afternoon convection. During the DACCIWA ground campaign, which took place in June and July 2016, three supersites in Benin, Ghana, and Nigeria were instrumented to document the conditions within the lower troposphere including the cloud layers. Radiative and turbulent fluxes were measured at different places by several surface stations jointly with low-level cloud occurrence during 50 days. These datasets enable the analysis of modifications in the diurnal cycle of the radiative and turbulent surface flux induced by the formation and presence of the low-level clouds. The final objective of this study is to estimate the error made in some NWP simulations when the diurnal cycle of low-level clouds is poorly represented or not represented at all.
Electron Stimulated Desorption Yields at the Mercury's Surface Based On Hybrid Simulation Results
NASA Astrophysics Data System (ADS)
Travnicek, P. M.; Schriver, D.; Orlando, T. M.; Hellinger, P.
2016-12-01
In terms of previous research concerning the solar wind sputtering process, most of the focus has been on ion sputtering by precipitating solar wind protons, however, precipitating electrons can also result in the desorption of neutrals and ions from Mercury's surface and represents a potentially significant source of exospheric and heavy ion components. Electron stimulated desorption (ESD) is not bound by optical selection rules and electron impact energies can vary over a much wider range, including core-level excitations that easily lead to multi-electron shake up events that can cascade into localized multiple charged states that Coulomb explode with extreme kinetic energy release (up to 8 eV = 186,000 K). While considered for the lunar exosphere, ESD has not been adequately studied or quantified as a producer of neutrals and ions. ESD is a well known process which involves the excitation (often ionization) of a surface target followed by charge ejection, bond breaking and ion expulsion due to the resultant Coulomb repulsion. Though the role of ESD processes has not been discussed much with respect to Mercury, the impinging energetic electrons that are transported through the magnetosphere and precipitate can induce significant material removal. Given the energetics and the wide band-gap nature of the minerals, the departing material may also be primarily ionic. The possible role of 5 eV - 1 keV electron stimulated desorption and dissociation in "weathering" the regolith can be significant. ESD yields will be calculated based on the ion and electron precipitation profiles for the already carried out hybrid and electron simulations. Neutral and ion cloud profiles around Mercury will be calculated and combined with those profiles expected from PSD and MIV.
CLOUD PEAK CONTIGUOUS, ROCK CREEK, PINEY CREEK, AND LITTLE GOOSE ROADLESS AREAS, WYOMING.
Segerstrom, Kenneth; Brown, Don S.
1984-01-01
On the basis of mineral surveys, study areas surrounding the Cloud Peak Primitive Area in northern Wyoming offer little promise for the occurrence of mineral or energy resources. The geologic setting precludes the existence of deposits of organic fuels. Nonmetallic commodities, such as feldspar, limestone, building stone, clay, sand, and gravel are present, but these materials are readily available nearby in large quantities in more accessible areas.
A cloud-based data network approach for translational cancer research.
Xing, Wei; Tsoumakos, Dimitrios; Ghanem, Moustafa
2015-01-01
We develop a new model and associated technology for constructing and managing self-organizing data to support translational cancer research studies. We employ a semantic content network approach to address the challenges of managing cancer research data. Such data is heterogeneous, large, decentralized, growing and continually being updated. Moreover, the data originates from different information sources that may be partially overlapping, creating redundancies as well as contradictions and inconsistencies. Building on the advantages of elasticity of cloud computing, we deploy the cancer data networks on top of the CELAR Cloud platform to enable more effective processing and analysis of Big cancer data.
Parametrically Optimized Carbon Nanotube-Coated Cold Cathode Spindt Arrays
Yuan, Xuesong; Cole, Matthew T.; Zhang, Yu; Wu, Jianqiang; Milne, William I.; Yan, Yang
2017-01-01
Here, we investigate, through parametrically optimized macroscale simulations, the field electron emission from arrays of carbon nanotube (CNT)-coated Spindts towards the development of an emerging class of novel vacuum electron devices. The present study builds on empirical data gleaned from our recent experimental findings on the room temperature electron emission from large area CNT electron sources. We determine the field emission current of the present microstructures directly using particle in cell (PIC) software and present a new CNT cold cathode array variant which has been geometrically optimized to provide maximal emission current density, with current densities of up to 11.5 A/cm2 at low operational electric fields of 5.0 V/μm. PMID:28336845
Abdo, A A; Ackermann, M; Ajello, M; Atwood, W B; Axelsson, M; Baldini, L; Ballet, J; Barbiellini, G; Bastieri, D; Battelino, M; Baughman, B M; Bechtol, K; Bellazzini, R; Berenji, B; Blandford, R D; Bloom, E D; Bogaert, G; Bonamente, E; Borgland, A W; Bregeon, J; Brez, A; Brigida, M; Bruel, P; Burnett, T H; Caliandro, G A; Cameron, R A; Caraveo, P A; Carlson, P; Casandjian, J M; Cecchi, C; Charles, E; Chekhtman, A; Cheung, C C; Chiang, J; Ciprini, S; Claus, R; Cohen-Tanugi, J; Cominsky, L R; Conrad, J; Cutini, S; Dermer, C D; de Angelis, A; de Palma, F; Digel, S W; Di Bernardo, G; do Couto E Silva, E; Drell, P S; Dubois, R; Dumora, D; Edmonds, Y; Farnier, C; Favuzzi, C; Focke, W B; Frailis, M; Fukazawa, Y; Funk, S; Fusco, P; Gaggero, D; Gargano, F; Gasparrini, D; Gehrels, N; Germani, S; Giebels, B; Giglietto, N; Giordano, F; Glanzman, T; Godfrey, G; Grasso, D; Grenier, I A; Grondin, M-H; Grove, J E; Guillemot, L; Guiriec, S; Hanabata, Y; Harding, A K; Hartman, R C; Hayashida, M; Hays, E; Hughes, R E; Jóhannesson, G; Johnson, A S; Johnson, R P; Johnson, W N; Kamae, T; Katagiri, H; Kataoka, J; Kawai, N; Kerr, M; Knödlseder, J; Kocevski, D; Kuehn, F; Kuss, M; Lande, J; Latronico, L; Lemoine-Goumard, M; Longo, F; Loparco, F; Lott, B; Lovellette, M N; Lubrano, P; Madejski, G M; Makeev, A; Massai, M M; Mazziotta, M N; McConville, W; McEnery, J E; Meurer, C; Michelson, P F; Mitthumsiri, W; Mizuno, T; Moiseev, A A; Monte, C; Monzani, M E; Moretti, E; Morselli, A; Moskalenko, I V; Murgia, S; Nolan, P L; Norris, J P; Nuss, E; Ohsugi, T; Omodei, N; Orlando, E; Ormes, J F; Ozaki, M; Paneque, D; Panetta, J H; Parent, D; Pelassa, V; Pepe, M; Pesce-Rollins, M; Piron, F; Pohl, M; Porter, T A; Profumo, S; Rainò, S; Rando, R; Razzano, M; Reimer, A; Reimer, O; Reposeur, T; Ritz, S; Rochester, L S; Rodriguez, A Y; Romani, R W; Roth, M; Ryde, F; Sadrozinski, H F-W; Sanchez, D; Sander, A; Saz Parkinson, P M; Scargle, J D; Schalk, T L; Sellerholm, A; Sgrò, C; Smith, D A; Smith, P D; Spandre, G; Spinelli, P; Starck, J-L; Stephens, T E; Strickman, M S; Strong, A W; Suson, D J; Tajima, H; Takahashi, H; Takahashi, T; Tanaka, T; Thayer, J B; Thayer, J G; Thompson, D J; Tibaldo, L; Tibolla, O; Torres, D F; Tosti, G; Tramacere, A; Uchiyama, Y; Usher, T L; Van Etten, A; Vasileiou, V; Vilchez, N; Vitale, V; Waite, A P; Wallace, E; Wang, P; Winer, B L; Wood, K S; Ylinen, T; Ziegler, M
2009-05-08
Designed as a high-sensitivity gamma-ray observatory, the Fermi Large Area Telescope is also an electron detector with a large acceptance exceeding 2 m;{2} sr at 300 GeV. Building on the gamma-ray analysis, we have developed an efficient electron detection strategy which provides sufficient background rejection for measurement of the steeply falling electron spectrum up to 1 TeV. Our high precision data show that the electron spectrum falls with energy as E-3.0 and does not exhibit prominent spectral features. Interpretations in terms of a conventional diffusive model as well as a potential local extra component are briefly discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J.-L.; Furman, M.A.; Azevedo, A.W.
2004-04-19
We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.
STS-69 Flight Day 9 Video File
NASA Technical Reports Server (NTRS)
1995-01-01
The song, 'He's A Tramp', from the Walt Disney cartoon movie, 'Lady and the Tramp', awakened the astronauts, Cmdr. Dave Walker, Pilot Ken Cockrell, and Mission Specialists Jim Voss, Jim Newman, and Mike Gernhardt, on the ninth day of the STS-69 mission. The Wake Shield Facility (WSF) was again unberthed from the shuttle cargo bay and , using the shuttle's robot arm, held over the side of the shuttle for five hours where it collected data on the electrical field build-up around the spacecraft as part of the Charging Hazards and Wake Studies Experiment (CHAWS). Voss and Gernhardt rehearsed their Extravehicular Activity (EVA) spacewalk, which was planned for the next day. Earth views included cloud cover, a hurricane, and its eye.
STS-69 flight day 9 highlights
NASA Astrophysics Data System (ADS)
1995-09-01
The song, 'He's A Tramp', from the Walt Disney cartoon movie, 'Lady and the Tramp', awakened the astronauts, Cmdr. Dave Walker, Pilot Ken Cockrell, and Mission Specialists Jim Voss, Jim Newman, and Mike Gernhardt, on the ninth day of the STS-69 mission. The Wake Shield Facility (WSF) was again unberthed from the shuttle cargo bay and , using the shuttle's robot arm, held over the side of the shuttle for five hours where it collected data on the electrical field build-up around the spacecraft as part of the Charging Hazards and Wake Studies Experiment (CHAWS). Voss and Gernhardt rehearsed their Extravehicular Activity (EVA) spacewalk, which was planned for the next day. Earth views included cloud cover, a hurricane, and its eye.
ICESat: Ice, Cloud and Land Elevation Satellite
NASA Technical Reports Server (NTRS)
Zwally, Jay; Shuman, Christopher
2002-01-01
Ice exists in the natural environment in many forms. The Earth dynamic ice features shows that at high elevations and/or high latitudes,snow that falls to the ground can gradually build up tu form thick consolidated ice masses called glaciers. Glaciers flow downhill under the force of gravity and can extend into areas that are too warm to support year-round snow cover. The snow line, called the equilibrium line on a glacier or ice sheet, separates the ice areas that melt on the surface and become show free in summer (net ablation zone) from the ice area that remain snow covered during the entire year (net accumulation zone). Snow near the surface of a glacier that is gradually being compressed into solid ice is called firm.
A cloud-based system for measuring radiation treatment plan similarity
NASA Astrophysics Data System (ADS)
Andrea, Jennifer
PURPOSE: Radiation therapy is used to treat cancer using carefully designed plans that maximize the radiation dose delivered to the target and minimize damage to healthy tissue, with the dose administered over multiple occasions. Creating treatment plans is a laborious process and presents an obstacle to more frequent replanning, which remains an unsolved problem. However, in between new plans being created, the patient's anatomy can change due to multiple factors including reduction in tumor size and loss of weight, which results in poorer patient outcomes. Cloud computing is a newer technology that is slowly being used for medical applications with promising results. The objective of this work was to design and build a system that could analyze a database of previously created treatment plans, which are stored with their associated anatomical information in studies, to find the one with the most similar anatomy to a new patient. The analyses would be performed in parallel on the cloud to decrease the computation time of finding this plan. METHODS: The system used SlicerRT, a radiation therapy toolkit for the open-source platform 3D Slicer, for its tools to perform the similarity analysis algorithm. Amazon Web Services was used for the cloud instances on which the analyses were performed, as well as for storage of the radiation therapy studies and messaging between the instances and a master local computer. A module was built in SlicerRT to provide the user with an interface to direct the system on the cloud, as well as to perform other related tasks. RESULTS: The cloud-based system out-performed previous methods of conducting the similarity analyses in terms of time, as it analyzed 100 studies in approximately 13 minutes, and produced the same similarity values as those methods. It also scaled up to larger numbers of studies to analyze in the database with a small increase in computation time of just over 2 minutes. CONCLUSION: This system successfully analyzes a large database of radiation therapy studies and finds the one that is most similar to a new patient, which represents a potential step forward in achieving feasible adaptive radiation therapy replanning.
NASA Astrophysics Data System (ADS)
Park, Shin-Young; Lee, Hyo-Jung; Kang, Jeong-Eon; Lee, Taehyoung; Kim, Cheol-Hee
2018-01-01
The online model, Weather Research and Forecasting Model with Chemistry (WRF-Chem) is employed to interpret the effects of aerosol-cloud-precipitation interaction on mesoscale meteorological fields over Northeast Asia during the Megacity Air Pollution Study-Seoul (MAPS-Seoul) 2015 campaign. The MAPS-Seoul campaign is a pre-campaign of the Korea-United States Air Quality (KORUS-AQ) campaign conducted over the Korean Peninsula. We validated the WRF-Chem simulations during the campaign period, and analyzed aerosol-warm cloud interactions by diagnosing both aerosol direct, indirect, and total effects. The results demonstrated that aerosol directly decreased downward shortwave radiation up to -44% (-282 W m-2) for this period and subsequently increased downward longwave radiation up to +15% (∼52 W m-2) in the presence of low-level clouds along the thematic area. Aerosol increased cloud fraction indirectly up to ∼24% with the increases of both liquid water path and the droplet number mixing ratio. Precipitation properties were altered both directly and indirectly. Direct effects simply changed cloud-precipitation quantities via simple updraft process associated with perturbed radiation and temperature, while indirect effects mainly suppressed precipitation, but sometimes increased precipitation in the higher relative humidity atmosphere or near vapor-saturated condition. The total aerosol effects caused a time lag of the precipitation rate with the delayed onset time of up to 9 h. This implies the importance of aerosol effects in improving mesoscale precipitation rate prediction in the online approach in the presence of non-linear warm cloud.
Cloud immersion building shielding factors for US residential structures.
Dickson, E D; Hamby, D M
2014-12-01
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.
Automatic Reconstruction of 3D Building Models from Terrestrial Laser Scanner Data
NASA Astrophysics Data System (ADS)
El Meouche, R.; Rezoug, M.; Hijazi, I.; Maes, D.
2013-11-01
With modern 3D laser scanners we can acquire a large amount of 3D data in only a few minutes. This technology results in a growing number of applications ranging from the digitalization of historical artifacts to facial authentication. The modeling process demands a lot of time and work (Tim Volodine, 2007). In comparison with the other two stages, the acquisition and the registration, the degree of automation of the modeling stage is almost zero. In this paper, we propose a new surface reconstruction technique for buildings to process the data obtained by a 3D laser scanner. These data are called a point cloud which is a collection of points sampled from the surface of a 3D object. Such a point cloud can consist of millions of points. In order to work more efficiently, we worked with simplified models which contain less points and so less details than a point cloud obtained in situ. The goal of this study was to facilitate the modeling process of a building starting from 3D laser scanner data. In order to do this, we wrote two scripts for Rhinoceros 5.0 based on intelligent algorithms. The first script finds the exterior outline of a building. With a minimum of human interaction, there is a thin box drawn around the surface of a wall. This box is able to rotate 360° around an axis in a corner of the wall in search for the points of other walls. In this way we can eliminate noise points. These are unwanted or irrelevant points. If there is an angled roof, the box can also turn around the edge of the wall and the roof. With the different positions of the box we can calculate the exterior outline. The second script draws the interior outline in a surface of a building. By interior outline we mean the outline of the openings like windows or doors. This script is based on the distances between the points and vector characteristics. Two consecutive points with a relative big distance will form the outline of an opening. Once those points are found, the interior outline can be drawn. The designed scripts are able to ensure for simple point clouds: the elimination of almost all noise points and the reconstruction of a CAD model.
A Standardized Based Approach to Managing Atmosphere Studies For Wind Energy Research
NASA Astrophysics Data System (ADS)
Stephan, E.; Sivaraman, C.
2015-12-01
Atmosphere to Electrons (A2e) is a multi-year U.S. Department of Energy (DOE) research initiative targeting significant reductions in the cost of wind energy through an improved understanding of the complex physics governing wind flow into and through wind farms. Better insight into the flow physics has the potential to reduce wind farm energy losses by up to 20%, to reduce annual operational costs by hundreds of millions of dollars, and to improve project financing terms to more closely resemble traditional capital projects. The Data Archive and Portal (DAP) is a key capability of the A2e initiative. The DAP is a cloud-based distributed system known as the 'Wind Cloud' that functions as a repository for all A2e data. This data includes numerous historic and on-going field studies involving in situ and remote sensing instruments, simulations, and scientific analysis. Significantly it is the integration and sharing of these diverse data sets through the DAP that is key to meeting the goals of A2e. This cloud will be accessible via an open and easy-to navigate user interface that facilitates community data access, interaction, and collaboration. DAP management is working with the community, industry, and international standards bodies to develop standards for wind data and to capture important characteristics of all data in the Wind Cloud. Security will be provided to facilitate storage of proprietary data alongside publicly accessible data in the Wind Cloud, and the capability to generate anonymized data will be provided to facilitate using private data by non-privileged users (when appropriate). Finally, limited computing capabilities will be provided to facilitate co-located data analysis, validation, and generation of derived products in support of A2e science.
NASA Astrophysics Data System (ADS)
Usachev, A. D.; Zobnin, A. V.; Shonenkov, A. V.; Lipaev, A. M.; Molotkov, V. I.; Petrov, O. F.; Fortov, V. E.; Pustyl'nik, M. Y.; Fink, M. A.; Thoma, M. A.; Thomas, H. M.; Padalka, G. I.
2018-01-01
Influence of the elongated dust cloud on the intensities of different neon spectral lines in visible and near ir spectral ranges in the uniform positive column has been experimentally investigated using the Russian-European space apparatus “Plasma Kristall-4” (SA PK-4) on board of the International Space Station (ISS). The investigation was performed in the low pressure (0.5 mbar) direct current (dc, 1 mA) gas discharge in neon. Microgravity allowed us to perform experiments with a large dust cloud in the steady-state regime. To avoid the dust cloud drift in the dc electric field a switching dc polarity discharge mode has been applied. During the experiment a dust cloud of 9 mm in diameter in the discharge tube of 30 mm in diameter with the length of about 100 mm has been observed in the steady-state regime. In this regard, the intensities of neon spectral lines corresponding to 3p → 3s electronic transitions have increased by a factor of 1.4 times, while the intensities of neon spectral lines corresponding to 3d → 3p electronic transitions have increased by a factor of 1.6 times. The observed phenomenon is explained on the basis of the Schottky approach by a self-consistent rising dc electric field in the dusty plasma cloud resulting in an increase of the electron temperature.
Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality
NASA Astrophysics Data System (ADS)
Lee, I.-C.; Tsai, F.
2015-05-01
A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The results presented in this paper demonstrate the potential of using panoramic images to generate 3D point clouds and 3D models. However, it is currently a manual and labor-intensive process. A research is being carried out to Increase the degree of automation of these procedures.
Goulds Belt, Interstellar Clouds, and the Eocene-Oligocene Helium-3 Spike
NASA Technical Reports Server (NTRS)
Rubincam, David Parry
2015-01-01
Drag from hydrogen in the interstellar cloud which formed Gould's Belt may have sent small meteoroids with embedded helium to the Earth, perhaps explaining part or all of the (sup 3) He spike seen in the sedimentary record at the Eocene-Oligocene transition. Assuming the Solar System passed through part of the cloud, meteoroids in the asteroid belt up to centimeter size may have been dragged to the resonances, where their orbital eccentricities were pumped up into Earth-crossing orbits.
Upper D region chemical kinetic modeling of LORE relaxation times
NASA Astrophysics Data System (ADS)
Gordillo-Vázquez, F. J.; Luque, A.; Haldoupis, C.
2016-04-01
The recovery times of upper D region electron density elevations, caused by lightning-induced electromagnetic pulses (EMP), are modeled. The work was motivated from the need to understand a recently identified narrowband VLF perturbation named LOREs, an acronym for LOng Recovery Early VLF events. LOREs associate with long-living electron density perturbations in the upper D region ionosphere; they are generated by strong EMP radiated from large peak current intensities of ±CG (cloud to ground) lightning discharges, known also to be capable of producing elves. Relaxation model scenarios are considered first for a weak enhancement in electron density and then for a much stronger one caused by an intense lightning EMP acting as an impulsive ionization source. The full nonequilibrium kinetic modeling of the perturbed mesosphere in the 76 to 92 km range during LORE-occurring conditions predicts that the electron density relaxation time is controlled by electron attachment at lower altitudes, whereas above 79 km attachment is balanced totally by associative electron detachment so that electron loss at these higher altitudes is controlled mainly by electron recombination with hydrated positive clusters H+(H2O)n and secondarily by dissociative recombination with NO+ ions, a process which gradually dominates at altitudes >88 km. The calculated recovery times agree fairly well with LORE observations. In addition, a simplified (quasi-analytic) model build for the key charged species and chemical reactions is applied, which arrives at similar results with those of the full kinetic model. Finally, the modeled recovery estimates for lower altitudes, that is <79 km, are in good agreement with the observed short recovery times of typical early VLF events, which are known to be associated with sprites.
NASA Astrophysics Data System (ADS)
Lawson, P.; Stamnes, K.; Stamnes, J.; Zmarzly, P.; O'Connor, D.; Koskulics, J.; Hamre, B.
2008-12-01
A tethered balloon system specifically designed to collect microphysical data in mixed-phase clouds was deployed in Arctic stratus clouds during May 2008 near Ny-Alesund, Svalbard, at 79 degrees North Latitude. This is the first time a tethered balloon system with a cloud particle imager (CPI) that records high-resolution digital images of cloud drops and ice particles has been operated in cloud. The custom tether supplies electrical power to the instrument package, which in addition to the CPI houses a 4-pi short-wavelength radiometer and a met package that measures temperature, humidity, pressure, GPS position, wind speed and direction. The instrument package was profiled vertically through cloud up to altitudes of 1.6 km. Since power was supplied to the instrument package from the ground, it was possible to keep the balloon package aloft for extended periods of time, up to 9 hours at Ny- Ålesund, which was limited only by crew fatigue. CPI images of cloud drops and the sizes, shapes and degree of riming of ice particles are shown throughout vertical profiles of Arctic stratus clouds. The images show large regions of mixed-phase cloud from -8 to -2 C. The predominant ice crystal habits in these regions are needles and aggregates of needles. The amount of ice in the mixed-phase clouds varied considerably and did not appear to be a function of temperature. On some occasions, ice was observed near cloud base at -2 C with supercooled cloud above to - 8 C that was devoid of ice. Measurements of shortwave radiation are also presented. Correlations between particle distributions and radiative measurements will be analyzed to determine the effect of these Arctic stratus clouds on radiative forcing.
Galactic cold cores. IV. Cold submillimetre sources: catalogue and statistical analysis
NASA Astrophysics Data System (ADS)
Montillaud, J.; Juvela, M.; Rivera-Ingraham, A.; Malinen, J.; Pelkonen, V.-M.; Ristorcelli, I.; Montier, L.; Marshall, D. J.; Marton, G.; Pagani, L.; Toth, L. V.; Zahorecz, S.; Ysard, N.; McGehee, P.; Paladini, R.; Falgarone, E.; Bernard, J.-P.; Motte, F.; Zavagno, A.; Doi, Y.
2015-12-01
Context. For the project Galactic cold cores, Herschel photometric observations were carried out as a follow-up of cold regions of interstellar clouds previously identified with the Planck satellite. The aim of the project is to derive the physical properties of the population of cold sources and to study its connection to ongoing and future star formation. Aims: We build a catalogue of cold sources within the clouds in 116 fields observed with the Herschel PACS and SPIRE instruments. We wish to determine the general physical characteristics of the cold sources and to examine the correlations with their host cloud properties. Methods: From Herschel data, we computed colour temperature and column density maps of the fields. We estimated the distance to the target clouds and provide both uncertainties and reliability flags for the distances. The getsources multiwavelength source extraction algorithm was employed to build a catalogue of several thousand cold sources. Mid-infrared data were used, along with colour and position criteria, to separate starless and protostellar sources. We also propose another classification method based on submillimetre temperature profiles. We analysed the statistical distributions of the physical properties of the source samples. Results: We provide a catalogue of ~4000 cold sources within or near star forming clouds, most of which are located either in nearby molecular complexes (≲1 kpc) or in star forming regions of the nearby galactic arms (~2 kpc). About 70% of the sources have a size compatible with an individual core, and 35% of those sources are likely to be gravitationally bound. Significant statistical differences in physical properties are found between starless and protostellar sources, in column density versus dust temperature, mass versus size, and mass versus dust temperature diagrams. The core mass functions are very similar to those previously reported for other regions. On statistical grounds we find that gravitationally bound sources have higher background column densities (median Nbg(H2) ~ 5 × 1021 cm-2) than unbound sources (median Nbg(H2) ~ 3 × 1021 cm-2). These values of Nbg(H2) are higher for higher dust temperatures of the external layers of the parent cloud. However, only in a few cases do we find clear Nbg(H2) thresholds for the presence of cores. The dust temperatures of cloud external layers show clear variations with galactic location, as may the source temperatures. Conclusions: Our data support a more complex view of star formation than in the simple idea of a column density threshold. They show a clear influence of the surrounding UV-visible radiation on how cores distribute in their host clouds with possible variations on the Galactic scale. Planck (http://www.esa.int/Planck) is a project of the European Space Agency - ESA - with instruments provided by two scientific consortia funded by ESA member states (in particular the lead countries: France and Italy) with contributions from NASA (USA), and telescope reflectors provided in a collaboration between ESA and a scientific consortium led and funded by Denmark.Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.Full Table B.1 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/584/A92
Molecules from Clouds to Planets: Sweet Results from Alma
NASA Astrophysics Data System (ADS)
van Dishoeck, Ewine
2017-06-01
One of the most exciting developments in astronomy is the discovery of thousands of planets around stars other than our Sun. But how do these exo-planets form, and which chemical ingredients are available to build them? Thanks to powerful new telescopes, especially the Atacama Large Millimeter/submillimeter Array (ALMA), astronomers are starting to address these age-old questions scientifically. Stars and planets are born in the cold and tenuous clouds between the stars in the Milky Way. In spite of the extremely low temperatures and densities, a surprisingly rich and interesting chemistry occurs in these interstellar clouds, as evidenced by the detection of more than 180 different molecules. Highly accurate spectroscopic data are key to their identification, and examples of the continued need and close interaction between laboratory work and astronomical observations will be given. ALMA now allows us to zoom in on solar system construction for the first time. Spectral scans of the birth sites of young stars contain tens of thousands of rotational lines. Water and a surprisingly rich variety of organic materials are found, including simple sugars and high abundances of deuterated species. How are these molecules formed? Can these pre-biotic molecules end up on new planets and form the basis for life elsewhere in the universe? Stay tuned for the latest analyses and also a comparison with recent results from the Rosetta mission to comet 67 P/C-G in our own Solar System.
MJO prediction skill of the subseasonal-to-seasonal (S2S) prediction models
NASA Astrophysics Data System (ADS)
Son, S. W.; Lim, Y.; Kim, D.
2017-12-01
The Madden-Julian Oscillation (MJO), the dominant mode of tropical intraseasonal variability, provides the primary source of tropical and extratropical predictability on subseasonal to seasonal timescales. To better understand its predictability, this study conducts quantitative evaluation of MJO prediction skill in the state-of-the-art operational models participating in the subseasonal-to-seasonal (S2S) prediction project. Based on bivariate correlation coefficient of 0.5, the S2S models exhibit MJO prediction skill ranging from 12 to 36 days. These prediction skills are affected by both the MJO amplitude and phase errors, the latter becoming more important with forecast lead times. Consistent with previous studies, the MJO events with stronger initial amplitude are typically better predicted. However, essentially no sensitivity to the initial MJO phase is observed. Overall MJO prediction skill and its inter-model spread are further related with the model mean biases in moisture fields and longwave cloud-radiation feedbacks. In most models, a dry bias quickly builds up in the deep tropics, especially across the Maritime Continent, weakening horizontal moisture gradient. This likely dampens the organization and propagation of MJO. Most S2S models also underestimate the longwave cloud-radiation feedbacks in the tropics, which may affect the maintenance of the MJO convective envelop. In general, the models with a smaller bias in horizontal moisture gradient and longwave cloud-radiation feedbacks show a higher MJO prediction skill, suggesting that improving those processes would enhance MJO prediction skill.
Accuracy assessment of building point clouds automatically generated from iphone images
NASA Astrophysics Data System (ADS)
Sirmacek, B.; Lindenbergh, R.
2014-06-01
Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ) and standard deviation (σ) of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m.) and (μ2 = 0.025 m., σ2 = 0.037 m.) for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.
Other satellite atmospheres: Their nature and planetary interactions
NASA Technical Reports Server (NTRS)
Smyth, W. H.
1982-01-01
The Io sodium cloud model was successfully generated to include the time and spatial dependent lifetime sink produced by electron impact ionization as the plasma torus oscillates about the satellite plane, while simultaneously including the additional time dependence introduced by the action of solar radiation pressure on the cloud. Very preliminary model results are discussed and continuing progress in analysis of the peculiar directional features of the sodium cloud is also reported. Significant progress was made in developing a model for the Io potassium cloud and differences anticipated between the potassium and sodium cloud are described. An effort to understand the hydrogen atmosphere associated with Saturn's rings was initiated and preliminary results of a very and study are summarized.
Security and privacy preserving approaches in the eHealth clouds with disaster recovery plan.
Sahi, Aqeel; Lai, David; Li, Yan
2016-11-01
Cloud computing was introduced as an alternative storage and computing model in the health sector as well as other sectors to handle large amounts of data. Many healthcare companies have moved their electronic data to the cloud in order to reduce in-house storage, IT development and maintenance costs. However, storing the healthcare records in a third-party server may cause serious storage, security and privacy issues. Therefore, many approaches have been proposed to preserve security as well as privacy in cloud computing projects. Cryptographic-based approaches were presented as one of the best ways to ensure the security and privacy of healthcare data in the cloud. Nevertheless, the cryptographic-based approaches which are used to transfer health records safely remain vulnerable regarding security, privacy, or the lack of any disaster recovery strategy. In this paper, we review the related work on security and privacy preserving as well as disaster recovery in the eHealth cloud domain. Then we propose two approaches, the Security-Preserving approach and the Privacy-Preserving approach, and a disaster recovery plan. The Security-Preserving approach is a robust means of ensuring the security and integrity of Electronic Health Records, and the Privacy-Preserving approach is an efficient authentication approach which protects the privacy of Personal Health Records. Finally, we discuss how the integrated approaches and the disaster recovery plan can ensure the reliability and security of cloud projects. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Uma, K. N.; Krishna Moorthy, K.; Sijikumar, S.; Renju, R.; Tinu, K. A.; Raju, Suresh C.
2012-07-01
Meso-scale Convective Systems (MCS) are important in view of their large cumulous build-up, vertical extent, short horizontal extent and associated thundershowers. The Microwave Radiometer Profiler (MRP) over the equatorial coastal station Thiruvanathapuram (Trivandrum, 8.55oN, 76.9oE), has been utilized to understand the genesis of Mesoscale convective system (MCS), that occur frequently during the pre-monsoon season. Examination of the measurement of relative humidity, temperature and cloud liquid water measurements, over the zenith and two scanning elevation angles (15o) viewing both over the land and the sea respectively revealed that the MCS generally originate over the land during early afternoon hours, propagate seawards over the observational site and finally dissipate over the sea, with accompanying rainfall and latent heat release. The simulations obtained using Advanced Research-Weather Research and Forecast (WRF-ARW) model effectively reproduces the thermodynamical and microphysical properties of the MCS. The time duration and quantity of rainfall obtained by the simulations also well compared with the observations. Analysis also suggests that wind shear in the upper troposphere is responsible for the growth and the shape of the convective cloud.
Newly Discovered Clouds Found Floating High Above Milky Way
NASA Astrophysics Data System (ADS)
2002-10-01
GREEN BANK, WV -- New studies with the National Science Foundation's Robert C. Byrd Green Bank Telescope (GBT) have revealed a previously unknown population of discrete hydrogen clouds in the gaseous halo that surrounds the Milky Way Galaxy. These clouds were discovered in the transition zone between the Milky Way and intergalactic space, and provide tantalizing evidence that supernova-powered "galactic fountains" continually blast superheated hydrogen gas into our Galactic suburbs. Hydrogen Clouds Graphic Artist's Rendering of the Milky Way (background) with insert showing GBT image of newly-discovered clouds of Hydrogen gas above the plane of the Galaxy. Credit: Kirk Woellert/National Science Foundation. Extending far above the star-filled disk of the Milky Way is an atmosphere, or halo, of hydrogen gas. "By studying this halo, we can learn a great deal about the processes that are going on inside our Galaxy as well as beyond its borders," said Jay Lockman, an astronomer with the National Radio Astronomy Observatory (NRAO) in Green Bank, West Virginia. "It has remained a mystery, however, how this halo formed and what has prevented gravitational forces from collapsing the gas into a thin layer long ago." Some astronomers have speculated that this gas is distributed as a diffuse mist held up by either magnetic fields or cosmic rays streaming out of the plane of the Milky Way. Others believed that it is made of innumerable long-lived hydrogen clouds bobbing up and down like balls tossed by a juggler. Early observations with other telescopes discovered that there was some neutral hydrogen gas floating far above the Galaxy's plane, but these instruments were not sensitive enough to reveal any structure or resolve questions about its origin. Lockman's studies for the first time show a clear picture of the structure of the gas. Rather than a mist, the halo is in fact full of discrete clouds, each containing 50-to-100 solar masses of hydrogen and averaging about 100 light-years in diameter. "These objects were just below the ability of the older telescopes to detect," said Lockman, "but I looked with the GBT, and they popped right out." Lockman's results will be published in the Astrophysical Journal Letters. The clouds were discovered about 15,000 light-years from Earth toward the center of our Galaxy, and about 5,000 light-years above the Galaxy's plane. One of the most compelling facts revealed by the GBT is that the clouds are coupled dynamically to the disk of the Galaxy; that is, they follow along with the rotation of the rest of the Milky Way. Material from other sources crashing into the Milky Way would have different velocities and also appear quite different. "These are home grown objects, and not interlopers from outside our own Galaxy," said Lockman. Although the origin of these newly discovered clouds is not yet known, one mechanism to explain how this gas could be lifted into the halo is through supernova explosions. When a massive star reaches the end of its life it erupts in a cataclysm that produces a burst of cosmic rays and an enormous expanding bubble of gas at a temperature of several million degrees Celsius. Over time, this hot gas can flow outward into the Milky Way's halo. The question remains, however, what happens to this gas once it's ejected into the halo. One possibility is that it leaves the Galaxy as a wind, never to return. Some astronomers predict, however, that as the gas slowly cools it would condense into hydrogen clouds, eventually falling like raindrops back into the Milky Way, and forming what is referred to as a galactic fountain. "If the clouds were formed by material ejected from the Galactic plane into the halo," Lockman said, "then it's possible that they are now falling back onto the Galaxy. This would then require a continuing flow of new material from supernova explosions into the halo to replenish the hydrogen gas that has rained back into the disk." The researcher comments that further observations, now in progress, should clarify the properties of these halo clouds, determine their distribution throughout the Galaxy, show how they are related to other types of clouds, and reveal their internal structure. Radio telescopes are able to detect the naturally occurring radio emission from neutral atomic hydrogen. As hydrogen atoms move about in space, they can absorb small amounts of energy, sending the atom's single electron to a higher energy state. When the electron eventually moves back to its lower energy -- or resting state, it gives up a small amount of electromagnetic radiation at radio frequencies. The individual energy of a single atom is very weak, but the accumulated signal from vast clouds of hydrogen is strong enough to be detected by sensitive radio telescopes on Earth. The GBT, dedicated in August of 2000, is the world's largest fully steerable radio telescope. Its 100 by 110 meter dish is composed of 2004 individually hinged panels. It also has a unique offset feed arm, which greatly enhances the performance of the telescope, making it ideal for observations of faint astronomical objects. The GBT is completing its commissioning and early science program and will be moving into full time operation. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.
Speeding Up Geophysical Research Using Docker Containers Within Multi-Cloud Environment.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.; Starovoit, Y. O.
2016-12-01
How useful are the geophysical observations in a scope of minimizing losses from natural disasters today? Does it help to decrease number of human victims during tsunami and earthquake? Unfortunately it's still at early stage these days. It's a big goal and achievement to make such observations more useful by improving early warning and prediction systems with the help of cloud computing. Cloud computing technologies have proved the ability to speed up application development in many areas for 10 years already. Cloud unlocks new opportunities for geoscientists by providing access to modern data processing tools and algorithms including real-time high-performance computing, big data processing, artificial intelligence and others. Emerging lightweight cloud technologies, such as Docker containers, are gaining wide traction in IT due to the fact of faster and more efficient deployment of different applications in a cloud environment. It allows to deploy and manage geophysical applications and systems in minutes across multiple clouds and data centers that becomes of utmost importance for the next generation applications. In this session we'll demonstrate how Docker containers technology within multi-cloud can accelerate the development of applications specifically designed for geophysical researches.
Single-molecule spectroscopy for plastic electronics: materials analysis from the bottom-up.
Lupton, John M
2010-04-18
pi-conjugated polymers find a range of applications in electronic devices. These materials are generally highly disordered in terms of chain length and chain conformation, besides being influenced by a variety of chemical and physical defects. Although this characteristic can be of benefit in certain device applications, disorder severely complicates materials analysis. Accurate analytical techniques are, however, crucial to optimising synthetic procedures and assessing overall material purity. Fortunately, single-molecule spectroscopic techniques have emerged as an unlikely but uniquely powerful approach to unraveling intrinsic material properties from the bottom up. Building on the success of such techniques in the life sciences, single-molecule spectroscopy is finding increasing applicability in materials science, effectively enabling the dissection of the bulk down to the level of the individual molecular constituent. This article reviews recent progress in single molecule spectroscopy of conjugated polymers as used in organic electronics.
NASA Astrophysics Data System (ADS)
Seeley, J.; Romps, D. M.
2015-12-01
Recent work by Singh and O'Gorman has produced a theory for convective available potential energy (CAPE) in radiative-convective equilibrium. In this model, the atmosphere deviates from a moist adiabat—and, therefore, has positive CAPE—because entrainment causes evaporative cooling in cloud updrafts, thereby steepening their lapse rate. This has led to the proposal that CAPE increases with global warming because the strength of evaporative cooling scales according to the Clausius-Clapeyron (CC) relation. However, CAPE could also change due to changes in cloud buoyancy and changes in the entrainment rate, both of which could vary with global warming. To test the relative importance of changes in CAPE due to CC scaling of evaporative cooling, changes in cloud buoyancy, and changes in the entrainment rate, we subject a cloud-resolving model to a suite of natural (and unnatural) forcings. We find that CAPE changes are primarily driven by changes in the strength of evaporative cooling; the effect of changes in the entrainment rate and cloud buoyancy are comparatively small. This builds support for CC scaling of CAPE.
Investigation of diocotron modes in toroidally trapped electron plasmas using non-destructive method
NASA Astrophysics Data System (ADS)
Lachhvani, Lavkesh; Pahari, Sambaran; Sengupta, Sudip; Yeole, Yogesh G.; Bajpai, Manu; Chattopadhyay, P. K.
2017-10-01
Experiments with trapped electron plasmas in a SMall Aspect Ratio Toroidal device (SMARTEX-C) have demonstrated a flute-like mode represented by oscillations on capacitive (wall) probes. Although analogous to diocotron mode observed in linear electron traps, the mode evolution in toroids can have interesting consequences due to the presence of in-homogeneous magnetic field. In SMARTEX-C, the probe signals are observed to undergo transition from small, near-sinusoidal oscillations to large amplitude, non-linear "double-peaked" oscillations. To interpret the wall probe signal and bring forth the dynamics, an expression for the induced current on the probe for an oscillating charge is derived, utilizing Green's Reciprocation Theorem. Equilibrium position, poloidal velocity of the charge cloud, and charge content of the cloud, required to compute the induced current, are estimated from the experiments. Signal through capacitive probes is thereby computed numerically for possible charge cloud trajectories. In order to correlate with experiments, starting with an intuitive guess of the trajectory, the model is evolved and tweaked to arrive at a signal consistent with experimentally observed probe signals. A possible vortex like dynamics is predicted, hitherto unexplored in toroidal geometries, for a limited set of experimental observations from SMARTEX-C. Though heuristic, a useful interpretation of capacitive probe data in terms of charge cloud dynamics is obtained.
Earth observations taken from OV-105 during the STS-99 mission
2000-02-17
S99-E-5555 (17 February 2000) --- As photographed from the Space Shuttle Endeavour, this oblique electronic still image of Earth's horizon reveals a great deal of cloud cover. In the case of the electronic still camera (ESC), as well as film-bearing instruments, clouds naturally obscure views of recognizable land masses. Much of Earth is heavily cloud covered during the current mission and meteorlogists and oceanographers are interested in studying that aspect. However, the Shuttle Radar Topography Mission's other sensing equipment, X-SAR and C-band antennae, are able to penetrate cloud cover and record important topographic data for mapmakers and scientists of other disciplines. In addition to the sensing equipment mentioned above, this mission is supporting the EarthKAM project which utilizes the services of another electronic still camera mounted in Endeavour's windows. Unlike this oblique view, EarthKAM records strictly vertical or nadir imagery of points all over the world. Students across the United States and in France, Germany and Japan are taking photos throughout the STS-99 mission. And they are using these new photos, plus all the images already available in the EarthKAM system, to enhance their classroom learning in Earth and space science, social studies, geography, mathematics and more.
NASA Astrophysics Data System (ADS)
Seo, Byonghoon; Li, Hui; Bellan, Paul
2017-10-01
We are studying magnetized target fusion using an experimental method where an imploding liner compressing a plasma is simulated by a high-speed MHD-driven plasma jet colliding with a gas target cloud. This has the advantage of being non-destructive so orders of magnitude more shots are possible. Since the actual density and temperature are much more modest than fusion-relevant values, the goal is to determine the scaling of the increase in density and temperature when an actual experimental plasma is adiabatically compressed. Two new-developed diagnostics are operating and providing data. The first new diagnostic is a fiber-coupled interferometer which measures line-integrated electron density not only as a function of time, but also as a function of position along the jet. The second new diagnostic is laser Thomson scattering which measures electron density and temperature at the location where the jet collides with the cloud. These diagnostics show that when the jet collides with a target cloud the jet slows down substantially and both the electron density and temperature increase. The experimental measurements are being compared with 3D MHD and hybrid kinetic numerical simulations that model the actual experimental geometry.
Hazard calculations of diffuse reflected laser radiation for the SELENE program
NASA Technical Reports Server (NTRS)
Miner, Gilda A.; Babb, Phillip D.
1993-01-01
The hazards from diffuse laser light reflections off water clouds, ice clouds, and fog and from possible specular reflections off ice clouds were assessed with the American National Standards (ANSI Z136.1-1986) for the free-electron-laser parameters under consideration for the Segmented Efficient Laser Emission for Non-Nuclear Electricity (SELENE) Program. Diffuse laser reflection hazards exist for water cloud surfaces less than 722 m in altitude and ice cloud surfaces less than 850 m in altitude. Specular reflections from ice crystals in cirrus clouds are not probable; however, any specular reflection is a hazard to ground observers. The hazard to the laser operators and any ground observers during heavy fog conditions is of such significant magnitude that the laser should not be operated in fog.
NASA Technical Reports Server (NTRS)
Pearl, J. C.; Smith, M. D.; Conrath, B. J.; Bandfield, J. L.; Christensen, P. R.
1999-01-01
Successful operation of the Mars Global Surveyor spacecraft, beginning in September 1997, has permitted extensive infrared observations of condensation clouds during the martian southern summer and fall seasons (184 deg less than L(sub s) less than 28 deg). Initially, thin (normal optical depth less than 0.06 at 825/ cm) ice clouds and hazes were widespread, showing a latitudinal gradient. With the onset of a regional dust storm at L(sub s) = 224 deg, ice clouds essentially vanished in the southern hemisphere, to reappear gradually after the decay of the storm. The thickest clouds (optical depth approx. 0.6) were associated with major volcanic features. At L(exp s) = 318 deg, the cloud at Ascraeus Mons was observed to disappear between 21:30 and 09:30, consistent with historically recorded diurnal behavior for clouds of this type. Limb observations showed extended optically thin (depth less than 0.04) stratiform clouds at altitudes up to 55 km. A water ice haze was present in the north polar night at altitudes up to 40 km; this probably provided heterogeneous nucleation sites for the formation of CO2 clouds at altitudes below the 1 mbar pressure level, where atmospheric temperatures dropped to the condensation point of CO2.
Noctilucent cloud polarimetry: Twilight measurements in a wide range of scattering angles
NASA Astrophysics Data System (ADS)
Ugolnikov, Oleg S.; Maslov, Igor A.; Kozelov, Boris V.; Dlugach, Janna M.
2016-06-01
Wide-field polarization measurements of the twilight sky background during several nights with bright and extended noctilucent clouds in central and northern Russia in 2014 and 2015 are used to build the phase dependence of the degree of polarization of sunlight scattered by cloud particles in a wide range of scattering angles (from 40° to 130°). This range covers the linear polarization maximum near 90° and large-angle slope of the curve. The polarization in this angle range is most sensitive to the particle size. The method of separation of scattering on cloud particles from the twilight background is presented. Results are compared with T-matrix simulations for different sizes and shapes of ice particles; the best-fit model radius of particles (0.06 μm) and maximum radius (about 0.1 μm) are estimated.
NASA Technical Reports Server (NTRS)
Lord, Albert M; Kaufman, Warner B
1956-01-01
An investigation was conducted on the size and density of the boric oxide exhaust cloud from a J47-25 turbojet engine operating on trimethylborate fuel at sea-level static condition. Movies and still photographs were taken from the ground and from a helicopter. Objects could not be perceived through the main body of the cloud at distances up to 800 feet from the engine. Data are included on the amount of fallout from the cloud and the concentration of boric oxide in the cloud. A radiation detection device was set up to determine whether the glowing oxide particles would be more susceptible than hydrocarbon exhaust gases to this type of tracking device. The device showed an increase in radiation by a factor of 3 for trimethylborate over that for JP-4.
A cloud system for mobile medical services of traditional Chinese medicine.
Hu, Nian-Ze; Lee, Chia-Ying; Hou, Mark C; Chen, Ying-Ling
2013-12-01
Many medical centers in Taiwan have started to provide Traditional Chinese Medicine (TCM) services for hospitalized patients. Due to the complexity of TCM modality and the increasing need for providing TCM services for patients in different wards at distantly separate locations within the hospital, it is getting difficult to manage the situation in the traditional way. A computerized system with mobile ability can therefore provide a practical solution to the challenge presented. The study tries to develop a cloud system equipped with mobile devices to integrate electronic medical records, facilitate communication between medical workers, and improve the quality of TCM services for the hospitalized patients in a medical center. The system developed in the study includes mobile devices carrying Android operation system and a PC as a cloud server. All the devices use the same TCM management system developed by the study. A website of database is set up for information sharing. The cloud system allows users to access and update patients' medical information, which is of great help to medical workers for verifying patients' identification and giving proper treatments to patients. The information then can be wirelessly transmitted between medical personnel through the cloud system. Several quantitative and qualitative evaluation indexes are developed to measure the effectiveness of the cloud system on the quality of the TCM service. The cloud system is tested and verified based on a sample of hospitalized patients receiving the acupuncture treatment at the Lukang Branch of Changhua Christian Hospital (CCH) in Taiwan. The result shows a great improvement in operating efficiency of the TCM service in that a significant saving in labor time can be attributable to the cloud system. In addition, the cloud system makes it easy to confirm patients' identity through taking a picture of the patient upon receiving any medical treatment. The result also shows that the cloud system achieves significant improvement in the acupuncture treatment. All the acupuncture needles now can be removed at the time they are expected to be removed. Furthermore, through the cloud system, medical workers can access and update patients' medical information on-site, which provides a means of effective communication between medical workers. These functions allow us to make the most use of the portability feature of the acupuncture service. The result shows that the contribution made by the cloud system to the TCM service is multi-dimensional: cost-effective, environment-protective, performance-enhancing etc. Developing and implementing such a cloud system for the TCM service in Taiwan symbolizes a pioneering effort. We believe that the work we have done here can serve as a stepping-stone toward advancing the TCM service quality in the future.
Sea spray aerosol structure and composition using cryogenic transmission electron microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patterson, Joseph P.; Collins, Douglas B.; Michaud, Jennifer M.
The surface properties of atmospheric aerosol particles largely control their impact on climate by affecting their ability to uptake water, react heterogeneously, and nucleate ice in clouds. However, in the vacuum of a conventional electron microscope, the native surface structure often undergoes chemical rearrangement resulting in surfaces that are quite different from their atmospheric configurations. Herein, we report the development of a cryo-TEM approach where sea spray aerosol particles are flash frozen in their native state and then probed by electron microscopy. This unique approach allows for the detection of not only mixed salts, but also soft materials including wholemore » hydrated bacteria, diatoms, virus particles, marine vesicles, as well as gel networks within hydrated salt droplets. As a result, we anticipate this method will open up a new avenue of analysis for aerosol particles, not only for ocean-derived aerosols, but for those produced from other sources where there is interest in the transfer of organic or biological species from the biosphere to the atmosphere.« less
Sea spray aerosol structure and composition using cryogenic transmission electron microscopy
Patterson, Joseph P.; Collins, Douglas B.; Michaud, Jennifer M.; ...
2016-01-15
The surface properties of atmospheric aerosol particles largely control their impact on climate by affecting their ability to uptake water, react heterogeneously, and nucleate ice in clouds. However, in the vacuum of a conventional electron microscope, the native surface structure often undergoes chemical rearrangement resulting in surfaces that are quite different from their atmospheric configurations. Herein, we report the development of a cryo-TEM approach where sea spray aerosol particles are flash frozen in their native state and then probed by electron microscopy. This unique approach allows for the detection of not only mixed salts, but also soft materials including wholemore » hydrated bacteria, diatoms, virus particles, marine vesicles, as well as gel networks within hydrated salt droplets. As a result, we anticipate this method will open up a new avenue of analysis for aerosol particles, not only for ocean-derived aerosols, but for those produced from other sources where there is interest in the transfer of organic or biological species from the biosphere to the atmosphere.« less
Michalet, X.; Siegmund, O.H.W.; Vallerga, J.V.; Jelinsky, P.; Millaud, J.E.; Weiss, S.
2017-01-01
We have recently developed a wide-field photon-counting detector having high-temporal and high-spatial resolutions and capable of high-throughput (the H33D detector). Its design is based on a 25 mm diameter multi-alkali photocathode producing one photo electron per detected photon, which are then multiplied up to 107 times by a 3-microchannel plate stack. The resulting electron cloud is proximity focused on a cross delay line anode, which allows determining the incident photon position with high accuracy. The imaging and fluorescence lifetime measurement performances of the H33D detector installed on a standard epifluorescence microscope will be presented. We compare them to those of standard single-molecule detectors such as single-photon avalanche photodiode (SPAD) or electron-multiplying camera using model samples (fluorescent beads, quantum dots and live cells). Finally, we discuss the design and applications of future generation of H33D detectors for single-molecule imaging and high-throughput study of biomolecular interactions. PMID:29479130
NASA Technical Reports Server (NTRS)
Chenette, D. L.; Stone, E. C.
1983-01-01
An analysis of the electron absorption signature observed by the Cosmic Ray System (CRS) on Voyage 2 near the orbit of Mimas is presented. We find that these observations cannot be explained as the absorption signature of Mimas. Combing Pioneer 11 and Voyager 2 measurements of the electron flux at Mimas's orbit (L=3.1), we find an electron spectrum where most of the flux above approx 100 keV is concentrated near 1 to 3 MeV. The expected Mimas absorption signature is calculated from this spectrum neglecting radial diffusion. A lower limit on the diffusion coefficient for MeV electrons is obtained. With a diffusion coefficient this large, both the Voyager 2 and the Pioneer 11 small-scale electron absorption signature observations in Mimas's orbit are enigmatic. Thus we refer to the mechanism for producing these signatures as the Mimas ghost. A cloud of material in orbit with Mimas may account for the observed electron signature if the cloud is at least 1% opaque to electrons across a region extending over a few hundred kilometers.
Mobile healthcare information management utilizing Cloud Computing and Android OS.
Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias
2010-01-01
Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.
NASA Technical Reports Server (NTRS)
Landt, J. A.
1974-01-01
The geometries of dense solar wind clouds are estimated by comparing single-location measurements of the solar wind plasma with the average of the electron density obtained by radio signal delay measurements along a radio path between earth and interplanetary spacecraft. Several of these geometries agree with the current theoretical spatial models of flare-induced shock waves. A new class of spatially limited structures that contain regions with densities greater than any observed in the broad clouds is identified. The extent of a cloud was found to be approximately inversely proportional to its density.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backfish, Michael
This paper documents the use of four retarding field analyzers (RFAs) to measure electron cloud signals created in Fermilab’s Main Injector during 120 GeV operations. The first data set was taken from September 11, 2009 to July 4, 2010. This data set is used to compare two different types of beam pipe that were installed in the accelerator. Two RFAs were installed in a normal steel beam pipe like the rest of the Main Injector while another two were installed in a one meter section of beam pipe that was coated on the inside with titanium nitride (TiN). A secondmore » data run started on August 23, 2010 and ended on January 10, 2011 when Main Injector beam intensities were reduced thus eliminating the electron cloud. This second run uses the same RFA setup but the TiN coated beam pipe was replaced by a one meter section coated with amorphous carbon (aC). This section of beam pipe was provided by CERN in an effort to better understand how an aC coating will perform over time in an accelerator. The research consists of three basic parts: (a) continuously monitoring the conditioning of the three different types of beam pipe over both time and absorbed electrons (b) measurement of the characteristics of the surrounding magnetic fields in the Main Injector in order to better relate actual data observed in the Main Injector with that of simulations (c) measurement of the energy spectrum of the electron cloud signals using retarding field analyzers in all three types of beam pipe.« less
Limitations of silicon diodes for clinical electron dosimetry.
Song, Haijun; Ahmad, Munir; Deng, Jun; Chen, Zhe; Yue, Ning J; Nath, Ravinder
2006-01-01
This work investigates the relevance of several factors affecting the response of silicon diode dosemeters in depth-dose scans of electron beams. These factors are electron energy, instantaneous dose rate, dose per pulse, photon/electron dose ratio and electron scattering angle (directional response). Data from the literature and our own experiments indicate that the impact of these factors may be up to +/-15%. Thus, the different factors would have to cancel out perfectly at all depths in order to produce true depth-dose curves. There are reports of good agreement between depth-doses measured with diodes and ionisation chambers. However, our measurements with a Scantronix electron field detector (EFD) diode and with a plane-parallel ionisation chamber show discrepancies both in the build-up and in the low-dose regions, with a ratio up to 1.4. Moreover, the absolute sensitivity of two diodes of the same EFD model was found to differ by a factor of 3, and this ratio was not constant but changed with depth between 5 and 15% in the low-dose regions of some clinical electron beams. Owing to these inhomogeneities among diodes even of the same model, corrections for each factor would have to be diode-specific and beam-specific. All these corrections would have to be determined using parallel plane chambers, as recommended by AAPM TG-25, which would be unrealistic in clinical practice. Our conclusion is that in general diodes are not reliable in the measurement of depth-dose curves of clinical electron beams.
Bayesian cloud detection for MERIS, AATSR, and their combination
NASA Astrophysics Data System (ADS)
Hollstein, A.; Fischer, J.; Carbajal Henken, C.; Preusker, R.
2014-11-01
A broad range of different of Bayesian cloud detection schemes is applied to measurements from the Medium Resolution Imaging Spectrometer (MERIS), the Advanced Along-Track Scanning Radiometer (AATSR), and their combination. The cloud masks were designed to be numerically efficient and suited for the processing of large amounts of data. Results from the classical and naive approach to Bayesian cloud masking are discussed for MERIS and AATSR as well as for their combination. A sensitivity study on the resolution of multidimensional histograms, which were post-processed by Gaussian smoothing, shows how theoretically insufficient amounts of truth data can be used to set up accurate classical Bayesian cloud masks. Sets of exploited features from single and derived channels are numerically optimized and results for naive and classical Bayesian cloud masks are presented. The application of the Bayesian approach is discussed in terms of reproducing existing algorithms, enhancing existing algorithms, increasing the robustness of existing algorithms, and on setting up new classification schemes based on manually classified scenes.
Bayesian cloud detection for MERIS, AATSR, and their combination
NASA Astrophysics Data System (ADS)
Hollstein, A.; Fischer, J.; Carbajal Henken, C.; Preusker, R.
2015-04-01
A broad range of different of Bayesian cloud detection schemes is applied to measurements from the Medium Resolution Imaging Spectrometer (MERIS), the Advanced Along-Track Scanning Radiometer (AATSR), and their combination. The cloud detection schemes were designed to be numerically efficient and suited for the processing of large numbers of data. Results from the classical and naive approach to Bayesian cloud masking are discussed for MERIS and AATSR as well as for their combination. A sensitivity study on the resolution of multidimensional histograms, which were post-processed by Gaussian smoothing, shows how theoretically insufficient numbers of truth data can be used to set up accurate classical Bayesian cloud masks. Sets of exploited features from single and derived channels are numerically optimized and results for naive and classical Bayesian cloud masks are presented. The application of the Bayesian approach is discussed in terms of reproducing existing algorithms, enhancing existing algorithms, increasing the robustness of existing algorithms, and on setting up new classification schemes based on manually classified scenes.
Spontaneous Ad Hoc Mobile Cloud Computing Network
Lacuesta, Raquel; Sendra, Sandra; Peñalver, Lourdes
2014-01-01
Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes. PMID:25202715
Spontaneous ad hoc mobile cloud computing network.
Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes
2014-01-01
Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.
Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improvemore » the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.« less
1963-01-01
the connector pin, which was then sol - dered at various levels of wire build up. It is proposed, with a slight modification of the connector terminal...sacrificial anode for galvanic protection of other metals. Aluminum Aluminum and its alloys show promise for applications in long-life oceano - graphic...section of a dumet weld lead. Calculations and actual heat measurements on the effects of welding and sol - dering within 0.060 in. of the component
Yiu, Rex; Fung, Vicky; Szeto, Karen; Hung, Veronica; Siu, Ricky; Lam, Johnny; Lai, Daniel; Maw, Christina; Cheung, Adah; Shea, Raman; Choy, Anna
2013-01-01
In Hong Kong, elderly patients discharged from hospital are at high risk of unplanned readmission. The Integrated Care Model (ICM) program is introduced to provide continuous and coordinated care for high risk elders from hospital to community to prevent unplanned readmission. A multidisciplinary working group was set up to address the requirements on developing the electronic forms for ICM program. Six (6) forms were developed. These forms can support ICM service delivery for the high risk elders, clinical documentation, statistical analysis and information sharing.
NASA Astrophysics Data System (ADS)
Extance, Andy
2010-05-01
Thousands of times per second a point of light turns on and off, moving side to side, top to bottom. It is a rhythm that ticks around the world, illuminating living rooms and office desks in the process. However, the cathode-ray TVs and monitors that metronomically fire electron guns at viewers - who are shielded only by thin sheets of glass - are rapidly being replaced by flat-screen technologies. Yet as the creation of images using scanning electron beams fades into history, a new form of technology is emerging that builds up pictures by scanning with light.
The GCM-Oriented CALIPSO Cloud Product (CALIPSO-GOCCP)
NASA Astrophysics Data System (ADS)
Chepfer, H.; Bony, S.; Winker, D.; Cesana, G.; Dufresne, J. L.; Minnis, P.; Stubenrauch, C. J.; Zeng, S.
2010-01-01
This article presents the GCM-Oriented Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) Cloud Product (GOCCP) designed to evaluate the cloudiness simulated by general circulation models (GCMs). For this purpose, Cloud-Aerosol Lidar with Orthogonal Polarization L1 data are processed following the same steps as in a lidar simulator used to diagnose the model cloud cover that CALIPSO would observe from space if the satellite was flying above an atmosphere similar to that predicted by the GCM. Instantaneous profiles of the lidar scattering ratio (SR) are first computed at the highest horizontal resolution of the data but at the vertical resolution typical of current GCMs, and then cloud diagnostics are inferred from these profiles: vertical distribution of cloud fraction, horizontal distribution of low, middle, high, and total cloud fractions, instantaneous SR profiles, and SR histograms as a function of height. Results are presented for different seasons (January-March 2007-2008 and June-August 2006-2008), and their sensitivity to parameters of the lidar simulator is investigated. It is shown that the choice of the vertical resolution and of the SR threshold value used for cloud detection can modify the cloud fraction by up to 0.20, particularly in the shallow cumulus regions. The tropical marine low-level cloud fraction is larger during nighttime (by up to 0.15) than during daytime. The histograms of SR characterize the cloud types encountered in different regions. The GOCCP high-level cloud amount is similar to that from the TIROS Operational Vertical Sounder (TOVS) and the Atmospheric Infrared Sounder (AIRS). The low-level and middle-level cloud fractions are larger than those derived from passive remote sensing (International Satellite Cloud Climatology Project, Moderate-Resolution Imaging Spectroradiometer-Cloud and Earth Radiant Energy System Polarization and Directionality of Earth Reflectances, TOVS Path B, AIRS-Laboratoire de Météorologie Dynamique) because the latter only provide information on the uppermost cloud layer.
Automatic Building Abstraction from Aerial Photogrammetry
NASA Astrophysics Data System (ADS)
Ley, A.; Hänsch, R.; Hellwich, O.
2017-09-01
Multi-view stereo has been shown to be a viable tool for the creation of realistic 3D city models. Nevertheless, it still states significant challenges since it results in dense, but noisy and incomplete point clouds when applied to aerial images. 3D city modelling usually requires a different representation of the 3D scene than these point clouds. This paper applies a fully-automatic pipeline to generate a simplified mesh from a given dense point cloud. The mesh provides a certain level of abstraction as it only consists of relatively large planar and textured surfaces. Thus, it is possible to remove noise, outlier, as well as clutter, while maintaining a high level of accuracy.
2006-09-07
KENNEDY SPACE CENTER, FLA. - Storm clouds fill the sky from Launch Pad 39B, at right, west beyond the Vehicle Assembly Building. Space Shuttle Atlantis still sits on the pad after a scrub was called Aug. 27 due to a concern with fuel cell 1. Towering above the shuttle is the 80-foot lightning mast. During the STS-115 mission, Atlantis' astronauts will deliver and install the 17.5-ton, bus-sized P3/P4 integrated truss segment on the station. The girder-like truss includes a set of giant solar arrays, batteries and associated electronics and will provide one-fourth of the total power-generation capability for the completed station. This mission is the 116th space shuttle flight, the 27th flight for orbiter Atlantis, and the 19th U.S. flight to the International Space Station. STS-115 is scheduled to last 11 days with a planned landing at KSC. Photo credit: NASA/Ken Thornsley
NASA Technical Reports Server (NTRS)
LaMothe, J.; Ferland, Gary J.
2002-01-01
Recombination cooling, in which a free electron emits light while being captured to an ion, is an important cooling process in photoionized clouds that are optically thick or have low metallicity. State specific rather than total recombination cooling rates are needed since the hydrogen atom tends to become optically thick in high-density regimes such as Active Galactic Nuclei. This paper builds upon previous work to derive the cooling rate over the full temperature range where the process can be a significant contributor in a photoionized plasma. We exploit the fact that the recombination and cooling rates are given by intrinsically similar formulae to express the cooling rate in terms of the closely related radiative recombination rate. We give an especially simple but accurate approximation that works for any high hydrogenic level and can be conveniently employed in large-scale numerical simulations.
Advanced Opto-Electronics (LIDAR and Microsensor Development)
NASA Technical Reports Server (NTRS)
Vanderbilt, Vern C. (Technical Monitor); Spangler, Lee H.
2005-01-01
Our overall intent in this aspect of the project were to establish a collaborative effort between several departments at Montana State University for developing advanced optoelectronic technology for advancing the state-of-the-art in optical remote sensing of the environment. Our particular focus was on development of small systems that can eventually be used in a wide variety of applications that might include ground-, air-, and space deployments, possibly in sensor networks. Specific objectives were to: 1) Build a field-deployable direct-detection lidar system for use in measurements of clouds, aerosols, fish, and vegetation; 2) Develop a breadboard prototype water vapor differential absorption lidar (DIAL) system based on highly stable, tunable diode laser technology developed previously at MSU. We accomplished both primary objectives of this project, in developing a field-deployable direct-detection lidar and a breadboard prototype of a water vapor DIAL system. Paper summarizes each of these accomplishments.
Nature of Pre-Earthquake Phenomena and their Effects on Living Organisms
Freund, Friedemann; Stolc, Viktor
2013-01-01
Simple Summary Earthquakes are invariably preceded by a period when stresses increase deep in the Earth. Animals appear to be able to sense impending seismic events. During build-up of stress, electronic charge carriers are activated deep below, called positive holes. Positive holes have unusual properties: they can travel fast and far into and through the surrounding rocks. As they flow, they generate ultralow frequency electromagnetic waves. When they arrive at the Earth surface, they can ionize the air. When they flow into water, they oxidize it to hydrogen peroxides. All these physical and chemical processes can have noticeable effects on animals. Abstract Earthquakes occur when tectonic stresses build up deep in the Earth before catastrophic rupture. During the build-up of stress, processes that occur in the crustal rocks lead to the activation of highly mobile electronic charge carriers. These charge carriers are able to flow out of the stressed rock volume into surrounding rocks. Such outflow constitutes an electric current, which generates electromagnetic (EM) signals. If the outflow occurs in bursts, it will lead to short EM pulses. If the outflow is continuous, the currents may fluctuate, generating EM emissions over a wide frequency range. Only ultralow and extremely low frequency (ULF/ELF) waves travel through rock and can reach the Earth surface. The outflowing charge carriers are (i) positively charged and (ii) highly oxidizing. When they arrive at the Earth surface from below, they build up microscopic electric fields, strong enough to field-ionize air molecules. As a result, the air above the epicentral region of an impending major earthquake often becomes laden with positive airborne ions. Medical research has long shown that positive airborne ions cause changes in stress hormone levels in animals and humans. In addition to the ULF/ELF emissions, positive airborne ions can cause unusual reactions among animals. When the charge carriers flow into water, they oxidize water to hydrogen peroxide. This, plus oxidation of organic compounds, can cause behavioral changes among aquatic animals. PMID:26487415
Ionisation and discharge in cloud-forming atmospheres of brown dwarfs and extrasolar planets
NASA Astrophysics Data System (ADS)
Helling, Ch; Rimmer, P. B.; Rodriguez-Barrera, I. M.; Wood, Kenneth; Robertson, G. B.; Stark, C. R.
2016-07-01
Brown dwarfs and giant gas extrasolar planets have cold atmospheres with rich chemical compositions from which mineral cloud particles form. Their properties, like particle sizes and material composition, vary with height, and the mineral cloud particles are charged due to triboelectric processes in such dynamic atmospheres. The dynamics of the atmospheric gas is driven by the irradiating host star and/or by the rotation of the objects that changes during its lifetime. Thermal gas ionisation in these ultra-cool but dense atmospheres allows electrostatic interactions and magnetic coupling of a substantial atmosphere volume. Combined with a strong magnetic field \\gg {{B}\\text{Earth}} , a chromosphere and aurorae might form as suggested by radio and x-ray observations of brown dwarfs. Non-equilibrium processes like cosmic ray ionisation and discharge processes in clouds will increase the local pool of free electrons in the gas. Cosmic rays and lighting discharges also alter the composition of the local atmospheric gas such that tracer molecules might be identified. Cosmic rays affect the atmosphere through air showers in a certain volume which was modelled with a 3D Monte Carlo radiative transfer code to be able to visualise their spacial extent. Given a certain degree of thermal ionisation of the atmospheric gas, we suggest that electron attachment to charge mineral cloud particles is too inefficient to cause an electrostatic disruption of the cloud particles. Cloud particles will therefore not be destroyed by Coulomb explosion for the local temperature in the collisional dominated brown dwarf and giant gas planet atmospheres. However, the cloud particles are destroyed electrostatically in regions with strong gas ionisation. The potential size of such cloud holes would, however, be too small and might occur too far inside the cloud to mimic the effect of, e.g. magnetic field induced star spots.
Electron Cloud Effects in Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
Abstract We present a brief summary of various aspects of the electron-cloud effect (ECE) in accelerators. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire ?ECLOUD? series [1?22]. In addition, the proceedings of the various flavors of Particle Accelerator Conferences [23] contain a large number of EC-related publications. The ICFA Beam Dynamics Newsletter series [24] contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC [25].
Leland, W.T.
1960-01-01
The ion source described essentially eliminater the problem of deposits of nonconducting materials forming on parts of the ion source by certain corrosive gases. This problem is met by removing both filament and trap from the ion chamber, spacing them apart and outside the chamber end walls, placing a focusing cylinder about the filament tip to form a thin collimated electron stream, aligning the cylinder, slits in the walls, and trap so that the electron stream does not bombard any part in the source, and heating the trap, which is bombarded by electrons, to a temperature hotter than that in the ion chamber, so that the tendency to build up a deposit caused by electron bombardment is offset by the extra heating supplied only to the trap.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdo, Aous A.; /Naval Research Lab, Wash., D.C.; Ackermann, M.
Designed as a high-sensitivity gamma-ray observatory, the Fermi Large Area Telescope is also an electron detector with a large acceptance exceeding 2 m{sup 2}sr at 300 GeV. Building on the gamma-ray analysis, we have developed an efficient electron detection strategy which provides sufficient background rejection for measurement of the steeply-falling electron spectrum up to 1 TeV. Our high precision data show that the electron spectrum falls with energy as E{sup -3.0} and does not exhibit prominent spectral features. Interpretations in terms of a conventional diffusive model as well as a potential local extra component are briefly discussed.
TRANSPORT EQUATION OF A PLASMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balescu, R.
1960-10-01
It is shown that the many-body problem in plasmas can be handled explicitly. An equation describing the collective effects of the problem is derived. For simplicity, a onecomponent gas is considered in a continuous neutralizing background. The tool for handling the problem is provided by the general theory of irreversible processes in gases. The equation derived describes the interaction of electrons which are"dressed" by a polarization cloud. The polarization cloud differs from the Debye cloud. (B.O.G.)
The Theoretical Astrophysical Observatory: Cloud-based Mock Galaxy Catalogs
NASA Astrophysics Data System (ADS)
Bernyk, Maksym; Croton, Darren J.; Tonini, Chiara; Hodkinson, Luke; Hassan, Amr H.; Garel, Thibault; Duffy, Alan R.; Mutch, Simon J.; Poole, Gregory B.; Hegarty, Sarah
2016-03-01
We introduce the Theoretical Astrophysical Observatory (TAO), an online virtual laboratory that houses mock observations of galaxy survey data. Such mocks have become an integral part of the modern analysis pipeline. However, building them requires expert knowledge of galaxy modeling and simulation techniques, significant investment in software development, and access to high performance computing. These requirements make it difficult for a small research team or individual to quickly build a mock catalog suited to their needs. To address this TAO offers access to multiple cosmological simulations and semi-analytic galaxy formation models from an intuitive and clean web interface. Results can be funnelled through science modules and sent to a dedicated supercomputer for further processing and manipulation. These modules include the ability to (1) construct custom observer light cones from the simulation data cubes; (2) generate the stellar emission from star formation histories, apply dust extinction, and compute absolute and/or apparent magnitudes; and (3) produce mock images of the sky. All of TAO’s features can be accessed without any programming requirements. The modular nature of TAO opens it up for further expansion in the future.
Yu, Hua-Gen
2008-05-21
A spherical electron cloud hopping (SECH) model is proposed to study the product branching ratios of dissociative recombination (DR) of polyatomic systems. In this model, the fast electron-captured process is treated as an instantaneous hopping of a cloud of uniform spherical fractional point charges onto a target M+q ion (or molecule). The sum of point charges (-1) simulates the incident electron. The sphere radius is determined by a critical distance (Rc eM) between the incoming electron (e-) and the target, at which the potential energy of the e(-)-M+q system is equal to that of the electron-captured molecule M+q(-1) in a symmetry-allowed electronic state with the same structure as M(+q). During the hopping procedure, the excess energies of electron association reaction are dispersed in the kinetic energies of M+q(-1) atoms to conserve total energy. The kinetic energies are adjusted by linearly adding atomic momenta in the direction of driving forces induced by the scattering electron. The nuclear dynamics of the resultant M+q(-1) molecule are studied by using a direct ab initio dynamics method on the adiabatic potential energy surface of M+q(-1), or together with extra adiabatic surface(s) of M+q(-1). For the latter case, the "fewest switches" surface hopping algorithm of Tully was adapted to deal with the nonadiabaticity in trajectory propagations. The SECH model has been applied to study the DR of both CH+ and H3O+(H2O)2. The theoretical results are consistent with the experiment. It was found that water molecules play an important role in determining the product branching ratios of the molecular cluster ion.
The Bonn Electron Stretcher Accelerator ELSA: Past and future
NASA Astrophysics Data System (ADS)
Hillert, W.
2006-05-01
In 1953, it was decided to build a 500MeV electron synchrotron in Bonn. It came into operation 1958, being the first alternating gradient synchrotron in Europe. After five years of performing photoproduction experiments at this accelerator, a larger 2.5GeV electron synchrotron was built and set into operation in 1967. Both synchrotrons were running for particle physics experiments, until from 1982 to 1987 a third accelerator, the electron stretcher ring ELSA, was constructed and set up in a separate ring tunnel below the physics institute. ELSA came into operation in 1987, using the pulsed 2.5GeV synchrotron as pre-accelerator. ELSA serves either as storage ring producing synchrotron radiation, or as post-accelerator and pulse stretcher. Applying a slow extraction close to a third integer resonance, external electron beams with energies up to 3.5GeV and high duty factors are delivered to hadron physics experiments. Various photo- and electroproduction experiments, utilising the experimental set-ups PHOENICS, ELAN, SAPHIR, GDH and Crystal Barrel have been carried out. During the late 90's, a pulsed GaAs source of polarised electrons was constructed and set up at the accelerator. ELSA was upgraded in order to accelerate polarised electrons, compensating for depolarising resonances by applying the methods of fast tune jumping and harmonic closed orbit correction. With the experimental investigation of the GDH sum rule, the first experiment requiring a polarised beam and a polarised target was successfully performed at the accelerator. In the near future, the stretcher ring will be further upgraded to increase polarisation and current of the external electron beams. In addition, the aspects of an increase of the maximum energy to 5GeV using superconducting resonators will be investigated.
Electron beam pumped semiconductor laser
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor)
2009-01-01
Electron-beam-pumped semiconductor ultra-violet optical sources (ESUVOSs) are disclosed that use ballistic electron pumped wide bandgap semiconductor materials. The sources may produce incoherent radiation and take the form of electron-beam-pumped light emitting triodes (ELETs). The sources may produce coherent radiation and take the form of electron-beam-pumped laser triodes (ELTs). The ELTs may take the form of electron-beam-pumped vertical cavity surface emitting lasers (EVCSEL) or edge emitting electron-beam-pumped lasers (EEELs). The semiconductor medium may take the form of an aluminum gallium nitride alloy that has a mole fraction of aluminum selected to give a desired emission wavelength, diamond, or diamond-like carbon (DLC). The sources may be produced from discrete components that are assembled after their individual formation or they may be produced using batch MEMS-type or semiconductor-type processing techniques to build them up in a whole or partial monolithic manner, or combination thereof.
Gelhorn, Heather L; Skalicky, Anne M; Balantac, Zaneta; Eremenco, Sonya; Cimms, Tricia; Halling, Katarina; Hollen, Patricia J; Gralla, Richard J; Mahoney, Martin C; Sexton, Chris
2018-07-01
Obtaining qualitative data directly from the patient perspective enhances the content validity of patient-reported outcome (PRO) instruments. The objective of this qualitative study was to evaluate the content validity of the Lung Cancer Symptom Scale for Mesothelioma (LCSS-Meso) and its usability on an electronic device. A cross-sectional methodological study, using a qualitative approach, was conducted among patients recruited from four clinical sites. The primary target population included patients with pleural mesothelioma; data were also collected from patients with peritoneal mesothelioma on an exploratory basis. Semi-structured interviews were conducted consisting of concept elicitation, cognitive interviewing, and evaluation of electronic patient-reported outcome (ePRO) usability. Participants (n = 21) were interviewed in person (n = 9) or by telephone (n = 12); 71% were male with a mean age of 69 years (SD = 14). The most common signs and symptoms experienced by participants with pleural mesothelioma (n = 18) were shortness of breath, fluid build-up, pain, fatigue, coughing, and appetite loss. The most commonly described symptoms for those with peritoneal mesothelioma (n = 4) were bloating, changes in appetite, fatigue, fluid build-up, shortness of breath, and pain. Participants with pleural mesothelioma commonly described symptoms assessed by the LCSS-Meso in language consistent with the questionnaire and a majority understood and easily completed each of the items. The ePRO version was easy to use, and there was no evidence that the electronic formatting changed the way participants responded to the questions. Results support the content validity of the LCSS-Meso and the usability of the electronic format for use in assessing symptoms among patients with pleural mesothelioma.
ERIC Educational Resources Information Center
Waters, John K.
2011-01-01
The vulnerability and inefficiency of backing up data on-site is prompting school districts to switch to more secure, less troublesome cloud-based options. District auditors are pushing for a better way to back up their data than the on-site, tape-based system that had been used for years. About three years ago, Hendrick School District in…
NASA Astrophysics Data System (ADS)
Sirch, Tobias; Bugliaro, Luca; Zinner, Tobias; Möhrlein, Matthias; Vazquez-Navarro, Margarita
2017-02-01
A novel approach for the nowcasting of clouds and direct normal irradiance (DNI) based on the Spinning Enhanced Visible and Infrared Imager (SEVIRI) aboard the geostationary Meteosat Second Generation (MSG) satellite is presented for a forecast horizon up to 120 min. The basis of the algorithm is an optical flow method to derive cloud motion vectors for all cloudy pixels. To facilitate forecasts over a relevant time period, a classification of clouds into objects and a weighted triangular interpolation of clear-sky regions are used. Low and high level clouds are forecasted separately because they show different velocities and motion directions. Additionally a distinction in advective and convective clouds together with an intensity correction for quickly thinning convective clouds is integrated. The DNI is calculated from the forecasted optical thickness of the low and high level clouds. In order to quantitatively assess the performance of the algorithm, a forecast validation against MSG/SEVIRI observations is performed for a period of 2 months. Error rates and Hanssen-Kuiper skill scores are derived for forecasted cloud masks. For a forecast of 5 min for most cloud situations more than 95 % of all pixels are predicted correctly cloudy or clear. This number decreases to 80-95 % for a forecast of 2 h depending on cloud type and vertical cloud level. Hanssen-Kuiper skill scores for cloud mask go down to 0.6-0.7 for a 2 h forecast. Compared to persistence an improvement of forecast horizon by a factor of 2 is reached for all forecasts up to 2 h. A comparison of forecasted optical thickness distributions and DNI against observations yields correlation coefficients larger than 0.9 for 15 min forecasts and around 0.65 for 2 h forecasts.