Electron Cloud Trapping in Recycler Combined Function Dipole Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey A.; Nagaitsev, S.
2016-10-04
Electron cloud can lead to a fast instability in intense proton and positron beams in circular accelerators. In the Fermilab Recycler the electron cloud is confined within its combined function magnets. We show that the field of combined function magnets traps the electron cloud, present the results of analytical estimates of trapping, and compare them to numerical simulations of electron cloud formation. The electron cloud is located at the beam center and up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electronsmore » significantly increases the density of the cloud on the next revolution. In a Recycler combined function dipole this multiturn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The multi-turn build-up can be stopped by injection of a clearing bunch of 1010 p at any position in the ring.« less
Fast instability caused by electron cloud in combined function magnets
Antipov, S. A.; Adamson, P.; Burov, A.; ...
2017-04-10
One of the factors which may limit the intensity in the Fermilab Recycler is a fast transverse instability. It develops within a hundred turns and, in certain conditions, may lead to a beam loss. The high rate of the instability suggest that its cause is electron cloud. Here, we studied the phenomena by observing the dynamics of stable and unstable beam, simulating numerically the build-up of the electron cloud, and developed an analytical model of an electron cloud driven instability with the electrons trapped in combined function di-poles. We also found that beam motion can be stabilized by a clearingmore » bunch, which confirms the electron cloud nature of the instability. The clearing suggest electron cloud trapping in Recycler combined function mag-nets. Numerical simulations show that up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. Furthermore, in a Recycler combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated resulting instability growth rate of about 30 revolutions and the mode fre-quency of 0.4 MHz are consistent with experimental observations and agree with the simulation in the PEI code. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
Fast instability caused by electron cloud in combined function magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, S. A.; Adamson, P.; Burov, A.
One of the factors which may limit the intensity in the Fermilab Recycler is a fast transverse instability. It develops within a hundred turns and, in certain conditions, may lead to a beam loss. The high rate of the instability suggest that its cause is electron cloud. Here, we studied the phenomena by observing the dynamics of stable and unstable beam, simulating numerically the build-up of the electron cloud, and developed an analytical model of an electron cloud driven instability with the electrons trapped in combined function di-poles. We also found that beam motion can be stabilized by a clearingmore » bunch, which confirms the electron cloud nature of the instability. The clearing suggest electron cloud trapping in Recycler combined function mag-nets. Numerical simulations show that up to 1% of the particles can be trapped by the magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. Furthermore, in a Recycler combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated resulting instability growth rate of about 30 revolutions and the mode fre-quency of 0.4 MHz are consistent with experimental observations and agree with the simulation in the PEI code. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
Fast Transverse Beam Instability Caused by Electron Cloud Trapped in Combined Function Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey
Electron cloud instabilities affect the performance of many circular high-intensity particle accelerators. They usually have a fast growth rate and might lead to an increase of the transverse emittance and beam loss. A peculiar example of such an instability is observed in the Fermilab Recycler proton storage ring. Although this instability might pose a challenge for future intensity upgrades, its nature had not been completely understood. The phenomena has been studied experimentally by comparing the dynamics of stable and unstable beam, numerically by simulating the build-up of the electron cloud and its interaction with the beam, and analytically by constructing a model of an electron cloud driven instability with the electrons trapped in combined function dipoles. Stabilization of the beam by a clearing bunch reveals that the instability is caused by the electron cloud, trapped in beam optics magnets. Measurements of microwave propagation confirm the presence of the cloud in the combined function dipoles. Numerical simulations show that up to 10more » $$^{-2}$$ of the particles can be trapped by their magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. In a combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated fast instability growth rate of about 30 revolutions and low mode frequency of 0.4 MHz are consistent with experimental observations and agree with the simulations. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
Electron-Cloud Build-Up: Theory and Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M. A.
We present a broad-brush survey of the phenomenology, history and importance of the electron-cloud effect (ECE). We briefly discuss the simulation techniques used to quantify the electron-cloud (EC) dynamics. Finally, we present in more detail an effective theory to describe the EC density build-up in terms of a few effective parameters. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire 'ECLOUD' series. In addition, the proceedings of the various flavors of Particle Accelerator Conferences contain a large number of EC-related publications.more » The ICFA Beam Dynamics Newsletter series contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC.« less
Protection of electronic health records (EHRs) in cloud.
Alabdulatif, Abdulatif; Khalil, Ibrahim; Mai, Vu
2013-01-01
EHR technology has come into widespread use and has attracted attention in healthcare institutions as well as in research. Cloud services are used to build efficient EHR systems and obtain the greatest benefits of EHR implementation. Many issues relating to building an ideal EHR system in the cloud, especially the tradeoff between flexibility and security, have recently surfaced. The privacy of patient records in cloud platforms is still a point of contention. In this research, we are going to improve the management of access control by restricting participants' access through the use of distinct encrypted parameters for each participant in the cloud-based database. Also, we implement and improve an existing secure index search algorithm to enhance the efficiency of information control and flow through a cloud-based EHR system. At the final stage, we contribute to the design of reliable, flexible and secure access control, enabling quick access to EHR information.
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
LIDAR Point Cloud Data Extraction and Establishment of 3D Modeling of Buildings
NASA Astrophysics Data System (ADS)
Zhang, Yujuan; Li, Xiuhai; Wang, Qiang; Liu, Jiang; Liang, Xin; Li, Dan; Ni, Chundi; Liu, Yan
2018-01-01
This paper takes the method of Shepard’s to deal with the original LIDAR point clouds data, and generate regular grid data DSM, filters the ground point cloud and non ground point cloud through double least square method, and obtains the rules of DSM. By using region growing method for the segmentation of DSM rules, the removal of non building point cloud, obtaining the building point cloud information. Uses the Canny operator to extract the image segmentation is needed after the edges of the building, uses Hough transform line detection to extract the edges of buildings rules of operation based on the smooth and uniform. At last, uses E3De3 software to establish the 3D model of buildings.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Electron-cloud build-up in hadron machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
2004-08-09
The first observations of electron-proton coupling effect for coasting beams and for long-bunch beams were made at the earliest proton storage rings at the Budker Institute of Nuclear Physics (BINP) in the mid-60's [1]. The effect was mainly a form of the two-stream instability. This phenomenon reappeared at the CERN ISR in the early 70's, where it was accompanied by an intense vacuum pressure rise. When the ISR was operated in bunched-beam mode while testing aluminum vacuum chambers, a resonant effect was observed in which the electron traversal time across the chamber was comparable to the bunch spacing [2]. Thismore » effect (''beam-induced multipacting''), being resonant in nature, is a dramatic manifestation of an electron cloud sharing the vacuum chamber with a positively-charged beam. An electron-cloud-induced instability has been observed since the mid-80's at the PSR (LANL) [3]; in this case, there is a strong transverse instability accompanied by fast beam losses when the beam current exceeds a certain threshold. The effect was observed for the first time for a positron beam in the early 90's at the Photon Factory (PF) at KEK, where the most prominent manifestation was a coupled-bunch instability that was absent when the machine was operated with an electron beam under otherwise identical conditions [4]. Since then, with the advent of ever more intense positron and hadron beams, and the development and deployment of specialized electron detectors [5-9], the effect has been observed directly or indirectly, and sometimes studied systematically, at most lepton and hadron machines when operated with sufficiently intense beams. The effect is expected in various forms and to various degrees in accelerators under design or construction. The electron-cloud effect (ECE) has been the subject of various meetings [10-15]. Two excellent reviews, covering the phenomenology, measurements, simulations and historical development, have been recently given by Frank Zimmermann [16,17]. In this article we focus on the mechanisms of electron-cloud buildup and dissipation for hadronic beams, particularly those with very long, intense, bunches.« less
Cloud-assisted mobile-access of health data with privacy and auditability.
Tong, Yue; Sun, Jinyuan; Chow, Sherman S M; Li, Pan
2014-03-01
Motivated by the privacy issues, curbing the adoption of electronic healthcare systems and the wild success of cloud service models, we propose to build privacy into mobile healthcare systems with the help of the private cloud. Our system offers salient features including efficient key management, privacy-preserving data storage, and retrieval, especially for retrieval at emergencies, and auditability for misusing health data. Specifically, we propose to integrate key management from pseudorandom number generator for unlinkability, a secure indexing method for privacy-preserving keyword search which hides both search and access patterns based on redundancy, and integrate the concept of attribute-based encryption with threshold signing for providing role-based access control with auditability to prevent potential misbehavior, in both normal and emergency cases.
Biotic games and cloud experimentation as novel media for biophysics education
NASA Astrophysics Data System (ADS)
Riedel-Kruse, Ingmar; Blikstein, Paulo
2014-03-01
First-hand, open-ended experimentation is key for effective formal and informal biophysics education. We developed, tested and assessed multiple new platforms that enable students and children to directly interact with and learn about microscopic biophysical processes: (1) Biotic games that enable local and online play using galvano- and photo-tactic stimulation of micro-swimmers, illustrating concepts such as biased random walks, Low Reynolds number hydrodynamics, and Brownian motion; (2) an undergraduate course where students learn optics, electronics, micro-fluidics, real time image analysis, and instrument control by building biotic games; and (3) a graduate class on the biophysics of multi-cellular systems that contains a cloud experimentation lab enabling students to execute open-ended chemotaxis experiments on slimemolds online, analyze their data, and build biophysical models. Our work aims to generate the equivalent excitement and educational impact for biophysics as robotics and video games have had for mechatronics and computer science, respectively. We also discuss how scaled-up cloud experimentation systems can support MOOCs with true lab components and life-science research in general.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
NASA Astrophysics Data System (ADS)
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.
NASA Astrophysics Data System (ADS)
Romano, Annalisa; Boine-Frankenheim, Oliver; Buffat, Xavier; Iadarola, Giovanni; Rumolo, Giovanni
2018-06-01
At the beginning of the 2016 run, an anomalous beam instability was systematically observed at the CERN Large Hadron Collider (LHC). Its main characteristic was that it spontaneously appeared after beams had been stored for several hours in collision at 6.5 TeV to provide data for the experiments, despite large chromaticity values and high strength of the Landau-damping octupole magnet. The instability exhibited several features characteristic of those induced by the electron cloud (EC). Indeed, when LHC operates with 25 ns bunch spacing, an EC builds up in a large fraction of the beam chambers, as revealed by several independent indicators. Numerical simulations have been carried out in order to investigate the role of the EC in the observed instabilities. It has been found that the beam intensity decay is unfavorable for the beam stability when LHC operates in a strong EC regime.
Storing and using health data in a virtual private cloud.
Regola, Nathan; Chawla, Nitesh V
2013-03-13
Electronic health records are being adopted at a rapid rate due to increased funding from the US federal government. Health data provide the opportunity to identify possible improvements in health care delivery by applying data mining and statistical methods to the data and will also enable a wide variety of new applications that will be meaningful to patients and medical professionals. Researchers are often granted access to health care data to assist in the data mining process, but HIPAA regulations mandate comprehensive safeguards to protect the data. Often universities (and presumably other research organizations) have an enterprise information technology infrastructure and a research infrastructure. Unfortunately, both of these infrastructures are generally not appropriate for sensitive research data such as HIPAA, as they require special accommodations on the part of the enterprise information technology (or increased security on the part of the research computing environment). Cloud computing, which is a concept that allows organizations to build complex infrastructures on leased resources, is rapidly evolving to the point that it is possible to build sophisticated network architectures with advanced security capabilities. We present a prototype infrastructure in Amazon's Virtual Private Cloud to allow researchers and practitioners to utilize the data in a HIPAA-compliant environment.
Properties of the electron cloud in a high-energy positron and electron storage ring
Harkay, K. C.; Rosenberg, R. A.
2003-03-20
Low-energy, background electrons are ubiquitous in high-energy particle accelerators. Under certain conditions, interactions between this electron cloud and the high-energy beam can give rise to numerous effects that can seriously degrade the accelerator performance. These effects range from vacuum degradation to collective beam instabilities and emittance blowup. Although electron-cloud effects were first observed two decades ago in a few proton storage rings, they have in recent years been widely observed and intensely studied in positron and proton rings. Electron-cloud diagnostics developed at the Advanced Photon Source enabled for the first time detailed, direct characterization of the electron-cloud properties in amore » positron and electron storage ring. From in situ measurements of the electron flux and energy distribution at the vacuum chamber wall, electron-cloud production mechanisms and details of the beam-cloud interaction can be inferred. A significant longitudinal variation of the electron cloud is also observed, due primarily to geometrical details of the vacuum chamber. Furthermore, such experimental data can be used to provide realistic limits on key input parameters in modeling efforts, leading ultimately to greater confidence in predicting electron-cloud effects in future accelerators.« less
Semantic Segmentation of Building Elements Using Point Cloud Hashing
NASA Astrophysics Data System (ADS)
Chizhova, M.; Gurianov, A.; Hess, M.; Luhmann, T.; Brunn, A.; Stilla, U.
2018-05-01
For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect) into different building types and structural elements (dome, nave, transept etc.), including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling).
On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things
NASA Astrophysics Data System (ADS)
Huang, Chao
2017-12-01
two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.
A secure EHR system based on hybrid clouds.
Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke
2012-10-01
Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.
NASA Astrophysics Data System (ADS)
Antova, Gergana; Kunchev, Ivan; Mickrenska-Cherneva, Christina
2016-10-01
The representation of physical buildings in Building Information Models (BIM) has been a subject of research since four decades in the fields of Construction Informatics and GeoInformatics. The early digital representations of buildings mainly appeared as 3D drawings constructed by CAD software, and the 3D representation of the buildings was only geometric, while semantics and topology were out of modelling focus. On the other hand, less detailed building representations, with often focus on ‘outside’ representations were also found in form of 2D /2,5D GeoInformation models. Point clouds from 3D laser scanning data give a full and exact representation of the building geometry. The article presents different aspects and the benefits of using point clouds in BIM in the different stages of a lifecycle of a building.
A cloud-based framework for large-scale traditional Chinese medical record retrieval.
Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin
2018-01-01
Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.
Weaver, Charlotte A; Teenier, Pamela
2014-01-01
Health care organizations have long been limited to a small number of major vendors in their selection of an electronic health record (EHR) system in the national and international marketplace. These major EHR vendors have in common base systems that are decades old, are built in antiquated programming languages, use outdated server architecture, and are based on inflexible data models [1,2]. The option to upgrade their technology to keep pace with the power of new web-based architecture, programming tools and cloud servers is not easily undertaken due to large client bases, development costs and risk [3]. This paper presents the decade-long efforts of a large national provider of home health and hospice care to select an EHR product, failing that to build their own and failing that initiative to go back into the market in 2012. The decade time delay had allowed new technologies and more nimble vendors to enter the market. Partnering with a new start-up company doing web and cloud based architecture for the home health and hospice market, made it possible to build, test and implement an operational and point of care system in 264 home health locations across 40 states and three time zones in the United States. This option of "starting over" with the new web and cloud technologies may be posing a next generation of new EHR vendors that retells the Blackberry replacement by iPhone story in healthcare.
Storing and Using Health Data in a Virtual Private Cloud
Regola, Nathan
2013-01-01
Electronic health records are being adopted at a rapid rate due to increased funding from the US federal government. Health data provide the opportunity to identify possible improvements in health care delivery by applying data mining and statistical methods to the data and will also enable a wide variety of new applications that will be meaningful to patients and medical professionals. Researchers are often granted access to health care data to assist in the data mining process, but HIPAA regulations mandate comprehensive safeguards to protect the data. Often universities (and presumably other research organizations) have an enterprise information technology infrastructure and a research infrastructure. Unfortunately, both of these infrastructures are generally not appropriate for sensitive research data such as HIPAA, as they require special accommodations on the part of the enterprise information technology (or increased security on the part of the research computing environment). Cloud computing, which is a concept that allows organizations to build complex infrastructures on leased resources, is rapidly evolving to the point that it is possible to build sophisticated network architectures with advanced security capabilities. We present a prototype infrastructure in Amazon’s Virtual Private Cloud to allow researchers and practitioners to utilize the data in a HIPAA-compliant environment. PMID:23485880
Electron temperatures within magnetic clouds between 2 and 4 AU: Voyager 2 observations
NASA Astrophysics Data System (ADS)
Sittler, E. C.; Burlaga, L. F.
1998-08-01
We have performed an analysis of Voyager 2 plasma electron observations within magnetic clouds between 2 and 4 AU identified by Burlaga and Behannon [1982]. The analysis has been confined to three of the magnetic clouds identified by Burlaga and Behannon that had high-quality data. The general properties of the plasma electrons within a magnetic cloud are that (1) the moment electron temperature anticorrelates with the electron density within the cloud, (2) the ratio Te/Tp tends to be >1, and (3) on average, Te/Tp~7.0. All three results are consistent with previous electron observations within magnetic clouds. Detailed analyses of the core and halo populations within the magnetic clouds show no evidence of either an anticorrelation between the core temperature TC and the electron density Ne or an anticorrelation between the halo temperature TH and the electron density. Within the magnetic clouds the halo component can contribute more than 50% of the electron pressure. The anticorrelation of Te relative to Ne can be traced to the density of the halo component relative to the density of the core component. The core electrons dominate the electron density. When the density goes up, the halo electrons contribute less to the electron pressure, so we get a lower Te. When the electron density goes down, the halo electrons contribute more to the electron pressure, and Te goes up. We find a relation between the electron pressure and density of the form Pe=αNeγ with γ~0.5.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubin, David L.
2015-01-23
Accelerators that collide high energy beams of matter and anti-matter are essential tools for the investigation of the fundamental constituents of matter, and the search for new forms of matter and energy. A “Linear Collider” is a machine that would bring high energy and very compact bunches of electrons and positrons (anti-electrons) into head-on collision. Such a machine would produce (among many other things) the newly discovered Higgs particle, enabling a detailed study of its properties. Among the most critical and challenging components of a linear collider are the damping rings that produce the very compact and intense beams ofmore » electrons and positrons that are to be accelerated into collision. Hot dilute particle beams are injected into the damping rings, where they are compressed and cooled. The size of the positron beam must be reduced more than a thousand fold in the damping ring, and this compression must be accomplished in a fraction of a second. The cold compact beams are then extracted from the damping ring and accelerated into collision at high energy. The proposed International Linear Collider (ILC), would require damping rings that routinely produce such cold, compact and intense beams. The goal of the Cornell study was a credible design for the damping rings for the ILC. Among the technical challenges of the damping rings; the development of instrumentation that can measure the properties of the very small beams in a very narrow window of time, and mitigation of the forces that can destabilize the beams and prevent adequate cooling, or worse lead to beam loss. One of the most pernicious destabilizing forces is due to the formation of clouds of electrons in the beam pipe. The electron cloud effect is a phenomenon in particle accelerators in which a high density of low energy electrons, build up inside the vacuum chamber. At the outset of the study, it was anticipated that electron cloud effects would limit the intensity of the positron ring, and that an instability associated with residual gas in the beam pipe would limit the intensity of the electron ring. It was also not clear whether the required very small beam size could be achieved. The results of this study are important contributions to the design of both the electron and positron damping rings in which all of those challenges are addressed and overcome. Our findings are documented in the ILC Technical Design Report, a document that represents the work of an international collaboration of scientists. Our contributions include design of the beam magnetic optics for the 3 km circumference damping rings, the vacuum system and surface treatments for electron cloud mitigation, the design of the guide field magnets, design of the superconducting damping wigglers, and new detectors for precision measurement of beam properties. Our study informed the specification of the basic design parameters for the damping rings, including alignment tolerances, magnetic field errors, and instrumentation. We developed electron cloud modelling tools and simulations to aid in the interpretation of the measurements that we carried out in the Cornell Electron-positron Storage Ring (CESR). The simulations provide a means for systematic extrapolation of our measurements at CESR to the proposed ILC damping rings, and ultimately to specify how the beam pipes should be fabricated in order to minimize the effects of the electron cloud. With the conclusion of this study, the design of the essential components of the damping rings is complete, including the development and characterization (with computer simulations) of the beam optics, specification of techniques for minimizing beam size, design of damping ring instrumentation, R&D into electron cloud suppression methods, tests of long term durability of electron cloud coatings, and design of damping ring vacuum system components.« less
Image-Based Airborne LiDAR Point Cloud Encoding for 3d Building Model Retrieval
NASA Astrophysics Data System (ADS)
Chen, Yi-Chen; Lin, Chao-Hung
2016-06-01
With the development of Web 2.0 and cyber city modeling, an increasing number of 3D models have been available on web-based model-sharing platforms with many applications such as navigation, urban planning, and virtual reality. Based on the concept of data reuse, a 3D model retrieval system is proposed to retrieve building models similar to a user-specified query. The basic idea behind this system is to reuse these existing 3D building models instead of reconstruction from point clouds. To efficiently retrieve models, the models in databases are compactly encoded by using a shape descriptor generally. However, most of the geometric descriptors in related works are applied to polygonal models. In this study, the input query of the model retrieval system is a point cloud acquired by Light Detection and Ranging (LiDAR) systems because of the efficient scene scanning and spatial information collection. Using Point clouds with sparse, noisy, and incomplete sampling as input queries is more difficult than that by using 3D models. Because that the building roof is more informative than other parts in the airborne LiDAR point cloud, an image-based approach is proposed to encode both point clouds from input queries and 3D models in databases. The main goal of data encoding is that the models in the database and input point clouds can be consistently encoded. Firstly, top-view depth images of buildings are generated to represent the geometry surface of a building roof. Secondly, geometric features are extracted from depth images based on height, edge and plane of building. Finally, descriptors can be extracted by spatial histograms and used in 3D model retrieval system. For data retrieval, the models are retrieved by matching the encoding coefficients of point clouds and building models. In experiments, a database including about 900,000 3D models collected from the Internet is used for evaluation of data retrieval. The results of the proposed method show a clear superiority over related methods.
Study of the transport parameters of cloud lightning plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Z. S.; Yuan, P.; Zhao, N.
2010-11-15
Three spectra of cloud lightning have been acquired in Tibet (China) using a slitless grating spectrograph. The electrical conductivity, the electron thermal conductivity, and the electron thermal diffusivity of the cloud lightning, for the first time, are calculated by applying the transport theory of air plasma. In addition, we investigate the change behaviors of parameters (the temperature, the electron density, the electrical conductivity, the electron thermal conductivity, and the electron thermal diffusivity) in one of the cloud lightning channels. The result shows that these parameters decrease slightly along developing direction of the cloud lightning channel. Moreover, they represent similar suddenmore » change behavior in tortuous positions and the branch of the cloud lightning channel.« less
D Building Reconstruction by Multiview Images and the Integrated Application with Augmented Reality
NASA Astrophysics Data System (ADS)
Hwang, Jin-Tsong; Chu, Ting-Chen
2016-10-01
This study presents an approach wherein photographs with a high degree of overlap are clicked using a digital camera and used to generate three-dimensional (3D) point clouds via feature point extraction and matching. To reconstruct a building model, an unmanned aerial vehicle (UAV) is used to click photographs from vertical shooting angles above the building. Multiview images are taken from the ground to eliminate the shielding effect on UAV images caused by trees. Point clouds from the UAV and multiview images are generated via Pix4Dmapper. By merging two sets of point clouds via tie points, the complete building model is reconstructed. The 3D models are reconstructed using AutoCAD 2016 to generate vectors from the point clouds; SketchUp Make 2016 is used to rebuild a complete building model with textures. To apply 3D building models in urban planning and design, a modern approach is to rebuild the digital models; however, replacing the landscape design and building distribution in real time is difficult as the frequency of building replacement increases. One potential solution to these problems is augmented reality (AR). Using Unity3D and Vuforia to design and implement the smartphone application service, a markerless AR of the building model can be built. This study is aimed at providing technical and design skills related to urban planning, urban designing, and building information retrieval using AR.
Beam tests of beampipe coatings for electron cloud mitigation in Fermilab Main Injector
Backfish, Michael; Eldred, Jeffrey; Tan, Cheng Yang; ...
2015-10-26
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Dedicated tests have shown beampipe coatings dramatically reduce the density of electron cloud in particle accelerators. In this work, we evaluate the performance of titanium nitride, amorphous carbon, and diamond-like carbon as beampipe coatings for the mitigation of electron cloud in the Fermilab Main Injector. Altogether our tests represent 2700 ampere-hours of proton operation spanning five years. Three electron cloud detectors, retarding field analyzers, are installed in a straight section and allow a direct comparisonmore » between the electron flux in the coated and uncoated stainless steel beampipe. We characterize the electron flux as a function of intensity up to a maximum of 50 trillion protons per cycle. Each beampipe material conditions in response to electron bombardment from the electron cloud and we track the changes in these materials as a function of time and the number of absorbed electrons. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. We measure the energy spectrum of electrons incident on the stainless steel, titanium nitride and amorphous carbon beampipes. We find the electron cloud signal is highly sensitive to stray magnetic fields and bunch-length over the Main Injector ramp cycle. In conclusion, we conduct a complete survey of the stray magnetic fields at the test station and compare the electron cloud signal to that in a field-free region.« less
Patient-Centered e-Health Record over the Cloud.
Koumaditis, Konstantinos; Themistocleous, Marinos; Vassilacopoulos, George; Prentza, Andrianna; Kyriazis, Dimosthenis; Malamateniou, Flora; Maglaveras, Nicos; Chouvarda, Ioanna; Mourouzis, Alexandros
2014-01-01
The purpose of this paper is to introduce the Patient-Centered e-Health (PCEH) conceptual aspects alongside a multidisciplinary project that combines state-of-the-art technologies like cloud computing. The project, by combining several aspects of PCEH, such as: (a) electronic Personal Healthcare Record (e-PHR), (b) homecare telemedicine technologies, (c) e-prescribing, e-referral, e-learning, with advanced technologies like cloud computing and Service Oriented Architecture (SOA), will lead to an innovative integrated e-health platform of many benefits to the society, the economy, the industry, and the research community. To achieve this, a consortium of experts, both from industry (two companies, one hospital and one healthcare organization) and academia (three universities), was set to investigate, analyse, design, build and test the new platform. This paper provides insights to the PCEH concept and to the current stage of the project. In doing so, we aim at increasing the awareness of this important endeavor and sharing the lessons learned so far throughout our work.
Building Facade Modeling Under Line Feature Constraint Based on Close-Range Images
NASA Astrophysics Data System (ADS)
Liang, Y.; Sheng, Y. H.
2018-04-01
To solve existing problems in modeling facade of building merely with point feature based on close-range images , a new method for modeling building facade under line feature constraint is proposed in this paper. Firstly, Camera parameters and sparse spatial point clouds data were restored using the SFM , and 3D dense point clouds were generated with MVS; Secondly, the line features were detected based on the gradient direction , those detected line features were fit considering directions and lengths , then line features were matched under multiple types of constraints and extracted from multi-image sequence. At last, final facade mesh of a building was triangulated with point cloud and line features. The experiment shows that this method can effectively reconstruct the geometric facade of buildings using the advantages of combining point and line features of the close - range image sequence, especially in restoring the contour information of the facade of buildings.
Beam Tests of Diamond-Like Carbon Coating for Mitigation of Electron Cloud
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffrey; Backfish, Michael; Kato, Shigeki
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Our results evaluate the efficacy of a diamond-like carbon (DLC) coating for the mitigation of electron in the Fermilab Main Injector. The interior surface of the beampipe conditions in response to electron bombardment from the electron cloud and we track the change in electron cloud flux over time in the DLC coated beampipe and uncoated stainless steel beampipe. The electron flux is measured by retarding field analyzers placed in a field-free region of the Mainmore » Injector. We find the DLC coating reduces the electron cloud signal to roughly 2\\% of that measured in the uncoated stainless steel beampipe.« less
NASA Astrophysics Data System (ADS)
Wang, Jinxia; Dou, Aixia; Wang, Xiaoqing; Huang, Shusong; Yuan, Xiaoxiang
2016-11-01
Compared to remote sensing image, post-earthquake airborne Light Detection And Ranging (LiDAR) point cloud data contains a high-precision three-dimensional information on earthquake disaster which can improve the accuracy of the identification of destroy buildings. However after the earthquake, the damaged buildings showed so many different characteristics that we can't distinguish currently between trees and damaged buildings points by the most commonly used method of pre-processing. In this study, we analyse the number of returns for given pulse of trees and damaged buildings point cloud and explore methods to distinguish currently between trees and damaged buildings points. We propose a new method by searching for a certain number of neighbourhood space and calculate the ratio(R) of points whose number of returns for given pulse greater than 1 of the neighbourhood points to separate trees from buildings. In this study, we select some point clouds of typical undamaged building, collapsed building and tree as samples from airborne LiDAR point cloud data which got after 2010 earthquake in Haiti MW7.0 by the way of human-computer interaction. Testing to get the Rvalue to distinguish between trees and buildings and apply the R-value to test testing areas. The experiment results show that the proposed method in this study can distinguish between building (undamaged and damaged building) points and tree points effectively but be limited in area where buildings various, damaged complex and trees dense, so this method will be improved necessarily.
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; ...
2016-04-11
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnostics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud withmore » stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains, 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this study we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.« less
NASA Astrophysics Data System (ADS)
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; Dugan, G. F.; Flanagan, J.; McArdle, K. E.; Miller, M. I.; Palmer, M. A.; Ramirez, G. A.; Sonnad, K. G.; Totten, M. M.; Tucker, S. L.; Williams, H. A.
2016-04-01
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnotics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud with stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains; 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this paper we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.
Electron Cloud Measurements in Fermilab Main Injector and Recycler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffrey Scott; Backfish, M.; Tan, C. Y.
This conference paper presents a series of electron cloud measurements in the Fermilab Main Injector and Recycler. A new instability was observed in the Recycler in July 2014 that generates a fast transverse excitation in the first high intensity batch to be injected. Microwave measurements of electron cloud in the Recycler show a corresponding depen- dence on the batch injection pattern. These electron cloud measurements are compared to those made with a retard- ing field analyzer (RFA) installed in a field-free region of the Recycler in November. RFAs are also used in the Main Injector to evaluate the performance ofmore » beampipe coatings for the mitigation of electron cloud. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. The diamond-like carbon coating, in contrast, reduced the electron cloud signal to 1% of that measured in uncoated stainless steel beampipe.« less
Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.
2018-05-01
Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.
High fidelity 3-dimensional models of beam-electron cloud interactions in circular accelerators
NASA Astrophysics Data System (ADS)
Feiz Zarrin Ghalam, Ali
Electron cloud is a low-density electron profile created inside the vacuum chamber of circular machines with positively charged beams. Electron cloud limits the peak current of the beam and degrades the beams' quality through luminosity degradation, emittance growth and head to tail or bunch to bunch instability. The adverse effects of electron cloud on long-term beam dynamics becomes more and more important as the beams go to higher and higher energies. This problem has become a major concern in many future circular machines design like the Large Hadron Collider (LHC) under construction at European Center for Nuclear Research (CERN). Due to the importance of the problem several simulation models have been developed to model long-term beam-electron cloud interaction. These models are based on "single kick approximation" where the electron cloud is assumed to be concentrated at one thin slab around the ring. While this model is efficient in terms of computational costs, it does not reflect the real physical situation as the forces from electron cloud to the beam are non-linear contrary to this model's assumption. To address the existing codes limitation, in this thesis a new model is developed to continuously model the beam-electron cloud interaction. The code is derived from a 3-D parallel Particle-In-Cell (PIC) model (QuickPIC) originally used for plasma wakefield acceleration research. To make the original model fit into circular machines environment, betatron and synchrotron equations of motions have been added to the code, also the effect of chromaticity, lattice structure have been included. QuickPIC is then benchmarked against one of the codes developed based on single kick approximation (HEAD-TAIL) for the transverse spot size of the beam in CERN-LHC. The growth predicted by QuickPIC is less than the one predicted by HEAD-TAIL. The code is then used to investigate the effect of electron cloud image charges on the long-term beam dynamics, particularly on the transverse tune shift of the beam at CERN Super Proton Synchrotron (SPS) ring. The force from the electron cloud image charges on the beam cancels the force due to cloud compression formed on the beam axis and therefore the tune shift is mainly due to the uniform electron cloud density. (Abstract shortened by UMI.)
Consolidation of cloud computing in ATLAS
NASA Astrophysics Data System (ADS)
Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration
2017-10-01
Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.
Scaling predictive modeling in drug development with cloud computing.
Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola
2015-01-26
Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.
Large ionospheric disturbances produced by the HAARP HF facility
NASA Astrophysics Data System (ADS)
Bernhardt, Paul A.; Siefring, Carl L.; Briczinski, Stanley J.; McCarrick, Mike; Michell, Robert G.
2016-07-01
The enormous transmitter power, fully programmable antenna array, and agile frequency generation of the High Frequency Active Auroral Research Program (HAARP) facility in Alaska have allowed the production of unprecedented disturbances in the ionosphere. Using both pencil beams and conical (or twisted) beam transmissions, artificial ionization clouds have been generated near the second, third, fourth, and sixth harmonics of the electron gyrofrequency. The conical beam has been used to sustain these clouds for up to 5 h as opposed to less than 30 min durations produced using pencil beams. The largest density plasma clouds have been produced at the highest harmonic transmissions. Satellite radio transmissions at 253 MHz from the National Research Laboratory TACSat4 communications experiment have been severely disturbed by propagating through artificial plasma regions. The scintillation levels for UHF waves passing through artificial ionization clouds from HAARP are typically 16 dB. This is much larger than previously reported scintillations at other HF facilities which have been limited to 3 dB or less. The goals of future HAARP experiments should be to build on these discoveries to sustain plasma densities larger than that of the background ionosphere for use as ionospheric reflectors of radio signals.
National electronic medical records integration on cloud computing system.
Mirza, Hebah; El-Masri, Samir
2013-01-01
Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.
Building Reflection with Word Clouds for Online RN to BSN Students.
Volkert, Delene R
Reflection allows students to integrate learning with their personal context, developing deeper knowledge and promoting critical thinking. Word clouds help students develop themes/concepts beyond traditional methods, introducing visual aspects to an online learning environment. Students created word clouds and captions, then responded to those created by peers for a weekly discussion assignment. Students indicated overwhelming support for the use of word clouds to develop deeper understanding of the subject matter. This reflection assignment could be utilized in asynchronous, online undergraduate nursing courses for creative methods of building reflection and developing knowledge for the undergraduate RN to BSN student.
Fast Transverse Instability and Electron Cloud Measurements in Fermilab Recycler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffery; Adamson, Philip; Capista, David
2015-03-01
A new transverse instability is observed that may limit the proton intensity in the Fermilab Recycler. The instability is fast, leading to a beam-abort loss within two hundred turns. The instability primarily affects the first high-intensity batch from the Fermilab Booster in each Recycler cycle. This paper analyzes the dynamical features of the destabilized beam. The instability excites a horizontal betatron oscillation which couples into the vertical motion and also causes transverse emittance growth. This paper describes the feasibility of electron cloud as the mechanism for this instability and presents the first measurements of the electron cloud in the Fermilabmore » Recycler. Direct measurements of the electron cloud are made using a retarding field analyzer (RFA) newly installed in the Fermilab Recycler. Indirect measurements of the electron cloud are made by propagating a microwave carrier signal through the beampipe and analyzing the phase modulation of the signal. The maximum betatron amplitude growth and the maximum electron cloud signal occur during minimums of the bunch length oscillation.« less
NASA Astrophysics Data System (ADS)
Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.
2017-11-01
The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.
NASA Astrophysics Data System (ADS)
Razumnikov, S.; Kurmanbay, A.
2016-04-01
The present paper suggests a system approach to evaluation of the effectiveness and risks resulted from the integration of cloud-based services in a machine-building enterprise. This approach makes it possible to estimate a set of enterprise IT applications and choose the applications to be migrated to the cloud with regard to specific business requirements, a technological strategy and willingness to risk.
NASA Astrophysics Data System (ADS)
Tuttas, S.; Braun, A.; Borrmann, A.; Stilla, U.
2014-08-01
For construction progress monitoring a planned state of the construction at a certain time (as-planed) has to be compared to the actual state (as-built). The as-planed state is derived from a building information model (BIM), which contains the geometry of the building and the construction schedule. In this paper we introduce an approach for the generation of an as-built point cloud by photogrammetry. It is regarded that that images on a construction cannot be taken from everywhere it seems to be necessary. Because of this we use a combination of structure from motion process together with control points to create a scaled point cloud in a consistent coordinate system. Subsequently this point cloud is used for an as-built - as-planed comparison. For that voxels of an octree are marked as occupied, free or unknown by raycasting based on the triangulated points and the camera positions. This allows to identify not existing building parts. For the verification of the existence of building parts a second test based on the points in front and behind the as-planed model planes is performed. The proposed procedure is tested based on an inner city construction site under real conditions.
Beam induced electron cloud resonances in dipole magnetic fields
Calvey, J. R.; Hartung, W.; Makita, J.; ...
2016-07-01
The buildup of low energy electrons in an accelerator, known as electron cloud, can be severely detrimental to machine performance. Under certain beam conditions, the beam can become resonant with the cloud dynamics, accelerating the buildup of electrons. This paper will examine two such effects: multipacting resonances, in which the cloud development time is resonant with the bunch spacing, and cyclotron resonances, in which the cyclotron period of electrons in a magnetic field is a multiple of bunch spacing. Both resonances have been studied directly in dipole fields using retarding field analyzers installed in the Cornell Electron Storage Ring. Thesemore » measurements are supported by both analytical models and computer simulations.« less
IBM Cloud Computing Powering a Smarter Planet
NASA Astrophysics Data System (ADS)
Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu
With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.
Indoor Modelling from Slam-Based Laser Scanner: Door Detection to Envelope Reconstruction
NASA Astrophysics Data System (ADS)
Díaz-Vilariño, L.; Verbree, E.; Zlatanova, S.; Diakité, A.
2017-09-01
Updated and detailed indoor models are being increasingly demanded for various applications such as emergency management or navigational assistance. The consolidation of new portable and mobile acquisition systems has led to a higher availability of 3D point cloud data from indoors. In this work, we explore the combined use of point clouds and trajectories from SLAM-based laser scanner to automate the reconstruction of building indoors. The methodology starts by door detection, since doors represent transitions from one indoor space to other, which constitutes an initial approach about the global configuration of the point cloud into building rooms. For this purpose, the trajectory is used to create a vertical point cloud profile in which doors are detected as local minimum of vertical distances. As point cloud and trajectory are related by time stamp, this feature is used to subdivide the point cloud into subspaces according to the location of the doors. The correspondence between subspaces and building rooms is not unambiguous. One subspace always corresponds to one room, but one room is not necessarily depicted by just one subspace, for example, in case of a room containing several doors and in which the acquisition is performed in a discontinue way. The labelling problem is formulated as combinatorial approach solved as a minimum energy optimization. Once the point cloud is subdivided into building rooms, envelop (conformed by walls, ceilings and floors) is reconstructed for each space. The connectivity between spaces is included by adding the previously detected doors to the reconstructed model. The methodology is tested in a real case study.
Tran, Thi Huong Giang; Ressl, Camillo; Pfeifer, Norbert
2018-02-03
This paper suggests a new approach for change detection (CD) in 3D point clouds. It combines classification and CD in one step using machine learning. The point cloud data of both epochs are merged for computing features of four types: features describing the point distribution, a feature relating to relative terrain elevation, features specific for the multi-target capability of laser scanning, and features combining the point clouds of both epochs to identify the change. All these features are merged in the points and then training samples are acquired to create the model for supervised classification, which is then applied to the whole study area. The final results reach an overall accuracy of over 90% for both epochs of eight classes: lost tree, new tree, lost building, new building, changed ground, unchanged building, unchanged tree, and unchanged ground.
A simple map-based localization strategy using range measurements
NASA Astrophysics Data System (ADS)
Moore, Kevin L.; Kutiyanawala, Aliasgar; Chandrasekharan, Madhumita
2005-05-01
In this paper we present a map-based approach to localization. We consider indoor navigation in known environments based on the idea of a "vector cloud" by observing that any point in a building has an associated vector defining its distance to the key structural components (e.g., walls, ceilings, etc.) of the building in any direction. Given a building blueprint we can derive the "ideal" vector cloud at any point in space. Then, given measurements from sensors on the robot we can compare the measured vector cloud to the possible vector clouds cataloged from the blueprint, thus determining location. We present algorithms for implementing this approach to localization, using the Hamming norm, the 1-norm, and the 2-norm. The effectiveness of the approach is verified by experiments on a 2-D testbed using a mobile robot with a 360° laser range-finder and through simulation analysis of robustness.
Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-07-28
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.
A Novel Reflector/Reflectarray Antenna: An Enabling Technology for NASA's Dual-Frequency ACE Radar
NASA Technical Reports Server (NTRS)
Racette, Paul E.; Heymsfield, Gerald; Li, Lihua; Cooley, Michael E.; Park, Richard; Stenger, Peter
2011-01-01
This paper describes a novel dual-frequency shared aperture Ka/W-band antenna design that enables wide-swath Imaging via electronic scanning at Ka-band and Is specifically applicable to NASA's Aerosol, Cloud and Ecosystems (ACE) mission. The innovative antenna design minimizes size and weight via use of a shared aperture and builds upon NASA's investments in large-aperture reflectors and high technology-readiness-level (TRL) W-band radar architectures. The antenna is comprised of a primary cylindrical reflector/reflectarray surface illuminated by a fixed W-band feed and a Ka-band Active Electronically Scanned Array (AESA) line feed. The reflectarray surface provides beam focusing at W-band, but is transparent at Ka-band.
The thinking of Cloud computing in the digital construction of the oil companies
NASA Astrophysics Data System (ADS)
CaoLei, Qizhilin; Dengsheng, Lei
In order to speed up digital construction of the oil companies and enhance productivity and decision-support capabilities while avoiding the disadvantages from the waste of the original process of building digital and duplication of development and input. This paper presents a cloud-based models for the build in the digital construction of the oil companies that National oil companies though the private network will join the cloud data of the oil companies and service center equipment integrated into a whole cloud system, then according to the needs of various departments to prepare their own virtual service center, which can provide a strong service industry and computing power for the Oil companies.
Electron-cloud updated simulation results for the PSR, and recent results for the SNS
NASA Astrophysics Data System (ADS)
Pivi, M.; Furman, M. A.
2002-05-01
Recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos are presented in this paper. A refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has recently been included in the electron-cloud code.
Point clouds segmentation as base for as-built BIM creation
NASA Astrophysics Data System (ADS)
Macher, H.; Landes, T.; Grussenmeyer, P.
2015-08-01
In this paper, a three steps segmentation approach is proposed in order to create 3D models from point clouds acquired by TLS inside buildings. The three scales of segmentation are floors, rooms and planes composing the rooms. First, floor segmentation is performed based on analysis of point distribution along Z axis. Then, for each floor, room segmentation is achieved considering a slice of point cloud at ceiling level. Finally, planes are segmented for each room, and planes corresponding to ceilings and floors are identified. Results of each step are analysed and potential improvements are proposed. Based on segmented point clouds, the creation of as-built BIM is considered in a future work section. Not only the classification of planes into several categories is proposed, but the potential use of point clouds acquired outside buildings is also considered.
ERIC Educational Resources Information Center
Islam, Muhammad Faysal
2013-01-01
Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…
ERIC Educational Resources Information Center
Togawa, Satoshi; Kanenishi, Kazuhide
2014-01-01
In this research, we have built a framework of disaster recovery such as against earthquake, tsunami disaster and a heavy floods for e-Learning environment. Especially, our proposed framework is based on private cloud collaboration. We build a prototype system based on IaaS architecture, and this prototype system is constructed by several private…
Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-01-01
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978
NASA Astrophysics Data System (ADS)
Bernhardt, Paul A.; Siefring, Carl L.; Briczinski, Stanley J.; Viggiano, Albert; Caton, Ronald G.; Pedersen, Todd R.; Holmes, Jeffrey M.; Ard, Shaun; Shuman, Nicholas; Groves, Keith M.
2017-05-01
Atomic samarium has been injected into the neutral atmosphere for production of electron clouds that modify the ionosphere. These electron clouds may be used as high-frequency radio wave reflectors or for control of the electrodynamics of the F region. A self-consistent model for the photochemical reactions of Samarium vapor cloud released into the upper atmosphere has been developed and compared with the Metal Oxide Space Cloud (MOSC) experimental observations. The release initially produces a dense plasma cloud that that is rapidly reduced by dissociative recombination and diffusive expansion. The spectral emissions from the release cover the ultraviolet to the near infrared band with contributions from solar fluorescence of the atomic, molecular, and ionized components of the artificial density cloud. Barium releases in sunlight are more efficient than Samarium releases in sunlight for production of dense ionization clouds. Samarium may be of interest for nighttime releases but the artificial electron cloud is limited by recombination with the samarium oxide ion.
Cloud Infrastructure & Applications - CloudIA
NASA Astrophysics Data System (ADS)
Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank
The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.
INDIGO: Building a DataCloud Framework to support Open Science
NASA Astrophysics Data System (ADS)
Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana
2016-04-01
New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.
Dynamic electronic institutions in agent oriented cloud robotic systems.
Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice
2015-01-01
The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.
Earth observations taken from orbiter Discovery during STS-91 mission
2016-08-24
STS091-708-077 (2-12 June 1998) -- The cloud shadows grew long as the STS-91 astronauts aboard the Space Shuttle Discovery approached the dark side of the Earth during "sunset" over Poland. The taller building cumulus clouds cast shadows over the lower clouds.
Morphology and ionization of the interstellar cloud surrounding the solar system.
Frisch, P C
1994-09-02
The first encounter between the sun and the surrounding interstellar cloud appears to have occurred 2000 to 8000 years ago. The sun and cloud space motions are nearly perpendicular, an indication that the sun is skimming the cloud surface. The electron density derived for the surrounding cloud from the carbon component of the anomalous cosmic ray population in the solar system and from the interstellar ratio of Mg(+) to Mg degrees toward Sirius support an equilibrium model for cloud ionization (an electron density of 0.22 to 0.44 per cubic centimeter). The upwind magnetic field direction is nearly parallel to the cloud surface. The relative sun-cloud motion indicates that the solar system has a bow shock.
Challenges with Electrical, Electronics, and Electromechanical Parts for James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Jah, Muzar A.; Jeffers, Basil S.
2016-01-01
James Webb Space Telescope (JWST) is the space-based observatory that will extend the knowledge gained by the Hubble Space Telescope (HST). Hubble focuses on optical and ultraviolet wavelengths while JWST focuses on the infrared portion of the electromagnetic spectrum, to see the earliest stars and galaxies that formed in the Universe and to look deep into nearby dust clouds to study the formation of stars and planets. JWST, which commenced creation in 1996, is scheduled to launch in 2018. It includes a suite of four instruments, the spacecraft bus, optical telescope element, Integrated Science Instrument Module (ISIM, the platform to hold the instruments), and a sunshield. The mass of JWST is approximately 6200 kg, including observatory, on-orbit consumables and launch vehicle adaptor. Many challenges were overcome while providing the electrical and electronic components for the Goddard Space Flight Center hardware builds. Other difficulties encountered included developing components to work at cryogenic temperatures, failures of electronic components during development and flight builds, Integration and Test electronic parts problems, and managing technical issues with international partners. This paper will present the context of JWST from a EEE (electrical, electronic, and electromechanical) perspective with examples of challenges and lessons learned throughout the design, development, and fabrication of JWST in cooperation with our associated partners including the Canadian Space Agency (CSA), the European Space Agency (ESA), Lockheed Martin and their respective associated partners. Technical challenges and lessons learned will be discussed.
The Role of Standards in Cloud-Computing Interoperability
2012-10-01
services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift
Visualization of the Construction of Ancient Roman Buildings in Ostia Using Point Cloud Data
NASA Astrophysics Data System (ADS)
Hori, Y.; Ogawa, T.
2017-02-01
The implementation of laser scanning in the field of archaeology provides us with an entirely new dimension in research and surveying. It allows us to digitally recreate individual objects, or entire cities, using millions of three-dimensional points grouped together in what is referred to as "point clouds". In addition, the visualization of the point cloud data, which can be used in the final report by archaeologists and architects, should usually be produced as a JPG or TIFF file. Not only the visualization of point cloud data, but also re-examination of older data and new survey of the construction of Roman building applying remote-sensing technology for precise and detailed measurements afford new information that may lead to revising drawings of ancient buildings which had been adduced as evidence without any consideration of a degree of accuracy, and finally can provide new research of ancient buildings. We used laser scanners at fields because of its speed, comprehensive coverage, accuracy and flexibility of data manipulation. Therefore, we "skipped" many of post-processing and focused on the images created from the meta-data simply aligned using a tool which extended automatic feature-matching algorithm and a popular renderer that can provide graphic results.
Plasma Pancakes and Deep Cavities Generated by High Power Radio Waves from the Arecibo Observatory
NASA Astrophysics Data System (ADS)
Bernhardt, P. A.; Briczinski, S. J., Jr.; Zawdie, K.; Huba, J.; Siefring, C. L.; Sulzer, M. P.; Nossa, E.; Aponte, N.; Perillat, P.; Jackson-Booth, N.
2017-12-01
Breakdown of the neutral atmosphere at ionospheric altitudes can be achieved with high power HF waves that reflect on the bottomside of the ionosphere. For overdense heating (i.e., wave frequency < maximum plasma frequency in the F-layer), the largest electric fields in the plasma are found just below the reflection altitude. There, electromagnetic waves are converted into electron plasma (Langmir) waves and ion acoustic waves. These waves are measured by scattering of the 430 MHz radar at Arecibo to from an enhanced plasma line. The photo-electron excitation of Langmuir waves yields a weaker plasma-line profile that shows the complete electron profile with the radar. Once HF enhanced Langmuir waves are formed, they can accelerate the photo-electron population to sufficient energies for neutral breakdown and enhanced ionization inside the HF Radio Beam. Plasma pancakes are produced because the breakdown process continues to build up plasma on bottom of the breakdown clouds and recombination occurs on the older breakdown plasma at the top of these clouds. Thus, the plasma pancake falls with altitude from the initial HF wave reflection altitude near 250 km to about 160 km where ion-electron recombination prevents the plasma cloud from being sustained by the high power HF. Experiments in March 2017 have produced plasma pancakes with about 100 Mega-Watts effective radiated power 5.1 MHz with the Arecibo HF Facility. Observations using the 430 MHz radar show falling plasma pancakes that disappear at low altitudes and reform at the F-layer critical reflection altitude. Sometimes the periodic and regular falling motion of the plasma pancakes is influenced by Acoustic Gravity Waves (AGW) propagating through the modified HF region. A rising AGW can cause the plasma pancake to reside at nearly constant altitude for 10 to 20 minutes. Dense cavities are also produced by high power radio waves interacting with the F-Layer. These structures are observed with the Arecibo 430 MHz radar as intense bight-outs in the plasma profile. Multiple cavities are seen simultaneously.
NASA Astrophysics Data System (ADS)
Griesbaum, Luisa; Marx, Sabrina; Höfle, Bernhard
2017-07-01
In recent years, the number of people affected by flooding caused by extreme weather events has increased considerably. In order to provide support in disaster recovery or to develop mitigation plans, accurate flood information is necessary. Particularly pluvial urban floods, characterized by high temporal and spatial variations, are not well documented. This study proposes a new, low-cost approach to determining local flood elevation and inundation depth of buildings based on user-generated flood images. It first applies close-range digital photogrammetry to generate a geo-referenced 3-D point cloud. Second, based on estimated camera orientation parameters, the flood level captured in a single flood image is mapped to the previously derived point cloud. The local flood elevation and the building inundation depth can then be derived automatically from the point cloud. The proposed method is carried out once for each of 66 different flood images showing the same building façade. An overall accuracy of 0.05 m with an uncertainty of ±0.13 m for the derived flood elevation within the area of interest as well as an accuracy of 0.13 m ± 0.10 m for the determined building inundation depth is achieved. Our results demonstrate that the proposed method can provide reliable flood information on a local scale using user-generated flood images as input. The approach can thus allow inundation depth maps to be derived even in complex urban environments with relatively high accuracies.
Distributed clinical data sharing via dynamic access-control policy transformation.
Rezaeibagha, Fatemeh; Mu, Yi
2016-05-01
Data sharing in electronic health record (EHR) systems is important for improving the quality of healthcare delivery. Data sharing, however, has raised some security and privacy concerns because healthcare data could be potentially accessible by a variety of users, which could lead to privacy exposure of patients. Without addressing this issue, large-scale adoption and sharing of EHR data are impractical. The traditional solution to the problem is via encryption. Although encryption can be applied to access control, it is not applicable for complex EHR systems that require multiple domains (e.g. public and private clouds) with various access requirements. This study was carried out to address the security and privacy issues of EHR data sharing with our novel access-control mechanism, which captures the scenario of the hybrid clouds and need of access-control policy transformation, to provide secure and privacy-preserving data sharing among different healthcare enterprises. We introduce an access-control mechanism with some cryptographic building blocks and present a novel approach for secure EHR data sharing and access-control policy transformation in EHR systems for hybrid clouds. We propose a useful data sharing system for healthcare providers to handle various EHR users who have various access privileges in different cloud environments. A systematic study has been conducted on data sharing in EHR systems to provide a solution to the security and privacy issues. In conclusion, we introduce an access-control method for privacy protection of EHRs and EHR policy transformation that allows an EHR access-control policy to be transformed from a private cloud to a public cloud. This method has never been studied previously in the literature. Furthermore, we provide a protocol to demonstrate policy transformation as an application scenario. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Electron cloud simulations for the main ring of J-PARC
NASA Astrophysics Data System (ADS)
Yee-Rendon, Bruce; Muto, Ryotaro; Ohmi, Kazuhito; Satou, Kenichirou; Tomizawa, Masahito; Toyama, Takeshi
2017-07-01
The simulation of beam instabilities is a helpful tool to evaluate potential threats against the machine protection of the high intensity beams. At Main Ring (MR) of J-PARC, signals related to the electron cloud have been observed during the slow beam extraction mode. Hence, several studies were conducted to investigate the mechanism that produces it, the results confirmed a strong dependence on the beam intensity and the bunch structure in the formation of the electron cloud, however, the precise explanation of its trigger conditions remains incomplete. To shed light on the problem, electron cloud simulations were done using an updated version of the computational model developed from previous works at KEK. The code employed the signals of the measurements to reproduce the events seen during the surveys.
Dorninger, Peter; Pfeifer, Norbert
2008-01-01
Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects. PMID:27873931
NASA Astrophysics Data System (ADS)
Macher, H.; Landes, T.; Grussenmeyer, P.
2016-06-01
Laser scanners are widely used for the modelling of existing buildings and particularly in the creation process of as-built BIM (Building Information Modelling). However, the generation of as-built BIM from point clouds involves mainly manual steps and it is consequently time consuming and error-prone. Along the path to automation, a three steps segmentation approach has been developed. This approach is composed of two phases: a segmentation into sub-spaces namely floors and rooms and a plane segmentation combined with the identification of building elements. In order to assess and validate the developed approach, different case studies are considered. Indeed, it is essential to apply algorithms to several datasets and not to develop algorithms with a unique dataset which could influence the development with its particularities. Indoor point clouds of different types of buildings will be used as input for the developed algorithms, going from an individual house of almost one hundred square meters to larger buildings of several thousand square meters. Datasets provide various space configurations and present numerous different occluding objects as for example desks, computer equipments, home furnishings and even wine barrels. For each dataset, the results will be illustrated. The analysis of the results will provide an insight into the transferability of the developed approach for the indoor modelling of several types of buildings.
First Prismatic Building Model Reconstruction from Tomosar Point Clouds
NASA Astrophysics Data System (ADS)
Sun, Y.; Shahzad, M.; Zhu, X.
2016-06-01
This paper demonstrates for the first time the potential of explicitly modelling the individual roof surfaces to reconstruct 3-D prismatic building models using spaceborne tomographic synthetic aperture radar (TomoSAR) point clouds. The proposed approach is modular and works as follows: it first extracts the buildings via DSM generation and cutting-off the ground terrain. The DSM is smoothed using BM3D denoising method proposed in (Dabov et al., 2007) and a gradient map of the smoothed DSM is generated based on height jumps. Watershed segmentation is then adopted to oversegment the DSM into different regions. Subsequently, height and polygon complexity constrained merging is employed to refine (i.e., to reduce) the retrieved number of roof segments. Coarse outline of each roof segment is then reconstructed and later refined using quadtree based regularization plus zig-zag line simplification scheme. Finally, height is associated to each refined roof segment to obtain the 3-D prismatic model of the building. The proposed approach is illustrated and validated over a large building (convention center) in the city of Las Vegas using TomoSAR point clouds generated from a stack of 25 images using Tomo-GENESIS software developed at DLR.
Lightweight Electronic Camera for Research on Clouds
NASA Technical Reports Server (NTRS)
Lawson, Paul
2006-01-01
"Micro-CPI" (wherein "CPI" signifies "cloud-particle imager") is the name of a small, lightweight electronic camera that has been proposed for use in research on clouds. It would acquire and digitize high-resolution (3- m-pixel) images of ice particles and water drops at a rate up to 1,000 particles (and/or drops) per second.
An approach of point cloud denoising based on improved bilateral filtering
NASA Astrophysics Data System (ADS)
Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin
2018-04-01
An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.
DAΦNE operation with electron-cloud-clearing electrodes.
Alesini, D; Drago, A; Gallo, A; Guiducci, S; Milardi, C; Stella, A; Zobov, M; De Santis, S; Demma, T; Raimondi, P
2013-03-22
The effects of an electron cloud (e-cloud) on beam dynamics are one of the major factors limiting performances of high intensity positron, proton, and ion storage rings. In the electron-positron collider DAΦNE, namely, a horizontal beam instability due to the electron-cloud effect has been identified as one of the main limitations on the maximum stored positron beam current and as a source of beam quality deterioration. During the last machine shutdown in order to mitigate such instability, special electrodes have been inserted in all dipole and wiggler magnets of the positron ring. It has been the first installation all over the world of this type since long metallic electrodes have been installed in all arcs of the collider positron ring and are currently used during the machine operation in collision. This has allowed a number of unprecedented measurements (e-cloud instabilities growth rate, transverse beam size variation, tune shifts along the bunch train) where the e-cloud contribution is clearly evidenced by turning the electrodes on and off. In this Letter we briefly describe a novel design of the electrodes, while the main focus is on experimental measurements. Here we report all results that clearly indicate the effectiveness of the electrodes for e-cloud suppression.
Trirotron: triode rotating beam radio frequency amplifier
Lebacqz, Jean V.
1980-01-01
High efficiency amplification of radio frequencies to very high power levels including: establishing a cylindrical cloud of electrons; establishing an electrical field surrounding and coaxial with the electron cloud to bias the electrons to remain in the cloud; establishing a rotating electrical field that surrounds and is coaxial with the steady field, the circular path of the rotating field being one wavelength long, whereby the peak of one phase of the rotating field is used to accelerate electrons in a beam through the bias field in synchronism with the peak of the rotating field so that there is a beam of electrons continuously extracted from the cloud and rotating with the peak; establishing a steady electrical field that surrounds and is coaxial with the rotating field for high-energy radial acceleration of the rotating beam of electrons; and resonating the rotating beam of electrons within a space surrounding the second field, the space being selected to have a phase velocity equal to that of the rotating field to thereby produce a high-power output at the frequency of the rotating field.
A Novel College Network Resource Management Method using Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Chen
At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-01-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-09-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.
ERIC Educational Resources Information Center
Liao, Yuan
2011-01-01
The virtualization of computing resources, as represented by the sustained growth of cloud computing, continues to thrive. Information Technology departments are building their private clouds due to the perception of significant cost savings by managing all physical computing resources from a single point and assigning them to applications or…
NASA Astrophysics Data System (ADS)
Jun, Byung-Il; Jones, T. W.
1999-02-01
We present two-dimensional MHD simulations of the evolution of a young Type Ia supernova remnant (SNR) during its interaction with an interstellar cloud of comparable size at impact. We include for the first time in such simulations explicit relativistic electron transport. This was done using a simplified treatment of the diffusion-advection equation, thus allowing us to model injection and acceleration of cosmic-ray electrons at shocks and their subsequent transport. From this information we also model radio synchrotron emission, including spectral information. The simulations were carried out in spherical coordinates with azimuthal symmetry and compare three different situations, each incorporating an initially uniform interstellar magnetic field oriented in the polar direction on the grid. In particular, we modeled the SNR-cloud interactions for a spherical cloud on the polar axis, a toroidal cloud whose axis is aligned with the polar axis, and, for comparison, a uniform medium with no cloud. We find that the evolution of the overrun cloud qualitatively resembles that seen in simulations of simpler but analogous situations: that is, the cloud is crushed and begins to be disrupted by Rayleigh-Taylor and Kelvin-Helmholtz instabilities. However, we demonstrate here that, in addition, the internal structure of the SNR is severely distorted as such clouds are engulfed. This has important dynamical and observational implications. The principal new conclusions we draw from these experiments are the following. (1) Independent of the cloud interaction, the SNR reverse shock can be an efficient site for particle acceleration in a young SNR. (2) The internal flows of the SNR become highly turbulent once it encounters a large cloud. (3) An initially uniform magnetic field is preferentially amplified along the magnetic equator of the SNR, primarily because of biased amplification in that region by Rayleigh-Taylor instabilities. A similar bias produces much greater enhancement to the magnetic energy in the SNR during an encounter with a cloud when the interstellar magnetic field is partially transverse to the expansion of the SNR. The enhanced magnetic fields have a significant radial component, independent of the field orientation external to the SNR. This leads to a strong equatorial bias in synchrotron brightness that could easily mask any enhancements to electron-acceleration efficiency near the magnetic equator of the SNR. Thus, to establish the latter effect, it will be essential to establish that the magnetic field in the brightest regions are actually tangential to the blast wave. (4) The filamentary radio structures correlate well with ``turbulence-enhanced'' magnetic structures, while the diffuse radio emission more closely follows the gas-density distribution within the SNR. (5) At these early times, the synchrotron spectral index due to electrons accelerated at the primary shocks should be close to 0.5 unless those shocks are modified by cosmic-ray proton pressures. While that result is predictable, we find that this simple result can be significantly complicated in practice by SNR interactions with clouds. Those events can produce regions with significantly steeper spectra. Especially if there are multiple cloud encounters, this interaction can lead to nonuniform spatial spectral distributions or, through turbulent mixing, produce a spectrum that is difficult to relate to the actual strength of the blast wave. (6) Interaction with the cloud enhances the nonthermal electron population in the SNR in our simulations because of additional electron injection taking place in the shocks associated with the cloud. Together with point 3, this means that SNR-cloud encounters can significantly increase the radio emission from the SNR.
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-04-01
The cloud processing scheme APOLLO (Avhrr Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. While building upon the physical principles having served well in the original APOLLO a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is not performed as a binary yes/no decision based on these physical principals but is expressed as cloud probability for each satellite pixel. Consequently the outcome of the algorithm can be tuned from clear confident to cloud confident depending on the purpose. The probabilistic approach allows to retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for the application with large amounts of historical satellite data. Thus the radiative transfer solution is approximated by the same two stream approach which also had been used for the original APOLLO. This allows the algorithm to be robust enough for being applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e. within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from on NOAA-18 are presented.
Outdoor Illegal Construction Identification Algorithm Based on 3D Point Cloud Segmentation
NASA Astrophysics Data System (ADS)
An, Lu; Guo, Baolong
2018-03-01
Recently, various illegal constructions occur significantly in our surroundings, which seriously restrict the orderly development of urban modernization. The 3D point cloud data technology is used to identify the illegal buildings, which could address the problem above effectively. This paper proposes an outdoor illegal construction identification algorithm based on 3D point cloud segmentation. Initially, in order to save memory space and reduce processing time, a lossless point cloud compression method based on minimum spanning tree is proposed. Then, a ground point removing method based on the multi-scale filtering is introduced to increase accuracy. Finally, building clusters on the ground can be obtained using a region growing method, as a result, the illegal construction can be marked. The effectiveness of the proposed algorithm is verified using a publicly data set collected from the International Society for Photogrammetry and Remote Sensing (ISPRS).
Experimental Investigation of Electron Cloud Containment in a Nonuniform Magnetic Field
NASA Technical Reports Server (NTRS)
Eninger, J. E.
1974-01-01
Dense clouds of electrons were generated and studied in an axisymmetric, nonuniform magnetic field created by a short solenoid. The operation of the experiment was similar to that of a low-pressure (approximately 0.000001 Torr) magnetron discharge. Discharge current characteristics are presented as a function of pressure, magnetic field strength, voltage, and cathode end-plate location. The rotation of the electron cloud is determined from the frequency of diocotron waves. In the space charge saturated regime of operation, the cloud is found to rotate as a solid body with frequency close to V sub a/phi sub a where V sub a is the anode voltage and phi suba is the total magnetic flux. This result indicates that, in regions where electrons are present, the magnetic field lines are electrostatic equipotentials (E bar, B bar = 0). Equilibrium electron density distributions suggested by this conditions are integrated with respect to total ionizing power and are found consistent with measured discharge currents.
The Public Health Community Platform, Electronic Case Reporting, and the Digital Bridge.
Cooney, Mary Ann; Iademarco, Michael F; Huang, Monica; MacKenzie, William R; Davidson, Arthur J
At the intersection of new technology advancements, ever-changing health policy, and fiscal constraints, public health agencies seek to leverage modern technical innovations and benefit from a more comprehensive and cooperative approach to transforming public health, health care, and other data into action. State health agencies recognized a way to advance population health was to integrate public health with clinical health data through electronic infectious disease case reporting. The Public Health Community Platform (PHCP) concept of bidirectional data flow and knowledge management became the foundation to build a cloud-based system connecting electronic health records to public health data for a select initial set of notifiable conditions. With challenges faced and lessons learned, significant progress was made and the PHCP grew into the Digital Bridge, a national governance model for systems change, bringing together software vendors, public health, and health care. As the model and technology advance together, opportunities to advance future connectivity solutions for both health care and public health will emerge.
NASA Astrophysics Data System (ADS)
Gleason, Alyx; Bedard, Jamie; Bellis, Matthew; CMS Collaboration
2016-03-01
In the summer of 2015, we hosted 10 high school teachers for a three-day ``Physics at the Frontier'' Workshop. The mornings were spent learning about particle physics, CMS and the LHC, and radiation safety while the afternoons were spent building turn-key cloud chambers for use in their classrooms. The basic cloud chamber design uses Peltier thermoelectric coolers, rather than dry ice, and instructions can be found in multiple places online. For a robust build procedure and for easy use in the classroom, we redesigned parts of the construction process to make it easier to put together while holding costs below 200 per chamber. In addition to this new design, we also created a website with instructions for those who are interested in building their own using this design. This workshop was funded in part by a minigrant for Outreach and Education from the USCMS collaboration. Our experience with the workshop and the lessons learned from the cloud chamber design will be discussed. This work was funded in part by NSF Grants PHY-1307562 and a USCMS-administered minigrant for Outreach and Education.
NASA Technical Reports Server (NTRS)
Ivory, K.; Schwenn, R.
1995-01-01
The solar wind data obtained from the two Helios solar probes in the years 1974 to 1986 were systematically searched for the occurrence of bi-directional electron events. Most often these events are found in conjunction with shock associated magnetic clouds. The implications of these observations for the topology of interplanetary plasma clouds are discussed.
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-10-01
The cloud processing scheme APOLLO (AVHRR Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. It builds upon the physical principles that have served well in the original APOLLO scheme. Nevertheless, a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is no longer performed as a binary yes/no decision based on these physical principles. It is rather expressed as cloud probability for each satellite pixel. Consequently, the outcome of the algorithm can be tuned from being sure to reliably identify clear pixels to conditions of reliably identifying definitely cloudy pixels, depending on the purpose. The probabilistic approach allows retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for application to large amounts of historical satellite data. The radiative transfer solution is approximated by the same two-stream approach which also had been used for the original APOLLO. This allows the algorithm to be applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e., within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from NOAA-18 are presented.
Average value of the shape and direction factor in the equation of refractive index
NASA Astrophysics Data System (ADS)
Zhang, Tao
2017-10-01
The theoretical calculation of the refractive indices is of great significance for the developments of new optical materials. The calculation method of refractive index, which was deduced from the electron-cloud-conductor model, contains the shape and direction factor 〈g〉. 〈g〉 affects the electromagnetic-induction energy absorbed by the electron clouds, thereby influencing the refractive indices. It is not yet known how to calculate 〈g〉 value of non-spherical electron clouds. In this paper, 〈g〉 value is derived by imaginatively dividing the electron cloud into numerous little volume elements and then regrouping them. This paper proves that 〈g〉 = 2/3 when molecules’ spatial orientations distribute randomly. The calculations of the refractive indices of several substances validate this equation. This result will help to promote the application of the calculation method of refractive index.
Storm Clouds Roll In Over The Vehicle Assembly Building
2009-07-12
Storm clouds roll in over the NASA Vehicle Assembly building moments after STS-127 Space Shuttle Launch Director Pete Nickolenko and the launch team called the launch a "No Go" due to weather conditions at the NASA Kennedy Space Center in Cape Canaveral, Florida, Sunday, July 12, 2009. Endeavour will be launching with the crew of STS-127 on a 16-day mission that will feature five spacewalks and complete construction of the Japan Aerospace Exploration Agency's Kibo laboratory. Photo Credit: (NASA/Bill Ingalls)
Storm Clouds Roll In Over The Vehicle Assembly Building
2009-07-11
Storm clouds roll in over the NASA Vehicle Assembly building moments after STS-127 Space Shuttle Launch Director Pete Nickolenko and the launch team called the launch a "No Go" due to weather conditions at the NASA Kennedy Space Center in Cape Canaveral, Florida, Sunday, July 12, 2009. Endeavour will be launching with the crew of STS-127 on a 16-day mission that will feature five spacewalks and complete construction of the Japan Aerospace Exploration Agency's Kibo laboratory. Photo Credit: (NASA/Bill Ingalls)
[Porting Radiotherapy Software of Varian to Cloud Platform].
Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin
2017-09-30
To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.
Building damage assessment using airborne lidar
NASA Astrophysics Data System (ADS)
Axel, Colin; van Aardt, Jan
2017-10-01
The assessment of building damage following a natural disaster is a crucial step in determining the impact of the event itself and gauging reconstruction needs. Automatic methods for deriving damage maps from remotely sensed data are preferred, since they are regarded as being rapid and objective. We propose an algorithm for performing unsupervised building segmentation and damage assessment using airborne light detection and ranging (lidar) data. Local surface properties, including normal vectors and curvature, were used along with region growing to segment individual buildings in lidar point clouds. Damaged building candidates were identified based on rooftop inclination angle, and then damage was assessed using planarity and point height metrics. Validation of the building segmentation and damage assessment techniques were performed using airborne lidar data collected after the Haiti earthquake of 2010. Building segmentation and damage assessment accuracies of 93.8% and 78.9%, respectively, were obtained using lidar point clouds and expert damage assessments of 1953 buildings in heavily damaged regions. We believe this research presents an indication of the utility of airborne lidar remote sensing for increasing the efficiency and speed at which emergency response operations are performed.
2. Historic American Buildings Survey Russell Jones, Photographer June 1963 ...
2. Historic American Buildings Survey Russell Jones, Photographer June 1963 SOUTHEAST VIEW - Abner Cloud House, Intersection of Canal Road & Reservoir Road Northwest, Washington, District of Columbia, DC
1. Historic American Buildings Survey Russell Jones, Photographer June 1963 ...
1. Historic American Buildings Survey Russell Jones, Photographer June 1963 SOUTHWEST VIEW - Abner Cloud House, Intersection of Canal Road & Reservoir Road Northwest, Washington, District of Columbia, DC
Building a Cloud Computing and Big Data Infrastructure for Cybersecurity Research and Education
2015-04-17
408 1,408 312,912 17 Hadoop- Integration M/D Node R720xd 2 24 128 3,600 5 Subtotal: 120 640 18,000 5 Cloud - Production VRTX M620 2 16 256 30,720...4 Subtotal: 8 64 1,024 30,720 4 Cloud - Integration IBM HS22 7870H5U 2 12 84 4,800 5 Subtotal: 10 60 420 4,800 5 TOTAL: 62 652 3,492 366,432...3,492 366,432 Cloud - Integration Hadoop- Production Hadoop- Integration Cloud - Production September 2014 8 Exploring New Opportunities (Cybersecurity
3. Historic American Buildings Survey Russell Jones, Photographer June 1963NORTHWEST ...
3. Historic American Buildings Survey Russell Jones, Photographer June 1963NORTHWEST VIEW - Abner Cloud House, Intersection of Canal Road & Reservoir Road Northwest, Washington, District of Columbia, DC
DOE Office of Scientific and Technical Information (OSTI.GOV)
FISCHER,W.
We summarize the ECL2 workshop on electron cloud clearing, which was held at CERN in early March 2007, and highlight a number of novel ideas for electron cloud suppression, such as continuous clearing electrodes based on enamel, slotted structures, and electrete inserts.
On the evolution of Saturn's 'Spokes' - Theory
NASA Technical Reports Server (NTRS)
Morfill, G. E.; Gruen, E.; Goertz, C. K.; Johnson, T. V.
1983-01-01
Starting with the assumption that negatively charged micron-sized dust grains may be elevated above Saturn's ring plane by plasma interactions, the subsequent evolution of the system is discussed. The discharge of the fine dust by solar UV radiation produces a cloud of electrons which moves adiabatically in Saturn's dipolar magnetic field. The electron cloud is absorbed by the ring after one bounce, alters the local ring potential significantly, and reduces the local Debye length. As a result, more micron-sized dust particles may be elevated above the ring plane and the spoke grows. This process continues until the electron cloud has dissipated.
Particle-in-cell simulations of the critical ionization velocity effect in finite size clouds
NASA Technical Reports Server (NTRS)
Moghaddam-Taaheri, E.; Lu, G.; Goertz, C. K.; Nishikawa, K. - I.
1994-01-01
The critical ionization velocity (CIV) mechanism in a finite size cloud is studied with a series of electrostatic particle-in-cell simulations. It is observed that an initial seed ionization, produced by non-CIV mechanisms, generates a cross-field ion beam which excites a modified beam-plasma instability (MBPI) with frequency in the range of the lower hybrid frequency. The excited waves accelerate electrons along the magnetic field up to the ion drift energy that exceeds the ionization energy of the neutral atoms. The heated electrons in turn enhance the ion beam by electron-neutral impact ionization, which establishes a positive feedback loop in maintaining the CIV process. It is also found that the efficiency of the CIV mechanism depends on the finite size of the gas cloud in the following ways: (1) Along the ambient magnetic field the finite size of the cloud, L (sub parallel), restricts the growth of the fastest growing mode, with a wavelength lambda (sub m parallel), of the MBPI. The parallel electron heating at wave saturation scales approximately as (L (sub parallel)/lambda (sub m parallel)) (exp 1/2); (2) Momentum coupling between the cloud and the ambient plasma via the Alfven waves occurs as a result of the finite size of the cloud in the direction perpendicular to both the ambient magnetic field and the neutral drift. This reduces exponentially with time the relative drift between the ambient plasma and the neutrals. The timescale is inversely proportional to the Alfven velocity. (3) The transvers e charge separation field across the cloud was found to result in the modulation of the beam velocity which reduces the parallel heating of electrons and increases the transverse acceleration of electrons. (4) Some energetic electrons are lost from the cloud along the magnetic field at a rate characterized by the acoustic velocity, instead of the electron thermal velocity. The loss of energetic electrons from the cloud seems to be larger in the direction of plasma drift relative to the neutrals, where the loss rate is characterized by the neutral drift velocity. It is also shown that a factor of 4 increase in the ambient plasma density, increases the CIV ionization yield by almost 2 orders of magnitude at the end of a typical run. It is concluded that a larger ambient plasma density can result in a larger CIV yield because of (1) larger seed ion production by non-CIV mechanisms, (2) smaller Alfven velocity and hence weak momentum coupling, and (3) smaller ratio of the ion beam density to the ambient ion density, and therefore a weaker modulation of the beam velocity. The simulation results are used to interpret various chemical release experiments in space.
ROOFN3D: Deep Learning Training Data for 3d Building Reconstruction
NASA Astrophysics Data System (ADS)
Wichmann, A.; Agoub, A.; Kada, M.
2018-05-01
Machine learning methods have gained in importance through the latest development of artificial intelligence and computer hardware. Particularly approaches based on deep learning have shown that they are able to provide state-of-the-art results for various tasks. However, the direct application of deep learning methods to improve the results of 3D building reconstruction is often not possible due, for example, to the lack of suitable training data. To address this issue, we present RoofN3D which provides a new 3D point cloud training dataset that can be used to train machine learning models for different tasks in the context of 3D building reconstruction. It can be used, among others, to train semantic segmentation networks or to learn the structure of buildings and the geometric model construction. Further details about RoofN3D and the developed data preparation framework, which enables the automatic derivation of training data, are described in this paper. Furthermore, we provide an overview of other available 3D point cloud training data and approaches from current literature in which solutions for the application of deep learning to unstructured and not gridded 3D point cloud data are presented.
Building Change Detection from Bi-Temporal Dense-Matching Point Clouds and Aerial Images.
Pang, Shiyan; Hu, Xiangyun; Cai, Zhongliang; Gong, Jinqi; Zhang, Mi
2018-03-24
In this work, a novel building change detection method from bi-temporal dense-matching point clouds and aerial images is proposed to address two major problems, namely, the robust acquisition of the changed objects above ground and the automatic classification of changed objects into buildings or non-buildings. For the acquisition of changed objects above ground, the change detection problem is converted into a binary classification, in which the changed area above ground is regarded as the foreground and the other area as the background. For the gridded points of each period, the graph cuts algorithm is adopted to classify the points into foreground and background, followed by the region-growing algorithm to form candidate changed building objects. A novel structural feature that was extracted from aerial images is constructed to classify the candidate changed building objects into buildings and non-buildings. The changed building objects are further classified as "newly built", "taller", "demolished", and "lower" by combining the classification and the digital surface models of two periods. Finally, three typical areas from a large dataset are used to validate the proposed method. Numerous experiments demonstrate the effectiveness of the proposed algorithm.
Automatic Generation of Indoor Navigable Space Using a Point Cloud and its Scanner Trajectory
NASA Astrophysics Data System (ADS)
Staats, B. R.; Diakité, A. A.; Voûte, R. L.; Zlatanova, S.
2017-09-01
Automatic generation of indoor navigable models is mostly based on 2D floor plans. However, in many cases the floor plans are out of date. Buildings are not always built according to their blue prints, interiors might change after a few years because of modified walls and doors, and furniture may be repositioned to the user's preferences. Therefore, new approaches for the quick recording of indoor environments should be investigated. This paper concentrates on laser scanning with a Mobile Laser Scanner (MLS) device. The MLS device stores a point cloud and its trajectory. If the MLS device is operated by a human, the trajectory contains information which can be used to distinguish different surfaces. In this paper a method is presented for the identification of walkable surfaces based on the analysis of the point cloud and the trajectory of the MLS scanner. This method consists of several steps. First, the point cloud is voxelized. Second, the trajectory is analysing and projecting to acquire seed voxels. Third, these seed voxels are generated into floor regions by the use of a region growing process. By identifying dynamic objects, doors and furniture, these floor regions can be modified so that each region represents a specific navigable space inside a building as a free navigable voxel space. By combining the point cloud and its corresponding trajectory, the walkable space can be identified for any type of building even if the interior is scanned during business hours.
Automatic 3d Building Model Generations with Airborne LiDAR Data
NASA Astrophysics Data System (ADS)
Yastikli, N.; Cetin, Z.
2017-11-01
LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D building models can be generated successfully using raw LiDAR point cloud data.
NASA Astrophysics Data System (ADS)
Huang, Yishuo; Chiang, Chih-Hung; Hsu, Keng-Tsang
2018-03-01
Defects presented on the facades of a building do have profound impacts on extending the life cycle of the building. How to identify the defects is a crucial issue; destructive and non-destructive methods are usually employed to identify the defects presented on a building. Destructive methods always cause the permanent damages for the examined objects; on the other hand, non-destructive testing (NDT) methods have been widely applied to detect those defects presented on exterior layers of a building. However, NDT methods cannot provide efficient and reliable information for identifying the defects because of the huge examination areas. Infrared thermography is often applied to quantitative energy performance measurements for building envelopes. Defects on the exterior layer of buildings may be caused by several factors: ventilation losses, conduction losses, thermal bridging, defective services, moisture condensation, moisture ingress, and structure defects. Analyzing the collected thermal images can be quite difficult when the spatial variations of surface temperature are small. In this paper the authors employ image segmentation to cluster those pixels with similar surface temperatures such that the processed thermal images can be composed of limited groups. The surface temperature distribution in each segmented group is homogenous. In doing so, the regional boundaries of the segmented regions can be identified and extracted. A terrestrial laser scanner (TLS) is widely used to collect the point clouds of a building, and those point clouds are applied to reconstruct the 3D model of the building. A mapping model is constructed such that the segmented thermal images can be projected onto the 2D image of the specified 3D building. In this paper, the administrative building in Chaoyang University campus is used as an example. The experimental results not only provide the defect information but also offer their corresponding spatial locations in the 3D model.
MAVEN Mapping of Plasma Clouds Near Mars
NASA Astrophysics Data System (ADS)
Hurley, D.; Tran, T.; DiBraccio, G. A.; Espley, J. R.; Soobiah, Y. I. J.
2017-12-01
Brace et al. identified parcels of ionospheric plasma above the nominal ionosphere of Venus, dubbed plasma clouds. These were envisioned as instabilities on the ionopause that evolved to escaping parcels of ionospheric plasma. Mars Global Surveyor (MGS) Electron Reflectometer (ER) also detected signatures of ionospheric plasma above the nominal ionopause of Mars. Initial examination of the MGS ER data suggests that plasma clouds are more prevalent at Mars than at Venus, and similarly exhibit a connection to rotations in the upstream Interplanetary Magnetic Field (IMF) as Zhang et al. showed at Venus. We examine electron data from Mars to determine the locations of plasma clouds in the near-Mars environment using MGS and MAVEN data. The extensive coverage of the MAVEN orbit enables mapping an occurrence rate of the photoelectron spectra in Solar Wind Electron Analyzer (SWEA) data spanning all relevant altitudes and solar zenith angles. Martian plasma clouds are observed near the terminator like at Venus. They move to higher altitude as solar zenith angle increases, consistent with the escaping plasma hypothesis.
Semantic Segmentation of Indoor Point Clouds Using Convolutional Neural Network
NASA Astrophysics Data System (ADS)
Babacan, K.; Chen, L.; Sohn, G.
2017-11-01
As Building Information Modelling (BIM) thrives, geometry becomes no longer sufficient; an ever increasing variety of semantic information is needed to express an indoor model adequately. On the other hand, for the existing buildings, automatically generating semantically enriched BIM from point cloud data is in its infancy. The previous research to enhance the semantic content rely on frameworks in which some specific rules and/or features that are hand coded by specialists. These methods immanently lack generalization and easily break in different circumstances. On this account, a generalized framework is urgently needed to automatically and accurately generate semantic information. Therefore we propose to employ deep learning techniques for the semantic segmentation of point clouds into meaningful parts. More specifically, we build a volumetric data representation in order to efficiently generate the high number of training samples needed to initiate a convolutional neural network architecture. The feedforward propagation is used in such a way to perform the classification in voxel level for achieving semantic segmentation. The method is tested both for a mobile laser scanner point cloud, and a larger scale synthetically generated data. We also demonstrate a case study, in which our method can be effectively used to leverage the extraction of planar surfaces in challenging cluttered indoor environments.
NASA Astrophysics Data System (ADS)
Ge, Xuming
2017-08-01
The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.
Factors governing the total rainfall yield from continental convective clouds
NASA Technical Reports Server (NTRS)
Rosenfeld, Daniel; Gagin, Abraham
1989-01-01
Several important factors that govern the total rainfall from continental convective clouds were investigated by tracking thousands of convective cells in Israel and South Africa. The rainfall volume yield (Rvol) of the individual cells that build convective rain systems has been shown to depend mainly on the cloud-top height. There is, however, considerable variability in this relationship. The following factors that influence the Rvol were parameterized and quantitatively analyzed: (1) cloud base temperature, (2)atmospheric instability, and (3) the extent of isolation of the cell. It is also shown that a strong low level forcing increases the duration of Rvol of clouds reaching the same vertical extent.
NASA Astrophysics Data System (ADS)
Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia
2018-05-01
Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.
Cloud-Based Virtual Laboratory for Network Security Education
ERIC Educational Resources Information Center
Xu, Le; Huang, Dijiang; Tsai, Wei-Tek
2014-01-01
Hands-on experiments are essential for computer network security education. Existing laboratory solutions usually require significant effort to build, configure, and maintain and often do not support reconfigurability, flexibility, and scalability. This paper presents a cloud-based virtual laboratory education platform called V-Lab that provides a…
Reconciliation of the cloud computing model with US federal electronic health record regulations
2011-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204
Reconciliation of the cloud computing model with US federal electronic health record regulations.
Schweitzer, Eugene J
2012-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.
NASA Astrophysics Data System (ADS)
Sun, Junqiang; Madhavan, S.; Wang, M.
2016-09-01
MODerate resolution Imaging Spectroradiometer (MODIS), a remarkable heritage sensor in the fleet of Earth Observing System for the National Aeronautics and Space Administration (NASA) is in space orbit on two spacecrafts. They are the Terra (T) and Aqua (A) platforms which tracks the Earth in the morning and afternoon orbits. T-MODIS has continued to operate over 15 years easily surpassing the 6 year design life time on orbit. Of the several science products derived from MODIS, one of the primary derivatives is the MODIS Cloud Mask (MOD035). The cloud mask algorithm incorporates several of the MODIS channels in both reflective and thermal infrared wavelengths to identify cloud pixels from clear sky. Two of the thermal infrared channels used in detecting clouds are the 6.7 μm and 8.5 μm. Based on a difference threshold with the 11 μm channel, the 6.7 μm channel helps in identifying thick high clouds while the 8.5 μm channel being useful for identifying thin clouds. Starting 2010, it had been observed in the cloud mask products that several pixels have been misclassified due to the change in the thermal band radiometry. The long-term radiometric changes in these thermal channels have been attributed to the electronic crosstalk contamination. In this paper, the improvement in cloud detection using the 6.7 μm and 8.5 μm channels are demonstrated using the electronic crosstalk correction. The electronic crosstalk phenomena analysis and characterization were developed using the regular moon observation of MODIS and reported in several works. The results presented in this paper should significantly help in improving the MOD035 product, maintaining the long term dataset from T-MODIS which is important for global change monitoring.
a Fast and Flexible Method for Meta-Map Building for Icp Based Slam
NASA Astrophysics Data System (ADS)
Kurian, A.; Morin, K. W.
2016-06-01
Recent developments in LiDAR sensors make mobile mapping fast and cost effective. These sensors generate a large amount of data which in turn improves the coverage and details of the map. Due to the limited range of the sensor, one has to collect a series of scans to build the entire map of the environment. If we have good GNSS coverage, building a map is a well addressed problem. But in an indoor environment, we have limited GNSS reception and an inertial solution, if available, can quickly diverge. In such situations, simultaneous localization and mapping (SLAM) is used to generate a navigation solution and map concurrently. SLAM using point clouds possesses a number of computational challenges even with modern hardware due to the shear amount of data. In this paper, we propose two strategies for minimizing the cost of computation and storage when a 3D point cloud is used for navigation and real-time map building. We have used the 3D point cloud generated by Leica Geosystems's Pegasus Backpack which is equipped with Velodyne VLP-16 LiDARs scanners. To improve the speed of the conventional iterative closest point (ICP) algorithm, we propose a point cloud sub-sampling strategy which does not throw away any key features and yet significantly reduces the number of points that needs to be processed and stored. In order to speed up the correspondence finding step, a dual kd-tree and circular buffer architecture is proposed. We have shown that the proposed method can run in real time and has excellent navigation accuracy characteristics.
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Wang, Lihui
2017-08-01
Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.
Dynamics of charge clouds ejected from laser-induced warm dense gold nanofilms
Zhou, Jun; Li, Junjie; Correa, Alfredo A.; ...
2014-10-24
We report the first systematic study of the ejected charge dynamics surrounding laser-produced 30-nm warm dense gold films using single-shot femtosecond electron shadow imaging and deflectometry. The results reveal a two-step dynamical process of the ejected electrons under the high pump fluence conditions: an initial emission and accumulation of a large amount of electrons near the pumped surface region followed by the formation of hemispherical clouds of electrons on both sides of the film, which are escaping into the vacuum at a nearly isotropic and constant velocity with an unusually high kinetic energy of more than 300 eV. We alsomore » developed a model of the escaping charge distribution that not only reproduces the main features of the observed charge expansion dynamics but also allows us to extract the number of ejected electrons remaining in the cloud.« less
Dynamics of charge clouds ejected from laser-induced warm dense gold nanofilms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Jun; Li, Junjie; Correa, Alfredo A.
We report the first systematic study of the ejected charge dynamics surrounding laser-produced 30-nm warm dense gold films using single-shot femtosecond electron shadow imaging and deflectometry. The results reveal a two-step dynamical process of the ejected electrons under the high pump fluence conditions: an initial emission and accumulation of a large amount of electrons near the pumped surface region followed by the formation of hemispherical clouds of electrons on both sides of the film, which are escaping into the vacuum at a nearly isotropic and constant velocity with an unusually high kinetic energy of more than 300 eV. We alsomore » developed a model of the escaping charge distribution that not only reproduces the main features of the observed charge expansion dynamics but also allows us to extract the number of ejected electrons remaining in the cloud.« less
A Diffusion Cloud Chamber Study of Very Slow Mesons. II. Beta Decay of the Muon
DOE R&D Accomplishments Database
Lederman, L. M.; Sargent, C. P.; Rinehart, M.; Rogers, K.
1955-03-01
The spectrum of electrons arising from the decay of the negative mu meson has been determined. The muons are arrested in the gas of a high pressure hydrogen filled diffusion cloud chamber. The momenta of the decay electrons are determined from their curvature in a magnetic field of 7750 gauss. The spectrum of 415 electrons has been analyzed according to the theory of Michel.
3D modeling of building indoor spaces and closed doors from imagery and point clouds.
Díaz-Vilariño, Lucía; Khoshelham, Kourosh; Martínez-Sánchez, Joaquín; Arias, Pedro
2015-02-03
3D models of indoor environments are increasingly gaining importance due to the wide range of applications to which they can be subjected: from redesign and visualization to monitoring and simulation. These models usually exist only for newly constructed buildings; therefore, the development of automatic approaches for reconstructing 3D indoors from imagery and/or point clouds can make the process easier, faster and cheaper. Among the constructive elements defining a building interior, doors are very common elements and their detection can be very useful either for knowing the environment structure, to perform an efficient navigation or to plan appropriate evacuation routes. The fact that doors are topologically connected to walls by being coplanar, together with the unavoidable presence of clutter and occlusions indoors, increases the inherent complexity of the automation of the recognition process. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors based on point clouds and images. The methodology analyses the visibility problem of indoor environments and goes in depth with door candidate detection. The presented approach is tested in real data sets showing its potential with a high door detection rate and applicability for robust and efficient envelope reconstruction.
NASA Technical Reports Server (NTRS)
Spann, J. F., Jr.; Germany, G. A.; Brittnacher, M. J.; Parks, G. K.; Elsen, R.
1997-01-01
The January 10-11, 1997 magnetic cloud event provided a rare opportunity to study auroral energy deposition under varying but intense IMF conditions. The Wind spacecraft located about 100 RE upstream monitored the IMF and plasma parameters during the passing of the cloud. The Polar Ultraviolet Imager (UVI) observed the aurora[ precipitation during the first encounter of the cloud with Earth's magnetosphere and during several subsequent substorm events. The UVI has the unique capability of measuring the energy flux and characteristic energy of the precipitating electrons through the use of narrow band filters that distinguish short and long wavelength molecular nitrogen emissions. The spatial and temporal characteristics of the precipitating electron energy will be discussed beginning with the inception of the event at the Earth early January 1 Oth and continuing through the subsidence of auroral activity on January 11th.
Grammar-Supported 3d Indoor Reconstruction from Point Clouds for As-Built Bim
NASA Astrophysics Data System (ADS)
Becker, S.; Peter, M.; Fritsch, D.
2015-03-01
The paper presents a grammar-based approach for the robust automatic reconstruction of 3D interiors from raw point clouds. The core of the approach is a 3D indoor grammar which is an extension of our previously published grammar concept for the modeling of 2D floor plans. The grammar allows for the modeling of buildings whose horizontal, continuous floors are traversed by hallways providing access to the rooms as it is the case for most office buildings or public buildings like schools, hospitals or hotels. The grammar is designed in such way that it can be embedded in an iterative automatic learning process providing a seamless transition from LOD3 to LOD4 building models. Starting from an initial low-level grammar, automatically derived from the window representations of an available LOD3 building model, hypotheses about indoor geometries can be generated. The hypothesized indoor geometries are checked against observation data - here 3D point clouds - collected in the interior of the building. The verified and accepted geometries form the basis for an automatic update of the initial grammar. By this, the knowledge content of the initial grammar is enriched, leading to a grammar with increased quality. This higher-level grammar can then be applied to predict realistic geometries to building parts where only sparse observation data are available. Thus, our approach allows for the robust generation of complete 3D indoor models whose quality can be improved continuously as soon as new observation data are fed into the grammar-based reconstruction process. The feasibility of our approach is demonstrated based on a real-world example.
76 FR 323 - Information Systems Technical Advisory Committee; Notice of Partially Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-04
...), Building 33, Cloud Room, 53560 Hull Street, San Diego, California 92152. The Committee advises the Office... and Relation to Category 3 5. Godson Microprocessor Project 6. Autonomous Vehicle Project 7. Cloud Computing, Technology and Security Issues Thursday, January 27 Closed Session 8. Discussion of matters...
2011-03-31
CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, an ominous thunderstorm cloud hovers over the Vehicle Assembly Building in the Launch Complex 39 area. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Kim Shiflett
2011-03-31
CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, dark clouds hover over the Vehicle Assembly Building in the Launch Complex 39 area. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Jack Pfaller
2011-03-31
CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, dark clouds hover over the Vehicle Assembly Building in the Launch Complex 39 area. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Jack Pfaller
NASA Astrophysics Data System (ADS)
Feng, Bing
Electron cloud instabilities have been observed in many circular accelerators around the world and raised concerns of future accelerators and possible upgrades. In this thesis, the electron cloud instabilities are studied with the quasi-static particle-in-cell (PIC) code QuickPIC. Modeling in three-dimensions the long timescale propagation of beam in electron clouds in circular accelerators requires faster and more efficient simulation codes. Thousands of processors are easily available for parallel computations. However, it is not straightforward to increase the effective speed of the simulation by running the same problem size on an increasingly number of processors because there is a limit to domain size in the decomposition of the two-dimensional part of the code. A pipelining algorithm applied on the fully parallelized particle-in-cell code QuickPIC is implemented to overcome this limit. The pipelining algorithm uses multiple groups of processors and optimizes the job allocation on the processors in parallel computing. With this novel algorithm, it is possible to use on the order of 102 processors, and to expand the scale and the speed of the simulation with QuickPIC by a similar factor. In addition to the efficiency improvement with the pipelining algorithm, the fidelity of QuickPIC is enhanced by adding two physics models, the beam space charge effect and the dispersion effect. Simulation of two specific circular machines is performed with the enhanced QuickPIC. First, the proposed upgrade to the Fermilab Main Injector is studied with an eye upon guiding the design of the upgrade and code validation. Moderate emittance growth is observed for the upgrade of increasing the bunch population by 5 times. But the simulation also shows that increasing the beam energy from 8GeV to 20GeV or above can effectively limit the emittance growth. Then the enhanced QuickPIC is used to simulate the electron cloud effect on electron beam in the Cornell Energy Recovery Linac (ERL) due to extremely small emittance and high peak currents anticipated in the machine. A tune shift is discovered from the simulation; however, emittance growth of the electron beam in electron cloud is not observed for ERL parameters.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
Background: With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. Methods: The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. Results: The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. Conclusion: According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology. PMID:29719309
A Multi-Frequency Wide-Swath Spaceborne Cloud and Precipitation Imaging Radar
NASA Technical Reports Server (NTRS)
Li, Lihua; Racette, Paul; Heymsfield, Gary; McLinden, Matthew; Venkatesh, Vijay; Coon, Michael; Perrine, Martin; Park, Richard; Cooley, Michael; Stenger, Pete;
2016-01-01
Microwave and millimeter-wave radars have proven their effectiveness in cloud and precipitation observations. The NASA Earth Science Decadal Survey (DS) Aerosol, Cloud and Ecosystems (ACE) mission calls for a dual-frequency cloud radar (W band 94 GHz and Ka-band 35 GHz) for global measurements of cloud microphysical properties. Recently, there have been discussions of utilizing a tri-frequency (KuKaW-band) radar for a combined ACE and Global Precipitation Measurement (GPM) follow-on mission that has evolved into the Cloud and Precipitation Process Mission (CaPPM) concept. In this presentation we will give an overview of the technology development efforts at the NASA Goddard Space Flight Center (GSFC) and at Northrop Grumman Electronic Systems (NGES) through projects funded by the NASA Earth Science Technology Office (ESTO) Instrument Incubator Program (IIP). Our primary objective of this research is to advance the key enabling technologies for a tri-frequency (KuKaW-band) shared-aperture spaceborne imaging radar to provide unprecedented, simultaneous multi-frequency measurements that will enhance understanding of the effects of clouds and precipitation and their interaction on Earth climate change. Research effort has been focused on concept design and trade studies of the tri-frequency radar; investigating architectures that provide tri-band shared-aperture capability; advancing the development of the Ka band active electronically scanned array (AESA) transmitreceive (TR) module, and development of the advanced radar backend electronics.
Dust particle radial confinement in a dc glow discharge.
Sukhinin, G I; Fedoseev, A V; Antipov, S N; Petrov, O F; Fortov, V E
2013-01-01
A self-consistent nonlocal model of the positive column of a dc glow discharge with dust particles is presented. Radial distributions of plasma parameters and the dust component in an axially homogeneous glow discharge are considered. The model is based on the solution of a nonlocal Boltzmann equation for the electron energy distribution function, drift-diffusion equations for ions, and the Poisson equation for a self-consistent electric field. The radial distribution of dust particle density in a dust cloud was fixed as a given steplike function or was chosen according to an equilibrium Boltzmann distribution. The balance of electron and ion production in argon ionization by an electron impact and their losses on the dust particle surface and on the discharge tube walls is taken into account. The interrelation of discharge plasma and the dust cloud is studied in a self-consistent way, and the radial distributions of the discharge plasma and dust particle parameters are obtained. It is shown that the influence of the dust cloud on the discharge plasma has a nonlocal behavior, e.g., density and charge distributions in the dust cloud substantially depend on the plasma parameters outside the dust cloud. As a result of a self-consistent evolution of plasma parameters to equilibrium steady-state conditions, ionization and recombination rates become equal to each other, electron and ion radial fluxes become equal to zero, and the radial component of electric field is expelled from the dust cloud.
THE LAUNCHING OF COLD CLOUDS BY GALAXY OUTFLOWS. II. THE ROLE OF THERMAL CONDUCTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brüggen, Marcus; Scannapieco, Evan
2016-05-01
We explore the impact of electron thermal conduction on the evolution of radiatively cooled cold clouds embedded in flows of hot and fast material as it occurs in outflowing galaxies. Performing a parameter study of three-dimensional adaptive mesh refinement hydrodynamical simulations, we show that electron thermal conduction causes cold clouds to evaporate, but it can also extend their lifetimes by compressing them into dense filaments. We distinguish between low column-density clouds, which are disrupted on very short times, and high-column density clouds with much longer disruption times that are set by a balance between impinging thermal energy and evaporation. Wemore » provide fits to the cloud lifetimes and velocities that can be used in galaxy-scale simulations of outflows in which the evolution of individual clouds cannot be modeled with the required resolution. Moreover, we show that the clouds are only accelerated to a small fraction of the ambient velocity because compression by evaporation causes the clouds to present a small cross-section to the ambient flow. This means that either magnetic fields must suppress thermal conduction, or that the cold clouds observed in galaxy outflows are not formed of cold material carried out from the galaxy.« less
Physical conditions in CaFe interstellar clouds
NASA Astrophysics Data System (ADS)
Gnaciński, P.; Krogulec, M.
2008-01-01
Interstellar clouds that exhibit strong Ca I and Fe I lines are called CaFe clouds. Ionisation equilibrium equations were used to model the column densities of Ca II, Ca I, K I, Na I, Fe I and Ti II in CaFe clouds. We find that the chemical composition of CaFe clouds is solar and that there is no depletion into dust grains. CaFe clouds have high electron densities, n_e≈1 cm-3, that lead to high column densities of neutral Ca and Fe.
Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun
NASA Technical Reports Server (NTRS)
Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.
2008-01-01
Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.
Building environment assessment and energy consumption estimation using smart phones
NASA Astrophysics Data System (ADS)
Li, Xiangli; Zhang, Li; Jia, Yingqi; Wang, Zihan; Jin, Xin; Zhao, Xuefeng
2017-04-01
In this paper, an APP for building indoor environment evaluation and energy consumption estimation based on Android platform is proposed and established. While using the APP, the smart phone built-in sensors are called for real-time monitoring of the building environmental information such as temperature, humidity and noise, etc. the built-in algorithm is developed to calculate the heat and power consumption, and questionnaires, grading and other methods are used to feed back to the space heating system. In addition, with the application of the technology of big data and cloud technology, the data collected by users will be uploaded to the cloud. After the statistics of the uploaded data, regional difference can be obtained, thus providing a more accurate basis for macro-control and research of energy, thermal comfort, greenhouse effect.
Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds
NASA Astrophysics Data System (ADS)
Li, Rui; Chen, Lei; Li, Wen-Syan
Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.
NASA Astrophysics Data System (ADS)
Casula, Giuseppe; Fais, Silvana; Giovanna Bianchi, Maria; Cuccuru, Francesco; Ligas, Paola
2015-04-01
The Terrestrial Laser Scanner (TLS) is a modern contactless non-destructive technique (NDT) useful to 3D-model complex-shaped objects with a few hours' field survey. A TLS survey produces very dense point clouds made up of coordinates of point and radiometric information given by the reflectivity parameter i.e. the ratio between the amount of energy emitted by the sensor and the energy reflected by the target object. Modern TLSs used in architecture are phase instruments where the phase difference obtained by comparing the emitted laser pulse with the reflected one is proportional to the sensor-target distance expressed as an integer multiple of the half laser wavelength. TLS data are processed by registering point clouds i.e. by referring them to the same reference frame and by aggregation after a fine registration procedure. The resulting aggregate point cloud can be compared with graphic primitives as single or multiple planes, cylinders or spheres, and the resulting residuals give a morphological map that affords information about the state of conservation of the building materials used in historical or modern buildings, in particular when compared with other NDT techniques. In spite of its great productivity, the TLS technique is limited in that it is unable to penetrate the investigated materials. For this reason both the 3D residuals map and the reflectivity map need to be correlated with the results of other NDT techniques such as the ultrasonic method, and a complex study of the composition of building materials is also necessary. The application of a methodology useful to evaluate the quality of stone building materials and locate altered or damaged zones is presented in this study based on the integrated application of three independent techniques, two non destructive such as the TLS and the ultrasonic techniques in the 24-54 kHz range, and a third to analyze the petrographical characteristics of the stone materials, mainly the texture, with optical and scanning electronic microscopy (SEM). A very interesting case study is presented on a carbonate stone door of great architectural and historical interest, well suited to a high definition survey . This architectural element is inside the "Palazzo di Città" museum in the historical center of the Town of Cagliari, Sardinia (Italy). The integrated application of TLS and in situ and laboratory ultrasonic techniques, enhanced by the knowledge of the petrographic characteristics of the rocks, improves the diagnostic process and affords reliable information on the state of conservation of the stones used to build it. The integrated use of the above non destructive techniques also provides suitable data for a possible restoration and future preservation. Acknowledgments: This work was financially supported by Sardinian Local Administration (RAS - LR 7,August 2007, n.7, Promotion of Scientific Research and Innovation in Sardinia - Italy, Responsible Scientist: S.Fais).
NASA Astrophysics Data System (ADS)
Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.
2014-03-01
Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.
An Examination of the Nature of Global MODIS Cloud Regimes
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin; Kato, Seiji; Huffman, George J.
2014-01-01
We introduce global cloud regimes (previously also referred to as "weather states") derived from cloud retrievals that use measurements by the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Aqua and Terra satellites. The regimes are obtained by applying clustering analysis on joint histograms of retrieved cloud top pressure and cloud optical thickness. By employing a compositing approach on data sets from satellites and other sources, we examine regime structural and thermodynamical characteristics. We establish that the MODIS cloud regimes tend to form in distinct dynamical and thermodynamical environments and have diverse profiles of cloud fraction and water content. When compositing radiative fluxes from the Clouds and the Earth's Radiant Energy System instrument and surface precipitation from the Global Precipitation Climatology Project, we find that regimes with a radiative warming effect on the atmosphere also produce the largest implied latent heat. Taken as a whole, the results of the study corroborate the usefulness of the cloud regime concept, reaffirm the fundamental nature of the regimes as appropriate building blocks for cloud system classification, clarify their association with standard cloud types, and underscore their distinct radiative and hydrological signatures.
2001-06-15
KENNEDY SPACE CENTER, Fla. -- Dark clouds and strong winds seem almost to touch the ground near the tow-way leading from the Shuttle Landing Facility (SLF). In the background (right) can be seen the new hangar at the SLF and the mate/demate device. The cloud formation is proceeding across the SLF towards the Vehicle Assembly Building
Improving Indoor Air Quality in St. Cloud Schools.
ERIC Educational Resources Information Center
Forer, Mike; Haus, El
2000-01-01
Describes how the St. Cloud Area School District (Minnesota), using Tools for Schools provided by the U.S. Environmental Protection Agency, managed the improvement of their school building indoor air quality (IAQ). The district goals of the IAQ Management Committee and the policy elements used to maintain high classroom air quality are…
Default Parallels Plesk Panel Page
services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this
NASA Astrophysics Data System (ADS)
Hess, M. R.; Petrovic, V.; Kuester, F.
2017-08-01
Digital documentation of cultural heritage structures is increasingly more common through the application of different imaging techniques. Many works have focused on the application of laser scanning and photogrammetry techniques for the acquisition of threedimensional (3D) geometry detailing cultural heritage sites and structures. With an abundance of these 3D data assets, there must be a digital environment where these data can be visualized and analyzed. Presented here is a feedback driven visualization framework that seamlessly enables interactive exploration and manipulation of massive point cloud data. The focus of this work is on the classification of different building materials with the goal of building more accurate as-built information models of historical structures. User defined functions have been tested within the interactive point cloud visualization framework to evaluate automated and semi-automated classification of 3D point data. These functions include decisions based on observed color, laser intensity, normal vector or local surface geometry. Multiple case studies are presented here to demonstrate the flexibility and utility of the presented point cloud visualization framework to achieve classification objectives.
Chemical Composition of Nebulosities in the Magellanic Clouds
Aller, L. H.; Czyzak, S. J.; Keyes, C. D.; Boeshaar, G.
1974-01-01
From photoelectric spectrophotometric data secured at Cerro Tololo Interamerican Observatory we have attempted to derive electron densities and temperatures, ionic concentrations, and chemical abundances of He, C, N, O, Ne, S, and Ar in nebulosities in the Magellanic Clouds. Although 10 distinct nebulosities were observed in the Small Cloud and 20 such objects in the Large Cloud, the most detailed observations were secured only for the brighter objects. Results for 30 Doradus are in harmony with those published previously and recent work by Peimbert and Torres-Peimbert. Nitrogen and heavier elements appear to be less abundant in the Small Cloud than in the Large Cloud, in accordance with the conclusions of Dufour. A comparison with the Orion nebula suggests He, N, Ne, O, and S may all be less abundant in the Megellanic Clouds, although adequate evaluations will require construction of detailed models. For example, if we postulate that the [NII], [OII], and [SII] radiations originate primarily in regions with electron temperatures near 8000°K, while the [OIII], [NeIII], [ArIII], and H radiations are produced primarily in regions with Tε = 10,000° K, the derived chemical abundances in the clouds are enhanced. PMID:16592199
Photocathode Optimization for a Dynamic Transmission Electron Microscope: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, P; Flom, Z; Heinselman, K
2011-08-04
The Dynamic Transmission Electron Microscope (DTEM) team at Harvey Mudd College has been sponsored by LLNL to design and build a test setup for optimizing the performance of the DTEM's electron source. Unlike a traditional TEM, the DTEM achieves much faster exposure times by using photoemission from a photocathode to produce electrons for imaging. The DTEM team's work is motivated by the need to improve the coherence and current density of the electron cloud produced by the electron gun in order to increase the image resolution and contrast achievable by DTEM. The photoemission test setup is nearly complete and themore » team will soon complete baseline tests of electron gun performance. The photoemission laser and high voltage power supply have been repaired; the optics path for relaying the laser to the photocathode has been finalized, assembled, and aligned; the internal setup of the vacuum chamber has been finalized and mostly implemented; and system control, synchronization, and data acquisition has been implemented in LabVIEW. Immediate future work includes determining a consistent alignment procedure to place the laser waist on the photocathode, and taking baseline performance measurements of the tantalum photocathode. Future research will examine the performance of the electron gun as a function of the photoemission laser profile, the photocathode material, and the geometry and voltages of the accelerating and focusing components in the electron gun. This report presents the team's progress and outlines the work that remains.« less
A review on the state-of-the-art privacy-preserving approaches in the e-health clouds.
Abbas, Assad; Khan, Samee U
2014-07-01
Cloud computing is emerging as a new computing paradigm in the healthcare sector besides other business domains. Large numbers of health organizations have started shifting the electronic health information to the cloud environment. Introducing the cloud services in the health sector not only facilitates the exchange of electronic medical records among the hospitals and clinics, but also enables the cloud to act as a medical record storage center. Moreover, shifting to the cloud environment relieves the healthcare organizations of the tedious tasks of infrastructure management and also minimizes development and maintenance costs. Nonetheless, storing the patient health data in the third-party servers also entails serious threats to data privacy. Because of probable disclosure of medical records stored and exchanged in the cloud, the patients' privacy concerns should essentially be considered when designing the security and privacy mechanisms. Various approaches have been used to preserve the privacy of the health information in the cloud environment. This survey aims to encompass the state-of-the-art privacy-preserving approaches employed in the e-Health clouds. Moreover, the privacy-preserving approaches are classified into cryptographic and noncryptographic approaches and taxonomy of the approaches is also presented. Furthermore, the strengths and weaknesses of the presented approaches are reported and some open issues are highlighted.
Yang, Shu; Qiu, Yuyan; Shi, Bo
2016-09-01
This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.
A hierarchical methodology for urban facade parsing from TLS point clouds
NASA Astrophysics Data System (ADS)
Li, Zhuqiang; Zhang, Liqiang; Mathiopoulos, P. Takis; Liu, Fangyu; Zhang, Liang; Li, Shuaipeng; Liu, Hao
2017-01-01
The effective and automated parsing of building facades from terrestrial laser scanning (TLS) point clouds of urban environments is an important research topic in the GIS and remote sensing fields. It is also challenging because of the complexity and great variety of the available 3D building facade layouts as well as the noise and data missing of the input TLS point clouds. In this paper, we introduce a novel methodology for the accurate and computationally efficient parsing of urban building facades from TLS point clouds. The main novelty of the proposed methodology is that it is a systematic and hierarchical approach that considers, in an adaptive way, the semantic and underlying structures of the urban facades for segmentation and subsequent accurate modeling. Firstly, the available input point cloud is decomposed into depth planes based on a data-driven method; such layer decomposition enables similarity detection in each depth plane layer. Secondly, the labeling of the facade elements is performed using the SVM classifier in combination with our proposed BieS-ScSPM algorithm. The labeling outcome is then augmented with weak architectural knowledge. Thirdly, least-squares fitted normalized gray accumulative curves are applied to detect regular structures, and a binarization dilation extraction algorithm is used to partition facade elements. A dynamic line-by-line division is further applied to extract the boundaries of the elements. The 3D geometrical façade models are then reconstructed by optimizing facade elements across depth plane layers. We have evaluated the performance of the proposed method using several TLS facade datasets. Qualitative and quantitative performance comparisons with several other state-of-the-art methods dealing with the same facade parsing problem have demonstrated its superiority in performance and its effectiveness in improving segmentation accuracy.
Electron cloud generation and trapping in a quadrupole magnet at the Los Alamos proton storage ring
NASA Astrophysics Data System (ADS)
Macek, Robert J.; Browman, Andrew A.; Ledford, John E.; Borden, Michael J.; O'Hara, James F.; McCrady, Rodney C.; Rybarcyk, Lawrence J.; Spickermann, Thomas; Zaugg, Thomas J.; Pivi, Mauro T. F.
2008-01-01
Recent beam physics studies on the two-stream e-p instability at the LANL proton storage ring (PSR) have focused on the role of the electron cloud generated in quadrupole magnets where primary electrons, which seed beam-induced multipacting, are expected to be largest due to grazing angle losses from the beam halo. A new diagnostic to measure electron cloud formation and trapping in a quadrupole magnet has been developed, installed, and successfully tested at PSR. Beam studies using this diagnostic show that the “prompt” electron flux striking the wall in a quadrupole is comparable to the prompt signal in the adjacent drift space. In addition, the “swept” electron signal, obtained using the sweeping feature of the diagnostic after the beam was extracted from the ring, was larger than expected and decayed slowly with an exponential time constant of 50 to 100μs. Other measurements include the cumulative energy spectra of prompt electrons and the variation of both prompt and swept electron signals with beam intensity. Experimental results were also obtained which suggest that a good fraction of the electrons observed in the adjacent drift space for the typical beam conditions in the 2006 run cycle were seeded by electrons ejected from the quadrupole.
Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Fraser, C. S.; Lu, G.
2015-08-01
Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.
Cloud screening Coastal Zone Color Scanner images using channel 5
NASA Technical Reports Server (NTRS)
Eckstein, B. A.; Simpson, J. J.
1991-01-01
Clouds are removed from Coastal Zone Color Scanner (CZCS) data using channel 5. Instrumentation problems require pre-processing of channel 5 before an intelligent cloud-screening algorithm can be used. For example, at intervals of about 16 lines, the sensor records anomalously low radiances. Moreover, the calibration equation yields negative radiances when the sensor records zero counts, and pixels corrupted by electronic overshoot must also be excluded. The remaining pixels may then be used in conjunction with the procedure of Simpson and Humphrey to determine the CZCS cloud mask. These results plus in situ observations of phytoplankton pigment concentration show that pre-processing and proper cloud-screening of CZCS data are necessary for accurate satellite-derived pigment concentrations. This is especially true in the coastal margins, where pigment content is high and image distortion associated with electronic overshoot is also present. The pre-processing algorithm is critical to obtaining accurate global estimates of pigment from spacecraft data.
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Calculation of gyrosynchrotron radiation brightness temperature for outer bright loop of ICME
NASA Astrophysics Data System (ADS)
Sun, Weiying; Wu, Ji; Wang, C. B.; Wang, S.
:Solar polar orbit radio telescope (SPORT) is proposed to detect the high density plasma clouds of outer bright loop of ICMEs from solar orbit with large inclination. Of particular interest is following the propagation of the plasma clouds with remote sensor in radio wavelength band. Gyrosynchrotron emission is a main radio radiation mechanism of the plasma clouds and can provide information of interplanetary magnetic field. In this paper, we statistically analyze the electron density, electron temperature and magnetic field of background solar wind in time of quiet sun and ICMEs propagation. We also estimate the fluctuation range of the electron density, electron temperature and magnetic field of outer bright loop of ICMEs. Moreover, we calculate and analyze the emission brightness temperature and degree of polarization on the basis of the study of gyrosynchrotron emission, absorption and polarization characteristics as the optical depth is less than or equal to 1.
Hybrid Automatic Building Interpretation System
NASA Astrophysics Data System (ADS)
Pakzad, K.; Klink, A.; Müterthies, A.; Gröger, G.; Stroh, V.; Plümer, L.
2011-09-01
HABIS (Hybrid Automatic Building Interpretation System) is a system for an automatic reconstruction of building roofs used in virtual 3D building models. Unlike most of the commercially available systems, HABIS is able to work to a high degree automatically. The hybrid method uses different sources intending to exploit the advantages of the particular sources. 3D point clouds usually provide good height and surface data, whereas spatial high resolution aerial images provide important information for edges and detail information for roof objects like dormers or chimneys. The cadastral data provide important basis information about the building ground plans. The approach used in HABIS works with a multi-stage-process, which starts with a coarse roof classification based on 3D point clouds. After that it continues with an image based verification of these predicted roofs. In a further step a final classification and adjustment of the roofs is done. In addition some roof objects like dormers and chimneys are also extracted based on aerial images and added to the models. In this paper the used methods are described and some results are presented.
3D Reconstruction of Irregular Buildings and Buddha Statues
NASA Astrophysics Data System (ADS)
Zhang, K.; Li, M.-j.
2014-04-01
Three-dimensional laser scanning could acquire object's surface data quickly and accurately. However, the post-processing of point cloud is not perfect and could be improved. Based on the study of 3D laser scanning technology, this paper describes the details of solutions to modelling irregular ancient buildings and Buddha statues in Jinshan Temple, which aiming at data acquisition, modelling and texture mapping, etc. In order to modelling irregular ancient buildings effectively, the structure of each building is extracted manually by point cloud and the textures are mapped by the software of 3ds Max. The methods clearly combine 3D laser scanning technology with traditional modelling methods, and greatly improves the efficiency and accuracy of the ancient buildings restored. On the other hand, the main idea of modelling statues is regarded as modelling objects in reverse engineering. The digital model of statues obtained is not just vivid, but also accurate in the field of surveying and mapping. On this basis, a 3D scene of Jinshan Temple is reconstructed, which proves the validity of the solutions.
TLS for generating multi-LOD of 3D building model
NASA Astrophysics Data System (ADS)
Akmalia, R.; Setan, H.; Majid, Z.; Suwardhi, D.; Chong, A.
2014-02-01
The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown.
Automated Classification of Heritage Buildings for As-Built Bim Using Machine Learning Techniques
NASA Astrophysics Data System (ADS)
Bassier, M.; Vergauwen, M.; Van Genechten, B.
2017-08-01
Semantically rich three dimensional models such as Building Information Models (BIMs) are increasingly used in digital heritage. They provide the required information to varying stakeholders during the different stages of the historic buildings life cyle which is crucial in the conservation process. The creation of as-built BIM models is based on point cloud data. However, manually interpreting this data is labour intensive and often leads to misinterpretations. By automatically classifying the point cloud, the information can be proccesed more effeciently. A key aspect in this automated scan-to-BIM process is the classification of building objects. In this research we look to automatically recognise elements in existing buildings to create compact semantic information models. Our algorithm efficiently extracts the main structural components such as floors, ceilings, roofs, walls and beams despite the presence of significant clutter and occlusions. More specifically, Support Vector Machines (SVM) are proposed for the classification. The algorithm is evaluated using real data of a variety of existing buildings. The results prove that the used classifier recognizes the objects with both high precision and recall. As a result, entire data sets are reliably labelled at once. The approach enables experts to better document and process heritage assets.
NASA Glenn Icing Research Tunnel: Upgrade and Cloud Calibration
NASA Technical Reports Server (NTRS)
VanZante, Judith Foss; Ide, Robert F.; Steen, Laura E.
2012-01-01
In 2011, NASA Glenn s Icing Research Tunnel underwent a major modification to it s refrigeration plant and heat exchanger. This paper presents the results of the subsequent full cloud calibration. Details of the calibration procedure and results are presented herein. The steps include developing a nozzle transfer map, establishing a uniform cloud, conducting a drop sizing calibration and finally a liquid water content calibration. The goal of the calibration is to develop a uniform cloud, and to build a transfer map from the inputs of air speed, spray bar atomizing air pressure and water pressure to the output of median volumetric droplet diameter and liquid water content.
NASA Glenn Icing Research Tunnel: 2012 Cloud Calibration Procedure and Results
NASA Technical Reports Server (NTRS)
VanZante, Judith Foss; Ide, Robert F.; Steen, Laura E.
2012-01-01
In 2011, NASA Glenn s Icing Research Tunnel underwent a major modification to it s refrigeration plant and heat exchanger. This paper presents the results of the subsequent full cloud calibration. Details of the calibration procedure and results are presented herein. The steps include developing a nozzle transfer map, establishing a uniform cloud, conducting a drop sizing calibration and finally a liquid water content calibration. The goal of the calibration is to develop a uniform cloud, and to build a transfer map from the inputs of air speed, spray bar atomizing air pressure and water pressure to the output of median volumetric droplet diameter and liquid water content.
NASA Astrophysics Data System (ADS)
Macher, H.; Grussenmeyer, P.; Landes, T.; Halin, G.; Chevrier, C.; Huyghe, O.
2017-08-01
The French collection of Plan-Reliefs, scale models of fortified towns, constitutes a precious testimony of the history of France. The aim of the URBANIA project is the valorisation and the diffusion of this Heritage through the creation of virtual models. The town scale model of Strasbourg at 1/600 currently exhibited in the Historical Museum of Strasbourg was selected as a case study. In this paper, the photogrammetric recording of this scale model is first presented. The acquisition protocol as well as the data post-processing are detailed. Then, the modelling of the city and more specially building blocks is investigated. Based on point clouds of the scale model, the extraction of roof elements is considered. It deals first with the segmentation of the point cloud into building blocks. Then, for each block, points belonging to roofs are identified and the extraction of chimney point clouds as well as roof ridges and roof planes is performed. Finally, the 3D parametric modelling of the building blocks is studied by considering roof polygons and polylines describing chimneys as input. In a future works section, the semantically enrichment and the potential usage scenarios of the scale model are envisaged.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Developing cloud-based Business Process Management (BPM): a survey
NASA Astrophysics Data System (ADS)
Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh
2018-03-01
In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.
Challenges in Securing the Interface Between the Cloud and Pervasive Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagesse, Brent J
2011-01-01
Cloud computing presents an opportunity for pervasive systems to leverage computational and storage resources to accomplish tasks that would not normally be possible on such resource-constrained devices. Cloud computing can enable hardware designers to build lighter systems that last longer and are more mobile. Despite the advantages cloud computing offers to the designers of pervasive systems, there are some limitations of leveraging cloud computing that must be addressed. We take the position that cloud-based pervasive system must be secured holistically and discuss ways this might be accomplished. In this paper, we discuss a pervasive system utilizing cloud computing resources andmore » issues that must be addressed in such a system. In this system, the user's mobile device cannot always have network access to leverage resources from the cloud, so it must make intelligent decisions about what data should be stored locally and what processes should be run locally. As a result of these decisions, the user becomes vulnerable to attacks while interfacing with the pervasive system.« less
Stratocumulus Cloud Top Radiative Cooling and Cloud Base Updraft Speeds
NASA Astrophysics Data System (ADS)
Kazil, J.; Feingold, G.; Balsells, J.; Klinger, C.
2017-12-01
Cloud top radiative cooling is a primary driver of turbulence in the stratocumulus-topped marine boundary. A functional relationship between cloud top cooling and cloud base updraft speeds may therefore exist. A correlation of cloud top radiative cooling and cloud base updraft speeds has been recently identified empirically, providing a basis for satellite retrieval of cloud base updraft speeds. Such retrievals may enable analysis of aerosol-cloud interactions using satellite observations: Updraft speeds at cloud base co-determine supersaturation and therefore the activation of cloud condensation nuclei, which in turn co-determine cloud properties and precipitation formation. We use large eddy simulation and an off-line radiative transfer model to explore the relationship between cloud-top radiative cooling and cloud base updraft speeds in a marine stratocumulus cloud over the course of the diurnal cycle. We find that during daytime, at low cloud water path (CWP < 50 g m-2), cloud base updraft speeds and cloud top cooling are well-correlated, in agreement with the reported empirical relationship. During the night, in the absence of short-wave heating, CWP builds up (CWP > 50 g m-2) and long-wave emissions from cloud top saturate, while cloud base heating increases. In combination, cloud top cooling and cloud base updrafts become weakly anti-correlated. A functional relationship between cloud top cooling and cloud base updraft speed can hence be expected for stratocumulus clouds with a sufficiently low CWP and sub-saturated long-wave emissions, in particular during daytime. At higher CWPs, in particular at night, the relationship breaks down due to saturation of long-wave emissions from cloud top.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
... Rehabilitation Research--Disability and Rehabilitation Research Project--Inclusive Cloud and Web Computing CFDA... inclusive Cloud and Web computing. The Assistant Secretary may use this priority for competitions in fiscal... Priority for Inclusive Cloud and Web Computing'' in the subject line of your electronic message. FOR...
Using Word Clouds to Develop Proactive Learners
ERIC Educational Resources Information Center
Miley, Frances; Read, Andrew
2011-01-01
This article examines student responses to a technique for summarizing electronically available information based on word frequency. Students used this technique to create word clouds, using those word clouds to enhance personal and small group study. This is a qualitative study. Small focus groups were used to obtain student feedback. Feedback…
Supernova Remnant W49B and Its Environment
NASA Astrophysics Data System (ADS)
Zhu, H.; Tian, W. W.; Zuo, P.
2014-10-01
We study gamma-ray supernova remnant (SNR) W49B and its environment using recent radio and infrared data. Spitzer Infrared Spectrograph low resolution data of W49B shows shocked excitation lines of H2 (0,0) S(0)-S(7) from the SNR-molecular cloud interaction. The H2 gas is composed of two components with temperatures of ~260 K and ~1060 K, respectively. Various spectral lines from atomic and ionic particles are detected toward W49B. We suggest that the ionic phase has an electron density of ~500 cm-3 and a temperature of ~104 K by the spectral line diagnoses. The mid- and far-infrared data from MSX, Spitzer, and Herschel reveal a 151 ± 20 K hot dust component with a mass of 7.5 ± 6.6 × 10-4 M ⊙ and a 45 ± 4 K warm dust component with a mass of 6.4 ± 3.2 M ⊙. The hot dust is likely from materials swept up by the shock of W49B. The warm dust may possibly originate from the evaporation of clouds interacting with W49B. We build the H I absorption spectra of W49B and four nearby H II regions (W49A, G42.90+0.58, G42.43-0.26, and G43.19-0.53) and study the relation between W49B and the surrounding molecular clouds by employing the 2.12 μm infrared and CO data. We therefore obtain a kinematic distance of ~10 kpc for W49B and suggest that the remnant is likely associated with the CO cloud at about 40 km s-1.
Plasma waves associated with the AMPTE artificial comet
NASA Technical Reports Server (NTRS)
Gurnett, D. A.; Anderson, R. R.; Haeusler, B.; Haerendel, G.; Bauer, O. H.
1985-01-01
Numerous plasma wave effects were detected by the AMPTE/IRM spacecraft during the artificial comet experiment on December 27, 1984. As the barium ion cloud produced by the explosion expanded over the spacecraft, emissions at the electron plasma frequency and ion plasma frequency provided a determination of the local electron density. The electron density in the diamagnetic cavity produced by the ion cloud reached a peak of more than 5 x 10 to the 5th per cu cm, then decayed smoothly as the cloud expanded, varying approximately as t exp-2. As the cloud began to move due to interactions with the solar wind, a region of compressed plasma was encountered on the upstream side of the diamagnetic cavity. The peak electron density in the compression region was about 1.5 x 10 to the 4th per cu cm. Later, a very intense (140 mVolt/m) broadband burst of electrostatic noise was encountered on the sunward side of the compression region. This noise has characteristics very similar to noise observed in the earth's bow shock, and is believed to be a shocklike interaction produced by an ion beam-plasma instability between the nearly stationary barium ions and the streaming solar wind protons.
NASA Astrophysics Data System (ADS)
Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas
2016-10-01
In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied ;as is; to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.
Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María
2017-10-01
New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
The Community Cloud Atlas - Building an Informed Cloud Watching Community
NASA Astrophysics Data System (ADS)
Guy, N.; Rowe, A.
2014-12-01
The sky is dynamic, from long lasting cloud systems to ethereal, fleeting formations. After years of observing the sky and growing our personal collections of cloud photos, we decided to take to social media to share pictures, as well as build and educate a community of cloud enthusiasts. We began a Facebook page, the Community Cloud Atlas, described as "...the place to show off your pictures of the sky, identify clouds, and to discuss how specific cloud types form and what they can tell you about current and future weather." Our main goal has been to encourage others to share their pictures, while we describe the scenes from a meteorological perspective and reach out to the general public to facilitate a deeper understanding of the sky. Nearly 16 months later, we have over 1400 "likes," spanning 45 countries with ages ranging from 13 to over 65. We have a consistent stream of submissions; so many that we decided to start a corresponding blog to better organize the photos, provide more detailed explanations, and reach a bigger audience. Feedback from users has been positive in support of not only sharing cloud pictures, but also to "learn the science as well as admiring" the clouds. As one community member stated, "This is not 'just' a place to share some lovely pictures." We have attempted to blend our social media presence with providing an educational resource, and we are encouraged by the response we have received. Our Atlas has been informally implemented into classrooms, ranging from a 6th grade science class to Meteorology courses at universities. NOVA's recent Cloud Lab also made use of our Atlas as a supply of categorized pictures. Our ongoing goal is to not only continue to increase understanding and appreciation of the sky among the public, but to provide an increasingly useful tool for educators. We continue to explore different social media options to interact with the public and provide easier content submission, as well as software options for managing a growing database.
Characterization of Individual Aerosol Particles Associated with Clouds (CRYSTAL-FACE)
NASA Technical Reports Server (NTRS)
Buseck, Peter R.
2004-01-01
The aim of our research was to obtain data on the chemical and physical properties of individual aerosol particles from near the bottoms and tops of the deep convective systems that lead to the generation of tropical cirrus clouds and to provide insights into the particles that serve as CCN or IN. We used analytical transmission electron microscopy (ATEM), including energy-dispersive X-ray spectrometry (EDS) and electron energy-loss spectroscopy (EELS), and field-emission electron microscopy (FESEM) to compare the compositions, concentrations, size distributions, shapes, surface coatings, and degrees of aggregation of individual particles from cloud bases and the anvils near the tropopause. Aggregates of sea salt and mineral dust, ammonium sulfate, and soot particles are abundant in in-cloud samples. Cirrus samples contain many H2SO4 droplets, but acidic sulfate particles are rare at the cloud bases. H2SO4 probably formed at higher altitudes through oxidation of SO2 in cloud droplets. The relatively high extent of ammoniation in the upper troposphere in-cloud samples appears to have resulted from vertical transport by strong convection. The morphology of H2SO4 droplets indicates that they had been at least yartiy ammoniated at the time of collection. They are internally mixed with organic materials, metal sulfates, and solid particles of various compositions. Ammoniation and internal mixing of result in freezing at higher temperature than in pure H2SO4 aerosols. K- and S-bearing organic particles and Si-Al-rich particles are common throughout. Sea salt and mineral dust were incorporated into the convective systems from the cloud bases and worked as ice nuclei while being vertically transported. The nonsulfate particles originated from the lower troposphere and were transported to the upper troposphere and lower stratosphere.
MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID ...
MTR BUILDING AND BALCONY FLOORS. CAMERA FACING EASTERLY. PHOTOGRAPHER DID NOT EXPLAIN DARK CLOUD. MTR WING WILL ATTACH TO GROUND FLOOR. INL NEGATIVE NO. 1567. Unknown Photographer, 2/28/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.
2017-12-01
Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.
NASA Astrophysics Data System (ADS)
Aneri, Parikh; Sumathy, S.
2017-11-01
Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.
Konishi, Yuki; Hayashi, Hiroaki; Takegami, Kazuki; Fukuda, Ikuma; Ueno, Junji
2014-01-01
A cloud chamber is a detector that can visualize the tracks of charged particles. Hayashi, et al. suggested a visualization experiment in which X-rays generated by diagnostic X-ray equipment were directed into a cloud chamber; however, there was a problem in that the wall of the cloud chamber scattered the incoming X-rays. In this study, we developed a new cloud chamber with entrance windows. Because these windows are made of thin film, we were able to direct the X-rays through them without contamination by scattered X-rays from the cloud chamber wall. We have newly proposed an experiment in which beta-particles emitted from radioisotopes are directed into a cloud chamber. We place shielding material in the cloud chamber and visualize the various shielding effects seen with the material positioned in different ways. During the experiment, electrons scattered in the air were measured quantitatively using GM counters. We explained the physical phenomena in the cloud chamber using Monte Carlo simulation code EGS5. Because electrons follow a tortuous path in air, the shielding material must be placed appropriately to be able to effectively block their emissions. Visualization of the tracks of charged particles in this experiment proved effective for instructing not only trainee radiological technologists but also different types of healthcare professionals.
Formation of Benzene in the Interstellar Medium
NASA Technical Reports Server (NTRS)
Jones, Brant M.; Zhang, Fangtong; Kaiser, Ralf I.; Jamal, Adeel; Mebel, Alexander M.; Cordiner, Martin A.; Charnley, Steven B.; Crim, F. Fleming (Editor)
2010-01-01
Polycyclic aromatic hydrocarbons and related species have been suggested to play a key role in the astrochemical evolution of the interstellar medium, but the formation mechanism of even their simplest building block-the aromatic benzene molecule-has remained elusive for decades. Here we demonstrate in crossed molecular beam experiments combined with electronic structure and statistical calculations that benzene (C6H6) can be synthesized via the barrierless, exoergic reaction of the ethynyl radical and 1,3- butadiene, C2H + H2CCHCHCH2 --> C6H6, + H, under single collision conditions. This reaction portrays the simplest representative of a reaction class in which aromatic molecules with a benzene core can be formed from acyclic precursors via barrierless reactions of ethynyl radicals with substituted 1,3-butadlene molecules. Unique gas-grain astrochemical models imply that this low-temperature route controls the synthesis of the very first aromatic ring from acyclic precursors in cold molecular clouds, such as in the Taurus Molecular Cloud. Rapid, subsequent barrierless reactions of benzene with ethynyl radicals can lead to naphthalene-like structures thus effectively propagating the ethynyl-radical mediated formation of aromatic molecules in the interstellar medium.
Formation of benzene in the interstellar medium
Jones, Brant M.; Zhang, Fangtong; Kaiser, Ralf I.; Jamal, Adeel; Mebel, Alexander M.; Cordiner, Martin A.; Charnley, Steven B.
2011-01-01
Polycyclic aromatic hydrocarbons and related species have been suggested to play a key role in the astrochemical evolution of the interstellar medium, but the formation mechanism of even their simplest building block—the aromatic benzene molecule—has remained elusive for decades. Here we demonstrate in crossed molecular beam experiments combined with electronic structure and statistical calculations that benzene (C6H6) can be synthesized via the barrierless, exoergic reaction of the ethynyl radical and 1,3-butadiene, C2H + H2CCHCHCH2 → C6H6 + H, under single collision conditions. This reaction portrays the simplest representative of a reaction class in which aromatic molecules with a benzene core can be formed from acyclic precursors via barrierless reactions of ethynyl radicals with substituted 1,3-butadiene molecules. Unique gas-grain astrochemical models imply that this low-temperature route controls the synthesis of the very first aromatic ring from acyclic precursors in cold molecular clouds, such as in the Taurus Molecular Cloud. Rapid, subsequent barrierless reactions of benzene with ethynyl radicals can lead to naphthalene-like structures thus effectively propagating the ethynyl-radical mediated formation of aromatic molecules in the interstellar medium. PMID:21187430
Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data
NASA Astrophysics Data System (ADS)
Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.
2016-06-01
Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.
Confidentiality Protection of Digital Health Records in Cloud Computing.
Chen, Shyh-Wei; Chiang, Dai Lun; Liu, Chia-Hui; Chen, Tzer-Shyong; Lai, Feipei; Wang, Huihui; Wei, Wei
2016-05-01
Electronic medical records containing confidential information were uploaded to the cloud. The cloud allows medical crews to access and manage the data and integration of medical records easily. This data system provides relevant information to medical personnel and facilitates and improve electronic medical record management and data transmission. A structure of cloud-based and patient-centered personal health record (PHR) is proposed in this study. This technique helps patients to manage their health information, such as appointment date with doctor, health reports, and a completed understanding of their own health conditions. It will create patients a positive attitudes to maintain the health. The patients make decision on their own for those whom has access to their records over a specific span of time specified by the patients. Storing data in the cloud environment can reduce costs and enhance the share of information, but the potential threat of information security should be taken into consideration. This study is proposing the cloud-based secure transmission mechanism is suitable for multiple users (like nurse aides, patients, and family members).
Pulse sequences for uniform perfluorocarbon droplet vaporization and ultrasound imaging.
Puett, C; Sheeran, P S; Rojas, J D; Dayton, P A
2014-09-01
Phase-change contrast agents (PCCAs) consist of liquid perfluorocarbon droplets that can be vaporized into gas-filled microbubbles by pulsed ultrasound waves at diagnostic pressures and frequencies. These activatable contrast agents provide benefits of longer circulating times and smaller sizes relative to conventional microbubble contrast agents. However, optimizing ultrasound-induced activation of these agents requires coordinated pulse sequences not found on current clinical systems, in order to both initiate droplet vaporization and image the resulting microbubble population. Specifically, the activation process must provide a spatially uniform distribution of microbubbles and needs to occur quickly enough to image the vaporized agents before they migrate out of the imaging field of view. The development and evaluation of protocols for PCCA-enhanced ultrasound imaging using a commercial array transducer are described. The developed pulse sequences consist of three states: (1) initial imaging at sub-activation pressures, (2) activating droplets within a selected region of interest, and (3) imaging the resulting microbubbles. Bubble clouds produced by the vaporization of decafluorobutane and octafluoropropane droplets were characterized as a function of focused pulse parameters and acoustic field location. Pulse sequences were designed to manipulate the geometries of discrete microbubble clouds using electronic steering, and cloud spacing was tailored to build a uniform vaporization field. The complete pulse sequence was demonstrated in the water bath and then in vivo in a rodent kidney. The resulting contrast provided a significant increase (>15 dB) in signal intensity. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dieckmann, M. E.
2008-11-01
Recent particle-in-cell (PIC) simulation studies have addressed particle acceleration and magnetic field generation in relativistic astrophysical flows by plasma phase space structures. We discuss the astrophysical environments such as the jets of compact objects, and we give an overview of the global PIC simulations of shocks. These reveal several types of phase space structures, which are relevant for the energy dissipation. These structures are typically coupled in shocks, but we choose to consider them here in an isolated form. Three structures are reviewed. (1) Simulations of interpenetrating or colliding plasma clouds can trigger filamentation instabilities, while simulations of thermally anisotropic plasmas observe the Weibel instability. Both transform a spatially uniform plasma into current filaments. These filament structures cause the growth of the magnetic fields. (2) The development of a modified two-stream instability is discussed. It saturates first by the formation of electron phase space holes. The relativistic electron clouds modulate the ion beam and a secondary, spatially localized electrostatic instability grows, which saturates by forming a relativistic ion phase space hole. It accelerates electrons to ultra-relativistic speeds. (3) A simulation is also revised, in which two clouds of an electron-ion plasma collide at the speed 0.9c. The inequal densities of both clouds and a magnetic field that is oblique to the collision velocity vector result in waves with a mixed electrostatic and electromagnetic polarity. The waves give rise to growing corkscrew distributions in the electrons and ions that establish an equipartition between the electron, the ion and the magnetic energy. The filament-, phase space hole- and corkscrew structures are discussed with respect to electron acceleration and magnetic field generation.
[Research and Implementation of Vital Signs Monitoring System Based on Cloud Platform].
Yu, Man; Tan, Anzu; Huang, Jianqi
2018-05-30
Through analyzing the existing problems in the current mode, the vital signs monitoring information system based on cloud platform is designed and developed. The system's aim is to assist nurse carry out vital signs nursing work effectively and accurately. The system collects, uploads and analyzes patient's vital signs data by PDA which connecting medical inspection equipments. Clinical application proved that the system can effectively improve the quality and efficiency of medical care and may reduce medical expenses. It is alse an important practice result to build a medical cloud platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pivi, M.T.F.; Collet, G.; King, F.
Beam instability caused by the electron cloud has been observed in positron and proton storage rings and it is expected to be a limiting factor in the performance of the positron Damping Ring (DR) of future Linear Colliders (LC) such as ILC and CLIC. To test a series of promising possible electron cloud mitigation techniques as surface coatings and grooves, in the Positron Low Energy Ring (LER) of the PEP-II accelerator, we have installed several test vacuum chambers including (i) a special chamber to monitor the variation of the secondary electron yield of technical surface materials and coatings under themore » effect of ion, electron and photon conditioning in situ in the beam line; (ii) chambers with grooves in a straight magnetic-free section; and (iii) coated chambers in a dedicated newly installed 4-magnet chicane to study mitigations in a magnetic field region. In this paper, we describe the ongoing R&D effort to mitigate the electron cloud effect for the LC damping ring, focusing on the first experimental area and on results of the reduction of the secondary electron yield due to in situ conditioning.« less
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
Context dependent off loading for cloudlet in mobile ad-hoc network
NASA Astrophysics Data System (ADS)
Bhatt, N.; Nadesh, R. K.; ArivuSelvan, K.
2017-11-01
Cloud Computing in Mobile Ad-hoc network is emerging part of research consideration as the demand and competency of mobile devices increased in last few years. To follow out operation within the remote cloud builds the postponement and influences the administration standard. To keep away from this trouble cloudlet is presented. Cloudlet gives identical support of the devices as cloud at low inactivity however at high transfer speed. Be that as it may, choice of a cloudlet for offloading calculation with flat energy is a noteworthy test if multiple cloud let is accessible adjacent. Here I proposed energy and bandwidth (Traffic overload for communication with cloud) aware cloudlet selection strategy based on the context dependency of the device location. It works on the basis of mobile device location and bandwidth availability of cloudlet. The cloudlet offloading and selection process using given solution is simulated in Cloud ~ Simulator.
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Smith, W. L., Jr.; Spangenberg, D.; Palikonda, R.; Bedka, K. M.; Minnis, P.; Thieman, M. M.; Nordeen, M.
2017-12-01
Providing public access to research products including cloud macro and microphysical properties and satellite imagery are a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a web based visualization tool and API that allows end users to easily create customized cloud product and satellite imagery, ground site data and satellite ground track information that is generated dynamically. The tool has two uses, one to visualize the dynamically created imagery and the other to provide access to the dynamically generated imagery directly at a later time. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud to accommodate scalability issues. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information, satellite imagery, ground site data and satellite track information accessible and easily searchable. This tool is the culmination of our prior experience with dynamic imagery generation and provides a way to build a "mash-up" of dynamically generated imagery and related kinds of information that are visualized together to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much scientific knowledge, observations and products available to the citizen science, research and interested communities as well as for automated systems to acquire the same information for data mining or other analytic purposes. This tool and the underlying API's provide a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Pimpec, F.; /PSI, Villigen; Kirby, R.E.
In many accelerator storage rings running positively charged beams, ionization of residual gas and secondary electron emission (SEE) in the beam pipe will give rise to an electron cloud which can cause beam blow-up or loss of the circulating beam. A preventative measure that suppresses electron cloud formation is to ensure that the vacuum wall has a low secondary emission yield (SEY). The SEY of thin films of TiN, sputter deposited Non-Evaporable Getters and a novel TiCN alloy were measured under a variety of conditions, including the effect of re-contamination from residual gas.
Grids, virtualization, and clouds at Fermilab
Timm, S.; Chadwick, K.; Garzoglio, G.; ...
2014-06-11
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-01-01
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121
Grids, virtualization, and clouds at Fermilab
NASA Astrophysics Data System (ADS)
Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.
2014-06-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-12-24
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.
Localization of Pathology on Complex Architecture Building Surfaces
NASA Astrophysics Data System (ADS)
Sidiropoulos, A. A.; Lakakis, K. N.; Mouza, V. K.
2017-02-01
The technology of 3D laser scanning is considered as one of the most common methods for heritage documentation. The point clouds that are being produced provide information of high detail, both geometric and thematic. There are various studies that examine techniques of the best exploitation of this information. In this study, an algorithm of pathology localization, such as cracks and fissures, on complex building surfaces is being tested. The algorithm makes use of the points' position in the point cloud and tries to distinguish them in two groups-patterns; pathology and non-pathology. The extraction of the geometric information that is being used for recognizing the pattern of the points is being accomplished via Principal Component Analysis (PCA) in user-specified neighborhoods in the whole point cloud. The implementation of PCA leads to the definition of the normal vector at each point of the cloud. Two tests that operate separately examine both local and global geometric criteria among the points and conclude which of them should be categorized as pathology. The proposed algorithm was tested on parts of the Gazi Evrenos Baths masonry, which are located at the city of Giannitsa at Northern Greece.
NASA Astrophysics Data System (ADS)
Chiabrando, F.; Lo Turco, M.; Santagati, C.
2017-02-01
The paper here presented shows the outcomes of a research/didactic activity carried out within a workshop titled "Digital Invasions. From point cloud to Heritage Building Information Modeling" held at Politecnico di Torino (29th September-5th October 2016). The term digital invasions refers to an Italian bottom up project born in the 2013 with the aim of promoting innovative digital ways for the enhancement of Cultural Heritage by the co-creation of cultural contents and its sharing through social media platforms. At this regard, we have worked with students of Architectural Master of Science degree, training them with a multidisciplinary teaching team (Architectural Representation, History of Architecture, Restoration, Digital Communication and Geomatics). The aim was also to test if our students could be involved in a sort of niche crowdsourcing for the creation of a library of H-BOMS (Historical-Building Object Modeling) of architectural elements.
A computational- And storage-cloud for integration of biodiversity collections
Matsunaga, A.; Thompson, A.; Figueiredo, R. J.; Germain-Aubrey, C.C; Collins, M.; Beeman, R.S; Macfadden, B.J.; Riccardi, G.; Soltis, P.S; Page, L. M.; Fortes, J.A.B
2013-01-01
A core mission of the Integrated Digitized Biocollections (iDigBio) project is the building and deployment of a cloud computing environment customized to support the digitization workflow and integration of data from all U.S. nonfederal biocollections. iDigBio chose to use cloud computing technologies to deliver a cyberinfrastructure that is flexible, agile, resilient, and scalable to meet the needs of the biodiversity community. In this context, this paper describes the integration of open source cloud middleware, applications, and third party services using standard formats, protocols, and services. In addition, this paper demonstrates the value of the digitized information from collections in a broader scenario involving multiple disciplines.
Construction and application of Red5 cluster based on OpenStack
NASA Astrophysics Data System (ADS)
Wang, Jiaqing; Song, Jianxin
2017-08-01
With the application and development of cloud computing technology in various fields, the resource utilization rate of the data center has been improved obviously, and the system based on cloud computing platform has also improved the expansibility and stability. In the traditional way, Red5 cluster resource utilization is low and the system stability is poor. This paper uses cloud computing to efficiently calculate the resource allocation ability, and builds a Red5 server cluster based on OpenStack. Multimedia applications can be published to the Red5 cloud server cluster. The system achieves the flexible construction of computing resources, but also greatly improves the stability of the cluster and service efficiency.
Hybrid Cloud Computing Environment for EarthCube and Geoscience Community
NASA Astrophysics Data System (ADS)
Yang, C. P.; Qin, H.
2016-12-01
The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollias, Pavlos
2017-04-23
With the vast upgrades to the ARM program radar measurement capabilities in 2010 and beyond, our ability to probe the 3D structure of clouds and associated precipitation has increased dramatically. This project build on the PI's and co-I's expertisein the analysis of radar observations. The first research thrust aims to document the 3D morphological (as depicted by the radar reflectivity structure) and 3D dynamical (cloud$-$scale eddies) structure of boundary layer clouds. Unraveling the 3D dynamical structure of stratocumulus and shallow cumulus clouds requires decomposition of the environmental wind contribution and particle sedimentation velocity from the observed radial Doppler velocity. Themore » second thrust proposes to unravel the mechanism of cumulus entrainment (location, scales) and its impact on microphysics utilizing radar measurements from the vertically pointing and new scanning radars at the ARM sites. The third research thrust requires the development of a cloud$-$tracking algorithm that monitors the properties of cloud.« less
3-D Object Recognition from Point Cloud Data
NASA Astrophysics Data System (ADS)
Smith, W.; Walker, A. S.; Zhang, B.
2011-09-01
The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs). Massively parallel processes such as graphics processing unit (GPU) computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM) and digital elevation model (DEM), so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex roofs. Several case studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.
NASA Technical Reports Server (NTRS)
Spann, J.; Germany, G.; Swift, W.; Parks, G.; Brittnacher, M.; Elsen, R.
1997-01-01
The observed precipitating electron energy between 0130 UT and 0400 UT of January 10 th, 1997, indicates that there is a more energetic precipitating electron population that appears in the auroral oval at 1800-2200 UT at 030) UT. This increase in energy occurs after the initial shock of the magnetic cloud reaches the Earth (0114 UT) and after faint but dynamic polar cap precipitation has been cleared out. The more energetic population is observed to remain rather constant in MLT through the onset of auroral activity (0330 UT) and to the end of the Polar spacecraft apogee pass. Data from the Ultraviolet Imager LBH long and LBH short images are used to quantify the average energy of the precipitating auroral electrons. The Wind spacecraft located about 100 RE upstream monitored the IMF and plasma parameters during the passing of the cloud. The affects of oblique angle viewing are included in the analysis. Suggestions as to the source of this hot electron population will be presented.
Automatic Generation of Building Models with Levels of Detail 1-3
NASA Astrophysics Data System (ADS)
Nguatem, W.; Drauschke, M.; Mayer, H.
2016-06-01
We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.
Search for water and life's building blocks in the Universe: An Introduction
NASA Astrophysics Data System (ADS)
Kwok, Sun
Water and organics are commonly believed to be the essential ingredients for life on Earth. The development of infrared and submillimeter observational techniques has resulted in the detection of water in circumstellar envelopes, interstellar clouds, comets, asteroids, planetary satellites and the Sun. Complex organics have also been found in stellar ejecta, diffuse and molecular clouds, meteorites, interplanetary dust particles, comets and planetary satellites. In this Focus Meeting, we will discuss the origin, distribution, and detection of water and other life's building blocks both inside and outside of the Solar System. The possibility of extraterrestrial organics and water on the origin of life on Earth will also be discussed.
Analysis of the Security and Privacy Requirements of Cloud-Based Electronic Health Records Systems
Fernández, Gonzalo; López-Coronado, Miguel
2013-01-01
Background The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients’ medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. Objective To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. Methods To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Results Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Conclusions Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access and data breaches. Patients must be kept informed about how their data are being managed. PMID:23965254
Analysis of the security and privacy requirements of cloud-based electronic health records systems.
Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel
2013-08-21
The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access and data breaches. Patients must be kept informed about how their data are being managed.
Hybrid cloud: bridging of private and public cloud computing
NASA Astrophysics Data System (ADS)
Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol
2018-05-01
Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.
NASA Astrophysics Data System (ADS)
Marshall, R. A.; Inan, U. S.; Glukhov, V. S.
2010-04-01
A 3-D finite difference time domain model is used to simulate the lightning electromagnetic pulse (EMP) and its interaction with the lower ionosphere. Results agree with the frequently observed, doughnut-shaped optical signature of elves but show that the structure exhibits asymmetry due to the presence of Earth's ambient magnetic field. Furthermore, in-cloud (horizontal) lightning channels produce observable optical emissions without the doughnut shape and, in fact, produce a much stronger optical output for the same channel current. Electron density perturbations associated with elves are also calculated, with contributions from attachment and ionization. Results presented as a function of parameters such as magnetic field direction, dipole current orientation, altitude and amplitude, and ambient ionospheric density profile demonstrate the highly nonlinear nature of the EMP-ionosphere interaction. Ionospheric effects of a sequence of in-cloud discharges are calculated, simulating a burst of in-cloud lightning activity and resulting in large density changes in the overlying ionosphere.
Electronic Health Records in the Cloud: Improving Primary Health Care Delivery in South Africa.
Cilliers, Liezel; Wright, Graham
2017-01-01
In South Africa, the recording of health data is done manually in a paper-based file, while attempts to digitize healthcare records have had limited success. In many countries, Electronic Health Records (EHRs) has developed in silos, with little or no integration between different operational systems. Literature has provided evidence that the cloud can be used to 'leapfrog' some of these implementation issues, but the adoption of this technology in the public health care sector has been very limited. This paper aims to identify the major reasons why the cloud has not been used to implement EHRs for the South African public health care system, and to provide recommendations of how to overcome these challenges. From the literature, it is clear that there are technology, environmental and organisational challenges affecting the implementation of EHRs in the cloud. Four recommendations are provided that can be used by the National Department of Health to implement EHRs making use of the cloud.
Simulating the growth of an charge cloud for a microchannel plate detector
NASA Astrophysics Data System (ADS)
Siwal, Davinder; Wiggins, Blake; Desouza, Romualdo
2015-10-01
Position sensitive microchannel plate (MCP) detectors have a variety of applications in the fields of astronomy, medical imaging, neutron imaging, and ion beam tracking. Recently, a novel approach has been implemented to detect the position of an incident particle. The charge cloud produced by the MCP induces a signal on a wire harp placed between the MCP and an anode. On qualitative grounds it is clear that in this detector the induced signal shape depends on the size of the electron cloud. A detailed study has therefore been performed to investigate the size of the charge cloud within the MCP and its growth as it propagates from the MCP to the anode. A simple model has been developed to calculate the impact of charge repulsion on the growth of the electron cloud. Both the details of the model and its predictions will be presented. Supported by the US DOE NNSA under Award No. DE-NA0002012.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.
Cianfrocco, Michael A; Leschziner, Andres E
2015-05-08
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.
A 94 GHz RF Electronics Subsystem for the CloudSat Cloud Profiling Radar
NASA Technical Reports Server (NTRS)
LaBelle, Remi C.; Girard, Ralph; Arbery, Graham
2003-01-01
The CloudSat spacecraft, scheduled for launch in 2004, will carry the 94 GHz Cloud Profiling Radar (CPR) instrument. The design, assembly and test of the flight Radio Frequency Electronics Subsystem (RFES) for this instrument has been completed and is presented here. The RFES consists of an Upconverter (which includes an Exciter and two Drive Amplifiers (DA's)), a Receiver, and a Transmitter Calibrator assembly. Some key performance parameters of the RFES are as follows: dual 100 mW pulse-modulated drive outputs at 94 GHz, overall Receiver noise figure < 5.0 dB, a highly stable W-band noise source to provide knowledge accuracy of Receiver gain of < 0.4 dB over the 2 year mission life, and a W-band peak power detector to monitor the transmitter output power to within 0.5 dB over life. Some recent monolithic microwave integrated circuit (MMIC) designs were utilized which implement the DA's in 0.1 micron GaAs high electron-mobility transistor (HEMT) technology and the Receiver low-noise amplifier (LNA) in 0.1 micron InP HEMT technology.
Cloud computing in pharmaceutical R&D: business risks and mitigations.
Geiger, Karl
2010-05-01
Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.
Current State of the Art Historic Building Information Modelling
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2017-08-01
In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
NASA Astrophysics Data System (ADS)
Gouwens, C.; Dragosavic, M.
The large reserves and increasing use of natural gas as a source of energy have resulted in its storage and transport becoming an urgent problem. Since a liquid of the same mass occupies only a fraction of the volume of a gas, it is economical to store natural gas as a liquid. Liquefied natural gas is stored in insulated tanks and also carried by ship at a temperature of -160 C to 170 C. If a serious accident allows the LNG to escape, a gas cloud forms. The results of a possible explosion from such a gas cloud are studied. The development of a leak, escape and evaporation, size and propagation of the gas cloud, the explosive pressures to be expected and the results on the environment are investigated. Damage to buildings is examined making use of the preliminary conclusions of the other sub-projects and especially the explosive pressures.
NASA Astrophysics Data System (ADS)
Thau, D.
2017-12-01
For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson
The Segmentation of Point Clouds with K-Means and ANN (artifical Neural Network)
NASA Astrophysics Data System (ADS)
Kuçak, R. A.; Özdemir, E.; Erol, S.
2017-05-01
Segmentation of point clouds is recently used in many Geomatics Engineering applications such as the building extraction in urban areas, Digital Terrain Model (DTM) generation and the road or urban furniture extraction. Segmentation is a process of dividing point clouds according to their special characteristic layers. The present paper discusses K-means and self-organizing map (SOM) which is a type of ANN (Artificial Neural Network) segmentation algorithm which treats the segmentation of point cloud. The point clouds which generate with photogrammetric method and Terrestrial Lidar System (TLS) were segmented according to surface normal, intensity and curvature. Thus, the results were evaluated. LIDAR (Light Detection and Ranging) and Photogrammetry are commonly used to obtain point clouds in many remote sensing and geodesy applications. By photogrammetric method or LIDAR method, it is possible to obtain point cloud from terrestrial or airborne systems. In this study, the measurements were made with a Leica C10 laser scanner in LIDAR method. In photogrammetric method, the point cloud was obtained from photographs taken from the ground with a 13 MP non-metric camera.
Using Deep Learning Model for Meteorological Satellite Cloud Image Prediction
NASA Astrophysics Data System (ADS)
Su, X.
2017-12-01
A satellite cloud image contains much weather information such as precipitation information. Short-time cloud movement forecast is important for precipitation forecast and is the primary means for typhoon monitoring. The traditional methods are mostly using the cloud feature matching and linear extrapolation to predict the cloud movement, which makes that the nonstationary process such as inversion and deformation during the movement of the cloud is basically not considered. It is still a hard task to predict cloud movement timely and correctly. As deep learning model could perform well in learning spatiotemporal features, to meet this challenge, we could regard cloud image prediction as a spatiotemporal sequence forecasting problem and introduce deep learning model to solve this problem. In this research, we use a variant of Gated-Recurrent-Unit(GRU) that has convolutional structures to deal with spatiotemporal features and build an end-to-end model to solve this forecast problem. In this model, both the input and output are spatiotemporal sequences. Compared to Convolutional LSTM(ConvLSTM) model, this model has lower amount of parameters. We imply this model on GOES satellite data and the model perform well.
Monitoring of Progressive Damage in Buildings Using Laser Scan Data
NASA Astrophysics Data System (ADS)
Puente, I.; Lindenbergh, R.; Van Natijne, A.; Esposito, R.; Schipper, R.
2018-05-01
Vulnerability of buildings to natural and man-induced hazards has become a main concern for our society. Ensuring their serviceability, safety and sustainability is of vital importance and the main reason for setting up monitoring systems to detect damages at an early stage. In this work, a method is presented for detecting changes from laser scan data, where no registration between different epochs is needed. To show the potential of the method, a case study of a laboratory test carried out at the Stevin laboratory of Delft University of Technology was selected. The case study was a quasi-static cyclic pushover test on a two-story high unreinforced masonry structure designed to simulate damage evolution caused by cyclic loading. During the various phases, we analysed the behaviour of the masonry walls by monitoring the deformation of each masonry unit. First a plane is fitted to the selected wall point cloud, consisting of one single terrestrial laser scan, using Principal Component Analysis (PCA). Second, the segmentation of individual elements is performed. Then deformations with respect to this plane model, for each epoch and specific element, are determined by computing their corresponding rotation and cloud-to-plane distances. The validation of the changes detected within this approach is done by comparison with traditional deformation analysis based on co-registered TLS point clouds between two or more epochs of building measurements. Initial results show that the sketched methodology is indeed able to detect changes at the mm level while avoiding 3D point cloud registration, which is a main issue in computer vision and remote sensing.
NASA Astrophysics Data System (ADS)
Fernandez Galarreta, J.; Kerle, N.; Gerke, M.
2015-06-01
Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.
Interaction of a neutral cloud moving through a magnetized plasma
NASA Technical Reports Server (NTRS)
Goertz, C. K.; Lu, G.
1990-01-01
Current collection by outgassing probes in motion relative to a magnetized plasma may be significantly affected by plasma processes that cause electron heating and cross field transport. Simulations of a neutral gas cloud moving across a static magnetic field are discussed. The authors treat a low-Beta plasma and use a 2-1/2 D electrostatic code linked with the authors' Plasma and Neutral Interaction Code (PANIC). This study emphasizes the understanding of the interface between the neutral gas cloud and the surrounding plasma where electrons are heated and can diffuse across field lines. When ionization or charge exchange collisions occur a sheath-like structure is formed at the surface of the neutral gas. In that region the crossfield component of the electric field causes the electron to E times B drift with a velocity of the order of the neutral gas velocity times the square root of the ion to electron mass ratio. In addition a diamagnetic drift of the electron occurs due to the number density and temperature inhomogeneity in the front. These drift currents excite the lower-hybrid waves with the wave k-vectors almost perpendicular to the neutral flow and magnetic field again resulting in electron heating. The thermal electron current is significantly enhanced due to this heating.
NASA Astrophysics Data System (ADS)
Chow, L.; Fai, S.
2017-08-01
The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.
NASA Technical Reports Server (NTRS)
Genkova, I.; Long, C. N.; Heck, P. W.; Minnis, P.
2003-01-01
One of the primary Atmospheric Radiation Measurement (ARM) Program objectives is to obtain measurements applicable to the development of models for better understanding of radiative processes in the atmosphere. We address this goal by building a three-dimensional (3D) characterization of the cloud structure and properties over the ARM Southern Great Plains (SGP). We take the approach of juxtaposing the cloud properties as retrieved from independent satellite and ground-based retrievals, and looking at the statistics of the cloud field properties. Once these retrievals are well understood, they will be used to populate the 3D characterization database. As a first step we determine the relationship between surface fractional sky cover and satellite viewing angle dependent cloud fraction (CF). We elaborate on the agreement intercomparing optical depth (OD) datasets from satellite and ground using available retrieval algorithms with relation to the CF, cloud height, multi-layer cloud presence, and solar zenith angle (SZA). For the SGP Central Facility, where output from the active remote sensing cloud layer (ARSCL) valueadded product (VAP) is available, we study the uncertainty of satellite estimated cloud heights and evaluate the impact of this uncertainty for radiative studies.
NASA Astrophysics Data System (ADS)
LIU, J.; Bi, Y.; Duan, S.; Lu, D.
2017-12-01
It is well-known that cloud characteristics, such as top and base heights and their layering structure of micro-physical parameters, spatial coverage and temporal duration are very important factors influencing both radiation budget and its vertical partitioning as well as hydrological cycle through precipitation data. Also, cloud structure and their statistical distribution and typical values will have respective characteristics with geographical and seasonal variation. Ka band radar is a powerful tool to obtain above parameters around the world, such as ARM cloud radar at the Oklahoma US, Since 2006, Cloudsat is one of NASA's A-Train satellite constellation, continuously observe the cloud structure with global coverage, but only twice a day it monitor clouds over same local site at same local time.By using IAP Ka band Doppler radar which has been operating continuously since early 2013 over the roof of IAP building in Beijing, we obtained the statistical characteristic of clouds, including cloud layering, cloud top and base heights, as well as the thickness of each cloud layer and their distribution, and were analyzed monthly and seasonal and diurnal variation, statistical analysis of cloud reflectivity profiles is also made. The analysis covers both non-precipitating clouds and precipitating clouds. Also, some preliminary comparison of the results with Cloudsat/Calipso products for same period and same area are made.
Tourism guide cloud service quality: What actually delights customers?
Lin, Shu-Ping; Yang, Chen-Lung; Pi, Han-Chung; Ho, Thao-Minh
2016-01-01
The emergence of advanced IT and cloud services has beneficially supported the information-intensive tourism industry, simultaneously caused extreme competitions in attracting customers through building efficient service platforms. On response, numerous nations have implemented cloud platforms to provide value-added sightseeing information and personal intelligent service experiences. Despite these efforts, customers' actual perspectives have yet been sufficiently understood. To bridge the gap, this study attempts to investigate what aspects of tourism cloud services actually delight customers' satisfaction and loyalty. 336 valid survey questionnaire answers were analyzed using structural equation modeling method. The results prove positive impacts of function quality, enjoyment, multiple visual aids, and information quality on customers' satisfaction as well as of enjoyment and satisfaction on use loyalty. The findings hope to provide helpful references of customer use behaviors for enhancing cloud service quality in order to achieve better organizational competitiveness.
A support architecture for reliable distributed computing systems
NASA Technical Reports Server (NTRS)
Dasgupta, Partha; Leblanc, Richard J., Jr.
1988-01-01
The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
New experimental measurements of electron clouds in ion beams with large tune depression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molvik, A W; Covo, M K; Cohen, R H
We study electron clouds in high perveance beams (K = 8E-4) with a large tune depression of 0.2 (defined as the ratio of a single particle oscillation response to the applied focusing fields, with and without space charge). These 1 MeV, 180 mA, K+ beams have a beam potential of +2 kV when electron clouds are minimized. Simulation results are discussed in a companion paper [J-L. Vay, this Conference]. We have developed the first diagnostics that quantitatively measure the accumulation of electrons in a beam [1]. This, together with measurements of electron sources, will enable the electron particle balance tomore » be measured, and electron-trapping efficiencies determined. We, along with colleagues from GSI and CERN, have also measured the scaling of gas desorption with beam energy and dE/dx [2]. Experiments where the heavy-ion beam is transported with solenoid magnetic fields, rather than with quadrupole magnetic or electrostatic fields, are being initiated. We will discuss initial results from experiments using electrode sets (in the middle and at the ends of magnets) to either expel or to trap electrons within the magnets. We observe electron oscillations in the last quadrupole magnet when we flood the beam with electrons from an end wall. These oscillations, of order 10 MHz, are observed to grow from the center of the magnet while drifting upstream against the beam, in good agreement with simulations.« less
NASA Astrophysics Data System (ADS)
Lagasio, Martina; Parodi, Antonio; Procopio, Renato; Rachidi, Farhad; Fiori, Elisabetta
2017-04-01
Lightning activity is a characteristic phenomenon of severe weather as confirmed by many studies on different weather regimes that reveal strong interplay between lightning phenomena and extreme rainfall process in thunderstorms. The improvement of the so-called total (i.e. cloud-to-ground and intra-cloud) lightning observation systems in the last decades has allowed to investigate the relationship between the lightning flash rate and the kinematic and microphysical properties of severe hydro-meteorological events characterized by strong convection. V-shape back-building Mesoscale Convective Systems (MCSs) occurring over short periods of time have hit several times the Liguria region located in north-western Italy in the period between October 2010 and November 2014, generating flash-flood events responsible for hundreds of fatalities and millions of euros of damage. All these events showed an area of intense precipitation sweeping an arc of a few degrees around the warm conveyor belt originating about 50-60 km from the Liguria coastline. A second main ingredient was the presence of a convergence line, which supported the development and the maintenance of the aforementioned back-building process. Other common features were the persistence of such geometric configuration for many hours and the associated strong lightning activity. A methodological approach for the evaluation of these types of extreme rainfall and lightning convective events is presented for a back-building MCS event occurred in Genoa in 2014. A microphysics driven ensemble of WRF simulations at cloud-permitting grid spacing (1 km) with different microphysics parameterizations is used and compared to the available observational radar and lightning data. To pursue this aim, the performance of the Lightning Potential Index (LPI) as a measure of the potential for charge generation and separation that leads to lightning occurrence in clouds, is computed and analyzed to gain further physical insight in these V-shape convective processes and to understand its predictive ability.
NASA Technical Reports Server (NTRS)
Bartkus, Tadas; Tsao, Jen-Ching; Struk, Peter
2017-01-01
This paper builds on previous work that compares numerical simulations of mixed-phase icing clouds with experimental data. The model couples the thermal interaction between ice particles and water droplets of the icing cloud with the flowing air of an icing wind tunnel for simulation of NASA Glenn Research Centers (GRC) Propulsion Systems Laboratory (PSL). Measurements were taken during the Fundamentals of Ice Crystal Icing Physics Tests at the PSL tunnel in March 2016. The tests simulated ice-crystal and mixed-phase icing that relate to ice accretions within turbofan engines.
Method and apparatus for measuring purity of noble gases
Austin, Robert
2008-04-01
A device for detecting impurities in a noble gas includes a detection chamber and a source of pulsed ultraviolet light. The pulse of the ultraviolet light is transferred into the detection chamber and onto a photocathode, thereby emitting a cloud of free electrons into the noble gas within the detection chamber. The cloud of electrons is attracted to the opposite end of the detection chamber by a high positive voltage potential at that end and focused onto a sensing anode. If there are impurities in the noble gas, some or all of the electrons within the cloud will bond with the impurity molecules and not reach the sensing anode. Therefore, measuring a lower signal at the sensing anode indicates a higher level of impurities while sensing a higher signal indicates fewer impurities. Impurities in the range of one part per billion can be measured by this device.
CesrTA Retarding Field Analyzer Modeling Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calvey, J.R.; Celata, C.M.; Crittenden, J.A.
2010-05-23
Retarding field analyzers (RFAs) provide an effective measure of the local electron cloud density and energy distribution. Proper interpretation of RFA data can yield information about the behavior of the cloud, as well as the surface properties of the instrumented vacuum chamber. However, due to the complex interaction of the cloud with the RFA itself, understanding these measurements can be nontrivial. This paper examines different methods for interpreting RFA data via cloud simulation programs. Techniques include postprocessing the output of a simulation code to predict the RFA response; and incorporating an RFA model into the cloud modeling program itself.
Multichannel scanning radiometer for remote sensing cloud physical parameters
NASA Technical Reports Server (NTRS)
Curran, R. J.; Kyle, H. L.; Blaine, L. R.; Smith, J.; Clem, T. D.
1981-01-01
A multichannel scanning radiometer developed for remote observation of cloud physical properties is described. Consisting of six channels in the near infrared and one channel in the thermal infrared, the instrument can observe cloud physical parameters such as optical thickness, thermodynamic phase, cloud top altitude, and cloud top temperature. Measurement accuracy is quantified through flight tests on the NASA CV-990 and the NASA WB-57F, and is found to be limited by the harsh environment of the aircraft at flight altitude. The electronics, data system, and calibration of the instrument are also discussed.
Magnetic Field Generation During the Collision of Narrow Plasma Clouds
NASA Astrophysics Data System (ADS)
Sakai, Jun-ichi; Kazimura, Yoshihiro; Haruki, Takayuki
1999-06-01
We investigate the dynamics of the collision of narrow plasma clouds,whose transverse dimension is on the order of the electron skin depth.A 2D3V (two dimensions in space and three dimensions in velocity space)particle-in-cell (PIC) collisionless relativistic code is used toshow the generation of a quasi-staticmagnetic field during the collision of narrow plasma clouds both inelectron-ion and electron-positron (pair) plasmas. The localizedstrong magnetic fluxes result in the generation of the charge separationwith complicated structures, which may be sources of electromagneticas well as Langmuir waves. We also present one applicationof this process, which occurs during coalescence of magnetic islandsin a current sheet of pair plasmas.
Common Faults and Their Prioritization in Small Commercial Buildings: February 2017 - December 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, Stephen M; Kim, Janghyun; Cai, Jie
To support an ongoing project at NREL titled 'An Open, Cloud-Based Platform for Whole-Building Fault Detection and Diagnostics' (work breakdown structure number 3.2.6.18 funded by the Department of Energy Building Technologies Office), this report documents faults that are commonly found in small commercial buildings (with a floor area of 10,000 ft2 or less) based on a literature review and discussions with building commissioning experts. It also provides a list of prioritized faults based on an estimation of the prevalence, energy impact, and financial impact of each fault.
Medeiros, Brian; Nuijens, Louise
2016-05-31
Trade wind regions cover most of the tropical oceans, and the prevailing cloud type is shallow cumulus. These small clouds are parameterized by climate models, and changes in their radiative effects strongly and directly contribute to the spread in estimates of climate sensitivity. This study investigates the structure and variability of these clouds in observations and climate models. The study builds upon recent detailed model evaluations using observations from the island of Barbados. Using a dynamical regimes framework, satellite and reanalysis products are used to compare the Barbados region and the broader tropics. It is shown that clouds in the Barbados region are similar to those across the trade wind regions, implying that observational findings from the Barbados Cloud Observatory are relevant to clouds across the tropics. The same methods are applied to climate models to evaluate the simulated clouds. The models generally capture the cloud radiative effect, but underestimate cloud cover and show an array of cloud vertical structures. Some models show strong biases in the environment of the Barbados region in summer, weakening the connection between the regional biases and those across the tropics. Even bearing that limitation in mind, it is shown that covariations of cloud and environmental properties in the models are inconsistent with observations. The models tend to misrepresent sensitivity to moisture variations and inversion characteristics. These model errors are likely connected to cloud feedback in climate projections, and highlight the importance of the representation of shallow cumulus convection.
Nuijens, Louise
2016-01-01
Trade wind regions cover most of the tropical oceans, and the prevailing cloud type is shallow cumulus. These small clouds are parameterized by climate models, and changes in their radiative effects strongly and directly contribute to the spread in estimates of climate sensitivity. This study investigates the structure and variability of these clouds in observations and climate models. The study builds upon recent detailed model evaluations using observations from the island of Barbados. Using a dynamical regimes framework, satellite and reanalysis products are used to compare the Barbados region and the broader tropics. It is shown that clouds in the Barbados region are similar to those across the trade wind regions, implying that observational findings from the Barbados Cloud Observatory are relevant to clouds across the tropics. The same methods are applied to climate models to evaluate the simulated clouds. The models generally capture the cloud radiative effect, but underestimate cloud cover and show an array of cloud vertical structures. Some models show strong biases in the environment of the Barbados region in summer, weakening the connection between the regional biases and those across the tropics. Even bearing that limitation in mind, it is shown that covariations of cloud and environmental properties in the models are inconsistent with observations. The models tend to misrepresent sensitivity to moisture variations and inversion characteristics. These model errors are likely connected to cloud feedback in climate projections, and highlight the importance of the representation of shallow cumulus convection. PMID:27185925
Radiation belt electron observations following the January 1997 magnetic cloud event
NASA Astrophysics Data System (ADS)
Selesnick, R. S.; Blake, J. B.
Relativistic electrons in the outer radiation belt associated with the January 1997 magnetic cloud event were observed by the HIST instrument on POLAR at kinetic energies from 0.7 to 7 MeV and L shells from 3 to 9. The electron enhancement occurred on a time scale of hours or less throughout the outer radiation belt, except for a more gradual rise in the higher energy electrons at the lower L values indicative of local acceleration and inward radial diffusion. At the higher L values, variations on a time scale of several days following the initial injection on January 10 are consistent with data from geosynchronous orbit and may be an adiabatic response.
Electrostatic plasma lens for focusing negatively charged particle beams.
Goncharov, A A; Dobrovolskiy, A M; Dunets, S M; Litovko, I V; Gushenets, V I; Oks, E M
2012-02-01
We describe the current status of ongoing research and development of the electrostatic plasma lens for focusing and manipulating intense negatively charged particle beams, electrons, and negative ions. The physical principle of this kind of plasma lens is based on magnetic isolation electrons providing creation of a dynamical positive space charge cloud in shortly restricted volume propagating beam. Here, the new results of experimental investigations and computer simulations of wide-aperture, intense electron beam focusing by plasma lens with positive space charge cloud produced due to the cylindrical anode layer accelerator creating a positive ion stream towards an axis system is presented.
NASA Astrophysics Data System (ADS)
Foster, S. Q.; Johnson, R. M.; Randall, D.; Denning, S.; Russell, R.; Gardiner, L.; Hatheway, B.; Genyuk, J.; Bergman, J.
2008-12-01
The need for improving the representation of cloud processes in climate models has been one of the most important limitations of the reliability of climate-change simulations. Now in its third year, the National Science Foundation-funded Center for Multi-scale Modeling of Atmospheric Processes (CMMAP) at Colorado State University is addressing this problem through a revolutionary new approach to representing cloud processes on their native scales, including the cloud-scale interaction processes that are active in cloud systems. CMMAP has set ambitious education and human-resource goals to share basic information about the atmosphere, clouds, weather, climate, and modeling with diverse K-12 and public audiences through its affiliation with the Windows to the Universe (W2U) program at University Corporation for Atmospheric Research (UCAR). W2U web pages are written at three levels in English and Spanish. This information targets learners at all levels, educators, and families who seek to understand and share resources and information about the nature of weather and the climate system, and career role models from related research fields. This resource can also be helpful to educators who are building bridges in the classroom between the sciences, the arts, and literacy. Visitors to the W2U's CMMAP web portal can access a beautiful new clouds image gallery; information about each cloud type and the atmospheric processes that produce them; a Clouds in Art interactive; collections of weather-themed poetry, art, and myths; links to games and puzzles for children; and extensive classroom- ready resources and activities for K-12 teachers. Biographies of CMMAP scientists and graduate students are featured. Basic science concepts important to understanding the atmosphere, such as condensation, atmosphere pressure, lapse rate, and more have been developed, as well as 'microworlds' that enable students to interact with experimental tools while building fundamental knowledge. These resources can be accessed online at no cost by the entire atmospheric science K-12 and informal science education community.
Ifcwall Reconstruction from Unstructured Point Clouds
NASA Astrophysics Data System (ADS)
Bassier, M.; Klein, R.; Van Genechten, B.; Vergauwen, M.
2018-05-01
The automated reconstruction of Building Information Modeling (BIM) objects from point cloud data is still ongoing research. A key aspect is the creation of accurate wall geometry as it forms the basis for further reconstruction of objects in a BIM. After segmenting and classifying the initial point cloud, the labelled segments are processed and the wall topology is reconstructed. However, the preocedure is challenging due to noise, occlusions and the complexity of the input data.In this work, a method is presented to automatically reconstruct consistent wall geometry from point clouds. More specifically, the use of room information is proposed to aid the wall topology creation. First, a set of partial walls is constructed based on classified planar primitives. Next, the rooms are identified using the retrieved wall information along with the floors and ceilings. The wall topology is computed by the intersection of the partial walls conditioned on the room information. The final wall geometry is defined by creating IfcWallStandardCase objects conform the IFC4 standard. The result is a set of walls according to the as-built conditions of a building. The experiments prove that the used method is a reliable framework for wall reconstruction from unstructured point cloud data. Also, the implementation of room information reduces the rate of false positives for the wall topology. Given the walls, ceilings and floors, 94% of the rooms is correctly identified. A key advantage of the proposed method is that it deals with complex rooms and is not bound to single storeys.
Mapping Urban Tree Canopy Cover Using Fused Airborne LIDAR and Satellite Imagery Data
NASA Astrophysics Data System (ADS)
Parmehr, Ebadat G.; Amati, Marco; Fraser, Clive S.
2016-06-01
Urban green spaces, particularly urban trees, play a key role in enhancing the liveability of cities. The availability of accurate and up-to-date maps of tree canopy cover is important for sustainable development of urban green spaces. LiDAR point clouds are widely used for the mapping of buildings and trees, and several LiDAR point cloud classification techniques have been proposed for automatic mapping. However, the effectiveness of point cloud classification techniques for automated tree extraction from LiDAR data can be impacted to the point of failure by the complexity of tree canopy shapes in urban areas. Multispectral imagery, which provides complementary information to LiDAR data, can improve point cloud classification quality. This paper proposes a reliable method for the extraction of tree canopy cover from fused LiDAR point cloud and multispectral satellite imagery data. The proposed method initially associates each LiDAR point with spectral information from the co-registered satellite imagery data. It calculates the normalised difference vegetation index (NDVI) value for each LiDAR point and corrects tree points which have been misclassified as buildings. Then, region growing of tree points, taking the NDVI value into account, is applied. Finally, the LiDAR points classified as tree points are utilised to generate a canopy cover map. The performance of the proposed tree canopy cover mapping method is experimentally evaluated on a data set of airborne LiDAR and WorldView 2 imagery covering a suburb in Melbourne, Australia.
Scargiali, F; Grisafi, F; Busciglio, A; Brucato, A
2011-12-15
The formation of toxic heavy clouds as a result of sudden accidental releases from mobile containers, such as road tankers or railway tank cars, may occur inside urban areas so the problem arises of their consequences evaluation. Due to the semi-confined nature of the dispersion site simplified models may often be inappropriate. As an alternative, computational fluid dynamics (CFD) has the potential to provide realistic simulations even for geometrically complex scenarios since the heavy gas dispersion process is described by basic conservation equations with a reduced number of approximations. In the present work a commercial general purpose CFD code (CFX 4.4 by Ansys(®)) is employed for the simulation of dense cloud dispersion in urban areas. The simulation strategy proposed involves a stationary pre-release flow field simulation followed by a dynamic after-release flow and concentration field simulations. In order to try a generalization of results, the computational domain is modeled as a simple network of straight roads with regularly distributed blocks mimicking the buildings. Results show that the presence of buildings lower concentration maxima and enlarge the side spread of the cloud. Dispersion dynamics is also found to be strongly affected by the quantity of heavy-gas released. Copyright © 2011 Elsevier B.V. All rights reserved.
The diverse use of clouds by CMS
Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...
2015-12-23
The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less
NASA Technical Reports Server (NTRS)
Farrugia, C. J.; Richardson, I. G.; Burlaga, L. F.; Lepping, R. P.; Osherovich, V. A.
1993-01-01
Simultaneous ISEE 3 and IMP 8 spacecraft observations of magnetic fields and flow anisotropies of solar energetic protons and electrons during the passage of an interplanetary magnetic cloud show various particle signature differences at the two spacecraft. These differences are interpretable in terms of the magnetic line topology of the cloud, the connectivity of the cloud field lines to the solar surface, and the interconnection between the magnetic fields of the magnetic clouds and of the earth. These observations are consistent with a magnetic cloud model in which these mesoscale configurations are curved magnetic flux ropes attached at both ends to the sun's surface, extending out to 1 AU.
Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds
NASA Astrophysics Data System (ADS)
Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.
2012-11-01
A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chidong
Motivated by the success of the AMIE/DYNAMO field campaign, which collected unprecedented observations of cloud and precipitation from the tropical Indian Ocean in Octber 2011 – March 2012, this project explored how such observations can be applied to assist the development of global cloud-permitting models through evaluating and correcting model biases in cloud statistics. The main accomplishment of this project were made in four categories: generating observational products for model evaluation, using AMIE/DYNAMO observations to validate global model simulations, using AMIE/DYNAMO observations in numerical studies of cloud-permitting models, and providing leadership in the field. Results from this project provide valuablemore » information for building a seamless bridge between DOE ASR program’s component on process level understanding of cloud processes in the tropics and RGCM focus on global variability and regional extremes. In particular, experience gained from this project would be directly applicable to evaluation and improvements of ACME, especially as it transitions to a non-hydrostatic variable resolution model.« less
NASA Technical Reports Server (NTRS)
Irvine, W. M.; Hjalmarson, A.; Rydbeck, O. E. H.
1981-01-01
The physical conditions and chemical compositions of the gas in interstellar clouds are reviewed in light of the importance of interstellar clouds for star formation and the origin of life. The Orion A region is discussed as an example of a giant molecular cloud where massive stars are being formed, and it is pointed out that conditions in the core of the cloud, with a kinetic temperature of about 75 K and a density of 100,000-1,000,000 molecules/cu cm, may support gas phase ion-molecule chemistry. The Taurus Molecular Clouds are then considered as examples of cold, dark, relatively dense interstellar clouds which may be the birthplaces of solar-type stars and which have been found to contain the heaviest interstellar molecules yet discovered. The molecular species identified in each of these regions are tabulated, including such building blocks of biological monomers as H2O, NH3, H2CO, CO, H2S, CH3CN and H2, and more complex species such as HCOOCH3 and CH3CH2CN.
NAFFS: network attached flash file system for cloud storage on portable consumer electronics
NASA Astrophysics Data System (ADS)
Han, Lin; Huang, Hao; Xie, Changsheng
Cloud storage technology has become a research hotspot in recent years, while the existing cloud storage services are mainly designed for data storage needs with stable high speed Internet connection. Mobile Internet connections are often unstable and the speed is relatively low. These native features of mobile Internet limit the use of cloud storage in portable consumer electronics. The Network Attached Flash File System (NAFFS) presented the idea of taking the portable device built-in NAND flash memory as the front-end cache of virtualized cloud storage device. Modern portable devices with Internet connection have built-in more than 1GB NAND Flash, which is quite enough for daily data storage. The data transfer rate of NAND flash device is much higher than mobile Internet connections[1], and its non-volatile feature makes it very suitable as the cache device of Internet cloud storage on portable device, which often have unstable power supply and intermittent Internet connection. In the present work, NAFFS is evaluated with several benchmarks, and its performance is compared with traditional network attached file systems, such as NFS. Our evaluation results indicate that the NAFFS achieves an average accessing speed of 3.38MB/s, which is about 3 times faster than directly accessing cloud storage by mobile Internet connection, and offers a more stable interface than that of directly using cloud storage API. Unstable Internet connection and sudden power off condition are tolerable, and no data in cache will be lost in such situation.
ELECTRON CLOUD OBSERVATIONS AND CURES IN RHIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
FISCHER,W.; BLASKIEWICZ, M.; HUANG, H.
Since 2001 RHIC has experienced electron cloud effects, which have limited the beam intensity. These include dynamic pressure rises - including pressure instabilities, tune shifts, a reduction of the stability threshold for bunches crossing the transition energy, and possibly incoherent emittance growth. We summarize the main observations in operation and dedicated experiments, as well as countermeasures including baking, NEG coated warm beam pipes, solenoids, bunch patterns, anti-grazing rings, pre-pumped cold beam pipes, scrubbing, and operation with long bunches.
Summary of SLAC's SEY Measurement On Flat Accelerator Wall Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Pimpec, F.; /PSI, Villigen /SLAC
The electron cloud effect (ECE) causes beam instabilities in accelerator structures with intense positively charged bunched beams. Reduction of the secondary electron yield (SEY) of the beam pipe inner wall is effective in controlling cloud formation. We summarize SEY results obtained from flat TiN, TiZrV and Al surfaces carried out in a laboratory environment. SEY was measured after thermal conditioning, as well as after low energy, less than 300 eV, particle exposure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Narayan, Ramesh; Sironi, Lorenzo; Oezel, Feryal
2012-10-01
A dense ionized cloud of gas has been recently discovered to be moving directly toward the supermassive black hole, Sgr A*, at the Galactic center. In 2013 June, at the pericenter of its highly eccentric orbit, the cloud will be approximately 3100 Schwarzschild radii from the black hole and will move supersonically through the ambient hot gas with a velocity of v{sub p} Almost-Equal-To 5400 km s{sup -1}. A bow shock is likely to form in front of the cloud and could accelerate electrons to relativistic energies. We estimate via particle-in-cell simulations the energy distribution of the accelerated electrons andmore » show that the non-thermal synchrotron emission from these electrons might exceed the quiescent radio emission from Sgr A* by a factor of several. The enhanced radio emission should be detectable at GHz and higher frequencies around the time of pericentric passage and in the following months. The bow shock emission is expected to be displaced from the quiescent radio emission of Sgr A* by {approx}33 mas. Interferometric observations could resolve potential changes in the radio image of Sgr A* at wavelengths {approx}< 6 cm.« less
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud
Cianfrocco, Michael A; Leschziner, Andres E
2015-01-01
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969
NASA Astrophysics Data System (ADS)
Borowiec, N.
2013-12-01
Gathering information about the roof shapes of the buildings is still current issue. One of the many sources from which we can obtain information about the buildings is the airborne laser scanning. However, detect information from cloud o points about roofs of building automatically is still a complex task. You can perform this task by helping the additional information from other sources, or based only on Lidar data. This article describes how to detect the building roof only from a point cloud. To define the shape of the roof is carried out in three tasks. The first step is to find the location of the building, the second is the precise definition of the edge, while the third is an indication of the roof planes. First step based on the grid analyses. And the next two task based on Hough Transformation. Hough transformation is a method of detecting collinear points, so a perfect match to determine the line describing a roof. To properly determine the shape of the roof is not enough only the edges, but it is necessary to indicate roofs. Thus, in studies Hough Transform, also served as a tool for detection of roof planes. The only difference is that the tool used in this case is a three-dimensional.
Supervised Outlier Detection in Large-Scale Mvs Point Clouds for 3d City Modeling Applications
NASA Astrophysics Data System (ADS)
Stucker, C.; Richard, A.; Wegner, J. D.; Schindler, K.
2018-05-01
We propose to use a discriminative classifier for outlier detection in large-scale point clouds of cities generated via multi-view stereo (MVS) from densely acquired images. What makes outlier removal hard are varying distributions of inliers and outliers across a scene. Heuristic outlier removal using a specific feature that encodes point distribution often delivers unsatisfying results. Although most outliers can be identified correctly (high recall), many inliers are erroneously removed (low precision), too. This aggravates object 3D reconstruction due to missing data. We thus propose to discriminatively learn class-specific distributions directly from the data to achieve high precision. We apply a standard Random Forest classifier that infers a binary label (inlier or outlier) for each 3D point in the raw, unfiltered point cloud and test two approaches for training. In the first, non-semantic approach, features are extracted without considering the semantic interpretation of the 3D points. The trained model approximates the average distribution of inliers and outliers across all semantic classes. Second, semantic interpretation is incorporated into the learning process, i.e. we train separate inlieroutlier classifiers per semantic class (building facades, roof, ground, vegetation, fields, and water). Performance of learned filtering is evaluated on several large SfM point clouds of cities. We find that results confirm our underlying assumption that discriminatively learning inlier-outlier distributions does improve precision over global heuristics by up to ≍ 12 percent points. Moreover, semantically informed filtering that models class-specific distributions further improves precision by up to ≍ 10 percent points, being able to remove very isolated building, roof, and water points while preserving inliers on building facades and vegetation.
Reconstruction of 3d Models from Point Clouds with Hybrid Representation
NASA Astrophysics Data System (ADS)
Hu, P.; Dong, Z.; Yuan, P.; Liang, F.; Yang, B.
2018-05-01
The three-dimensional (3D) reconstruction of urban buildings from point clouds has long been an active topic in applications related to human activities. However, due to the structures significantly differ in terms of complexity, the task of 3D reconstruction remains a challenging issue especially for the freeform surfaces. In this paper, we present a new reconstruction algorithm which allows the 3D-models of building as a combination of regular structures and irregular surfaces, where the regular structures are parameterized plane primitives and the irregular surfaces are expressed as meshes. The extraction of irregular surfaces starts with an over-segmented method for the unstructured point data, a region growing approach based the adjacent graph of super-voxels is then applied to collapse these super-voxels, and the freeform surfaces can be clustered from the voxels filtered by a thickness threshold. To achieve these regular planar primitives, the remaining voxels with a larger flatness will be further divided into multiscale super-voxels as basic units, and the final segmented planes are enriched and refined in a mutually reinforcing manner under the framework of a global energy optimization. We have implemented the proposed algorithms and mainly tested on two point clouds that differ in point density and urban characteristic, and experimental results on complex building structures illustrated the efficacy of the proposed framework.
CLOUD PEAK CONTIGUOUS, ROCK CREEK, PINEY CREEK, AND LITTLE GOOSE ROADLESS AREAS, WYOMING.
Segerstrom, Kenneth; Brown, Don S.
1984-01-01
On the basis of mineral surveys, study areas surrounding the Cloud Peak Primitive Area in northern Wyoming offer little promise for the occurrence of mineral or energy resources. The geologic setting precludes the existence of deposits of organic fuels. Nonmetallic commodities, such as feldspar, limestone, building stone, clay, sand, and gravel are present, but these materials are readily available nearby in large quantities in more accessible areas.
A cloud-based data network approach for translational cancer research.
Xing, Wei; Tsoumakos, Dimitrios; Ghanem, Moustafa
2015-01-01
We develop a new model and associated technology for constructing and managing self-organizing data to support translational cancer research studies. We employ a semantic content network approach to address the challenges of managing cancer research data. Such data is heterogeneous, large, decentralized, growing and continually being updated. Moreover, the data originates from different information sources that may be partially overlapping, creating redundancies as well as contradictions and inconsistencies. Building on the advantages of elasticity of cloud computing, we deploy the cancer data networks on top of the CELAR Cloud platform to enable more effective processing and analysis of Big cancer data.
Electric potential distributions at the interface between plasmasheet clouds
NASA Technical Reports Server (NTRS)
Evans, D. S.; Roth, M.; Lemaire, J.
1987-01-01
At the interface between two plasma clouds with different densities, temperatures, and/or bulk velocities, there are large charge separation electric fields which can be modeled in the framework of a collisionless theory for tangential discontinuities. Two different classes of layers were identified: the first one corresponds to (stable) ion layers which are thicker than one ion Lamor radius; the second one corresponds to (unstable) electron layers which are only a few electron Larmor radii thick. It is suggested that these thin electron layers with large electric potential gradients (up to 400 mV/m) are the regions where large-amplitude electrostatic waves are spontaneously generated. These waves scatter the pitch angles of the ambient plasmasheet electron into the atmospheric loss cone. The unstable electron layers can therefore be considered as the seat of strong pitch angle scattering for the primary auroral electrons.
Unusual chemical compositions of noctilucent-cloud particle nuclei
NASA Technical Reports Server (NTRS)
Hemenway, C. L.
1973-01-01
Two sounding rocket payloads were launched from the ESRO range in Sweden during a noctilucent cloud display. Large numbers of submicron particles were collected, most of which appear to be made up of a high density material coated with a low density material. Typical electron micrographs are shown. Particle chemical compositions have been measured by use of dispersive X-ray analysis equipment attached to an electron microscope and have revealed that most of the high density particle nuclei have atomic weights greater than iron.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J.-L.; Furman, M.A.; Azevedo, A.W.
2004-04-19
We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.
Reconstruction of Building Outlines in Dense Urban Areas Based on LIDAR Data and Address Points
NASA Astrophysics Data System (ADS)
Jarzabek-Rychard, M.
2012-07-01
The paper presents a comprehensive method for automated extraction and delineation of building outlines in densely built-up areas. A novel approach to outline reconstruction is the use of geocoded building address points. They give information about building location thus highly reduce task complexity. Reconstruction process is executed on 3D point clouds acquired by airborne laser scanner. The method consists of three steps: building detection, delineation and contours refinement. The algorithm is tested against a data set that presents the old market town and its surroundings. The results are discussed and evaluated by comparison to reference cadastral data.
Cloud immersion building shielding factors for US residential structures.
Dickson, E D; Hamby, D M
2014-12-01
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.
Automatic Reconstruction of 3D Building Models from Terrestrial Laser Scanner Data
NASA Astrophysics Data System (ADS)
El Meouche, R.; Rezoug, M.; Hijazi, I.; Maes, D.
2013-11-01
With modern 3D laser scanners we can acquire a large amount of 3D data in only a few minutes. This technology results in a growing number of applications ranging from the digitalization of historical artifacts to facial authentication. The modeling process demands a lot of time and work (Tim Volodine, 2007). In comparison with the other two stages, the acquisition and the registration, the degree of automation of the modeling stage is almost zero. In this paper, we propose a new surface reconstruction technique for buildings to process the data obtained by a 3D laser scanner. These data are called a point cloud which is a collection of points sampled from the surface of a 3D object. Such a point cloud can consist of millions of points. In order to work more efficiently, we worked with simplified models which contain less points and so less details than a point cloud obtained in situ. The goal of this study was to facilitate the modeling process of a building starting from 3D laser scanner data. In order to do this, we wrote two scripts for Rhinoceros 5.0 based on intelligent algorithms. The first script finds the exterior outline of a building. With a minimum of human interaction, there is a thin box drawn around the surface of a wall. This box is able to rotate 360° around an axis in a corner of the wall in search for the points of other walls. In this way we can eliminate noise points. These are unwanted or irrelevant points. If there is an angled roof, the box can also turn around the edge of the wall and the roof. With the different positions of the box we can calculate the exterior outline. The second script draws the interior outline in a surface of a building. By interior outline we mean the outline of the openings like windows or doors. This script is based on the distances between the points and vector characteristics. Two consecutive points with a relative big distance will form the outline of an opening. Once those points are found, the interior outline can be drawn. The designed scripts are able to ensure for simple point clouds: the elimination of almost all noise points and the reconstruction of a CAD model.
NASA Astrophysics Data System (ADS)
Usachev, A. D.; Zobnin, A. V.; Shonenkov, A. V.; Lipaev, A. M.; Molotkov, V. I.; Petrov, O. F.; Fortov, V. E.; Pustyl'nik, M. Y.; Fink, M. A.; Thoma, M. A.; Thomas, H. M.; Padalka, G. I.
2018-01-01
Influence of the elongated dust cloud on the intensities of different neon spectral lines in visible and near ir spectral ranges in the uniform positive column has been experimentally investigated using the Russian-European space apparatus “Plasma Kristall-4” (SA PK-4) on board of the International Space Station (ISS). The investigation was performed in the low pressure (0.5 mbar) direct current (dc, 1 mA) gas discharge in neon. Microgravity allowed us to perform experiments with a large dust cloud in the steady-state regime. To avoid the dust cloud drift in the dc electric field a switching dc polarity discharge mode has been applied. During the experiment a dust cloud of 9 mm in diameter in the discharge tube of 30 mm in diameter with the length of about 100 mm has been observed in the steady-state regime. In this regard, the intensities of neon spectral lines corresponding to 3p → 3s electronic transitions have increased by a factor of 1.4 times, while the intensities of neon spectral lines corresponding to 3d → 3p electronic transitions have increased by a factor of 1.6 times. The observed phenomenon is explained on the basis of the Schottky approach by a self-consistent rising dc electric field in the dusty plasma cloud resulting in an increase of the electron temperature.
Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality
NASA Astrophysics Data System (ADS)
Lee, I.-C.; Tsai, F.
2015-05-01
A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The results presented in this paper demonstrate the potential of using panoramic images to generate 3D point clouds and 3D models. However, it is currently a manual and labor-intensive process. A research is being carried out to Increase the degree of automation of these procedures.
Upper D region chemical kinetic modeling of LORE relaxation times
NASA Astrophysics Data System (ADS)
Gordillo-Vázquez, F. J.; Luque, A.; Haldoupis, C.
2016-04-01
The recovery times of upper D region electron density elevations, caused by lightning-induced electromagnetic pulses (EMP), are modeled. The work was motivated from the need to understand a recently identified narrowband VLF perturbation named LOREs, an acronym for LOng Recovery Early VLF events. LOREs associate with long-living electron density perturbations in the upper D region ionosphere; they are generated by strong EMP radiated from large peak current intensities of ±CG (cloud to ground) lightning discharges, known also to be capable of producing elves. Relaxation model scenarios are considered first for a weak enhancement in electron density and then for a much stronger one caused by an intense lightning EMP acting as an impulsive ionization source. The full nonequilibrium kinetic modeling of the perturbed mesosphere in the 76 to 92 km range during LORE-occurring conditions predicts that the electron density relaxation time is controlled by electron attachment at lower altitudes, whereas above 79 km attachment is balanced totally by associative electron detachment so that electron loss at these higher altitudes is controlled mainly by electron recombination with hydrated positive clusters H+(H2O)n and secondarily by dissociative recombination with NO+ ions, a process which gradually dominates at altitudes >88 km. The calculated recovery times agree fairly well with LORE observations. In addition, a simplified (quasi-analytic) model build for the key charged species and chemical reactions is applied, which arrives at similar results with those of the full kinetic model. Finally, the modeled recovery estimates for lower altitudes, that is <79 km, are in good agreement with the observed short recovery times of typical early VLF events, which are known to be associated with sprites.
Accuracy assessment of building point clouds automatically generated from iphone images
NASA Astrophysics Data System (ADS)
Sirmacek, B.; Lindenbergh, R.
2014-06-01
Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ) and standard deviation (σ) of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m.) and (μ2 = 0.025 m., σ2 = 0.037 m.) for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.
Other satellite atmospheres: Their nature and planetary interactions
NASA Technical Reports Server (NTRS)
Smyth, W. H.
1982-01-01
The Io sodium cloud model was successfully generated to include the time and spatial dependent lifetime sink produced by electron impact ionization as the plasma torus oscillates about the satellite plane, while simultaneously including the additional time dependence introduced by the action of solar radiation pressure on the cloud. Very preliminary model results are discussed and continuing progress in analysis of the peculiar directional features of the sodium cloud is also reported. Significant progress was made in developing a model for the Io potassium cloud and differences anticipated between the potassium and sodium cloud are described. An effort to understand the hydrogen atmosphere associated with Saturn's rings was initiated and preliminary results of a very and study are summarized.
Security and privacy preserving approaches in the eHealth clouds with disaster recovery plan.
Sahi, Aqeel; Lai, David; Li, Yan
2016-11-01
Cloud computing was introduced as an alternative storage and computing model in the health sector as well as other sectors to handle large amounts of data. Many healthcare companies have moved their electronic data to the cloud in order to reduce in-house storage, IT development and maintenance costs. However, storing the healthcare records in a third-party server may cause serious storage, security and privacy issues. Therefore, many approaches have been proposed to preserve security as well as privacy in cloud computing projects. Cryptographic-based approaches were presented as one of the best ways to ensure the security and privacy of healthcare data in the cloud. Nevertheless, the cryptographic-based approaches which are used to transfer health records safely remain vulnerable regarding security, privacy, or the lack of any disaster recovery strategy. In this paper, we review the related work on security and privacy preserving as well as disaster recovery in the eHealth cloud domain. Then we propose two approaches, the Security-Preserving approach and the Privacy-Preserving approach, and a disaster recovery plan. The Security-Preserving approach is a robust means of ensuring the security and integrity of Electronic Health Records, and the Privacy-Preserving approach is an efficient authentication approach which protects the privacy of Personal Health Records. Finally, we discuss how the integrated approaches and the disaster recovery plan can ensure the reliability and security of cloud projects. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Seeley, J.; Romps, D. M.
2015-12-01
Recent work by Singh and O'Gorman has produced a theory for convective available potential energy (CAPE) in radiative-convective equilibrium. In this model, the atmosphere deviates from a moist adiabat—and, therefore, has positive CAPE—because entrainment causes evaporative cooling in cloud updrafts, thereby steepening their lapse rate. This has led to the proposal that CAPE increases with global warming because the strength of evaporative cooling scales according to the Clausius-Clapeyron (CC) relation. However, CAPE could also change due to changes in cloud buoyancy and changes in the entrainment rate, both of which could vary with global warming. To test the relative importance of changes in CAPE due to CC scaling of evaporative cooling, changes in cloud buoyancy, and changes in the entrainment rate, we subject a cloud-resolving model to a suite of natural (and unnatural) forcings. We find that CAPE changes are primarily driven by changes in the strength of evaporative cooling; the effect of changes in the entrainment rate and cloud buoyancy are comparatively small. This builds support for CC scaling of CAPE.
Investigation of diocotron modes in toroidally trapped electron plasmas using non-destructive method
NASA Astrophysics Data System (ADS)
Lachhvani, Lavkesh; Pahari, Sambaran; Sengupta, Sudip; Yeole, Yogesh G.; Bajpai, Manu; Chattopadhyay, P. K.
2017-10-01
Experiments with trapped electron plasmas in a SMall Aspect Ratio Toroidal device (SMARTEX-C) have demonstrated a flute-like mode represented by oscillations on capacitive (wall) probes. Although analogous to diocotron mode observed in linear electron traps, the mode evolution in toroids can have interesting consequences due to the presence of in-homogeneous magnetic field. In SMARTEX-C, the probe signals are observed to undergo transition from small, near-sinusoidal oscillations to large amplitude, non-linear "double-peaked" oscillations. To interpret the wall probe signal and bring forth the dynamics, an expression for the induced current on the probe for an oscillating charge is derived, utilizing Green's Reciprocation Theorem. Equilibrium position, poloidal velocity of the charge cloud, and charge content of the cloud, required to compute the induced current, are estimated from the experiments. Signal through capacitive probes is thereby computed numerically for possible charge cloud trajectories. In order to correlate with experiments, starting with an intuitive guess of the trajectory, the model is evolved and tweaked to arrive at a signal consistent with experimentally observed probe signals. A possible vortex like dynamics is predicted, hitherto unexplored in toroidal geometries, for a limited set of experimental observations from SMARTEX-C. Though heuristic, a useful interpretation of capacitive probe data in terms of charge cloud dynamics is obtained.
Earth observations taken from OV-105 during the STS-99 mission
2000-02-17
S99-E-5555 (17 February 2000) --- As photographed from the Space Shuttle Endeavour, this oblique electronic still image of Earth's horizon reveals a great deal of cloud cover. In the case of the electronic still camera (ESC), as well as film-bearing instruments, clouds naturally obscure views of recognizable land masses. Much of Earth is heavily cloud covered during the current mission and meteorlogists and oceanographers are interested in studying that aspect. However, the Shuttle Radar Topography Mission's other sensing equipment, X-SAR and C-band antennae, are able to penetrate cloud cover and record important topographic data for mapmakers and scientists of other disciplines. In addition to the sensing equipment mentioned above, this mission is supporting the EarthKAM project which utilizes the services of another electronic still camera mounted in Endeavour's windows. Unlike this oblique view, EarthKAM records strictly vertical or nadir imagery of points all over the world. Students across the United States and in France, Germany and Japan are taking photos throughout the STS-99 mission. And they are using these new photos, plus all the images already available in the EarthKAM system, to enhance their classroom learning in Earth and space science, social studies, geography, mathematics and more.
NASA Astrophysics Data System (ADS)
Seo, Byonghoon; Li, Hui; Bellan, Paul
2017-10-01
We are studying magnetized target fusion using an experimental method where an imploding liner compressing a plasma is simulated by a high-speed MHD-driven plasma jet colliding with a gas target cloud. This has the advantage of being non-destructive so orders of magnitude more shots are possible. Since the actual density and temperature are much more modest than fusion-relevant values, the goal is to determine the scaling of the increase in density and temperature when an actual experimental plasma is adiabatically compressed. Two new-developed diagnostics are operating and providing data. The first new diagnostic is a fiber-coupled interferometer which measures line-integrated electron density not only as a function of time, but also as a function of position along the jet. The second new diagnostic is laser Thomson scattering which measures electron density and temperature at the location where the jet collides with the cloud. These diagnostics show that when the jet collides with a target cloud the jet slows down substantially and both the electron density and temperature increase. The experimental measurements are being compared with 3D MHD and hybrid kinetic numerical simulations that model the actual experimental geometry.
Hazard calculations of diffuse reflected laser radiation for the SELENE program
NASA Technical Reports Server (NTRS)
Miner, Gilda A.; Babb, Phillip D.
1993-01-01
The hazards from diffuse laser light reflections off water clouds, ice clouds, and fog and from possible specular reflections off ice clouds were assessed with the American National Standards (ANSI Z136.1-1986) for the free-electron-laser parameters under consideration for the Segmented Efficient Laser Emission for Non-Nuclear Electricity (SELENE) Program. Diffuse laser reflection hazards exist for water cloud surfaces less than 722 m in altitude and ice cloud surfaces less than 850 m in altitude. Specular reflections from ice crystals in cirrus clouds are not probable; however, any specular reflection is a hazard to ground observers. The hazard to the laser operators and any ground observers during heavy fog conditions is of such significant magnitude that the laser should not be operated in fog.
Noctilucent cloud polarimetry: Twilight measurements in a wide range of scattering angles
NASA Astrophysics Data System (ADS)
Ugolnikov, Oleg S.; Maslov, Igor A.; Kozelov, Boris V.; Dlugach, Janna M.
2016-06-01
Wide-field polarization measurements of the twilight sky background during several nights with bright and extended noctilucent clouds in central and northern Russia in 2014 and 2015 are used to build the phase dependence of the degree of polarization of sunlight scattered by cloud particles in a wide range of scattering angles (from 40° to 130°). This range covers the linear polarization maximum near 90° and large-angle slope of the curve. The polarization in this angle range is most sensitive to the particle size. The method of separation of scattering on cloud particles from the twilight background is presented. Results are compared with T-matrix simulations for different sizes and shapes of ice particles; the best-fit model radius of particles (0.06 μm) and maximum radius (about 0.1 μm) are estimated.
D Building FAÇADE Reconstruction Using Handheld Laser Scanning Data
NASA Astrophysics Data System (ADS)
Sadeghi, F.; Arefi, H.; Fallah, A.; Hahn, M.
2015-12-01
3D The three dimensional building modelling has been an interesting topic of research for decades and it seems that photogrammetry methods provide the only economic means to acquire truly 3D city data. According to the enormous developments of 3D building reconstruction with several applications such as navigation system, location based services and urban planning, the need to consider the semantic features (such as windows and doors) becomes more essential than ever, and therefore, a 3D model of buildings as block is not any more sufficient. To reconstruct the façade elements completely, we employed the high density point cloud data that obtained from the handheld laser scanner. The advantage of the handheld laser scanner with capability of direct acquisition of very dense 3D point clouds is that there is no need to derive three dimensional data from multi images using structure from motion techniques. This paper presents a grammar-based algorithm for façade reconstruction using handheld laser scanner data. The proposed method is a combination of bottom-up (data driven) and top-down (model driven) methods in which, at first the façade basic elements are extracted in a bottom-up way and then they are served as pre-knowledge for further processing to complete models especially in occluded and incomplete areas. The first step of data driven modelling is using the conditional RANSAC (RANdom SAmple Consensus) algorithm to detect façade plane in point cloud data and remove noisy objects like trees, pedestrians, traffic signs and poles. Then, the façade planes are divided into three depth layers to detect protrusion, indentation and wall points using density histogram. Due to an inappropriate reflection of laser beams from glasses, the windows appear like holes in point cloud data and therefore, can be distinguished and extracted easily from point cloud comparing to the other façade elements. Next step, is rasterizing the indentation layer that holds the windows and doors information. After rasterization process, the morphological operators are applied in order to remove small irrelevant objects. Next, the horizontal splitting lines are employed to determine floors and vertical splitting lines are employed to detect walls, windows, and doors. The windows, doors and walls elements which are named as terminals are clustered during classification process. Each terminal contains a special property as width. Among terminals, windows and doors are named the geometry tiles in definition of the vocabularies of grammar rules. Higher order structures that inferred by grouping the tiles resulted in the production rules. The rules with three dimensional modelled façade elements constitute formal grammar that is named façade grammar. This grammar holds all the information that is necessary to reconstruct façades in the style of the given building. Thus, it can be used to improve and complete façade reconstruction in areas with no or limited sensor data. Finally, a 3D reconstructed façade model is generated that the accuracy of its geometry size and geometry position depends on the density of the raw point cloud.
NASA Technical Reports Server (NTRS)
Chenette, D. L.; Stone, E. C.
1983-01-01
An analysis of the electron absorption signature observed by the Cosmic Ray System (CRS) on Voyage 2 near the orbit of Mimas is presented. We find that these observations cannot be explained as the absorption signature of Mimas. Combing Pioneer 11 and Voyager 2 measurements of the electron flux at Mimas's orbit (L=3.1), we find an electron spectrum where most of the flux above approx 100 keV is concentrated near 1 to 3 MeV. The expected Mimas absorption signature is calculated from this spectrum neglecting radial diffusion. A lower limit on the diffusion coefficient for MeV electrons is obtained. With a diffusion coefficient this large, both the Voyager 2 and the Pioneer 11 small-scale electron absorption signature observations in Mimas's orbit are enigmatic. Thus we refer to the mechanism for producing these signatures as the Mimas ghost. A cloud of material in orbit with Mimas may account for the observed electron signature if the cloud is at least 1% opaque to electrons across a region extending over a few hundred kilometers.
Mobile healthcare information management utilizing Cloud Computing and Android OS.
Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias
2010-01-01
Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.
NASA Technical Reports Server (NTRS)
Landt, J. A.
1974-01-01
The geometries of dense solar wind clouds are estimated by comparing single-location measurements of the solar wind plasma with the average of the electron density obtained by radio signal delay measurements along a radio path between earth and interplanetary spacecraft. Several of these geometries agree with the current theoretical spatial models of flare-induced shock waves. A new class of spatially limited structures that contain regions with densities greater than any observed in the broad clouds is identified. The extent of a cloud was found to be approximately inversely proportional to its density.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backfish, Michael
This paper documents the use of four retarding field analyzers (RFAs) to measure electron cloud signals created in Fermilab’s Main Injector during 120 GeV operations. The first data set was taken from September 11, 2009 to July 4, 2010. This data set is used to compare two different types of beam pipe that were installed in the accelerator. Two RFAs were installed in a normal steel beam pipe like the rest of the Main Injector while another two were installed in a one meter section of beam pipe that was coated on the inside with titanium nitride (TiN). A secondmore » data run started on August 23, 2010 and ended on January 10, 2011 when Main Injector beam intensities were reduced thus eliminating the electron cloud. This second run uses the same RFA setup but the TiN coated beam pipe was replaced by a one meter section coated with amorphous carbon (aC). This section of beam pipe was provided by CERN in an effort to better understand how an aC coating will perform over time in an accelerator. The research consists of three basic parts: (a) continuously monitoring the conditioning of the three different types of beam pipe over both time and absorbed electrons (b) measurement of the characteristics of the surrounding magnetic fields in the Main Injector in order to better relate actual data observed in the Main Injector with that of simulations (c) measurement of the energy spectrum of the electron cloud signals using retarding field analyzers in all three types of beam pipe.« less
NASA Technical Reports Server (NTRS)
Jensen, Eric J.; Tabazadeh, Azadeh; Drdla, Katja; Toon, Owen B.; Gore, Warren J. (Technical Monitor)
2000-01-01
Recent satellite and in situ measurements have indicated that limited denitrification can occur in the Arctic stratosphere. In situ measurements from the SOLVE campaign indicate polar stratospheric clouds (PSCs) composed of small numbers (about 3 x 10^ -4 cm^-3) of 10-20 micron particles (probably NAT or NAD). These observations raise the issue of whether low number density NAT PSCs can substantially denitrify the air with reasonable cloud lifetimes. In this study, we use a one dimensional cloud model to investigate the verticle redistribution of HNO3 by NAT/NAD PSCs. The cloud formation is driven by a temperature oscillation which drops the temperature below the NAT/NAD formation threshold (about 195 K) for a few days. We assume that a small fraction of the available aerosols act as NAT nuclei when the saturation ratio of HNO3 over NAT(NAD) exceeds 10(l.5). The result is a cloud between about 16 and 20 km in the model, with NAT/NAD particle effective radii as large as about 10 microns (in agreement with the SOLVE data). We find that for typical cloud lifetimes of 2-3 days or less, the net depletion of HNO3 is no more than 1-2 ppbv, regardless of the NAT or NAD particle number density. Repeated passes of the air column through the cold pool build up the denitrification to 3-4 ppbv, and the cloud altitude steadily decreases due to the downward transport of nitric acid. Increasing the cloud lifetime results in considerably more effective denitrification, even with very low cloud particle number densities. As expected, the degree of denitrification by NAT clouds is much larger than that by NAD Clouds. Significant denitrification by NAD Clouds is only possible if the cloud lifetime is several days or more. The clouds also cause a local maximum HNO3 mixing ratio at cloud base where the cloud particles sublimate.
Spontaneous Ad Hoc Mobile Cloud Computing Network
Lacuesta, Raquel; Sendra, Sandra; Peñalver, Lourdes
2014-01-01
Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes. PMID:25202715
Spontaneous ad hoc mobile cloud computing network.
Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes
2014-01-01
Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.
Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improvemore » the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.« less
Automatic Building Abstraction from Aerial Photogrammetry
NASA Astrophysics Data System (ADS)
Ley, A.; Hänsch, R.; Hellwich, O.
2017-09-01
Multi-view stereo has been shown to be a viable tool for the creation of realistic 3D city models. Nevertheless, it still states significant challenges since it results in dense, but noisy and incomplete point clouds when applied to aerial images. 3D city modelling usually requires a different representation of the 3D scene than these point clouds. This paper applies a fully-automatic pipeline to generate a simplified mesh from a given dense point cloud. The mesh provides a certain level of abstraction as it only consists of relatively large planar and textured surfaces. Thus, it is possible to remove noise, outlier, as well as clutter, while maintaining a high level of accuracy.
2001-01-03
KENNEDY SPACE CENTER, Fla. -- Under wispy white morning clouds, Space Shuttle Atlantis approaches Launch Pad 39A, which shows the Rotating Service Structure open (left) and the Fixed Service Structure (right). At the RSS, the payload canister is being lifted up to the Payload Changeout Room. This is the Shuttle’s second attempt at rollout. Jan. 2 a failed computer processor on the crawler transporter aborted the rollout and the Shuttle was returned to the Vehicle Assembly Building using a secondary computer processor on the vehicle. Atlantis will fly on mission STS-98, the seventh construction flight to the International Space Station, carrying the U.S. Laboratory, named Destiny. The lab will have five system racks already installed inside the module. After delivery of electronics in the lab, electrically powered attitude control for Control Moment Gyroscopes will be activated. Atlantis is scheduled for launch no earlier than Jan. 19, 2001, with a crew of five
2006-09-07
KENNEDY SPACE CENTER, FLA. - Storm clouds fill the sky from Launch Pad 39B, at right, west beyond the Vehicle Assembly Building. Space Shuttle Atlantis still sits on the pad after a scrub was called Aug. 27 due to a concern with fuel cell 1. Towering above the shuttle is the 80-foot lightning mast. During the STS-115 mission, Atlantis' astronauts will deliver and install the 17.5-ton, bus-sized P3/P4 integrated truss segment on the station. The girder-like truss includes a set of giant solar arrays, batteries and associated electronics and will provide one-fourth of the total power-generation capability for the completed station. This mission is the 116th space shuttle flight, the 27th flight for orbiter Atlantis, and the 19th U.S. flight to the International Space Station. STS-115 is scheduled to last 11 days with a planned landing at KSC. Photo credit: NASA/Ken Thornsley
NASA Technical Reports Server (NTRS)
LaMothe, J.; Ferland, Gary J.
2002-01-01
Recombination cooling, in which a free electron emits light while being captured to an ion, is an important cooling process in photoionized clouds that are optically thick or have low metallicity. State specific rather than total recombination cooling rates are needed since the hydrogen atom tends to become optically thick in high-density regimes such as Active Galactic Nuclei. This paper builds upon previous work to derive the cooling rate over the full temperature range where the process can be a significant contributor in a photoionized plasma. We exploit the fact that the recombination and cooling rates are given by intrinsically similar formulae to express the cooling rate in terms of the closely related radiative recombination rate. We give an especially simple but accurate approximation that works for any high hydrogenic level and can be conveniently employed in large-scale numerical simulations.
Advanced Opto-Electronics (LIDAR and Microsensor Development)
NASA Technical Reports Server (NTRS)
Vanderbilt, Vern C. (Technical Monitor); Spangler, Lee H.
2005-01-01
Our overall intent in this aspect of the project were to establish a collaborative effort between several departments at Montana State University for developing advanced optoelectronic technology for advancing the state-of-the-art in optical remote sensing of the environment. Our particular focus was on development of small systems that can eventually be used in a wide variety of applications that might include ground-, air-, and space deployments, possibly in sensor networks. Specific objectives were to: 1) Build a field-deployable direct-detection lidar system for use in measurements of clouds, aerosols, fish, and vegetation; 2) Develop a breadboard prototype water vapor differential absorption lidar (DIAL) system based on highly stable, tunable diode laser technology developed previously at MSU. We accomplished both primary objectives of this project, in developing a field-deployable direct-detection lidar and a breadboard prototype of a water vapor DIAL system. Paper summarizes each of these accomplishments.
NASA Astrophysics Data System (ADS)
McGibbon, J.; Bretherton, C. S.
2017-06-01
During the Marine ARM GPCI Investigation of Clouds (MAGIC) in October 2011 to September 2012, a container ship making periodic cruises between Los Angeles, CA, and Honolulu, HI, was instrumented with surface meteorological, aerosol and radiation instruments, a cloud radar and ceilometer, and radiosondes. Here large-eddy simulation (LES) is performed in a ship-following frame of reference for 13 four day transects from the MAGIC field campaign. The goal is to assess if LES can skillfully simulate the broad range of observed cloud characteristics and boundary layer structure across the subtropical stratocumulus to cumulus transition region sampled during different seasons and meteorological conditions. Results from Leg 15A, which sampled a particularly well-defined stratocumulus to cumulus transition, demonstrate the approach. The LES reproduces the observed timing of decoupling and transition from stratocumulus to cumulus and matches the observed evolution of boundary layer structure, cloud fraction, liquid water path, and precipitation statistics remarkably well. Considering the simulations of all 13 cruises, the LES skillfully simulates the mean diurnal variation of key measured quantities, including liquid water path (LWP), cloud fraction, measures of decoupling, and cloud radar-derived precipitation. The daily mean quantities are well represented, and daily mean LWP and cloud fraction show the expected correlation with estimated inversion strength. There is a -0.6 K low bias in LES near-surface air temperature that results in a high bias of 5.6 W m-2 in sensible heat flux (SHF). Overall, these results build confidence in the ability of LES to represent the northeast Pacific stratocumulus to trade cumulus transition region.
Ionisation and discharge in cloud-forming atmospheres of brown dwarfs and extrasolar planets
NASA Astrophysics Data System (ADS)
Helling, Ch; Rimmer, P. B.; Rodriguez-Barrera, I. M.; Wood, Kenneth; Robertson, G. B.; Stark, C. R.
2016-07-01
Brown dwarfs and giant gas extrasolar planets have cold atmospheres with rich chemical compositions from which mineral cloud particles form. Their properties, like particle sizes and material composition, vary with height, and the mineral cloud particles are charged due to triboelectric processes in such dynamic atmospheres. The dynamics of the atmospheric gas is driven by the irradiating host star and/or by the rotation of the objects that changes during its lifetime. Thermal gas ionisation in these ultra-cool but dense atmospheres allows electrostatic interactions and magnetic coupling of a substantial atmosphere volume. Combined with a strong magnetic field \\gg {{B}\\text{Earth}} , a chromosphere and aurorae might form as suggested by radio and x-ray observations of brown dwarfs. Non-equilibrium processes like cosmic ray ionisation and discharge processes in clouds will increase the local pool of free electrons in the gas. Cosmic rays and lighting discharges also alter the composition of the local atmospheric gas such that tracer molecules might be identified. Cosmic rays affect the atmosphere through air showers in a certain volume which was modelled with a 3D Monte Carlo radiative transfer code to be able to visualise their spacial extent. Given a certain degree of thermal ionisation of the atmospheric gas, we suggest that electron attachment to charge mineral cloud particles is too inefficient to cause an electrostatic disruption of the cloud particles. Cloud particles will therefore not be destroyed by Coulomb explosion for the local temperature in the collisional dominated brown dwarf and giant gas planet atmospheres. However, the cloud particles are destroyed electrostatically in regions with strong gas ionisation. The potential size of such cloud holes would, however, be too small and might occur too far inside the cloud to mimic the effect of, e.g. magnetic field induced star spots.
Electron Cloud Effects in Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
Abstract We present a brief summary of various aspects of the electron-cloud effect (ECE) in accelerators. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire ?ECLOUD? series [1?22]. In addition, the proceedings of the various flavors of Particle Accelerator Conferences [23] contain a large number of EC-related publications. The ICFA Beam Dynamics Newsletter series [24] contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC [25].
TRANSPORT EQUATION OF A PLASMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balescu, R.
1960-10-01
It is shown that the many-body problem in plasmas can be handled explicitly. An equation describing the collective effects of the problem is derived. For simplicity, a onecomponent gas is considered in a continuous neutralizing background. The tool for handling the problem is provided by the general theory of irreversible processes in gases. The equation derived describes the interaction of electrons which are"dressed" by a polarization cloud. The polarization cloud differs from the Debye cloud. (B.O.G.)
Yu, Hua-Gen
2008-05-21
A spherical electron cloud hopping (SECH) model is proposed to study the product branching ratios of dissociative recombination (DR) of polyatomic systems. In this model, the fast electron-captured process is treated as an instantaneous hopping of a cloud of uniform spherical fractional point charges onto a target M+q ion (or molecule). The sum of point charges (-1) simulates the incident electron. The sphere radius is determined by a critical distance (Rc eM) between the incoming electron (e-) and the target, at which the potential energy of the e(-)-M+q system is equal to that of the electron-captured molecule M+q(-1) in a symmetry-allowed electronic state with the same structure as M(+q). During the hopping procedure, the excess energies of electron association reaction are dispersed in the kinetic energies of M+q(-1) atoms to conserve total energy. The kinetic energies are adjusted by linearly adding atomic momenta in the direction of driving forces induced by the scattering electron. The nuclear dynamics of the resultant M+q(-1) molecule are studied by using a direct ab initio dynamics method on the adiabatic potential energy surface of M+q(-1), or together with extra adiabatic surface(s) of M+q(-1). For the latter case, the "fewest switches" surface hopping algorithm of Tully was adapted to deal with the nonadiabaticity in trajectory propagations. The SECH model has been applied to study the DR of both CH+ and H3O+(H2O)2. The theoretical results are consistent with the experiment. It was found that water molecules play an important role in determining the product branching ratios of the molecular cluster ion.
Accuracy Analysis of a Dam Model from Drone Surveys
Buffi, Giulia; Venturi, Sara
2017-01-01
This paper investigates the accuracy of models obtained by drone surveys. To this end, this work analyzes how the placement of ground control points (GCPs) used to georeference the dense point cloud of a dam affects the resulting three-dimensional (3D) model. Images of a double arch masonry dam upstream face are acquired from drone survey and used to build the 3D model of the dam for vulnerability analysis purposes. However, there still remained the issue of understanding the real impact of a correct GCPs location choice to properly georeference the images and thus, the model. To this end, a high number of GCPs configurations were investigated, building a series of dense point clouds. The accuracy of these resulting dense clouds was estimated comparing the coordinates of check points extracted from the model and their true coordinates measured via traditional topography. The paper aims at providing information about the optimal choice of GCPs placement not only for dams but also for all surveys of high-rise structures. The knowledge a priori of the effect of the GCPs number and location on the model accuracy can increase survey reliability and accuracy and speed up the survey set-up operations. PMID:28771185
Accuracy Analysis of a Dam Model from Drone Surveys.
Ridolfi, Elena; Buffi, Giulia; Venturi, Sara; Manciola, Piergiorgio
2017-08-03
This paper investigates the accuracy of models obtained by drone surveys. To this end, this work analyzes how the placement of ground control points (GCPs) used to georeference the dense point cloud of a dam affects the resulting three-dimensional (3D) model. Images of a double arch masonry dam upstream face are acquired from drone survey and used to build the 3D model of the dam for vulnerability analysis purposes. However, there still remained the issue of understanding the real impact of a correct GCPs location choice to properly georeference the images and thus, the model. To this end, a high number of GCPs configurations were investigated, building a series of dense point clouds. The accuracy of these resulting dense clouds was estimated comparing the coordinates of check points extracted from the model and their true coordinates measured via traditional topography. The paper aims at providing information about the optimal choice of GCPs placement not only for dams but also for all surveys of high-rise structures. The knowledge a priori of the effect of the GCPs number and location on the model accuracy can increase survey reliability and accuracy and speed up the survey set-up operations.
Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud
Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew
2015-01-01
Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966
Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.
Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew
2015-01-01
Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.
Characterizing Subpixel Spatial Resolution of a Hybrid CMOS Detector
NASA Astrophysics Data System (ADS)
Bray, Evan; Burrows, Dave; Chattopadhyay, Tanmoy; Falcone, Abraham; Hull, Samuel; Kern, Matthew; McQuaide, Maria; Wages, Mitchell
2018-01-01
The detection of X-rays is a unique process relative to other wavelengths, and allows for some novel features that increase the scientific yield of a single observation. Unlike lower photon energies, X-rays liberate a large number of electrons from the silicon absorber array of the detector. This number is usually on the order of several hundred to a thousand for moderate-energy X-rays. These electrons tend to diffuse outward into what is referred to as the charge cloud. This cloud can then be picked up by several pixels, forming a specific pattern based on the exact incident location. By conducting the first ever “mesh experiment" on a hybrid CMOS detector (HCD), we have experimentally determined the charge cloud shape and used it to characterize responsivity of the detector with subpixel spatial resolution.
Theory of plasma contactors in ground-based experiments and low Earth orbit
NASA Technical Reports Server (NTRS)
Gerver, M. J.; Hastings, Daniel E.; Oberhardt, M. R.
1990-01-01
Previous theoretical work on plasma contactors as current collectors has fallen into two categories: collisionless double layer theory (describing space charge limited contactor clouds) and collisional quasineutral theory. Ground based experiments at low current are well explained by double layer theory, but this theory does not scale well to power generation by electrodynamic tethers in space, since very high anode potentials are needed to draw a substantial ambient electron current across the magnetic field in the absence of collisions (or effective collisions due to turbulence). Isotropic quasineutral models of contactor clouds, extending over a region where the effective collision frequency upsilon sub e exceeds the electron cyclotron frequency omega sub ce, have low anode potentials, but would collect very little ambient electron current, much less than the emitted ion current. A new model is presented, for an anisotropic contactor cloud oriented along the magnetic field, with upsilon sub e less than omega sub ce. The electron motion along the magnetic field is nearly collisionless, forming double layers in that direction, while across the magnetic field the electrons diffuse collisionally and the potential profile is determined by quasineutrality. Using a simplified expression for upsilon sub e due to ion acoustic turbulence, an analytic solution has been found for this model, which should be applicable to current collection in space. The anode potential is low and the collected ambient electron current can be several times the emitted ion current.
Federated and Cloud Enabled Resources for Data Management and Utilization
NASA Astrophysics Data System (ADS)
Rankin, R.; Gordon, M.; Potter, R. G.; Satchwill, B.
2011-12-01
The emergence of cloud computing over the past three years has led to a paradigm shift in how data can be managed, processed and made accessible. Building on the federated data management system offered through the Canadian Space Science Data Portal (www.cssdp.ca), we demonstrate how heterogeneous and geographically distributed data sets and modeling tools have been integrated to form a virtual data center and computational modeling platform that has services for data processing and visualization embedded within it. We also discuss positive and negative experiences in utilizing Eucalyptus and OpenStack cloud applications, and job scheduling facilitated by Condor and Star Cluster. We summarize our findings by demonstrating use of these technologies in the Cloud Enabled Space Weather Data Assimilation and Modeling Platform CESWP (www.ceswp.ca), which is funded through Canarie's (canarie.ca) Network Enabled Platforms program in Canada.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center are working on a multi-year Collaborative Research and Development Agreement.With the knowledge developed in the first year on how to provision and manage a federation of virtual machines through Cloud management systems. In this second year, we expanded the work on provisioning and federation, increasing both scale and diversity of solutions, and we started to build on-demand services on the established fabric, introducing the paradigm of Platform as a Service to assist with the execution of scientific workflows. We have enabled scientific workflows ofmore » stakeholders to run on multiple cloud resources at the scale of 1,000 concurrent machines. The demonstrations have been in the areas of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federation of Cloud Resources, and (c) On-demand Services for ScientificWorkflows.« less
Probabilistic Feasibility of the Reconstruction Process of Russian-Orthodox Churches
NASA Astrophysics Data System (ADS)
Chizhova, M.; Brunn, A.; Stilla, U.
2016-06-01
The cultural human heritage is important for the identity of following generations and has to be preserved in a suitable manner. In the course of time a lot of information about former cultural constructions has been lost because some objects were strongly damaged by natural erosion or on account of human work or were even destroyed. It is important to capture still available building parts of former buildings, mostly ruins. This data could be the basis for a virtual reconstruction. Laserscanning offers in principle the possibility to take up extensively surfaces of buildings in its actual status. In this paper we assume a priori given 3d-laserscanner data, 3d point cloud for the partly destroyed church. There are many well known algorithms, that describe different methods of extraction and detection of geometric primitives, which are recognized separately in 3d points clouds. In our work we put them in a common probabilistic framework, which guides the complete reconstruction process of complex buildings, in our case russian-orthodox churches. Churches are modeled with their functional volumetric components, enriched with a priori known probabilities, which are deduced from a database of russian-orthodox churches. Each set of components represents a complete church. The power of the new method is shown for a simulated dataset of 100 russian-orthodox churches.
Building Damage Extraction Triggered by Earthquake Using the Uav Imagery
NASA Astrophysics Data System (ADS)
Li, S.; Tang, H.
2018-04-01
When extracting building damage information, we can only determine whether the building is collapsed using the post-earthquake satellite images. Even the satellite images have the sub-meter resolution, the identification of slightly damaged buildings is still a challenge. As the complementary data to satellite images, the UAV images have unique advantages, such as stronger flexibility and higher resolution. In this paper, according to the spectral feature of UAV images and the morphological feature of the reconstructed point clouds, the building damage was classified into four levels: basically intact buildings, slightly damaged buildings, partially collapsed buildings and totally collapsed buildings, and give the rules of damage grades. In particular, the slightly damaged buildings are determined using the detected roof-holes. In order to verify the approach, we conduct experimental simulations in the cases of Wenchuan and Ya'an earthquakes. By analyzing the post-earthquake UAV images of the two earthquakes, the building damage was classified into four levels, and the quantitative statistics of the damaged buildings is given in the experiments.
The EOS CERES Global Cloud Mask
NASA Technical Reports Server (NTRS)
Berendes, T. A.; Welch, R. M.; Trepte, Q.; Schaaf, C.; Baum, B. A.
1996-01-01
To detect long-term climate trends, it is essential to produce long-term and consistent data sets from a variety of different satellite platforms. With current global cloud climatology data sets, such as the International Satellite Cloud Climatology Experiment (ISCCP) or CLAVR (Clouds from Advanced Very High Resolution Radiometer), one of the first processing steps is to determine whether an imager pixel is obstructed between the satellite and the surface, i.e., determine a cloud 'mask.' A cloud mask is essential to studies monitoring changes over ocean, land, or snow-covered surfaces. As part of the Earth Observing System (EOS) program, a series of platforms will be flown beginning in 1997 with the Tropical Rainfall Measurement Mission (TRMM) and subsequently the EOS-AM and EOS-PM platforms in following years. The cloud imager on TRMM is the Visible/Infrared Sensor (VIRS), while the Moderate Resolution Imaging Spectroradiometer (MODIS) is the imager on the EOS platforms. To be useful for long term studies, a cloud masking algorithm should produce consistent results between existing (AVHRR) data, and future VIRS and MODIS data. The present work outlines both existing and proposed approaches to detecting cloud using multispectral narrowband radiance data. Clouds generally are characterized by higher albedos and lower temperatures than the underlying surface. However, there are numerous conditions when this characterization is inappropriate, most notably over snow and ice of the cloud types, cirrus, stratocumulus and cumulus are the most difficult to detect. Other problems arise when analyzing data from sun-glint areas over oceans or lakes over deserts or over regions containing numerous fires and smoke. The cloud mask effort builds upon operational experience of several groups that will now be discussed.
High-mobility strained organic semiconductors (Conference Presentation)
NASA Astrophysics Data System (ADS)
Takeya, Jun; Matsui, H.; Kubo, T.; Hausermann, Roger
2016-11-01
Small molecular organic semiconductor crystals form interesting electronic systems of periodically arranged "charge clouds" whose mutual electronic coupling determines whether or not electronic states can be coherent over fluctuating molecules. This presentation focuses on two methods to reduce molecular fluctuation, which strongly restricts mobility of highly mobile charge in single-crystal organic transistors. The first example is to apply external hydrostatic pressure. Using Hall-effect measurement for pentacene FETs, which tells us the extent of the electronic coherence, we found a crossover from hopping-like transport of nearly localized charge to band transport of delocalized charge with full coherence. As the result of temperature dependence measurement, it turned out that reduced molecular fluctuation is mainly responsible for the crossover. The second is to apply uniaxial strain to single-crystal organic FETs. We applied stain by bending thin films of newly synthesized decyldinaphthobenzodithiophene (C10-DNBDT) on plastic substrate so that 3% strain is uniaxially applied. As the result, the room-temperature mobility increased by the factor of 1.7. In-depth analysis using X-ray diffraction (XRD) measurements and density functional theory (DFT) calculations reveal the origin to be the suppression of the thermal fluctuation of the individual molecules, which is confirmed by temperature dependent measurements. Our findings show that compressing the crystal structure directly restricts the vibration of the molecules, thus suppressing dynamic disorder, a unique mechanism in organic semiconductors. Since strain can easily be induced during the fabrication process, these findings can directly be exploited to build high performance organic devices.
Data provenance assurance in the cloud using blockchain
NASA Astrophysics Data System (ADS)
Shetty, Sachin; Red, Val; Kamhoua, Charles; Kwiat, Kevin; Njilla, Laurent
2017-05-01
Ever increasing adoption of cloud technology scales up the activities like creation, exchange, and alteration of cloud data objects, which create challenges to track malicious activities and security violations. Addressing this issue requires implementation of data provenance framework so that each data object in the federated cloud environment can be tracked and recorded but cannot be modified. The blockchain technology gives a promising decentralized platform to build tamper-proof systems. Its incorruptible distributed ledger/blockchain complements the need of maintaining cloud data provenance. In this paper, we present a cloud based data provenance framework using block chain which traces data record operations and generates provenance data. We anchor provenance data records into block chain transactions, which provide validation on provenance data and preserve user privacy at the same time. Once the provenance data is uploaded to the global block chain network, it is extremely challenging to tamper the provenance data. Besides, the provenance data uses hashed user identifiers prior to uploading so the blockchain nodes cannot link the operations to a particular user. The framework ensures that the privacy is preserved. We implemented the architecture on ownCloud, uploaded records to blockchain network, stored records in a provenance database and developed a prototype in form of a web service.
NASA Technical Reports Server (NTRS)
Bartkus, Tadas P.; Struk, Peter M.; Tsao, Jen-Ching
2017-01-01
This paper builds on previous work that compares numerical simulations of mixed-phase icing clouds with experimental data. The model couples the thermal interaction between ice particles and water droplets of the icing cloud with the flowing air of an icing wind tunnel for simulation of NASA Glenn Research Centers (GRC) Propulsion Systems Laboratory (PSL). Measurements were taken during the Fundamentals of Ice Crystal Icing Physics Tests at the PSL tunnel in March 2016. The tests simulated ice-crystal and mixed-phase icing that relate to ice accretions within turbofan engines. Experimentally measured air temperature, humidity, total water content, liquid and ice water content, as well as cloud particle size, are compared with model predictions. The model showed good trend agreement with experimentally measured values, but often over-predicted aero-thermodynamic changes. This discrepancy is likely attributed to radial variations that this one-dimensional model does not address. One of the key findings of this work is that greater aero-thermodynamic changes occur when humidity conditions are low. In addition a range of mixed-phase clouds can be achieved by varying only the tunnel humidity conditions, but the range of humidities to generate a mixed-phase cloud becomes smaller when clouds are composed of smaller particles. In general, the model predicted melt fraction well, in particular with clouds composed of larger particle sizes.
A Lab Based Method for Exoplanet Cloud and Aerosol Characterization
NASA Astrophysics Data System (ADS)
Johnson, A. V.; Schneiderman, T. M.; Bauer, A. J. R.; Cziczo, D. J.
2017-12-01
The atmospheres of some smaller, cooler exoplanets, like GJ 1214b, lack strong spectral features. This may suggest the presence of a high, optically thick cloud layer and poses great challenges for atmospheric characterization, but there is hope. The study of extraterrestrial atmospheres with terrestrial based techniques has proven useful for understanding the cloud-laden atmospheres of our solar system. Here we build on this by leveraging laboratory-based, terrestrial cloud particle instrumentation to better understand the microphysical and radiative properties of proposed exoplanet cloud and aerosol particles. The work to be presented focuses on the scattering properties of single particles, that may be representative of those suspended in exoplanet atmospheres, levitated in an Electrodynamic Balance (EDB). I will discuss how we leverage terrestrial based cloud microphysics for exoplanet applications, the instruments for single and ensemble particle studies used in this work, our investigation of ammonium nitrate (NH4NO3) scattering across temperature dependent crystalline phase changes, and the steps we are taking toward the collection of scattering phase functions and polarization of scattered light for exoplanet cloud analogs. Through this and future studies we hope to better understand how upper level cloud and/or aerosol particles in exoplanet atmospheres interact with incoming radiation from their host stars and what atmospheric information may still be obtainable through remote observations when no spectral features are observed.
NASA Technical Reports Server (NTRS)
Kidder, Stanley Q.; Hafner, Jan
1997-01-01
The goal of Project ATLANTA is to derive a better scientific understanding of how land cover changes associated with urbanization affect local and regional climate and air quality. Clouds play a significant role in this relationship. Using GOES images, we found that in a 63-day period (5 July-5 September 1996) there were zero days which were clear for the entire daylight period. Days which are cloud-free in the morning become partly cloudy with small cumulus clouds in the afternoon in response to solar heating. This result casts doubt on the applicability of California-style air quality models which run in perpetual clear skies. Days which are clear in the morning have higher ozone than those which are cloudy in the morning. Using the RAMS model, we found that urbanization increases the skin surface temperature by about 1.0-1.5 C on average under cloudy conditions, with an extreme of +3.5 C. Clouds cool the surface due to their shading effect by 1.5-2.0 C on average, with an extreme of 5.0 C. RAMS simulates well the building stage of the cumulus cloud field, but does poorly in the decaying phase. Next year's work: doing a detailed cloud climatology and developing improved RAMS cloud simulations.
Georeferencing UAS Derivatives Through Point Cloud Registration with Archived Lidar Datasets
NASA Astrophysics Data System (ADS)
Magtalas, M. S. L. Y.; Aves, J. C. L.; Blanco, A. C.
2016-10-01
Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a `skeleton point cloud'. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of CANUPO and manual skeleton clouds were obtained with values for both equal to around 0.67 meters at 1.73 standard deviation.
Applicability Analysis of Cloth Simulation Filtering Algorithm for Mobile LIDAR Point Cloud
NASA Astrophysics Data System (ADS)
Cai, S.; Zhang, W.; Qi, J.; Wan, P.; Shao, J.; Shen, A.
2018-04-01
Classifying the original point clouds into ground and non-ground points is a key step in LiDAR (light detection and ranging) data post-processing. Cloth simulation filtering (CSF) algorithm, which based on a physical process, has been validated to be an accurate, automatic and easy-to-use algorithm for airborne LiDAR point cloud. As a new technique of three-dimensional data collection, the mobile laser scanning (MLS) has been gradually applied in various fields, such as reconstruction of digital terrain models (DTM), 3D building modeling and forest inventory and management. Compared with airborne LiDAR point cloud, there are some different features (such as point density feature, distribution feature and complexity feature) for mobile LiDAR point cloud. Some filtering algorithms for airborne LiDAR data were directly used in mobile LiDAR point cloud, but it did not give satisfactory results. In this paper, we explore the ability of the CSF algorithm for mobile LiDAR point cloud. Three samples with different shape of the terrain are selected to test the performance of this algorithm, which respectively yields total errors of 0.44 %, 0.77 % and1.20 %. Additionally, large area dataset is also tested to further validate the effectiveness of this algorithm, and results show that it can quickly and accurately separate point clouds into ground and non-ground points. In summary, this algorithm is efficient and reliable for mobile LiDAR point cloud.
First Steps to Automated Interior Reconstruction from Semantically Enriched Point Clouds and Imagery
NASA Astrophysics Data System (ADS)
Obrock, L. S.; Gülch, E.
2018-05-01
The automated generation of a BIM-Model from sensor data is a huge challenge for the modeling of existing buildings. Currently the measurements and analyses are time consuming, allow little automation and require expensive equipment. We do lack an automated acquisition of semantical information of objects in a building. We are presenting first results of our approach based on imagery and derived products aiming at a more automated modeling of interior for a BIM building model. We examine the building parts and objects visible in the collected images using Deep Learning Methods based on Convolutional Neural Networks. For localization and classification of building parts we apply the FCN8s-Model for pixel-wise Semantic Segmentation. We, so far, reach a Pixel Accuracy of 77.2 % and a mean Intersection over Union of 44.2 %. We finally use the network for further reasoning on the images of the interior room. We combine the segmented images with the original images and use photogrammetric methods to produce a three-dimensional point cloud. We code the extracted object types as colours of the 3D-points. We thus are able to uniquely classify the points in three-dimensional space. We preliminary investigate a simple extraction method for colour and material of building parts. It is shown, that the combined images are very well suited to further extract more semantic information for the BIM-Model. With the presented methods we see a sound basis for further automation of acquisition and modeling of semantic and geometric information of interior rooms for a BIM-Model.
Secure Skyline Queries on Cloud Platform.
Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian
2017-04-01
Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions.
A lineage CLOUD for neoblasts.
Tran, Thao Anh; Gentile, Luca
2018-05-10
In planarians, pluripotency can be studied in vivo in the adult animal, making these animals a unique model system where pluripotency-based regeneration (PBR)-and its therapeutic potential-can be investigated. This review focuses on recent findings to build a cloud model of fate restriction likelihood for planarian stem and progenitor cells. Recently, a computational approach based on functional and molecular profiling at the single cell level was proposed for human hematopoietic stem cells. Based on data generated both in vivo and ex vivo, we hypothesized that planarian stem cells could acquire multiple direction lineage biases, following a "badlands" landscape. Instead of a discrete tree-like hierarchy, where the potency of stem/progenitor cells reduces stepwise, we propose a Continuum of LOw-primed UnDifferentiated Planarian Stem/Progenitor Cells (CLOUD-PSPCs). Every subclass of neoblast/progenitor cells is a cloud of likelihood, as the single cell transcriptomics data indicate. The CLOUD-HSPCs concept was substantiated by in vitro data from cell culture; therefore, to confirm the CLOUD-PSPCs model, the planarian community needs to develop new tools, like live cell tracking. Future studies will allow a deeper understanding of PBR in planarian, and the possible implications for regenerative therapies in human. Copyright © 2018 Elsevier Ltd. All rights reserved.
Developing cloud applications using the e-Science Central platform.
Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek
2013-01-28
This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction.
Developing cloud applications using the e-Science Central platform
Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek
2013-01-01
This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction. PMID:23230161
75 FR 1339 - Information Systems Technical Advisory Committee; Notice of Partially Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-11
...), Building 33, Cloud Room, 53560 Hull Street, San Diego, California 92152. The Committee advises the Office... Session 1. Welcome and Introduction. 2. Working Groups Reports. 3. Industry Presentations. 4. New Business...
NOAA - Western Regional Center
open a larger version of the photo. The complete Western Regional Center consists of nine buildings of a cloud with text about 2018 Open house. Links to Open House web page. Privacy Policy | FOIA
High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthik, Rajasekar
2014-01-01
In this paper, an architecture for building Scalable And Mobile Environment For High-Performance Computing with spatial capabilities called SAME4HPC is described using cutting-edge technologies and standards such as Node.js, HTML5, ECMAScript 6, and PostgreSQL 9.4. Mobile devices are increasingly becoming powerful enough to run high-performance apps. At the same time, there exist a significant number of low-end and older devices that rely heavily on the server or the cloud infrastructure to do the heavy lifting. Our architecture aims to support both of these types of devices to provide high-performance and rich user experience. A cloud infrastructure consisting of OpenStack withmore » Ubuntu, GeoServer, and high-performance JavaScript frameworks are some of the key open-source and industry standard practices that has been adopted in this architecture.« less
The mated Pegasus XL rocket - AIM spacecraft leaves Building 165
2007-04-16
The mated Pegasus XL rocket - AIM spacecraft is moved onto a transporter in Building 1655 at Vandenberg Air Force Base in California. The launch vehicle will be transferred to a waiting Orbital Sciences Stargazer L-1011 aircraft for launch. AIM, which stands for Aeronomy of Ice in the Mesosphere, is being prepared for integrated testing and a flight simulation. The AIM spacecraft will fly three instruments designed to study polar mesospheric clouds located at the edge of space, 50 miles above the Earth's surface in the coldest part of the planet's atmosphere. The mission's primary goal is to explain why these clouds form and what has caused them to become brighter and more numerous and appear at lower latitudes in recent years. AIM's results will provide the basis for the study of long-term variability in the mesospheric climate and its relationship to global climate change. Launch is scheduled for April 25.
The mated Pegasus XL rocket - AIM spacecraft leaves Building 165
2007-04-16
The mated Pegasus XL rocket - AIM spacecraft leaves Building 1655 at Vandenberg Air Force Base in California. The rocket will be transferred to a waiting Orbital Sciences Stargazer L-1011 aircraft for launch. AIM, which stands for Aeronomy of Ice in the Mesosphere, is being prepared for integrated testing and a flight simulation. The AIM spacecraft will fly three instruments designed to study polar mesospheric clouds located at the edge of space, 50 miles above the Earth's surface in the coldest part of the planet's atmosphere. The mission's primary goal is to explain why these clouds form and what has caused them to become brighter and more numerous and appear at lower latitudes in recent years. AIM's results will provide the basis for the study of long-term variability in the mesospheric climate and its relationship to global climate change. Launch is scheduled for April 25.
Coupled fvGCM-GCE Modeling System, 3D Cloud-Resolving Model and Cloud Library
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2005-01-01
Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional singlecolumn models in simulating various types of clouds and cloud systems from Merent geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloudscale model (termed a super-parameterization or multiscale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameteridon NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D Goddard cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF in being developed and production nms will be conducted at the beginning of 2005. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes, (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), (3) A cloud library generated by Goddard MMF, and 3D GCE model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.
NASA Technical Reports Server (NTRS)
Winker, David M.
1999-01-01
Current uncertainties in the effects of clouds and aerosols on the Earth radiation budget limit our understanding of the climate system and the potential for global climate change. Pathfinder Instruments for Cloud and Aerosol Spaceborne Observations - Climatologie Etendue des Nuages et des Aerosols (PICASSO-CENA) is a recently approved satellite mission within NASA's Earth System Science Pathfinder (ESSP) program which will address these uncertainties with a unique suite of active and passive instruments. The Lidar In-space Technology Experiment (LITE) demonstrated the potential benefits of space lidar for studies of clouds and aerosols. PICASSO-CENA builds on this experience with a payload consisting of a two-wavelength polarization-sensitive lidar, an oxygen A-band spectrometer (ABS), an imaging infrared radiometer (IIR), and a wide field camera (WFC). Data from these instruments will be used to measure the vertical distributions of aerosols and clouds in the atmosphere, as well as optical and physical properties of aerosols and clouds which influence the Earth radiation budget. PICASSO-CENA will be flown in formation with the PM satellite of the NASA Earth Observing System (EOS) to provide a comprehensive suite of coincident measurements of atmospheric state, aerosol and cloud optical properties, and radiative fluxes. The mission will address critical uncertainties iin the direct radiative forcing of aerosols and clouds as well as aerosol influences on cloud radiative properties and cloud-climate radiation feedbacks. PICASSO-CENA is planned for a three year mission, with a launch in early 2003. PICASSO-CENA is being developed within the framework of a collaboration between NASA and CNES.
Numberical simulation of the effects of radially injected barium plasma in the ionosphere
NASA Technical Reports Server (NTRS)
Swift, D. W.
1985-01-01
The morphology of the ion cloud in the radial shaped charge barium injection was studied. The shape of the ion cloud that remains after the explosive products and neutral barium clears away was examined. The ion cloud which has the configuration of a rimless wagon wheel is shown. The major features are the 2.5 km radius black hole in the center of the cloud, the surrounding ring of barium ion and the spokes of barium ionization radiating away from the center. The cloud shows no evolution after it emerges from the neutral debris and it is concluded that it is formed within 5 seconds of the event. A numerical model is used to calculate the motion of ions and electrons subject to the electrostatic and lorenz forces.
Formation of a knudsen layer in electronically induced desorption
NASA Astrophysics Data System (ADS)
Sibold, D.; Urbassek, H. M.
1992-10-01
For intense desorption fluxes, particles desorbed by electronic transitions (DIET) from a surface into a vacuum may thermalize in the gas cloud forming above the surface. In immediate vicinity to the surface, however, a non-equilibrium layer (the Knudsen layer) exists which separates the recently desorbed, non-thermal particles from the thermalized gas cloud. We investigate by Monte Carlo computer simulation the time it takes to form a Knudsen layer, and its properties. It is found that a Knudsen layer, and thus also a thermalized gas cloud, is formed after around 200 mean free flight times of the desorbing particles, corresponding to a desorption of 20 monolayers. At the end of the Knudsen layer, the gas density will be higher, and the flow velocity and temperature smaller, than literature values indicate for thermal desorption. These data are of fundamental interest for the modeling of gas-kinetic and gas-dynamic effects in DIET.
2001-06-06
X-rays diffracted from a well-ordered protein crystal create sharp patterns of scattered light on film. A computer can use these patterns to generate a model of a protein molecule. To analyze the selected crystal, an X-ray crystallographer shines X-rays through the crystal. Unlike a single dental X-ray, which produces a shadow image of a tooth, these X-rays have to be taken many times from different angles to produce a pattern from the scattered light, a map of the intensity of the X-rays after they diffract through the crystal. The X-rays bounce off the electron clouds that form the outer structure of each atom. A flawed crystal will yield a blurry pattern; a well-ordered protein crystal yields a series of sharp diffraction patterns. From these patterns, researchers build an electron density map. With powerful computers and a lot of calculations, scientists can use the electron density patterns to determine the structure of the protein and make a computer-generated model of the structure. The models let researchers improve their understanding of how the protein functions. They also allow scientists to look for receptor sites and active areas that control a protein's function and role in the progress of diseases. From there, pharmaceutical researchers can design molecules that fit the active site, much like a key and lock, so that the protein is locked without affecting the rest of the body. This is called structure-based drug design.
Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan
2010-01-01
To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
Coupled fvGCM-GCE Modeling System, TRMM Latent Heating and Cloud Library
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2004-01-01
Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to imiprove the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D GCE model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF will be developed by the end of 2004 and production runs will be conducted at the beginning of 2005. The purpose of this proposal is to augment the current Goddard MMF and other cloud modeling activities. I this talk, I will present: (1) A summary of the second Cloud Modeling Workshop took place at NASA Goddard, (2) A summary of the third TRMM Latent Heating Workshop took place at Nara Japan, (3) A brief discussion on the Goddard research plan of using Weather Research Forecast (WRF) model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.
Coupled fvGCM-GCE Modeling System: TRMM Latent Heating and Cloud Library
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2005-01-01
Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D GCE model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF will be developed by the end of 2004 and production runs will be conducted at the beginning of 2005. The purpose of this proposal is to augment the current Goddard MMF and other cloud modeling activities. In this talk, I will present: (1) A summary of the second Cloud Modeling Workshop took place at NASA Goddard, (2) A summary of the third TRMM Latent Heating Workshop took place at Nara Japan, (3) A brief discussion on the GCE model on developing a global cloud simulator.
Coupled fvGCM-GCE Modeling System, 3D Cloud-Resolving Model and Cloud Library
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2005-01-01
Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud- resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF in being developed and production runs will be conducted at the beginning of 2005. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes, ( 2 ) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), (3) A cloud library generated by Goddard MMF, and 3D GCE model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.
Providing Access and Visualization to Global Cloud Properties from GEO Satellites
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.; Ayers, J. K.
2015-12-01
Providing public access to cloud macro and microphysical properties is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and method that allows end users to easily browse and access cloud information that is otherwise difficult to acquire and manipulate. The core of the tool is an application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the dynamically generated imagery as an input into their own work flows for both image generation and cloud product requisition. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite derived cloud product information available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider. Using the Open Geospatial Consortium's Web Processing Service as our access method, we describe a system that uses a hybrid local and cloud based parallel processing system that can return both satellite imagery and cloud product imagery as well as the binary data used to generate them in multiple formats. The images and cloud products are sourced from multiple satellites and also "merged" datasets created by temporally and spatially matching satellite sensors. Finally, the tool and API allow users to access information that spans the time ranges that our group has information available. In the case of satellite imagery, the temporal range can span the entire lifetime of the sensor.
Scheduling multimedia services in cloud computing environment
NASA Astrophysics Data System (ADS)
Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing
2018-02-01
Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.
Line segment extraction for large scale unorganized point clouds
NASA Astrophysics Data System (ADS)
Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan
2015-04-01
Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.
Electronic torsional sound in linear atomic chains: Chemical energy transport at 1000 km/s
NASA Astrophysics Data System (ADS)
Kurnosov, Arkady A.; Rubtsov, Igor V.; Maksymov, Andrii O.; Burin, Alexander L.
2016-07-01
We investigate entirely electronic torsional vibrational modes in linear cumulene chains. The carbon nuclei of a cumulene are positioned along the primary axis so that they can participate only in the transverse and longitudinal motions. However, the interatomic electronic clouds behave as a torsion spring with remarkable torsional stiffness. The collective dynamics of these clouds can be described in terms of electronic vibrational quanta, which we name torsitons. It is shown that the group velocity of the wavepacket of torsitons is much higher than the typical speed of sound, because of the small mass of participating electrons compared to the atomic mass. For the same reason, the maximum energy of the torsitons in cumulenes is as high as a few electronvolts, while the minimum possible energy is evaluated as a few hundred wavenumbers and this minimum is associated with asymmetry of zero point atomic vibrations. Theory predictions are consistent with the time-dependent density functional theory calculations. Molecular systems for experimental evaluation of the predictions are proposed.
Observation of thermal quench induced by runaway electrons in magnetic perturbation
NASA Astrophysics Data System (ADS)
Cheon, MunSeong; Seo, Dongcheol; Kim, Junghee
2018-04-01
Experimental observations in Korea Superconducting Tokamak Advanced Research (KSTAR) plasmas show that a loss of pre-disruptive runaway electrons can induce a rapid radiative cooling of the plasma, by generating impurity clouds from the first wall. The synchrotron radiation image shows that the loss of runaway electrons occurs from the edge region when the resonant magnetic perturbation is applied on the plasma. When the impact of the runaway electrons on the wall is strong enough, a sudden drop of the electron cyclotron emission (ECE) signal occurs with the characteristic plasma behaviors such as the positive spike and following decay of the plasma current, Dα spike, big magnetic fluctuation, etc. The visible images at this runaway loss show an evidence of the generation of impurity cloud and the following radiative cooling. When the runaway beam is located on the plasma edge, thermal quenches are expected to occur without global destruction of the magnetic structure up to the core.
Cloud-Aerosol Transport System (CATS)
Atmospheric Science Data Center
2017-04-18
... build-to-cost project development with streamlined management structure. Conducted successful underflights of opportunity ... (CPL) on the ER-2 on Feb 10, 17, 20 and 21. For more information, please see the CATS homepage or the attached presentation ...
Analysis of the Metal Oxide Space Clouds (MOSC) HF Propagation Environment
NASA Astrophysics Data System (ADS)
Jackson-Booth, N.; Selzer, L.
2015-12-01
Artificial Ionospheric Modification (AIM) attempts to modify the ionosphere in order to alter the high frequency (HF) propagation environment. It can be achieved through injections of aerosols, chemicals or radio (RF) signals into the ionosphere. The Metal Oxide Space Clouds (MOSC) experiment was undertaken in April/May 2013 to investigate chemical AIM. Two sounding rockets were launched from the Kwajalein Atoll (part of the Marshall Islands) and each released a cloud of vaporized samarium (Sm). The samarium created a localized plasma cloud, with increased electron density, which formed an additional ionospheric layer. The ionospheric effects were measured by a wide range of ground based instrumentation which included a network of high frequency (HF) sounders. Chirp transmissions were made from three atolls and received at five sites within the Marshall Islands. One of the receive sites consisted of an 18 antenna phased array, which was used for direction finding. The ionograms have shown that as well as generating a new layer the clouds created anomalous RF propagation paths, which interact with both the cloud and the F-layer, resulting in 'ghost traces'. To fully understand the propagation environment a 3D numerical ray trace has been undertaken, using a variety of background ionospheric and cloud models, to find the paths through the electron density grid for a given fan of elevation and azimuth firing angles. Synthetic ionograms were then produced using the ratio of ray path length to speed of light as an estimation of the delay between transmission and observation for a given frequency of radio wave. This paper reports on the latest analysis of the MOSC propagation environment, comparing theory with observations, to further understanding of AIM.
Ionisation in ultra-cool, cloud forming extrasolar planetary atmospheres
NASA Astrophysics Data System (ADS)
Helling, Christiane; the LEAP Team
2015-04-01
Transit spectroscopy provides evidence that extrasolare planets are covered in clouds, a finding that has been forecast by cloud model simulations 15 years ago. Atmospheres are strongly affected by clouds through their large opacity and their chemical activity. Cloud formation models allow to predict cloud particle sizes, their chemical composition and the composition of the remaining atmospheric gas (Woitke & Helling 2004, A&A 414; Helling & Woitke 2006, A&A 455), for example, as input for radiative transfer codes like Drift-Phoenix (Witte et al. 2009; A&A 506). These cloud particles are charged and can discharge, for example in form of lighting (Helling et al. 2013, ApJ 767; Bailey et al. 2014, ApJ 784). Earth observations demonstrate that lighting effects not only the local chemistry but also the electron budget of the atmosphere. This talk will present our work on cloud formation modelling and ionisation processes in cloud forming atmospheres. An hierarchy of ionisation processes leads to a vertically inhomogenously ionised atmosphere which has implications for planetary mass loss and global circulation pattern of planetary atmospheres. Processes involved, like Cosmic Ray ionisation, do also activate the local chemistry such that large hydrocarbon molecules form (Rimmer et al. 2014, IJAsB 13).
O2 A Band Studies for Cloud Detection and Algorithm Improvement
NASA Technical Reports Server (NTRS)
Chance, K. V.
1996-01-01
Detection of cloud parameters from space-based spectrometers can employ the vibrational bands of O2 in the (sup b1)Sigma(sub +)(sub g) yields X(sub 3) Sigma(sup -)(sub g) spin-forbidden electronic transition manifold, particularly the Delta nu = 0 A band. The GOME instrument uses the A band in the Initial Cloud Fitting Algorithm (ICFA). The work reported here consists of making substantial improvements in the line-by-line spectral database for the A band, testing whether an additional correction to the line shape function is necessary in order to correctly model the atmospheric transmission in this band, and calculating prototype cloud and ground template spectra for comparison with satellite measurements.
NASA Technical Reports Server (NTRS)
Suomi, V. E.
1975-01-01
The complete output of the Synchronous Meteorological Satellite was recorded on one inch magnetic tape. A quality control subsystem tests cloud track vectors against four sets of criteria: (1) rejection if best match occurs on correlation boundary; (2) rejection if major correlation peak is not distinct and significantly greater than secondary peak; (3) rejection if correlation is not persistent; and (4) rejection if acceleration is too great. A cloud height program determines cloud optical thickness from visible data and computer infrared emissivity. From infrared data and temperature profile, cloud height is determined. A functional description and electronic schematics of equipment are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maté, Belén; Molpeceres, Germán; Jiménez-Redondo, Miguel
2016-11-01
The effects of cosmic rays on the carriers of the interstellar 3.4 μ m absorption band have been investigated in the laboratory. This band is attributed to stretching vibrations of CH{sub 3} and CH{sub 2} in carbonaceous dust. It is widely observed in the diffuse interstellar medium, but disappears in dense clouds. Destruction of CH{sub 3} and CH{sub 2} by cosmic rays could become relevant in dense clouds, shielded from the external ultraviolet field. For the simulations, samples of hydrogenated amorphous carbon (a-C:H) have been irradiated with 5 keV electrons. The decay of the band intensity versus electron fluence reflectsmore » a-C:H dehydrogenation, which is well described by a model assuming that H{sub 2} molecules, formed by the recombination of H atoms liberated through CH bond breaking, diffuse out of the sample. The CH bond destruction rates derived from the present experiments are in good accordance with those from previous ion irradiation experiments of HAC. The experimental simplicity of electron bombardment has allowed the use of higher-energy doses than in the ion experiments. The effects of cosmic rays on the aliphatic components of cosmic dust are found to be small. The estimated cosmic-ray destruction times for the 3.4 μ m band carriers lie in the 10{sup 8} yr range and cannot account for the disappearance of this band in dense clouds, which have characteristic lifetimes of 3 × 10{sup 7} yr. The results invite a more detailed investigation of the mechanisms of CH bond formation and breaking in the intermediate region between diffuse and dense clouds.« less
Community Seismic Network (CSN)
NASA Astrophysics Data System (ADS)
Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.
2012-12-01
We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging between five and 22 stories tall have been constructed using Google SketchUp. Ambient vibration records are used to identify the first set of horizontal vibrational modal frequencies of the buildings. These frequencies are used to compute the response on every floor of the building, given either observed data or scenario ground motion input at the buildings' base.
A secure medical data exchange protocol based on cloud environment.
Chen, Chin-Ling; Yang, Tsai-Tung; Shih, Tzay-Farn
2014-09-01
In recent years, health care technologies already became matured such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concern issue. In spite of many literatures discussed about medical systems, but these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a secure medical data exchange protocol based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples use medical resources on the cloud environment to seek medical advice conveniently.
Integrated Building Management System (IBMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anita Lewis
This project provides a combination of software and services that more easily and cost-effectively help to achieve optimized building performance and energy efficiency. Featuring an open-platform, cloud- hosted application suite and an intuitive user experience, this solution simplifies a traditionally very complex process by collecting data from disparate building systems and creating a single, integrated view of building and system performance. The Fault Detection and Diagnostics algorithms developed within the IBMS have been designed and tested as an integrated component of the control algorithms running the equipment being monitored. The algorithms identify the normal control behaviors of the equipment withoutmore » interfering with the equipment control sequences. The algorithms also work without interfering with any cooperative control sequences operating between different pieces of equipment or building systems. In this manner the FDD algorithms create an integrated building management system.« less
Self-sustained oscillations in nanoelectromechanical systems induced by Kondo resonance
NASA Astrophysics Data System (ADS)
Song, Taegeun; Kiselev, Mikhail N.; Kikoin, Konstantin; Shekhter, Robert I.; Gorelik, Leonid Y.
2014-03-01
We investigate the instability and dynamical properties of nanoelectromechanical systems represented by a single-electron device containing movable quantum dots attached to a vibrating cantilever via asymmetric tunnel contacts. The Kondo resonance in electron tunneling between the source and shuttle facilitates self-sustained oscillations originating from the strong coupling of mechanical and electronic/spin degrees of freedom. We analyze a stability diagram for the two-channel Kondo shuttling regime due to limitations given by the electromotive force acting on a moving shuttle, and find that the saturation oscillation amplitude is associated with the retardation effect of the Kondo cloud. The results shed light on possible ways to experimentally realize the Kondo-cloud dynamical probe by using high mechanical dissipation tunability as well as supersensitive detection of mechanical displacement.
From Faddeev-Kulish to LSZ. Towards a non-perturbative description of colliding electrons
NASA Astrophysics Data System (ADS)
Dybalski, Wojciech
2017-12-01
In a low energy approximation of the massless Yukawa theory (Nelson model) we derive a Faddeev-Kulish type formula for the scattering matrix of N electrons and reformulate it in LSZ terms. To this end, we perform a decomposition of the infrared finite Dollard modifier into clouds of real and virtual photons, whose infrared divergencies mutually cancel. We point out that in the original work of Faddeev and Kulish the clouds of real photons are omitted, and consequently their wave-operators are ill-defined on the Fock space of free electrons. To support our observations, we compare our final LSZ expression for N = 1 with a rigorous non-perturbative construction due to Pizzo. While our discussion contains some heuristic steps, they can be formulated as clear-cut mathematical conjectures.
NASA Astrophysics Data System (ADS)
Gacal, G. F. B.; Tan, F.; Antioquia, C. T.; Lagrosas, N.
2014-12-01
Cloud detection during nighttime poses a real problem to researchers because of a lack of optimum sensors that can specifically detect clouds during this time of the day. Hence, lidars and satellites are currently some of the instruments that are being utilized to determine cloud presence in the atmosphere. These clouds play a significant role in the night weather system for the reason that they serve as barriers of thermal radiation from the Earth and thereby reflecting this radiation back to the Earth. This effectively lowers the rate of decreasing temperature in the atmosphere at night. The objective of this study is to detect cloud occurrences at nighttime for the purpose of studying patterns of cloud occurrence and the effects of clouds on local weather. In this study, a commercial camera (Canon Powershot A2300) is operated continuously to capture nighttime clouds. The camera is situated inside a weather-proof box with a glass cover and is placed on the rooftop of the Manila Observatory building to gather pictures of the sky every 5min to observe cloud dynamics and evolution in the atmosphere. To detect pixels with clouds, the pictures are converted from its native JPEG to grayscale format. The pixels are then screened for clouds by looking at the values of pixels with and without clouds. In grayscale format, pixels with clouds have greater pixel values than pixels without clouds. Based on the observations, 0.34 of the maximum pixel value is enough to discern pixels with clouds from pixels without clouds. Figs. 1a & 1b are sample unprocessed pictures of cloudless night (May 22-23, 2014) and cloudy skies (May 23-24, 2014), respectively. Figs.1c and 1d show percentage of occurrence of nighttime clouds on May 22-23 and May 23-24, 2014, respectively. The cloud occurrence in a pixel is defined as the ratio of the number times when the pixel has clouds to the total number of observations. Fig. 1c shows less than 50% cloud occurrence while Fig. 1d shows cloud occurrence more than what is shown in Fig. 1c. These graphs show the capability of the camera to detect and measure the cloud occurrence at nighttime. Continuous collection of nighttime pictures is currently implemented. In regions where there is a dearth of scientific data, the measured nighttime cloud occurrence will serve as a baseline for future cloud studies in this part of the world.
Point Clouds to Indoor/outdoor Accessibility Diagnosis
NASA Astrophysics Data System (ADS)
Balado, J.; Díaz-Vilariño, L.; Arias, P.; Garrido, I.
2017-09-01
This work presents an approach to automatically detect structural floor elements such as steps or ramps in the immediate environment of buildings, elements that may affect the accessibility to buildings. The methodology is based on Mobile Laser Scanner (MLS) point cloud and trajectory information. First, the street is segmented in stretches along the trajectory of the MLS to work in regular spaces. Next, the lower region of each stretch (the ground zone) is selected as the ROI and normal, curvature and tilt are calculated for each point. With this information, points in the ROI are classified in horizontal, inclined or vertical. Points are refined and grouped in structural elements using raster process and connected components in different phases for each type of previously classified points. At last, the trajectory data is used to distinguish between road and sidewalks. Adjacency information is used to classify structural elements in steps, ramps, curbs and curb-ramps. The methodology is tested in a real case study, consisting of 100 m of an urban street. Ground elements are correctly classified in an acceptable computation time. Steps and ramps also are exported to GIS software to enrich building models from Open Street Map with information about accessible/inaccessible entrances and their locations.
Barta, András; Horváth, Gábor; Horváth, Ákos; Egri, Ádám; Blahó, Miklós; Barta, Pál; Bumke, Karl; Macke, Andreas
2015-02-10
Cloud cover estimation is an important part of routine meteorological observations. Cloudiness measurements are used in climate model evaluation, nowcasting solar radiation, parameterizing the fluctuations of sea surface insolation, and building energy transfer models of the atmosphere. Currently, the most widespread ground-based method to measure cloudiness is based on analyzing the unpolarized intensity and color distribution of the sky obtained by digital cameras. As a new approach, we propose that cloud detection can be aided by the additional use of skylight polarization measured by 180° field-of-view imaging polarimetry. In the fall of 2010, we tested such a novel polarimetric cloud detector aboard the research vessel Polarstern during expedition ANT-XXVII/1. One of our goals was to test the durability of the measurement hardware under the extreme conditions of a trans-Atlantic cruise. Here, we describe the instrument and compare the results of several different cloud detection algorithms, some conventional and some newly developed. We also discuss the weaknesses of our design and its possible improvements. The comparison with cloud detection algorithms developed for traditional nonpolarimetric full-sky imagers allowed us to evaluate the added value of polarimetric quantities. We found that (1) neural-network-based algorithms perform the best among the investigated schemes and (2) global information (the mean and variance of intensity), nonoptical information (e.g., sun-view geometry), and polarimetric information (e.g., the degree of polarization) improve the accuracy of cloud detection, albeit slightly.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.
2014-12-01
The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.
FINAL REPORT (DE-FG02-97ER62338): Single-column modeling, GCM parameterizations, and ARM data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard C. J. Somerville
2009-02-27
Our overall goal is the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data at all three ARM sites, and the implementation and testing of these parameterizations in global models. To test recently developed prognostic parameterizations based on detailed cloud microphysics, we have compared SCM (single-column model) output with ARM observations at the SGP, NSA and TWP sites. We focus on the predicted cloud amounts and on a suite of radiative quantities strongly dependent on clouds, such as downwelling surface shortwave radiation. Our results demonstrate the superiority of parameterizations based on comprehensive treatments ofmore » cloud microphysics and cloud-radiative interactions. At the SGP and NSA sites, the SCM results simulate the ARM measurements well and are demonstrably more realistic than typical parameterizations found in conventional operational forecasting models. At the TWP site, the model performance depends strongly on details of the scheme, and the results of our diagnostic tests suggest ways to develop improved parameterizations better suited to simulating cloud-radiation interactions in the tropics generally. These advances have made it possible to take the next step and build on this progress, by incorporating our parameterization schemes in state-of-the-art three-dimensional atmospheric models, and diagnosing and evaluating the results using independent data. Because the improved cloud-radiation results have been obtained largely via implementing detailed and physically comprehensive cloud microphysics, we anticipate that improved predictions of hydrologic cycle components, and hence of precipitation, may also be achievable.« less
Does a Relationship Between Arctic Low Clouds and Sea Ice Matter?
NASA Technical Reports Server (NTRS)
Taylor, Patrick C.
2016-01-01
Arctic low clouds strongly affect the Arctic surface energy budget. Through this impact Arctic low clouds influence important aspects of the Arctic climate system, namely surface and atmospheric temperature, sea ice extent and thickness, and atmospheric circulation. Arctic clouds are in turn influenced by these elements of the Arctic climate system, and these interactions create the potential for Arctic cloud-climate feedbacks. To further our understanding of potential Arctic cloudclimate feedbacks, the goal of this paper is to quantify the influence of atmospheric state on the surface cloud radiative effect (CRE) and its covariation with sea ice concentration (SIC). We build on previous research using instantaneous, active remote sensing satellite footprint data from the NASA A-Train. First, the results indicate significant differences in the surface CRE when stratified by atmospheric state. Second, there is a weak covariation between CRE and SIC for most atmospheric conditions. Third, the results show statistically significant differences in the average surface CRE under different SIC values in fall indicating a 3-5 W m(exp -2) larger LW CRE in 0% versus 100% SIC footprints. Because systematic changes on the order of 1 W m(exp -2) are sufficient to explain the observed long-term reductions in sea ice extent, our results indicate a potentially significant amplifying sea ice-cloud feedback, under certain meteorological conditions, that could delay the fall freeze-up and influence the variability in sea ice extent and volume. Lastly, a small change in the frequency of occurrence of atmosphere states may yield a larger Arctic cloud feedback than any cloud response to sea ice.
Boamah, Mavis D; Sullivan, Kristal K; Shulenberger, Katie E; Soe, ChanMyae M; Jacob, Lisa M; Yhee, Farrah C; Atkinson, Karen E; Boyer, Michael C; Haines, David R; Arumainayagam, Christopher R
2014-01-01
In the interstellar medium, UV photolysis of condensed methanol (CH3OH), contained in ice mantles surrounding dust grains, is thought to be the mechanism that drives the formation of "complex" molecules, such as methyl formate (HCOOCH3), dimethyl ether (CH3OCH3), acetic acid (CH3COOH), and glycolaldehyde (HOCH2CHO). The source of this reaction-initiating UV light is assumed to be local because externally sourced UV radiation cannot penetrate the ice-containing dark, dense molecular clouds. Specifically, exceedingly penetrative high-energy cosmic rays generate secondary electrons within the clouds through molecular ionizations. Hydrogen molecules, present within these dense molecular clouds, are excited in collisions with these secondary electrons. It is the UV light, emitted by these electronically excited hydrogen molecules, that is generally thought to photoprocess interstellar icy grain mantles to generate "complex" molecules. In addition to producing UV light, the large numbers of low-energy (< 20 eV) secondary electrons, produced by cosmic rays, can also directly initiate radiolysis reactions in the condensed phase. The goal of our studies is to understand the low-energy, electron-induced processes that occur when high-energy cosmic rays interact with interstellar ices, in which methanol, a precursor of several prebiotic species, is the most abundant organic species. Using post-irradiation temperature-programmed desorption, we have investigated the radiolysis initiated by low-energy (7 eV and 20 eV) electrons in condensed methanol at - 85 K under ultrahigh vacuum (5 x 10(-10) Torr) conditions. We have identified eleven electron-induced methanol radiolysis products, which include many that have been previously identified as being formed by methanol UV photolysis in the interstellar medium. These experimental results suggest that low-energy, electron-induced condensed phase reactions may contribute to the interstellar synthesis of "complex" molecules previously thought to form exclusively via UV photons.
2002-06-18
KENNEDY SPACE CENTER, FLA. -- Black storm clouds roll in over the Vehicle Assembly Building, bringing thunder and heavy rain. This type of weather convinced flight control managers to wave off the two scheduled landing attempts at KSC for Endeavour, returning from mission STS-111
An Evaluation Methodology for the Usability and Security of Cloud-based File Sharing Technologies
2012-09-01
FISMA, ISO 27001 , FIPS 140-2, and ISO 270001) indicate a cloud-based service’s compliance with industry standard security controls, management and...Information Assurance IEEE Institute of Electrical and Electronics Engineers IT Information Technology ITS Insider Threat Study ISO International...effectively, efficiently and with satisfaction” (International Organization for Standardization [ ISO ], 1998). Alternately, information security
Rango, A.; Foster, J.; Josberger, E.G.; Erbe, E.F.; Pooley, C.; Wergin, W.P.
2003-01-01
Snow crystals, which form by vapor deposition, occasionally come in contact with supercooled cloud droplets during their formation and descent. When this occurs, the droplets adhere and freeze to the snow crystals in a process known as accretion. During the early stages of accretion, discrete snow crystals exhibiting frozen cloud droplets are referred to as rime. If this process continues, the snow crystal may become completely engulfed in frozen cloud droplets. The resulting particle is known as graupel. Light microscopic investigations have studied rime and graupel for nearly 100 years. However, the limiting resolution and depth of field associated with the light microscope have prevented detailed descriptions of the microscopic cloud droplets and the three-dimensional topography of the rime and graupel particles. This study uses low-temperature scanning electron microscopy to characterize the frozen precipitates that are commonly known as rime and graupel. Rime, consisting of frozen cloud droplets, is observed on all types of snow crystals including needles, columns, plates, and dendrites. The droplets, which vary in size from 10 to 100 μm, frequently accumulate along one face of a single snow crystal, but are found more randomly distributed on aggregations consisting of two or more snow crystals (snowflakes). The early stages of riming are characterized by the presence of frozen cloud droplets that appear as a layer of flattened hemispheres on the surface of the snow crystal. As this process continues, the cloud droplets appear more sinuous and elongate as they contact and freeze to the rimed crystals. The advanced stages of this process result in graupel, a particle 1 to 3 mm across, composed of hundreds of frozen cloud droplets interspersed with considerable air spaces; the original snow crystal is no longer discernible. This study increases our knowledge about the process and characteristics of riming and suggests that the initial appearance of the flattened hemispheres may result from impact of the leading face of the snow crystal with cloud droplets. The elongated and sinuous configurations of frozen cloud droplets that are encountered on the more advanced stages suggest that aerodynamic forces propel cloud droplets to the trailing face of the descending crystal where they make contact and freeze.
Kinetics of laser irradiated nanoparticles cloud
NASA Astrophysics Data System (ADS)
Mishra, S. K.; Upadhyay Kahaly, M.; Misra, Shikha
2018-02-01
A comprehensive kinetic model describing the complex kinetics of a laser irradiated nanoparticle ensemble has been developed. The absorbed laser radiation here serves dual purpose, viz., photoenhanced thermionic emission via rise in its temperature and direct photoemission of electrons. On the basis of mean charge theory along with the equations for particle (electron) and energy flux balance over the nanoparticles, the transient processes of charge/temperature evolution over its surface and mass diminution on account of the sublimation (phase change) process have been elucidated. Using this formulation phenomenon of nanoparticle charging, its temperature rise to the sublimation point, mass ablation, and cloud disintegration have been investigated; afterwards, typical timescales of disintegration, sublimation and complete evaporation in reference to a graphite nanoparticle cloud (as an illustrative case) have been parametrically investigated. Based on a numerical analysis, an adequate parameter space describing the nanoparticle operation below the sublimation temperature, in terms of laser intensity, wavelength and nanoparticle material work function, has been identified. The cloud disintegration is found to be sensitive to the nanoparticle charging through photoemission; as a consequence, it illustrates that radiation operating below the photoemission threshold causes disintegration in the phase change state, while above the threshold, it occurs with the onset of surface heating.
NASA Astrophysics Data System (ADS)
Isarie, Claudiu I.; Oprean, Constantin; Marginean, Ion; Nemes, Toderita; Isarie, Ilie V.; Bokor, Corina; Itu, Sorin
2011-03-01
When a photon beam is in impact with a metal, the peripheric electrons which belong to the bombarded material are made jumps, and in the same time, new photons are absorbed by electrons which had not time to come back to the fundamental levels. At a high level concentration of the radiant energy, a peripheral electron, could sequentially absorb more photons and could realize energetic jumps in succesive phase, equivalent with some photons of high energy which have wave-lenght smaller than the incidental photons. After some succesive photon absorbtion of the same electron, in the interval in which it is not activated by new photons, the electron comes back to the fundamental level and delivers the accumulated energy, in photons of higher energy, which have a lower energy than the incident beam. Comming back to the fundamental level, the electrons disturb the electronic cloud of the atom or ion they belong. After a huge number of such phenomenon the electronic cloud which is succesivelly disturbed, produces an oscillation which risez the temperature of the nucleus. The authors have studied the conditions which generated the rise of temperature and multiple radiations at the place where the photons bombard the metal.
Secure Skyline Queries on Cloud Platform
Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian
2017-01-01
Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions. PMID:28883710
NASA Astrophysics Data System (ADS)
Pawłowicz, Joanna A.
2017-10-01
The TLS method (Terrestrial Laser Scanning) may replace the traditional building survey methods, e.g. those requiring the use measuring tapes or range finders. This technology allows for collecting digital data in the form of a point cloud, which can be used to create a 3D model of a building. In addition, it allows for collecting data with an incredible precision, which translates into the possibility to reproduce all architectural features of a building. This data is applied in reverse engineering to create a 3D model of an object existing in a physical space. This study presents the results of a research carried out using a point cloud to recreate the architectural features of a historical building with the application of reverse engineering. The research was conducted on a two-storey residential building with a basement and an attic. Out of the building’s façade sticks a veranda featuring a complicated, wooden structure. The measurements were taken at the medium and the highest resolution using a ScanStation C10 laser scanner by Leica. The data obtained was processed using specialist software, which allowed for the application of reverse engineering, especially for reproducing the sculpted details of the veranda. Following digitization, all redundant data was removed from the point cloud and the cloud was subjected to modelling. For testing purposes, a selected part of the veranda was modelled by means of two methods: surface matching and Triangulated Irregular Network. Both modelling methods were applied in the case of data collected at medium and the highest resolution. Creating a model based on data obtained at medium resolution, both by means of the surface matching and the TIN method, does not allow for a precise recreation of architectural details. The study presents certain sculpted elements recreated based on the highest resolution data with superimposed TIN juxtaposed against a digital image. The resulting model is very precise. Creating good models requires highly accurate field data. It is important to properly choose the distance between the measuring station and the measured object in order to ensure that the angles of incidence (horizontal and vertical) of the laser beam are as straight as possible. The model created based on medium resolution offers very poor quality of details, i.e. only the bigger, basic elements of each detail are clearly visible, while the smaller ones are blurred. This is why in order to obtain data sufficient to reproduce architectural details laser scanning should be performed at the highest resolution. In addition, modelling by means of the surface matching method should be avoided - a better idea is to use the TIN method. In addition to providing a realistically-looking visualization, the method has one more important advantage - it is 4 times faster than the surface matching method.
SeReNA Project: studying aerosol interactions with cloud microphysics in the Amazon Basin
NASA Astrophysics Data System (ADS)
Correia, A. L.; Catandi, P. B.; Frigeri, F. F.; Ferreira, W. C.; Martins, J.; Artaxo, P.
2012-12-01
Cloud microphysics and its interaction with aerosols is a key atmospheric process for weather and climate. Interactions between clouds and aerosols can impact Earth's radiative balance, its hydrological and energetic cycles, and are responsible for a large fraction of the uncertainty in climatic models. On a planetary scale, the Amazon Basin is one of the most significant land sources of moisture and latent heat energy. Moreover, every year this region undergoes mearked seasonal shifts in its atmospheric state, transitioning from clean to heavily polluted conditions due to the occurrence of seasonal biomass burning fires, that emit large amounts of smoke to the atmosphere. These conditions make the Amazon Basin a special place to study aerosol-cloud interactions. The SeReNA Project ("Remote sensing of clouds and their interaction with aerosols", from the acronym in Portuguese, @SerenaProject on Twitter) is an ongoing effort to experimentally investigate the impact of aerosols upon cloud microphysics in Amazonia. Vertical profiles of droplet effective radius of water and ice particles, in single convective clouds, can be derived from measurements of the emerging radiation on cloud sides. Aerosol optical depth, cloud top properties, and meteorological parameters retrieved from satellites will be correlated with microphysical properties derived for single clouds. Maps of cloud brightness temperature will allow building temperature vs. effective radius profiles for hydrometeors in single clouds. Figure 1 shows an example extracted from Martins et al. (2011), illustrating a proof-of-concept for the kind of result expected within the framework for the SeReNA Project. The results to be obtained will help foster the quantitative knowledge about interactions between aerosols and clouds in a microphysical level. These interactions are a fundamental process in the context of global climatic changes, they are key to understanding basic processes within clouds and how aerosols can influence them. Reference: Martins et al. (2011) ACP, v.11, p.9485-9501. Available at: http://bit.ly/martinspaper Figure 1. Brightness temperature (left panel) and thermodynamic phase (right) of hydrometeors in the convective cloud shown in the middle panel. Extracted from Martins et al. (2011).
NASA Astrophysics Data System (ADS)
Li, Jiming; Lv, Qiaoyi; Jian, Bida; Zhang, Min; Zhao, Chuanfeng; Fu, Qiang; Kawamoto, Kazuaki; Zhang, Hua
2018-05-01
Studies have shown that changes in cloud cover are responsible for the rapid climate warming over the Tibetan Plateau (TP) in the past 3 decades. To simulate the total cloud cover, atmospheric models have to reasonably represent the characteristics of vertical overlap between cloud layers. Until now, however, this subject has received little attention due to the limited availability of observations, especially over the TP. Based on the above information, the main aim of this study is to examine the properties of cloud overlaps over the TP region and to build an empirical relationship between cloud overlap properties and large-scale atmospheric dynamics using 4 years (2007-2010) of data from the CloudSat cloud product and collocated ERA-Interim reanalysis data. To do this, the cloud overlap parameter α, which is an inverse exponential function of the cloud layer separation D and decorrelation length scale L, is calculated using CloudSat and is discussed. The parameters α and L are both widely used to characterize the transition from the maximum to random overlap assumption with increasing layer separations. For those non-adjacent layers without clear sky between them (that is, contiguous cloud layers), it is found that the overlap parameter α is sensitive to the unique thermodynamic and dynamic environment over the TP, i.e., the unstable atmospheric stratification and corresponding weak wind shear, which leads to maximum overlap (that is, greater α values). This finding agrees well with the previous studies. Finally, we parameterize the decorrelation length scale L as a function of the wind shear and atmospheric stability based on a multiple linear regression. Compared with previous parameterizations, this new scheme can improve the simulation of total cloud cover over the TP when the separations between cloud layers are greater than 1 km. This study thus suggests that the effects of both wind shear and atmospheric stability on cloud overlap should be taken into account in the parameterization of decorrelation length scale L in order to further improve the calculation of the radiative budget and the prediction of climate change over the TP in the atmospheric models.
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.
Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E
2012-03-19
A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them.
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community
2012-01-01
Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them. PMID:22429538
NASA Technical Reports Server (NTRS)
Scales, W. A.; Bernhardt, P. A.; Ganguli, G.
1994-01-01
Two-dimensional electrostatic particle-in-cell simulations are used to study the early time evolution of electron depletions and negative ion clouds produced during electron attachment chemical releases in the ionosphere. The simulation model considers the evolution in the plane perpendicular to the magnetic field and a three-species plasma that contains electrons, positive ions, and also heavy negative ions that result as a by-product of the electron attachment reaction. The early time evolution (less than the negative ion cyclotron period) of the system shows that a negative charge surplus initially develops outside of the depletion boundary as the heavy negative ions move across the boundary. The electrons are initially restricted from moving into the depletion due to the magnetic field. An inhomogenous electric field develops across the boundary layer due to this charge separation. A highly sheared electron flow velocity develops in the depletion boundary due to E x B and Delta-N x B drifts that result from electron density gradients and this inhomogenous electric field. Structure eventually develops in the depletion boundary layer due to low-frequency electrostatic waves that have growth times shorter than the negative ion cyclotron period. It is proposed that these waves are most likely produced by the electron-ion hybrid instability that results from sufficiently large shears in the electron flow velocity.
Cloud-Based Numerical Weather Prediction for Near Real-Time Forecasting and Disaster Response
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venners, Jason; Schroeder, Richard; Checchi, Milton; Zavodsky, Bradley; Limaye, Ashutosh; O'Brien, Raymond
2015-01-01
The use of cloud computing resources continues to grow within the public and private sector components of the weather enterprise as users become more familiar with cloud-computing concepts, and competition among service providers continues to reduce costs and other barriers to entry. Cloud resources can also provide capabilities similar to high-performance computing environments, supporting multi-node systems required for near real-time, regional weather predictions. Referred to as "Infrastructure as a Service", or IaaS, the use of cloud-based computing hardware in an on-demand payment system allows for rapid deployment of a modeling system in environments lacking access to a large, supercomputing infrastructure. Use of IaaS capabilities to support regional weather prediction may be of particular interest to developing countries that have not yet established large supercomputing resources, but would otherwise benefit from a regional weather forecasting capability. Recently, collaborators from NASA Marshall Space Flight Center and Ames Research Center have developed a scripted, on-demand capability for launching the NOAA/NWS Science and Training Resource Center (STRC) Environmental Modeling System (EMS), which includes pre-compiled binaries of the latest version of the Weather Research and Forecasting (WRF) model. The WRF-EMS provides scripting for downloading appropriate initial and boundary conditions from global models, along with higher-resolution vegetation, land surface, and sea surface temperature data sets provided by the NASA Short-term Prediction Research and Transition (SPoRT) Center. This presentation will provide an overview of the modeling system capabilities and benchmarks performed on the Amazon Elastic Compute Cloud (EC2) environment. In addition, the presentation will discuss future opportunities to deploy the system in support of weather prediction in developing countries supported by NASA's SERVIR Project, which provides capacity building activities in environmental monitoring and prediction across a growing number of regional hubs throughout the world. Capacity-building applications that extend numerical weather prediction to developing countries are intended to provide near real-time applications to benefit public health, safety, and economic interests, but may have a greater impact during disaster events by providing a source for local predictions of weather-related hazards, or impacts that local weather events may have during the recovery phase.
ERIC Educational Resources Information Center
Ballard, Thomas H.
1985-01-01
Examines arguments now being presented in support of public library networking and resource sharing and suggests ways to obtain harder evidence that might serve to legitimize these beliefs: estimate benefits, build in evaluation, support estimates of benefits, seek cheaper alternatives for accomplishing same goal, consider alternatives to…
Comparative study of building footprint estimation methods from LiDAR point clouds
NASA Astrophysics Data System (ADS)
Rozas, E.; Rivera, F. F.; Cabaleiro, J. C.; Pena, T. F.; Vilariño, D. L.
2017-10-01
Building area calculation from LiDAR points is still a difficult task with no clear solution. Their different characteristics, such as shape or size, have made the process too complex to automate. However, several algorithms and techniques have been used in order to obtain an approximated hull. 3D-building reconstruction or urban planning are examples of important applications that benefit of accurate building footprint estimations. In this paper, we have carried out a study of accuracy in the estimation of the footprint of buildings from LiDAR points. The analysis focuses on the processing steps following the object recognition and classification, assuming that labeling of building points have been previously performed. Then, we perform an in-depth analysis of the influence of the point density over the accuracy of the building area estimation. In addition, a set of buildings with different size and shape were manually classified, in such a way that they can be used as benchmark.
Yusef-Zadeh, F; Wardle, M; Lis, D; Viti, S; Brogan, C; Chambers, E; Pound, M; Rickert, M
2013-10-03
We present 74 MHz radio continuum observations of the Galactic center region. These measurements show nonthermal radio emission arising from molecular clouds that is unaffected by free–free absorption along the line of sight. We focus on one cloud, G0.13-0.13, representative of the population of molecular clouds that are spatially correlated with steep spectrum (α(327MHz)(74MHz) = 1.3 ± 0.3) nonthermal emission from the Galactic center region. This cloud lies adjacent to the nonthermal radio filaments of the Arc near l 0.2° and is a strong source of 74 MHz continuum, SiO (2-1), and Fe I Kα 6.4 keV line emission. This three-way correlation provides the most compelling evidence yet that relativistic electrons, here traced by 74 MHz emission, are physically associated with the G0.13-0.13 molecular cloud and that low-energy cosmic ray electrons are responsible for the Fe I Kα line emission. The high cosmic ray ionization rate 10(–1)3 s(–1) H(–1) is responsible for heating the molecular gas to high temperatures and allows the disturbed gas to maintain a high-velocity dispersion. Large velocity gradient (LVG) modeling of multitransition SiO observations of this cloud implies H2 densities 10(4–5) cm(–3) and high temperatures. The lower limit to the temperature of G0.13-0.13 is 100 K, whereas the upper limit is as high as 1000 K. Lastly, we used a time-dependent chemical model in which cosmic rays drive the chemistry of the gas to investigate for molecular line diagnostics of cosmic ray heating. When the cloud reaches chemical equilibrium, the abundance ratios of HCN/HNC and N2H+/HCO+ are consistent with measured values. In addition, significant abundance of SiO is predicted in the cosmic ray dominated region of the Galactic center. We discuss different possibilities to account for the origin of widespread SiO emission detected from Galactic center molecular clouds.
Hurricane Harvey Building Damage Assessment Using UAV Data
NASA Astrophysics Data System (ADS)
Yeom, J.; Jung, J.; Chang, A.; Choi, I.
2017-12-01
Hurricane Harvey which was extremely destructive major hurricane struck southern Texas, U.S.A on August 25, causing catastrophic flooding and storm damages. We visited Rockport suffered severe building destruction and conducted UAV (Unmanned Aerial Vehicle) surveying for building damage assessment. UAV provides very high resolution images compared with traditional remote sensing data. In addition, prompt and cost-effective damage assessment can be performed regardless of several limitations in other remote sensing platforms such as revisit interval of satellite platforms, complicated flight plan in aerial surveying, and cloud amounts. In this study, UAV flight and GPS surveying were conducted two weeks after hurricane damage to generate an orthomosaic image and a DEM (Digital Elevation Model). 3D region growing scheme has been proposed to quantitatively estimate building damages considering building debris' elevation change and spectral difference. The result showed that the proposed method can be used for high definition building damage assessment in a time- and cost-effective way.
ESA's Ice Cloud Imager on Metop Second Generation
NASA Astrophysics Data System (ADS)
Klein, Ulf; Loiselet, Marc; Mason, Graeme; Gonzalez, Raquel; Brandt, Michael
2016-04-01
Since 2006, the European contribution to operational meteorological observations from polar orbit has been provided by the Meteorological Operational (MetOp) satellites, which is the space segment of the EUMETSAT Polar System (EPS). The first MetOp satellite was launched in 2006, 2nd 2012 and 3rd satellite is planned for launch in 2018. As part of the next generation EUMETSAT Polar System (EPS-SG), the MetOp Second Generation (MetOp-SG) satellites will provide continuity and enhancement of these observations in the 2021 - 2042 timeframe. The noel Ice Cloud Imager (ICI) is one of the instruments selected to be on-board the MetOp-SG satellite "B". The main objective of the ICI is to enable cloud ice retrieval, with emphasis on cirrus clouds. ICI will provide information on cloud ice mean altitude, cloud ice water path and cloud ice effective radius. In addition, it will provide water vapour profile measurement capability. ICI is a 13-channel microwave/sub-millimetre wave radiometer, covering the frequency range from 183 GHz up to 664 GHz. The instrument is composed of a rotating part and a fixed part. The rotating part includes the main antenna, the feed assembly and the receiver electronics. The fixed part contains the hot calibration target, the reflector for viewing the cold sky and the electronics for the instrument control and interface with the platform. Between the fixed and the rotating part is the scan mechanism. Scan mechanism is not only responsible of rotating the instrument and providing its angular position, but it will also have pass through the power and data lines. The Scan mechanism is controlled by the fully redundant Control and Drive Electronics ICI is calibrated using an internal hot target and a cold sky mirror, which are viewed once per rotation. The internal hot target is a traditional pyramidal target. The hot target is covered by an annular shield during rotation with only a small opening for the feed horns to guarantee a stable environment. Also, in order to achieve very good radiometric accuracy and stability, the ICI instrument is designed with sun-shields in order to minimize sun-intrusion at all possible sun angles. Details of the instrument design and the current development status will be presented.
TU-F-CAMPUS-T-05: A Cloud-Based Monte Carlo Dose Calculation for Electron Cutout Factors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, T; Bush, K
Purpose: For electron cutouts of smaller sizes, it is necessary to verify electron cutout factors due to perturbations in electron scattering. Often, this requires a physical measurement using a small ion chamber, diode, or film. The purpose of this study is to develop a fast Monte Carlo based dose calculation framework that requires only a smart phone photograph of the cutout and specification of the SSD and energy to determine the electron cutout factor, with the ultimate goal of making this cloud-based calculation widely available to the medical physics community. Methods: The algorithm uses a pattern recognition technique to identifymore » the corners of the cutout in the photograph as shown in Figure 1. It then corrects for variations in perspective, scaling, and translation of the photograph introduced by the user’s positioning of the camera. Blob detection is used to identify the portions of the cutout which comprise the aperture and the portions which are cutout material. This information is then used define physical densities of the voxels used in the Monte Carlo dose calculation algorithm as shown in Figure 2, and select a particle source from a pre-computed library of phase-spaces scored above the cutout. The electron cutout factor is obtained by taking a ratio of the maximum dose delivered with the cutout in place to the dose delivered under calibration/reference conditions. Results: The algorithm has been shown to successfully identify all necessary features of the electron cutout to perform the calculation. Subsequent testing will be performed to compare the Monte Carlo results with a physical measurement. Conclusion: A simple, cloud-based method of calculating electron cutout factors could eliminate the need for physical measurements and substantially reduce the time required to properly assure accurate dose delivery.« less
Agile Infrastructure Monitoring
NASA Astrophysics Data System (ADS)
Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.
2014-06-01
At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.
Integrated Survey Procedures for the Virtual Reading and Fruition of Historical Buildings
NASA Astrophysics Data System (ADS)
Scandurra, S.; Pulcrano, M.; Cirillo, V.; Campi, M.; di Luggo, A.; Zerlenga, O.
2018-05-01
This paper presents the developments of research related to the integration of digital survey methodologies with reference to image-based and range-based technologies. Starting from the processing of point clouds, the data were processed for both the geometric interpretation of the space as well as production of three-dimensional models that describe the constitutive and morphological relationships. The subject of the study was the church of San Carlo all'Arena in Naples (Italy), with a HBIM model being produced that is semantically consistent with the real building. Starting from the data acquired, a visualization system was created for the virtual exploration of the building.
NASA Astrophysics Data System (ADS)
Warchoł, A.
2013-12-01
The following article presents an analysis of accuracy three point clouds (airborne, terrestrial and mobile) obtained for the same area. The study was conducted separately for the coordinates (X, Y) - examining the location of buildings vertex and separately for the coordinate (Z) - comparing models built on each of the clouds. As a baseline measurement for both analyzes (X, Y and Z), the total station measurement was taken.
Design of Smart-Meter data acquisition device based on Cloud Platform
NASA Astrophysics Data System (ADS)
Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng
2018-05-01
In recent years, the government has attached great importance to ‘Four-Meter Unified’ Project. Under the call of national policy, State Grid is participate in building ‘Four-Meter Unified’ Project actively by making use of electricity information acquisition system. In this paper, a new type Smart-Meter data acquisition device based on Cloud Platform is designed according to the newest series of standards Energy Measure and Management System for Electric, Water, Gas and Heat Meter, and this paper introduces the general scheme, main hardware design and main software design for the Smart-Meter data acquisition device.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
An experimental investigation of hollow cathode-based plasma contactors. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Williams, John D.
1991-01-01
Experimental results are presented which describe operation of the plasma environment associated with a hollow cathod-based plasma contactor collecting electrons from or emitting them to an ambient, low density Maxwellian plasma. A one-dimensional, phenomenological model of the near-field electron collection process, which was formulated from experimental observations, is presented. It considers three regions, namely, a plasma cloud adjacent to the contactor, an ambient plasma from which electrons are collected, and a double layer region that develops between the contactor plasma cloud and the ambient plasma regions. Results of the electron emission experiments are also presented. An important observation is made using a retarding potential analyzer (RPA) which shows that high energy ions generally stream from a contactor along with the electrons being emitted. A mechanism for this phenomenon is presented and it involves a high rate of ionization induced between electrons and atoms flowing together from the hollow cathode orifice. This can result in the development of a region of high positive potential. Langmuir and RPA probe data suggest that both electrons and ions expand spherically from this hill region. In addition to experimental observations, a one-dimensional model which describes the electron emission process and predicts the phenomena just mentioned is presented and shown to agree qualitatively with these observations.
Analysis of cloud-based solutions on EHRs systems in different scenarios.
Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C
2012-12-01
Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Earl, James A.
From 1948 until 1963, cloud chambers were carried to the top of the atmosphere by balloons. From these flights, which were begun by Edward P. Ney at the University of Minnesota, came the following results: discovery of heavy cosmic ray nuclei, development of scintillation and cherenkov detectors, discovery of cosmic ray electrons, and studies of solar proton events. The history of that era is illustrated here by cloud chamber photographs of primary cosmic rays.
Cloud Effects in Hyperspectral Imagery from First-Principles Scene Simulations
2009-01-01
SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, or distribution to multiple locations...scattering and absorption, scattering events, surface scattering with material-dependent bidirectional reflectances, multiple surface adjacency...aerosols or clouds, they may be absorbed, or they may reflect off the ground or an object. A given photon may undergo multiple scattering events
Smart learning services based on smart cloud computing.
Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik
2011-01-01
Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.
Smart Learning Services Based on Smart Cloud Computing
Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik
2011-01-01
Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users. PMID:22164048
Metric Scale Calculation for Visual Mapping Algorithms
NASA Astrophysics Data System (ADS)
Hanel, A.; Mitschke, A.; Boerner, R.; Van Opdenbosch, D.; Hoegner, L.; Brodie, D.; Stilla, U.
2018-05-01
Visual SLAM algorithms allow localizing the camera by mapping its environment by a point cloud based on visual cues. To obtain the camera locations in a metric coordinate system, the metric scale of the point cloud has to be known. This contribution describes a method to calculate the metric scale for a point cloud of an indoor environment, like a parking garage, by fusing multiple individual scale values. The individual scale values are calculated from structures and objects with a-priori known metric extension, which can be identified in the unscaled point cloud. Extensions of building structures, like the driving lane or the room height, are derived from density peaks in the point distribution. The extension of objects, like traffic signs with a known metric size, are derived using projections of their detections in images onto the point cloud. The method is tested with synthetic image sequences of a drive with a front-looking mono camera through a virtual 3D model of a parking garage. It has been shown, that each individual scale value improves either the robustness of the fused scale value or reduces its error. The error of the fused scale is comparable to other recent works.
NASA Astrophysics Data System (ADS)
Fiedler, V.; Arnold, F.; Schlager, H.; Pirjola, L.
2009-01-01
We report on sulfur dioxide (SO2) induced formation of aerosols and cloud condensation nuclei in an SO2 rich aged (9 days) pollution plume of Chinese origin, which we have detected at 5-7 km altitude during a research aircraft mission over the East Atlantic off the West coast of Ireland. Building on our measurements of SO2 and other trace gases along with plume trajectory simulations, we have performed model simulations of SO2 induced formation of gaseous sulfuric acid (GSA, H2SO4) followed by GSA induced formation and growth of aerosol particles. We find that efficient photochemical SO2 conversion to GSA took place in the plume followed by efficient formation and growth of H2SO4-H2O aerosol particles. Most particles reached sufficiently large sizes to act as cloud condensation nuclei whenever water vapor supersaturation exceeded 0.1-0.2%. As a consequence, smaller but more numerous cloud droplets are formed, which tend to increase the cloud albedo and to decrease the rainout efficiency. The detected plume represents an interesting example of the environmental impact of long range transport of fossil fuel combustion generated SO2.
Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)
NASA Astrophysics Data System (ADS)
Nebert, D. D.; Huang, Q.; Yang, C.
2013-12-01
The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.
Electronic torsional sound in linear atomic chains: Chemical energy transport at 1000 km/s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnosov, Arkady A.; Rubtsov, Igor V.; Maksymov, Andrii O.
2016-07-21
We investigate entirely electronic torsional vibrational modes in linear cumulene chains. The carbon nuclei of a cumulene are positioned along the primary axis so that they can participate only in the transverse and longitudinal motions. However, the interatomic electronic clouds behave as a torsion spring with remarkable torsional stiffness. The collective dynamics of these clouds can be described in terms of electronic vibrational quanta, which we name torsitons. It is shown that the group velocity of the wavepacket of torsitons is much higher than the typical speed of sound, because of the small mass of participating electrons compared to themore » atomic mass. For the same reason, the maximum energy of the torsitons in cumulenes is as high as a few electronvolts, while the minimum possible energy is evaluated as a few hundred wavenumbers and this minimum is associated with asymmetry of zero point atomic vibrations. Theory predictions are consistent with the time-dependent density functional theory calculations. Molecular systems for experimental evaluation of the predictions are proposed.« less
Meteoric Metal Chemistry in the Martian Atmosphere
NASA Astrophysics Data System (ADS)
Plane, J. M. C.; Carrillo-Sanchez, J. D.; Mangan, T. P.; Crismani, M. M. J.; Schneider, N. M.; Määttänen, A.
2018-03-01
Recent measurements by the Imaging Ultraviolet Spectrograph (IUVS) instrument on NASA's Mars Atmosphere and Volatile EvolutioN mission show that a persistent layer of Mg+ ions occurs around 90 km in the Martian atmosphere but that neutral Mg atoms are not detectable. These observations can be satisfactorily modeled with a global meteoric ablation rate of 0.06 t sol-1, out of a cosmic dust input of 2.7 ± 1.6 t sol-1. The absence of detectable Mg at 90 km requires that at least 50% of the ablating Mg atoms ionize through hyperthermal collisions with CO2 molecules. Dissociative recombination of MgO+.(CO2)n cluster ions with electrons to produce MgCO3 directly, rather than MgO, also avoids a buildup of Mg to detectable levels. The meteoric injection rate of Mg, Fe, and other metals—constrained by the IUVS measurements—enables the production rate of metal carbonate molecules (principally MgCO3 and FeCO3) to be determined. These molecules have very large electric dipole moments (11.6 and 9.2 Debye, respectively) and thus form clusters with up to six H2O molecules at temperatures below 150 K. These clusters should then coagulate efficiently, building up metal carbonate-rich ice particles which can act as nucleating particles for the formation of CO2-ice clouds. Observable mesospheric clouds are predicted to occur between 65 and 80 km at temperatures below 95 K and above 85 km at temperatures about 5 K colder.
2011-03-31
CAPE CANAVERAL, Fla. – An osprey wades in flooded grass near the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Ben Smegelsky
2011-03-31
CAPE CANAVERAL, Fla. – An osprey wades in flooded grass near the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Ben Smegelsky
2011-03-31
CAPE CANAVERAL, Fla. – An osprey wades in flooded grass near the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. Severe storms associated with a frontal system are moving through Central Florida, producing strong winds, heavy rain, frequent lightning and even funnel clouds. Photo credit: NASA/Ben Smegelsky
Building an Interdepartmental Major in Speech Communication.
ERIC Educational Resources Information Center
Litterst, Judith K.
This paper describes a popular and innovative major program of study in speech communication at St. Cloud University in Minnesota: the Speech Communication Interdepartmental Major. The paper provides background on the program, discusses overall program requirements, presents sample student options, identifies ingredients for program success,…
NASA Astrophysics Data System (ADS)
Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito
2017-07-01
The Structure From Motion (SFM) is a technique applied to a series of photographs of an object that returns a 3D reconstruction made up by points in the space (point clouds). This research aims at comparing the results of the SFM approach with the results of a 3D laser scanning in terms of density and accuracy of the model. The experience was conducted by detecting several architectural elements (walls and portals of historical buildings) both with a 3D laser scanner of the latest generation and an amateur photographic camera. The point clouds acquired by laser scanner and those acquired by the photo camera have been systematically compared. In particular we present the experience carried out on the "Don Diego Pappalardo Palace" site in Pedara (Catania, Sicily).
Abstracting application deployment on Cloud infrastructures
NASA Astrophysics Data System (ADS)
Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.
2017-10-01
Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.
NASA Astrophysics Data System (ADS)
Oniga, E.; Chirilă, C.; Stătescu, F.
2017-02-01
Nowadays, Unmanned Aerial Systems (UASs) are a wide used technique for acquisition in order to create buildings 3D models, providing the acquisition of a high number of images at very high resolution or video sequences, in a very short time. Since low-cost UASs are preferred, the accuracy of a building 3D model created using this platforms must be evaluated. To achieve results, the dean's office building from the Faculty of "Hydrotechnical Engineering, Geodesy and Environmental Engineering" of Iasi, Romania, has been chosen, which is a complex shape building with the roof formed of two hyperbolic paraboloids. Seven points were placed on the ground around the building, three of them being used as GCPs, while the remaining four as Check points (CPs) for accuracy assessment. Additionally, the coordinates of 10 natural CPs representing the building characteristic points were measured with a Leica TCR 405 total station. The building 3D model was created as a point cloud which was automatically generated based on digital images acquired with the low-cost UASs, using the image matching algorithm and different software like 3DF Zephyr, Visual SfM, PhotoModeler Scanner and Drone2Map for ArcGIS. Except for the PhotoModeler Scanner software, the interior and exterior orientation parameters were determined simultaneously by solving a self-calibrating bundle adjustment. Based on the UAS point clouds, automatically generated by using the above mentioned software and GNSS data respectively, the parameters of the east side hyperbolic paraboloid were calculated using the least squares method and a statistical blunder detection. Then, in order to assess the accuracy of the building 3D model, several comparisons were made for the facades and the roof with reference data, considered with minimum errors: TLS mesh for the facades and GNSS mesh for the roof. Finally, the front facade of the building was created in 3D based on its characteristic points using the PhotoModeler Scanner software, resulting a CAD (Computer Aided Design) model. The results showed the high potential of using low-cost UASs for building 3D model creation and if the building 3D model is created based on its characteristic points the accuracy is significantly improved.
Lightweight Data Systems in the Cloud: Costs, Benefits and Best Practices
NASA Astrophysics Data System (ADS)
Fatland, R.; Arendt, A. A.; Howe, B.; Hess, N. J.; Futrelle, J.
2015-12-01
We present here a simple analysis of both the cost and the benefit of using the cloud in environmental science circa 2016. We present this set of ideas to enable the potential 'cloud adopter' research scientist to explore and understand the tradeoffs in moving some aspect of their compute work to the cloud. We present examples, design patterns and best practices as an evolving body of knowledge that help optimize benefit to the research team. Thematically this generally means not starting from a blank page but rather learning how to find 90% of the solution to a problem pre-built. We will touch on four topics of interest. (1) Existing cloud data resources (NASA, WHOI BCO DMO, etc) and how they can be discovered, used and improved. (2) How to explore, compare and evaluate cost and compute power from many cloud options, particularly in relation to data scale (size/complexity). (3) What are simple / fast 'Lightweight Data System' procedures that take from 20 minutes to one day to implement and that have a clear immediate payoff in environmental data-driven research. Examples include publishing a SQL Share URL at (EarthCube's) CINERGI as a registered data resource and creating executable papers on a cloud-hosted Jupyter instance, particularly iPython notebooks. (4) Translating the computational terminology landscape ('cloud', 'HPC cluster', 'hadoop', 'spark', 'machine learning') into examples from the community of practice to help the geoscientist build or expand their mental map. In the course of this discussion -- which is about resource discovery, adoption and mastery -- we provide direction to online resources in support of these themes.
Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup
Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.
2010-01-01
Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure. PMID:21258651
Xu, Long; Zhao, Zhiyuan; Xiao, Mingchao; Yang, Jie; Xiao, Jian; Yi, Zhengran; Wang, Shuai; Liu, Yunqi
2017-11-22
The exploration of novel electron-deficient building blocks is a key task for developing high-performance polymer semiconductors in organic thin-film transistors. In view of the situation of the lack of strong electron-deficient building blocks, we designed two novel π-extended isoindigo-based electron-deficient building blocks, IVI and F 4 IVI. Owing to the strong electron-deficient nature and the extended π-conjugated system of the two acceptor units, their copolymers, PIVI2T and PF 4 IVI2T, containing 2,2'-bithiophene donor units, are endowed with deep-lying highest occupied molecular orbital (HOMO)/lowest unoccupied molecular orbital (LUMO) energy levels and strong intermolecular interactions. In comparison to PIVI2T, the fluorinated PF 4 IVI2T exhibits stronger intra- and intermolecular interactions, lower HOMO/LUMO energy levels up to -5.74/-4.17 eV, and more ordered molecular packing with a smaller π-π stacking distance of up to 3.53 Å, resulting in an excellent ambipolar transporting behavior and a promising application in logic circuits for PF 4 IVI2T in ambient with hole and electron mobilities of up to 1.03 and 1.82 cm 2 V -1 s -1 , respectively. The results reveal that F 4 IVI is a promising and strong electron-deficient building unit to construct high-performance semiconducting polymers, which provides an insight into the structure-property relationships for the exploration and molecular engineering of excellent electron-deficient building blocks in the field of organic electronics.
NASA Astrophysics Data System (ADS)
Saito, M.; Saito, Y.; Mukai, T.; Asamura, K.
2009-06-01
The future magnetospheric exploration missions (ex. SCOPE: cross Scale COupling in the Plasma universE) aim to obtain electron 3D distribution function with very fast time resolution below 10 ms to investigate the electron dynamics that is regarded as pivotal in understanding the space plasma phenomena such as magnetic reconnection. This can be achieved by developing a new plasma detector system which is fast in signal processing with small size, light weight and low power consumption. The new detector system consists of stacked micro channel plates and a position sensitive multi-anode detector with on-anode analogue ASIC (Application Specific Integrated Circuits). Multi-anode system usually suffers from false signals caused by mainly two effects. One is the effect of the electrostatic crosstalk between the discrete anodes since our new detector consists of many adjacent anodes with small gaps to increase the detection areas. Our experimental results show that there exists electrostatic crosstalk effect of approximately 10% from the adjacent anodes. The effect of 10% electrostatic crosstalk can be effectively avoided by a suitable discrimination level of the signal processing circuit. Non negligible charge cloud size on the anode also causes false counts. Optimized ASIC for in-situ plasma measurement in the Earth's magnetosphere is under development. The initial electron cloud at the MCP output has angular divergence. Furthermore, space charge effects may broaden the size of the charge cloud. We have obtained the charge cloud size both experimentally and theoretically. Our test model detector shows expected performance that is explained by our studies above.
The Globus Galaxies Platform. Delivering Science Gateways as a Service
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madduri, Ravi; Chard, Kyle; Chard, Ryan
We use public cloud computers to host sophisticated scientific data; software is then used to transform scientific practice by enabling broad access to capabilities previously available only to the few. The primary obstacle to more widespread use of public clouds to host scientific software (‘cloud-based science gateways’) has thus far been the considerable gap between the specialized needs of science applications and the capabilities provided by cloud infrastructures. We describe here a domain-independent, cloud-based science gateway platform, the Globus Galaxies platform, which overcomes this gap by providing a set of hosted services that directly address the needs of science gatewaymore » developers. The design and implementation of this platform leverages our several years of experience with Globus Genomics, a cloud-based science gateway that has served more than 200 genomics researchers across 30 institutions. Building on that foundation, we have also implemented a platform that leverages the popular Galaxy system for application hosting and workflow execution; Globus services for data transfer, user and group management, and authentication; and a cost-aware elastic provisioning model specialized for public cloud resources. We describe here the capabilities and architecture of this platform, present six scientific domains in which we have successfully applied it, report on user experiences, and analyze the economics of our deployments. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.« less
NASA Technical Reports Server (NTRS)
Pritchett, P. L.; Schriver, D.; Ashour-Abdalla, M.
1991-01-01
A one-dimensional electromagnetic particle simulation model is constructed to study the excitation of whistler waves in the presence of a cold plasma cloud for conditions representative of those after the release of lithium in the inner plasma sheet during the Combined Release and Radiation Effect Satellite mission. The results indicate that a standing-wave pattern with discrete wave frequencies is formed within the cloud. The magnetic wave amplitude inside the cloud, which is limited by quasi-linear diffusion, is of the order of several nanoteslas. Assuming a magnetospheric loss cone of 5 deg, the observed pitch angle diffusion produced by the whistler waves is sufficient to put the electrons on strong diffusion.
NASA Technical Reports Server (NTRS)
2001-01-01
X-rays diffracted from a well-ordered protein crystal create sharp patterns of scattered light on film. A computer can use these patterns to generate a model of a protein molecule. To analyze the selected crystal, an X-ray crystallographer shines X-rays through the crystal. Unlike a single dental X-ray, which produces a shadow image of a tooth, these X-rays have to be taken many times from different angles to produce a pattern from the scattered light, a map of the intensity of the X-rays after they diffract through the crystal. The X-rays bounce off the electron clouds that form the outer structure of each atom. A flawed crystal will yield a blurry pattern; a well-ordered protein crystal yields a series of sharp diffraction patterns. From these patterns, researchers build an electron density map. With powerful computers and a lot of calculations, scientists can use the electron density patterns to determine the structure of the protein and make a computer-generated model of the structure. The models let researchers improve their understanding of how the protein functions. They also allow scientists to look for receptor sites and active areas that control a protein's function and role in the progress of diseases. From there, pharmaceutical researchers can design molecules that fit the active site, much like a key and lock, so that the protein is locked without affecting the rest of the body. This is called structure-based drug design.
Combining 3d Volume and Mesh Models for Representing Complicated Heritage Buildings
NASA Astrophysics Data System (ADS)
Tsai, F.; Chang, H.; Lin, Y.-W.
2017-08-01
This study developed a simple but effective strategy to combine 3D volume and mesh models for representing complicated heritage buildings and structures. The idea is to seamlessly integrate 3D parametric or polyhedral models and mesh-based digital surfaces to generate a hybrid 3D model that can take advantages of both modeling methods. The proposed hybrid model generation framework is separated into three phases. Firstly, after acquiring or generating 3D point clouds of the target, these 3D points are partitioned into different groups. Secondly, a parametric or polyhedral model of each group is generated based on plane and surface fitting algorithms to represent the basic structure of that region. A "bare-bones" model of the target can subsequently be constructed by connecting all 3D volume element models. In the third phase, the constructed bare-bones model is used as a mask to remove points enclosed by the bare-bones model from the original point clouds. The remaining points are then connected to form 3D surface mesh patches. The boundary points of each surface patch are identified and these boundary points are projected onto the surfaces of the bare-bones model. Finally, new meshes are created to connect the projected points and original mesh boundaries to integrate the mesh surfaces with the 3D volume model. The proposed method was applied to an open-source point cloud data set and point clouds of a local historical structure. Preliminary results indicated that the reconstructed hybrid models using the proposed method can retain both fundamental 3D volume characteristics and accurate geometric appearance with fine details. The reconstructed hybrid models can also be used to represent targets in different levels of detail according to user and system requirements in different applications.
NASA Astrophysics Data System (ADS)
Garagnani, S.; Manferdini, A. M.
2013-02-01
Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.
2002-06-18
KENNEDY SPACE CENTER, FLA. -- Black storm clouds hang over the Vehicle Assembly Building and Launch Control Center, bringing thunder and heavy rain to the area. This type of weather convinced flight control managers to wave off the two scheduled landing attempts at KSC for Endeavour, returning from mission STS-111
NASA Technical Reports Server (NTRS)
Keyser, G.
1978-01-01
The design philosophy and performance characteristics of the continuous flow diffusion chamber developed for use in ground-based simulation of some of the experiments planned for the atmospheric cloud physics laboratory during the first Spacelab flight are discussed. Topics covered include principle of operation, thermal control, temperature measurement, tem-powered heat exchangers, wettable metal surfaces, sample injection system, and control electronics.
Multipactor saturation in parallel-plate waveguides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorolla, E.; Mattes, M.
2012-07-15
The saturation stage of a multipactor discharge is considered of interest, since it can guide towards a criterion to assess the multipactor onset. The electron cloud under multipactor regime within a parallel-plate waveguide is modeled by a thin continuous distribution of charge and the equations of motion are calculated taking into account the space charge effects. The saturation is identified by the interaction of the electron cloud with its image charge. The stability of the electron population growth is analyzed and two mechanisms of saturation to explain the steady-state multipactor for voltages near above the threshold onset are identified. Themore » impact energy in the collision against the metal plates decreases during the electron population growth due to the attraction of the electron sheet on the image through the initial plate. When this growth remains stable till the impact energy reaches the first cross-over point, the electron surface density tends to a constant value. When the stability is broken before reaching the first cross-over point the surface charge density oscillates chaotically bounded within a certain range. In this case, an expression to calculate the maximum electron surface charge density is found whose predictions agree with the simulations when the voltage is not too high.« less
Silicon photonics cloud (SiCloud)
NASA Astrophysics Data System (ADS)
DeVore, Peter T. S.; Jiang, Yunshan; Lynch, Michael; Miyatake, Taira; Carmona, Christopher; Chan, Andrew C.; Muniam, Kuhan; Jalali, Bahram
2015-02-01
We present SiCloud (Silicon Photonics Cloud), the first free, instructional web-based research and education tool for silicon photonics. SiCloud's vision is to provide a host of instructional and research web-based tools. Such interactive learning tools enhance traditional teaching methods by extending access to a very large audience, resulting in very high impact. Interactive tools engage the brain in a way different from merely reading, and so enhance and reinforce the learning experience. Understanding silicon photonics is challenging as the topic involves a wide range of disciplines, including material science, semiconductor physics, electronics and waveguide optics. This web-based calculator is an interactive analysis tool for optical properties of silicon and related material (SiO2, Si3N4, Al2O3, etc.). It is designed to be a one stop resource for students, researchers and design engineers. The first and most basic aspect of Silicon Photonics is the Material Parameters, which provides the foundation for the Device, Sub-System and System levels. SiCloud includes the common dielectrics and semiconductors for waveguide core, cladding, and photodetection, as well as metals for electrical contacts. SiCloud is a work in progress and its capability is being expanded. SiCloud is being developed at UCLA with funding from the National Science Foundation's Center for Integrated Access Networks (CIAN) Engineering Research Center.
Evolution of the Far-Infrared Cloud at Titan's South Pole
NASA Technical Reports Server (NTRS)
Jennings, Donald E.; Achterberg, R. K.; Cottini, V.; Anderson, C. M.; Flasar, F. M.; Nixon, C. A.; Bjoraker, G. L.; Kunde, V. G.; Carlson, R. C.; Guandique, E.;
2015-01-01
A condensate cloud on Titan identified by its 220 cm (sup -1) far-infrared signature continues to undergo seasonal changes at both the north and south poles. In the north the cloud, which extends from 55 North to the pole, has been gradually decreasing in emission intensity since the beginning of the Cassini mission with a half-life of 3.8 years. The cloud in the south did not appear until 2012 but its intensity has increased rapidly, doubling every year. The shape of the cloud at the South Pole is very different from that in the north. Mapping in December 2013 showed that the condensate emission was confined to a ring with a maximum at 80 South. The ring was centered 4 degrees from Titan's pole. The pattern of emission from stratospheric trace gases like nitriles and complex hydrocarbons (mapped in January 2014) was also offset by 4 degrees, but had a central peak at the pole and a secondary maximum in a ring at about 70 South with a minimum at 80 South. The shape of the gas emissions distribution can be explained by abundances that are high at the atmospheric pole and diminish toward the equator, combined with correspondingly increasing temperatures. We discuss possible causes for the condensate ring. The present rapid build up of the condensate cloud at the South Pole is likely to transition to a gradual decline during 2015-16.
An efficient global energy optimization approach for robust 3D plane segmentation of point clouds
NASA Astrophysics Data System (ADS)
Dong, Zhen; Yang, Bisheng; Hu, Pingbo; Scherer, Sebastian
2018-03-01
Automatic 3D plane segmentation is necessary for many applications including point cloud registration, building information model (BIM) reconstruction, simultaneous localization and mapping (SLAM), and point cloud compression. However, most of the existing 3D plane segmentation methods still suffer from low precision and recall, and inaccurate and incomplete boundaries, especially for low-quality point clouds collected by RGB-D sensors. To overcome these challenges, this paper formulates the plane segmentation problem as a global energy optimization because it is robust to high levels of noise and clutter. First, the proposed method divides the raw point cloud into multiscale supervoxels, and considers planar supervoxels and individual points corresponding to nonplanar supervoxels as basic units. Then, an efficient hybrid region growing algorithm is utilized to generate initial plane set by incrementally merging adjacent basic units with similar features. Next, the initial plane set is further enriched and refined in a mutually reinforcing manner under the framework of global energy optimization. Finally, the performances of the proposed method are evaluated with respect to six metrics (i.e., plane precision, plane recall, under-segmentation rate, over-segmentation rate, boundary precision, and boundary recall) on two benchmark datasets. Comprehensive experiments demonstrate that the proposed method obtained good performances both in high-quality TLS point clouds (i.e., http://SEMANTIC3D.NET)
Galactic Building Blocks Seen Swarming Around Andromeda
NASA Astrophysics Data System (ADS)
2004-02-01
Green Bank, WV - A team of astronomers using the National Science Foundation's Robert C. Byrd Green Bank Telescope (GBT) has made the first conclusive detection of what appear to be the leftover building blocks of galaxy formation -- neutral hydrogen clouds -- swarming around the Andromeda Galaxy, the nearest large spiral galaxy to the Milky Way. This discovery may help scientists understand the structure and evolution of the Milky Way and all spiral galaxies. It also may help explain why certain young stars in mature galaxies are surprisingly bereft of the heavy elements that their contemporaries contain. Andromeda Galaxy This image depicts several long-sought galactic "building blocks" in orbit of the Andromeda Galaxy (M31). The newfound hydrogen clouds are depicted in a shade of orange (GBT), while gas that comprises the massive hydrogen disk of Andromeda is shown at high-resolution in blue (Westerbork Sythesis Radio Telescope). CREDIT: NRAO/AUI/NSF, WSRT (Click on Image for Larger Version) "Giant galaxies, like Andromeda and our own Milky Way, are thought to form through repeated mergers with smaller galaxies and through the accretion of vast numbers of even lower mass 'clouds' -- dark objects that lack stars and even are too small to call galaxies," said David A. Thilker of the Johns Hopkins University in Baltimore, Maryland. "Theoretical studies predict that this process of galactic growth continues today, but astronomers have been unable to detect the expected low mass 'building blocks' falling into nearby galaxies, until now." Thilker's research is published in the Astrophysical Journal Letters. Other contributors include: Robert Braun of the Netherlands Foundation for Research in Astronomy; Rene A.M. Walterbos of New Mexico State University; Edvige Corbelli of the Osservatorio Astrofisico di Arcetri in Italy; Felix J. Lockman and Ronald Maddalena of the National Radio Astronomy Observatory (NRAO) in Green Bank, West Virginia; and Edward Murphy of the University of Virginia. The Milky Way and Andromeda were formed many billions of years ago in a cosmic neighborhood brimming with galactic raw materials -- among which hydrogen, helium, and cold dark matter were primary constituents. By now, most of this raw material has probably been gobbled up by the two galaxies, but astronomers suspect that some primitive clouds are still floating free. Previous studies have revealed a number of clouds of neutral atomic hydrogen that are near the Milky Way but not part of its disk. These were initially referred to as high-velocity clouds (HVCs) when they were first discovered because they appeared to move at velocities difficult to reconcile with Galactic rotation. Scientists were uncertain if HVCs comprised building blocks of the Milky Way that had so far escaped capture, or if they traced gas accelerated to unexpected velocities by energetic processes (multiple supernovae) within the Milky Way. The discovery of similar clouds bound to the Andromeda Galaxy strengthens the case that at least some of these HVCs are indeed galactic building blocks. Astronomers are able to use radio telescopes to detect the characteristic 21-centimeter radiation emitted naturally by neutral atomic hydrogen. The great difficulty in analyzing these low-mass galactic building blocks has been that their natural radio emission is extremely faint. Even those nearest to us, clouds orbiting our Galaxy, are hard to study because of serious distance uncertainties. "We know the Milky Way HVCs are relatively nearby, but precisely how close is maddeningly tough to determine," said Thilker. Past attempts to find missing satellites around external galaxies at well-known distances have been unsuccessful because of the need for a very sensitive instrument capable of producing high-fidelity images, even in the vicinity of a bright source such as the Andromeda Galaxy. One might consider this task similar to visually distinguishing a candle placed adjacent to a spotlight. The novel design of the recently commissioned GBT met these challenges brilliantly, and gave astronomers their first look at the cluttered neighborhood around Andromeda. The Andromeda Galaxy was targeted because it is the nearest massive spiral galaxy. "In some sense, the rich get richer, even in space," said Thilker. "All else being equal, one would expect to find more primordial clouds in the vicinity of a large spiral galaxy than near a small dwarf galaxy, for instance. This makes Andromeda a good place to look, especially considering its relative proximity -- a mere 2.5 million light-years from Earth." What the GBT was able to pin down was a population of 20 discrete neutral hydrogen clouds, together with an extended filamentary component, which, the astronomers believe, are both associated with Andromeda. These objects, seemingly under the gravitational influence of Andromeda's halo, are thought to be the gaseous clouds of the "missing" (perhaps dark-matter dominated) satellites and their merger remnants. They were found within 163,000 light-years of Andromeda. Favored cosmological models have predicted the existence of these satellites, and their discovery could account for some of the missing "cold dark matter" in the Universe. Also, confirmation that these low-mass objects are ubiquitous around larger galaxies could help solve the mystery of why certain young stars, known as G-dwarf stars, are chemically similar to ones that evolved billions of years ago. As galaxies age, they develop greater concentrations of heavy elements formed by the nuclear reactions in the cores of stars and in the cataclysmic explosions of supernovae. These explosions spew heavy elements out into the galaxy, which then become planets and get taken up in the next generation of stars. Spectral and photometric analysis of young stars in the Milky Way and other galaxies, however, show that there are a certain number of young stars that are surprisingly bereft of heavy elements, making them resemble stars that should have formed in the early stages of galactic evolution. "One way to account for this strange anomaly is to have a fresh source of raw galactic material from which to form new stars," said Murphy. "Since high-velocity clouds may be the leftover building blocks of galaxy formation, they contain nearly pristine concentrations of hydrogen, mostly free from the heavy metals that seed older galaxies." Their merger into large galaxies, therefore, could explain how fresh material is available for the formation of G-dwarf stars. The Andromeda Galaxy, also known as M31, is one of only a few galaxies that are visible from Earth with the unaided eye, and is seen as a faint smudge in the constellation Andromeda. When viewed through a modest telescope, Andromeda also reveals that it has two prominent satellite dwarf galaxies, known as M32 and M110. These dwarfs, along with the clouds studied by Thilker and collaborators, are doomed to eventually merge with Andromeda. The Milky Way, M33, and the Andromeda Galaxy plus about 40 dwarf companions, comprise what is known as the "Local Group." Today, Andromeda is perhaps the most studied galaxy other than the Milky Way. In fact, many of the things we know about the nature of galaxies like the Milky Way were learned by studying Andromeda, since the overall features of our own galaxy are disguised by our internal vantage point. "In this case, Andromeda is a good analogue for the Milky Way," said Murphy. "It clarifies the picture. Living inside the Milky Way is like trying to determine what your house looks like from the inside, without stepping outdoors. However, if you look at neighbors' houses, you can get a feeling for what your own home might look like." The GBT is the world's largest fully steerable radio telescope. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.
SU-E-P-05: Electronic Brachytherapy: A Physics Perspective On Field Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pai, S; Ayyalasomayajula, S; Lee, S
2015-06-15
Purpose: We want to summarize our experience implementing a successful program of electronic brachytherapy at several dermatology clinics with the help of a cloud based software to help us define the key program parameters and capture physics QA aspects. Optimally developed software helps the physicist in peer review and qualify the physical parameters. Methods: Using the XOFT™ Axxent™ electronic brachytherapy system in conjunction with a cloud-based software, a process was setup to capture and record treatments. It was implemented initially at about 10 sites in California. For dosimetric purposes, the software facilitated storage of the physics parameters of surface applicatorsmore » used in treatment and other source calibration parameters. In addition, the patient prescription, pathology and other setup considerations were input by radiation oncologist and the therapist. This facilitated physics planning of the treatment parameters and also independent check of the dwell time. From 2013–2014, nearly1500 such calculation were completed by a group of physicists. A total of 800 patients with multiple lesions have been treated successfully during this period. The treatment log files have been uploaded and documented in the software which facilitated physics peer review of treatments per the standards in place by AAPM and ACR. Results: The program model was implemented successfully at multiple sites. The cloud based software allowed for proper peer review and compliance of the program at 10 clinical sites. Dosimtery was done on 800 patients and executed in a timely fashion to suit the clinical needs. Accumulated physics data in the software from the clinics allows for robust analysis and future development. Conclusion: Electronic brachytherapy implementation experience from a quality assurance perspective was greatly enhanced by using a cloud based software. The comprehensive database will pave the way for future developments to yield superior physics outcomes.« less
NASA Astrophysics Data System (ADS)
Regi, Mauro; Redaelli, Gianluca; Francia, Patrizia; De Lauretis, Marcello
2017-06-01
In the present study we investigated the possible relationship between the ULF geomagnetic activity and the variations of several atmospheric parameters. In particular, we compared the ULF activity in the Pc1-2 frequency band (100 mHz-5 Hz), computed from geomagnetic field measurements at Terra Nova Bay in Antarctica, with the tropospheric temperature T, specific humidity Q, and cloud cover (high cloud cover, medium cloud cover, and low cloud cover) obtained from reanalysis data set. The statistical analysis was conducted during the years 2003-2010, using correlation and Superposed Epoch Analysis approaches. The results show that the atmospheric parameters significantly change following the increase of geomagnetic activity within 2 days. These changes are evident in particular when the interplanetary magnetic field Bz component is oriented southward (Bz<0) and the By component duskward (By>0). We suggest that both the precipitation of electrons induced by Pc1-2 activity and the intensification of the polar cap potential difference, modulating the microphysical processes in the clouds, can affect the atmosphere conditions.
30. SITE BUILDING 002 SCANNER BUILDING FLOOR 3A ...
30. SITE BUILDING 002 - SCANNER BUILDING - FLOOR 3A ("A" FACE) INTERIOR BETWEEN GRIDS 17-A1 AND 18-A1, SHOWING REAR OF RADAR EMITTER ELECTRONIC INTERFACE TERMINAL NO. 3147-20, "RECEIVER TRANSMITTER RADAR" MODULE. VIEW IS ALSO SHOWING BUILDING FIRE STOP MATERIAL AT BOTTOM OF FLOOR. NOTE: WALL SLOPES BOTTOM TO TOP INWARD; STRUCTURAL ELEMENT IN FOREGROUND. VIEW ALSO SHOWS PIPING GRID OF CHILLED WATER LINES FOR ELECTRONIC SYSTEMS COOLING. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA
NASA Astrophysics Data System (ADS)
Li, Weijun; Li, Peiren; Sun, Guode; Zhou, Shengzhen; Yuan, Qi; Wang, Wenxing
2011-05-01
Most studies of aerosol-cloud interactions have been conducted in remote locations; few have investigated the characterization of cloud condensation nuclei (CCN) over highly polluted urban and industrial areas. The present work, based on samples collected at Mt. Tai, a site in northern China affected by nearby urban and industrial air pollutant emissions, illuminates CCN properties in a polluted atmosphere. High-resolution transmission electron microscopy (TEM) was used to obtain the size, composition, and mixing state of individual cloud residues and interstitial aerosols. Most of the cloud residues displayed distinct rims which were found to consist of soluble organic matter (OM). Nearly all (91.7%) cloud residues were attributed to sulfate-related salts (the remainder was mostly coarse crustal dust particles with nitrate coatings). Half the salt particles were internally mixed with two or more refractory particles (e.g., soot, fly ash, crustal dust, CaSO 4, and OM). A comparison between cloud residues and interstitial particles shows that the former contained more salts and were of larger particle size than the latter. In addition, a somewhat high number scavenging ratio of 0.54 was observed during cloud formation. Therefore, the mixtures of salts with OMs account for most of the cloud-nucleating ability of the entire aerosol population in the polluted air of northern China. We advocate that both size and composition - the two influential, controlling factors for aerosol activation - should be built into all regional climate models of China.
Door recognition in cluttered building interiors using imagery and lidar data
NASA Astrophysics Data System (ADS)
Díaz-Vilariño, L.; Martínez-Sánchez, J.; Lagüela, S.; Armesto, J.; Khoshelham, K.
2014-06-01
Building indoors reconstruction is an active research topic due to the importance of the wide range of applications to which they can be subjected, from architecture and furniture design, to movies and video games editing, or even crime scene investigation. Among the constructive elements defining the inside of a building, doors are important entities in applications like routing and navigation, and their automated recognition is advantageous e.g. in case of large multi-storey buildings with many office rooms. The inherent complexity of the automation of the recognition process is increased by the presence of clutter and occlusions, difficult to avoid in indoor scenes. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors using information acquired in the form of point clouds and images. The methodology goes in depth with door detection and labelling as either opened, closed or furniture (false positive)
Investigating the Application of Moving Target Defenses to Network Security
2013-08-01
developing an MTD testbed using OpenStack [14] to show that our MTD design can actually work. Building an MTD system in a cloud infrastructure will be...Information Intelli- gence Research. New York, USA: ACM, 2013. [14] Openstack , “ Openstack : The folsom release,” http://www.openstack.org/software
2002-06-18
KENNEDY SPACE CENTER, FLA. - Dark, rain-filled clouds blanket the sky over the Vehicle Assembly Building and Launch Control Center, bringing thunder and heavy rain to the area. This type of weather convinced flight control managers to wave off the two scheduled landing attempts at KSC for Endeavour, returning from mission STS-111
2009-09-10
CAPE CANAVERAL, Fla. – Between rain showers over NASA's Kennedy Space Center in Florida, a rainbow breaks through the clouds behind the Vehicle Assembly Building. Rain and thunderstorms caused the waveoff of two landing opportunities Sept. 10 for space shuttle Discovery's return to Earth from the STS-128 mission. Photo credit: NASA/Jim Grossmann
Cloud cover estimation optical package: New facility, algorithms and techniques
NASA Astrophysics Data System (ADS)
Krinitskiy, Mikhail
2017-02-01
Short- and long-wave radiation is an important component of surface heat budget over sea and land. For estimating them accurate observations of the cloud cover are needed. While massively observed visually, for building accurate parameterizations cloud cover needs also to be quantified using precise instrumental measurements. Major disadvantages of the most of existing cloud-cameras are associated with their complicated design and inaccuracy of post-processing algorithms which typically result in the uncertainties of 20% to 30% in the camera-based estimates of cloud cover. The accuracy of these types of algorithm in terms of true scoring compared to human-observed values is typically less than 10%. We developed new generation package for cloud cover estimating, which provides much more accurate results and also allows for measuring additional characteristics. New algorithm, namely SAIL GrIx, based on routine approach, also developed for this package. It uses the synthetic controlling index ("grayness rate index") which allows to suppress the background sunburn effect. This makes it possible to increase the reliability of the detection of the optically thin clouds. The accuracy of this algorithm in terms of true scoring became 30%. One more approach, namely SAIL GrIx ML, we have used to increase the cloud cover estimating accuracy is the algorithm that uses machine learning technique along with some other signal processing techniques. Sun disk condition appears to be a strong feature in this kind of models. Artificial Neural Networks type of model demonstrates the best quality. This model accuracy in terms of true scoring increases up to 95,5%. Application of a new algorithm lets us to modify the design of the optical sensing package and to avoid the use of the solar trackers. This made the design of the cloud camera much more compact. New cloud-camera has already been tested in several missions across Atlantic and Indian oceans on board of IORAS research vessels.
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Fisher, W.; Yoksas, T.
2014-12-01
Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.
On Study of Application of Big Data and Cloud Computing Technology in Smart Campus
NASA Astrophysics Data System (ADS)
Tang, Zijiao
2017-12-01
We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.
Orbital Sciences Pegasus XL AIM Processing
2007-03-16
Inside the clean-room "tent" of Building 1555 at North Vandenberg Air Force Base, two of the solar array panels on the AIM spacecraft are deployed for testing. Inside are the instruments that will study polar mesospheric clouds located at the edge of space. The AIM spacecraft will fly three instruments designed to study those clouds located at the edge of space, 50 miles above the Earth's surface in the coldest part of the planet's atmosphere. The mission's primary goal is to explain why these clouds form and what has caused them to become brighter and more numerous and appear at lower latitudes in recent years. AIM's results will provide the basis for the study of long-term variability in the mesospheric climate and its relationship to global climate change. AIM is scheduled to be mated to the Pegasus XL during the second week of April, after which final inspections will be conducted. Launch is scheduled for April 25.
Detecting the building blocks of aromatics
NASA Astrophysics Data System (ADS)
Joblin, Christine; Cernicharo, José
2018-01-01
Interstellar clouds are sites of active organic chemistry (1). Many small, gasphase molecules are found in the dark parts of the clouds that are protected from ultraviolet (UV) photons, but these molecules photodissociate in the external layers of the cloud that are exposed to stellar radiation (see the photo). These irradiated regions are populated by large polycyclic aromatic hydrocarbons (PAHs) with characteristic infrared (IR) emission features. These large aromatics are expected to form from benzene (C6H6), which is, however, difficult to detect because it does not have a permanent dipole moment and can only be detected via its IR absorption transitions against a strong background source (2). On page 202 of this issue, McGuire et al. (3) report the detection of benzonitrile (c-C6H5CN) with radio telescopes. Benzonitrile likely forms in the reaction of CN with benzene; from its observation, it is therefore possible to estimate the abundance of benzene itself.
A global view of atmospheric ice particle complexity
NASA Astrophysics Data System (ADS)
Schmitt, Carl G.; Heymsfield, Andrew J.; Connolly, Paul; Järvinen, Emma; Schnaiter, Martin
2016-11-01
Atmospheric ice particles exist in a variety of shapes and sizes. Single hexagonal crystals like common hexagonal plates and columns are possible, but more frequently, atmospheric ice particles are much more complex. Ice particle shapes have a substantial impact on many atmospheric processes through fall speed, affecting cloud lifetime, to radiative properties, affecting energy balance to name a few. This publication builds on earlier work where a technique was demonstrated to separate single crystals and aggregates of crystals using particle imagery data from aircraft field campaigns. Here data from 10 field programs have been analyzed and ice particle complexity parameterized by cloud temperature for arctic, midlatitude (summer and frontal), and tropical cloud systems. Results show that the transition from simple to complex particles can be as small as 80 µm or as large as 400 µm depending on conditions. All regimes show trends of decreasing transition size with decreasing temperature.
H-Ransac a Hybrid Point Cloud Segmentation Combining 2d and 3d Data
NASA Astrophysics Data System (ADS)
Adam, A.; Chatzilari, E.; Nikolopoulos, S.; Kompatsiaris, I.
2018-05-01
In this paper, we present a novel 3D segmentation approach operating on point clouds generated from overlapping images. The aim of the proposed hybrid approach is to effectively segment co-planar objects, by leveraging the structural information originating from the 3D point cloud and the visual information from the 2D images, without resorting to learning based procedures. More specifically, the proposed hybrid approach, H-RANSAC, is an extension of the well-known RANSAC plane-fitting algorithm, incorporating an additional consistency criterion based on the results of 2D segmentation. Our expectation that the integration of 2D data into 3D segmentation will achieve more accurate results, is validated experimentally in the domain of 3D city models. Results show that HRANSAC can successfully delineate building components like main facades and windows, and provide more accurate segmentation results compared to the typical RANSAC plane-fitting algorithm.
A bright-rimmed cloud sculpted by the H ii region Sh2-48
NASA Astrophysics Data System (ADS)
Ortega, M. E.; Paron, S.; Giacani, E.; Rubio, M.; Dubner, G.
2013-08-01
Aims: We characterize a bright-rimmed cloud embedded in the H ii region Sh2-48 while searching for evidence of triggered star formation. Methods: We carried out observations towards a region of 2' × 2' centered at RA = 18h22m11.39s, Dec = -14°35'24.81''(J2000) using the Atacama Submillimeter Telescope Experiment (ASTE; Chile) in the 12CO J = 3-2, 13CO J = 3-2, HCO+J = 4-3, and CS J = 7-6 lines with an angular resolution of about 22''. We also present radio continuum observations at 5 GHz carried out with the Jansky Very Large Array (JVLA; EEUU) interferometer with a synthetized beam of 7'' × 5''. The molecular transitions were used to study the distribution and kinematics of the molecular gas of the bright-rimmed cloud. The radio continuum data was used to characterize the ionized gas located on the illuminated border of this molecular condensation. Combining these observations with infrared public data allowed us to build up a comprehensive picture of the current state of star formation within this cloud. Results: The analysis of our molecular observations reveals a relatively dense clump with n(H2) ~ 3 × 103cm-3, located in projection onto the interior of the H ii region Sh2-48. The emission distribution of the four observed molecular transitions has, at VLSR ~ 38 km s-1, morphological anticorrelation with the bright-rimmed cloud as seen in the optical emission. From the new radio continuum observations, we identify a thin layer of ionized gas located on the border of the clump that is facing the ionizing star. The ionized gas has an electron density of about 73 cm-3, which is a factor three higher than the typical critical density (nc ~ 25 cm-3), above which an ionized boundary layer can be formed and maintained. This supports the hypothesis that the clump is being photoionized by the nearby O9.5V star, BD-14 5014. From the evaluation of the pressure balance between the ionized and molecular gas, we conclude that the clump would be in a prepressure balance state with the shocks being driven into the surface layer. Among the five YSO candidates found in the region, two of them (class I) are placed slightly beyond the bright rim, suggesting that their formation could have been triggered by the radiation-driven implosion process.
NASA Astrophysics Data System (ADS)
Ogunjobi, O.; Sivakumar, V.; Mtumela, Z.
2017-06-01
Energetic electrons are trapped in the Earth's radiation belts which occupy a toroidal region between 3 and 7 \\hbox {R}E above the Earth's surface. Rapid loss of electrons from the radiation belts is known as dropouts. The source and loss mechanisms regulating the radiation belts population are not yet understood entirely, particularly during geomagnetic storm times. Nevertheless, the dominant loss mechanism may require an event based study to be better observed. Utilizing multiple data sources from the year 1997-2007, this study identifies radiation belt electron dropouts which are ultimately triggered when solar wind stream interfaces (SI) arrived at Earth, or when magnetic clouds (MC) arrived. Using superposed epoch analysis (SEA) technique, a synthesis of multiple observations is performed to reveal loss mechanism which might, perhaps, be a major contributor to radiation belt losses under SI and MC driven storms. Results show an abrupt slower decaying precipitation of electron peak (about 3000 counts/sec) on SI arrival within 5.05 < L < 6.05, which persist till 0.5 day before gradual recovery. This pattern is interpreted as an indication of depleted electrons from bounce lost cone via precipitating mechanism known as relativistic electron microburst. On the other hand, MC shows a pancake precipitating peak extending to lower L (Plasmapause); indicating a combination of electron cyclotron harmonic (ECH) and whistler mode waves as the contributing mechanisms.
Thunderstorms over the Pacific Ocean as seen from STS-64
NASA Technical Reports Server (NTRS)
1994-01-01
Multiple thunderstorm cells leading to Earth's atmosphere were photographed on 70mm by the astronauts of STS-64, orbiting aboard the Space Shuttle Discovery 130 nautical miles away. These thunderstorms are located about 16 degrees southeast of Hawaii in the Pacific Ocean. Every stage of a developing thunderstorm is documented in this photo: from the building cauliflower tops to the mature anvil phase. The anvil or the tops of the clouds being blown off are at about 50,000 feet. The light line in the blue atmosphere is either clouds in the distance or an atmospheric layer which is defined but different particle sizes.
Interior Reconstruction Using the 3d Hough Transform
NASA Astrophysics Data System (ADS)
Dumitru, R.-C.; Borrmann, D.; Nüchter, A.
2013-02-01
Laser scanners are often used to create accurate 3D models of buildings for civil engineering purposes, but the process of manually vectorizing a 3D point cloud is time consuming and error-prone (Adan and Huber, 2011). Therefore, the need to characterize and quantify complex environments in an automatic fashion arises, posing challenges for data analysis. This paper presents a system for 3D modeling by detecting planes in 3D point clouds, based on which the scene is reconstructed at a high architectural level through removing automatically clutter and foreground data. The implemented software detects openings, such as windows and doors and completes the 3D model by inpainting.
Extracting Topological Relations Between Indoor Spaces from Point Clouds
NASA Astrophysics Data System (ADS)
Tran, H.; Khoshelham, K.; Kealy, A.; Díaz-Vilariño, L.
2017-09-01
3D models of indoor environments are essential for many application domains such as navigation guidance, emergency management and a range of indoor location-based services. The principal components defined in different BIM standards contain not only building elements, such as floors, walls and doors, but also navigable spaces and their topological relations, which are essential for path planning and navigation. We present an approach to automatically reconstruct topological relations between navigable spaces from point clouds. Three types of topological relations, namely containment, adjacency and connectivity of the spaces are modelled. The results of initial experiments demonstrate the potential of the method in supporting indoor navigation.
NASA Astrophysics Data System (ADS)
Nugent, Paul Winston
Cloud cover is an important but poorly understood component of current climate models, and although climate change is most easily observed in the Arctic, cloud data in the Arctic is unreliable or simply unavailable. Ground-based infrared cloud imaging has the potential to fill this gap. This technique uses a thermal infrared camera to observe cloud amount, cloud optical depth, and cloud spatial distribution at a particular location. The Montana State University Optical Remote Sensor Laboratory has developed the ground-based Infrared Cloud Imager (ICI) instrument to measure spatial and temporal cloud data. To build an ICI for Arctic sites required the system to be engineered to overcome the challenges of this environment. Of particular challenge was keeping the system calibration and data processing accurate through the severe temperature changes. Another significant challenge was that weak emission from the cold, dry Arctic atmosphere pushed the camera used in the instrument to its operational limits. To gain an understanding of the operation of the ICI systems for the Arctic and to gather critical data on Arctic clouds, a prototype arctic ICI was deployed in Barrow, AK from July 2012 through July 2014. To understand the long-term operation of an ICI in the arctic, a study was conducted of the ICI system accuracy in relation to co-located active and passive sensors. Understanding the operation of this system in the Arctic environment required careful characterization of the full optical system, including the lens, filter, and detector. Alternative data processing techniques using decision trees and support vector machines were studied to improve data accuracy and reduce dependence on auxiliary instrument data and the resulting accuracy is reported here. The work described in this project was part of the effort to develop a fourth-generation ICI ready to be deployed in the Arctic. This system will serve a critical role in developing our understanding of cloud cover in the Arctic, an important but poorly understood region of the world.
NASA Astrophysics Data System (ADS)
Klapa, Przemyslaw; Mitka, Bartosz; Zygmunt, Mariusz
2017-12-01
The terrestrial laser scanning technology has a wide spectrum of applications, from land surveying, civil engineering and architecture to archaeology. The technology is capable of obtaining, in a short time, accurate coordinates of points which represent the surface of objects. Scanning of buildings is therefore a process which ensures obtaining information on all structural elements a building. The result is a point cloud consisting of millions of elements which are a perfect source of information on the object and its surrounding. The photogrammetric techniques allow documenting an object in high resolution in the form of orthophoto plans, or are a basis to develop 2D documentation or obtain point clouds for objects and 3D modelling. Integration of photogrammetric data and TLS brings a new quality in surveying historic monuments. Historic monuments play an important cultural and historical role. Centuries-old buildings require constant renovation and preservation of their structural and visual invariability while maintaining safety of people who use them. The full process of surveying allows evaluating the actual condition of monuments and planning repairs and renovations. Huge sizes and specific types of historic monuments cause problems in obtaining reliable and full information on them. The TLS technology allows obtaining such information in a short time and is non-invasive. A point cloud is not only a basis for developing architectural and construction documentation or evaluation of actual condition of a building. It also is a real visualization of monuments and their entire environment. The saved image of object surface can be presented at any time and place. A cyclical TLS survey of historic monuments allows detecting structural changes and evaluating damage and changes that cause deformation of monument’s components. The paper presents application of integrated photogrammetric data and TLS illustrated on an example of historic monuments from southern Poland. The cartographic materials are a basis for determining the actual condition of monuments and performing repair works. The materials also supplement the archive of monuments by means of recording the actual image of a monument in a virtual space.
Cloud Processing of Secondary Organic Aerosol from Isoprene and Methacrolein Photooxidation.
Giorio, Chiara; Monod, Anne; Brégonzio-Rozier, Lola; DeWitt, Helen Langley; Cazaunau, Mathieu; Temime-Roussel, Brice; Gratien, Aline; Michoud, Vincent; Pangui, Edouard; Ravier, Sylvain; Zielinski, Arthur T; Tapparo, Andrea; Vermeylen, Reinhilde; Claeys, Magda; Voisin, Didier; Kalberer, Markus; Doussin, Jean-François
2017-10-12
Aerosol-cloud interaction contributes to the largest uncertainties in the estimation and interpretation of the Earth's changing energy budget. The present study explores experimentally the impacts of water condensation-evaporation events, mimicking processes occurring in atmospheric clouds, on the molecular composition of secondary organic aerosol (SOA) from the photooxidation of methacrolein. A range of on- and off-line mass spectrometry techniques were used to obtain a detailed chemical characterization of SOA formed in control experiments in dry conditions, in triphasic experiments simulating gas-particle-cloud droplet interactions (starting from dry conditions and from 60% relative humidity (RH)), and in bulk aqueous-phase experiments. We observed that cloud events trigger fast SOA formation accompanied by evaporative losses. These evaporative losses decreased SOA concentration in the simulation chamber by 25-32% upon RH increase, while aqueous SOA was found to be metastable and slowly evaporated after cloud dissipation. In the simulation chamber, SOA composition measured with a high-resolution time-of-flight aerosol mass spectrometer, did not change during cloud events compared with high RH conditions (RH > 80%). In all experiments, off-line mass spectrometry techniques emphasize the critical role of 2-methylglyceric acid as a major product of isoprene chemistry, as an important contributor to the total SOA mass (15-20%) and as a key building block of oligomers found in the particulate phase. Interestingly, the comparison between the series of oligomers obtained from experiments performed under different conditions show a markedly different reactivity. In particular, long reaction times at high RH seem to create the conditions for aqueous-phase processing to occur in a more efficient manner than during two relatively short cloud events.
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Murphy, K. J.; Baynes, K.; Lynnes, C.
2016-12-01
With the volume of Earth observation data expanding rapidly, cloud computing is quickly changing the way Earth observation data is processed, analyzed, and visualized. The cloud infrastructure provides the flexibility to scale up to large volumes of data and handle high velocity data streams efficiently. Having freely available Earth observation data collocated on a cloud infrastructure creates opportunities for innovation and value-added data re-use in ways unforeseen by the original data provider. These innovations spur new industries and applications and spawn new scientific pathways that were previously limited due to data volume and computational infrastructure issues. NASA, in collaboration with Amazon, Google, and Microsoft, have jointly developed a set of recommendations to enable efficient transfer of Earth observation data from existing data systems to a cloud computing infrastructure. The purpose of these recommendations is to provide guidelines against which all data providers can evaluate existing data systems and be used to improve any issues uncovered to enable efficient search, access, and use of large volumes of data. Additionally, these guidelines ensure that all cloud providers utilize a common methodology for bulk-downloading data from data providers thus preventing the data providers from building custom capabilities to meet the needs of individual cloud providers. The intent is to share these recommendations with other Federal agencies and organizations that serve Earth observation to enable efficient search, access, and use of large volumes of data. Additionally, the adoption of these recommendations will benefit data users interested in moving large volumes of data from data systems to any other location. These data users include the cloud providers, cloud users such as scientists, and other users working in a high performance computing environment who need to move large volumes of data.
Environmental risk perception from visual cues: the psychophysics of tornado risk perception
NASA Astrophysics Data System (ADS)
Dewitt, Barry; Fischhoff, Baruch; Davis, Alexander; Broomell, Stephen B.
2015-12-01
Lay judgments of environmental risks are central to both immediate decisions (e.g., taking shelter from a storm) and long-term ones (e.g., building in locations subject to storm surges). Using methods from quantitative psychology, we provide a general approach to studying lay perceptions of environmental risks. As a first application of these methods, we investigate a setting where lay decisions have not taken full advantage of advances in natural science understanding: tornado forecasts in the US and Canada. Because official forecasts are imperfect, members of the public must often evaluate the risks on their own, by checking environmental cues (such as cloud formations) before deciding whether to take protective action. We study lay perceptions of cloud formations, demonstrating an approach that could be applied to other environmental judgments. We use signal detection theory to analyse how well people can distinguish tornadic from non-tornadic clouds, and multidimensional scaling to determine how people make these judgments. We find that participants (N = 400 recruited from Amazon Mechanical Turk) have heuristics that generally serve them well, helping participants to separate tornadic from non-tornadic clouds, but which also lead them to misjudge the tornado risk of certain cloud types. The signal detection task revealed confusion regarding shelf clouds, mammatus clouds, and clouds with upper- and mid-level tornadic features, which the multidimensional scaling task suggested was the result of participants focusing on the darkness of the weather scene and the ease of discerning its features. We recommend procedures for training (e.g., for storm spotters) and communications (e.g., tornado warnings) that will reduce systematic misclassifications of tornadicity arising from observers’ reliance on otherwise useful heuristics.
GIFT-Cloud: A data sharing and collaboration platform for medical imaging research.
Doel, Tom; Shakir, Dzhoshkun I; Pratt, Rosalind; Aertsen, Michael; Moggridge, James; Bellon, Erwin; David, Anna L; Deprest, Jan; Vercauteren, Tom; Ourselin, Sébastien
2017-02-01
Clinical imaging data are essential for developing research software for computer-aided diagnosis, treatment planning and image-guided surgery, yet existing systems are poorly suited for data sharing between healthcare and academia: research systems rarely provide an integrated approach for data exchange with clinicians; hospital systems are focused towards clinical patient care with limited access for external researchers; and safe haven environments are not well suited to algorithm development. We have established GIFT-Cloud, a data and medical image sharing platform, to meet the needs of GIFT-Surg, an international research collaboration that is developing novel imaging methods for fetal surgery. GIFT-Cloud also has general applicability to other areas of imaging research. GIFT-Cloud builds upon well-established cross-platform technologies. The Server provides secure anonymised data storage, direct web-based data access and a REST API for integrating external software. The Uploader provides automated on-site anonymisation, encryption and data upload. Gateways provide a seamless process for uploading medical data from clinical systems to the research server. GIFT-Cloud has been implemented in a multi-centre study for fetal medicine research. We present a case study of placental segmentation for pre-operative surgical planning, showing how GIFT-Cloud underpins the research and integrates with the clinical workflow. GIFT-Cloud simplifies the transfer of imaging data from clinical to research institutions, facilitating the development and validation of medical research software and the sharing of results back to the clinical partners. GIFT-Cloud supports collaboration between multiple healthcare and research institutions while satisfying the demands of patient confidentiality, data security and data ownership. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Pueschel, R. F.; Howard, S. D.; Foster, T. C.; Hallett, J.; Arnott, W. P.; Condon, Estelle P. (Technical Monitor)
1996-01-01
Whether cirrus clouds heat or cool the Earth-atmosphere system depends on the relative importance of the cloud shortwave albedo effect and the cloud thermal greenhouse effect. Both are determined by the distribution of ice condensate with cloud particle size. The microphysics instrument package flown aboard the NASA DC-8 in TOGA/COARE included an ice crystal replicator, a 2D Greyscale Cloud Particle Probe and a Forward Scattering Spectrometer Aerosol Probe. In combination, the electro-optical instruments permitted particle size measurements between 0.5 micrometer and 2.6 millimeter diameter. Ice crystal replicas were used to validate signals from the electrooptical instruments. Both optical and scanning electron microscopy were utilized to analyze aerosol and ice particle replicas between 0.1 micrometer and several 100 micrometer diameter. In first approximation, the combined aerosol-cloud particle spectrum in several clouds followed a power law N alpha D(sup -2.5). Thus, large cloud particles carried most of the condensate mass, while small cloud and aerosol particles determined the surface area. The mechanism of formation of small particles is growth of (hygroscopic, possibly ocean-derived) aerosol particles along the Kohler curves. The concentration of small particles is higher and less variable in space and time, and their tropospheric residence time is longer, than those of large cloud particles because of lower sedimentation velocities. Small particles shift effective cloud particle radii to sizes much smaller than the mean diameter of the cloud particles. This causes an increase in shortwave reflectivity and IR emissivity, and a decrease in transmissivity. Occasionally, the cloud reflectivity increased with altitude (decreasing temperature) stronger than did cloud emissivity, yielding enhanced radiative cooling at higher altitudes. Thus, cirrus produced by deep convection in the tropics may be critical in controlling processes whereby energy from warm tropical oceans is injected to different levels in the atmosphere to subsequently influence not only tropical but mid-latitude climate.
Visualizing unstructured patient data for assessing diagnostic and therapeutic history.
Deng, Yihan; Denecke, Kerstin
2014-01-01
Having access to relevant patient data is crucial for clinical decision making. The data is often documented in unstructured texts and collected in the electronic health record. In this paper, we evaluate an approach to visualize information extracted from clinical documents by means of tag cloud. Tag clouds will be generated using a bag of word approach and by exploiting part of speech tags. For a real word data set comprising radiological reports, pathological reports and surgical operation reports, tag clouds are generated and a questionnaire-based study is conducted as evaluation. Feedback from the physicians shows that the tag cloud visualization is an effective and rapid approach to represent relevant parts of unstructured patient data. To handle the different medical narratives, we have summarized several possible improvements according to the user feedback and evaluation results.
Analog Building Blocks for Communications Modems.
1977-01-01
x*—*- A0-A039 82b ELECTRONIC COMMUNICATIONS INC ST PETERSBURG FLA F/6 9/5 ANALOG BUILDING BLOCKS FOR COMMUNICATIONS MODEMS .(U) JAN 77 B BLACK...F33615-7<t-C-1120 UNCLASSIFIED AFAL-TR-76-29 NL ANALOG BUILDING BLOCKS FOR COMMUNICATIONS MODEMS ELECTRONIC COMMUNICATIONS INC. A SUBSIDIARY OF...Idantltr Or Mac* numb*,; Avionics Building-Block modules Frequency Synthesize* Costas Demodulator Amplifier Modem Frequency Multiplier ’ -^ « TRACT
Trust and Relationship Building in Electronic Commerce.
ERIC Educational Resources Information Center
Papadopoulou, Panagiota; Andreou, Andreas; Kanellis, Panagiotis; Martakos, Drakoulis
2001-01-01
Discussion of the need for trust in electronic commerce to build customer relationships focuses on a model drawn from established theoretical work on trust and relationship marketing that highlights differences between traditional and electronic commerce. Considers how trust can be built into virtual environments. (Contains 50 references.)…
CIMEL Measurements of Zenith Radiances at the ARM Site
NASA Technical Reports Server (NTRS)
Marshak, Alexander; Wiscombe, Warren; Lau, William K. M. (Technical Monitor)
2002-01-01
Starting from October 1, 2001, Cimel at the ARM Central Facility in Oklahoma has been switched to a new "cloud mode." This mode allows taking measurements of zenith radiance when the Sun in blocked by clouds. In this case, every 13 min. Cimel points straight up and takes 10 measurements with 9 sec. time interval. The new Cimel's mode has four filters at 440, 670, 870 and 1020 nm. For cloudy conditions, the spectral contrast in surface albedo dominates over Rayleigh and aerosol effects; this makes normalized zenith radiances at 440 and 670 as well as for 870 and 1020 almost indistinguishable. We compare Cimel measurements with other ARM cart site instruments: Multi-Filter Rotating Shadowband Radiometer (MFRSR), Narrow Field of View (NFOV) sensor, and MicroWave Radiometer(MWR). Based on Cimel and MFRSR 670 and 870 nm channels, we build a normalized difference cloud index (NDCI) for radiances and fluxes, respectively. Radiance NDCI from Cimel and flux NDCI from MFRSR are compared between themselves as well as with cloud Liquid Water Path (LWP) retrieved from MWR. Based on our theoretical calculations and preliminary data analysis,there is a good correlation between NDCIs and LWP for cloudy sky above green vegetation. Based on this correlation, an algorithm to retrieve cloud optical depth from NDCI is proposed.
MagCloud: magazine self-publishing for the long tail
NASA Astrophysics Data System (ADS)
Koh, Kok-Wei; Chatow, Ehud
2010-02-01
In June of 2008, Hewlett-Packard Labs launched MagCloud, a print-on-demand web service for magazine selfpublishing. MagCloud enables anyone to publish their own magazine by simply uploading a PDF file to the site. There are no setup fees, minimum print runs, storage requirements or waste due to unsold magazines. Magazines are only printed when an order is placed, and are shipped directly to the end customer. In the course of building this web service, a number of technological challenges were encountered. In this paper, we will discuss these challenges and the methods used to overcome them. Perhaps the most important decision in enabling the successful launch of MagCloud was the choice to offer a single product. This simplified the PDF validation phase and streamlined the print fulfillment process such that orders can be printed, folded and trimmed in batches, rather than one-by-one. In a sense, MagCloud adopted the Ford Model T approach to manufacturing, where having just a single model with little or no options allows for efficiencies in the production line, enabling a lower product price and opening the market to a much larger customer base. This platform has resulted in a number of new niche publications - the long tail of publishing.
Externally fed star formation: a numerical study
NASA Astrophysics Data System (ADS)
Mohammadpour, Motahareh; Stahler, Steven W.
2013-08-01
We investigate, through a series of numerical calculations, the evolution of dense cores that are accreting external gas up to and beyond the point of star formation. Our model clouds are spherical, unmagnetized configurations with fixed outer boundaries, across which gas enters subsonically. When we start with any near-equilibrium state, we find that the cloud's internal velocity also remains subsonic for an extended period, in agreement with observations. However, the velocity becomes supersonic shortly before the star forms. Consequently, the accretion rate building up the protostar is much greater than the benchmark value c_s^3/G, where cs is the sound speed in the dense core. This accretion spike would generate a higher luminosity than those seen in even the most embedded young stars. Moreover, we find that the region of supersonic infall surrounding the protostar races out to engulf much of the cloud, again in violation of the observations, which show infall to be spatially confined. Similar problematic results have been obtained by all other hydrodynamic simulations to date, regardless of the specific infall geometry or boundary conditions adopted. Low-mass star formation is evidently a quasi-static process, in which cloud gas moves inward subsonically until the birth of the star itself. We speculate that magnetic tension in the cloud's deep interior helps restrain the infall prior to this event.
An open science cloud for scientific research
NASA Astrophysics Data System (ADS)
Jones, Bob
2016-04-01
The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.
NASA Astrophysics Data System (ADS)
Diner, David
2010-05-01
The Multi-angle Imaging SpectroRadiometer (MISR) instrument has been collecting global Earth data from NASA's Terra satellite since February 2000. With its 9 along-track view angles, 4 spectral bands, intrinsic spatial resolution of 275 m, and stable radiometric and geometric calibration, no instrument that combines MISR's attributes has previously flown in space, nor is there is a similar capability currently available on any other satellite platform. Multiangle imaging offers several tools for remote sensing of aerosol and cloud properties, including bidirectional reflectance and scattering measurements, stereoscopic pattern matching, time lapse sequencing, and potentially, optical tomography. Current data products from MISR employ several of these techniques. Observations of the intensity of scattered light as a function of view angle and wavelength provide accurate measures of aerosol optical depths (AOD) over land, including bright desert and urban source regions. Partitioning of AOD according to retrieved particle classification and incorporation of height information improves the relationship between AOD and surface PM2.5 (fine particulate matter, a regulated air pollutant), constituting an important step toward a satellite-based particulate pollution monitoring system. Stereoscopic cloud-top heights provide a unique metric for detecting interannual variability of clouds and exceptionally high quality and sensitivity for detection and height retrieval for low-level clouds. Using the several-minute time interval between camera views, MISR has enabled a pole-to-pole, height-resolved atmospheric wind measurement system. Stereo imagery also makes possible global measurement of the injection heights and advection speeds of smoke plumes, volcanic plumes, and dust clouds, for which a large database is now available. To build upon what has been learned during the first decade of MISR observations, we are evaluating algorithm updates that not only refine retrieval accuracies but also include enhancements (e.g., finer spatial resolution) that would have been computationally prohibitive just ten years ago. In addition, we are developing technological building blocks for future sensors that enable broader spectral coverage, wider swath, and incorporation of high-accuracy polarimetric imaging. Prototype cameras incorporating photoelastic modulators have been constructed. To fully capitalize on the rich information content of the current and next-generation of multiangle imagers, several algorithmic paradigms currently employed need to be re-examined, e.g., the use of aerosol look-up tables, neglect of 3-D effects, and binary partitioning of the atmosphere into "cloudy" or "clear" designations. Examples of progress in algorithm and technology developments geared toward advanced application of multiangle imaging to remote sensing of aerosols and clouds will be presented.
NASA Technical Reports Server (NTRS)
Sadowy, Gregory; Tanelli, Simone; Chamberlain, Neil; Durden, Stephen; Fung, Andy; Sanchez-Barbetty, Mauricio; Thrivikraman, Tushar
2013-01-01
The National Resource Council’s Earth Science Decadal Survey” (NRCDS) has identified the Aerosol/Climate/Ecosystems (ACE) Mission as a priority mission for NASA Earth science. The NRC recommended the inclusion of "a cross-track scanning cloud radar with channels at 94 GHz and possibly 34 GHz for measurement of cloud droplet size, glaciation height, and cloud height". Several radar concepts have been proposed that meet some of the requirements of the proposed ACE mission but none have provided scanning capability at both 34 and 94 GHz due to the challenge of constructing scanning antennas at 94 GHz. In this paper, we will describe a radar design that leverages new developments in microwave monolithic integrated circuits (MMICs) and micro-machining to enable an electronically-scanned radar with both Ka-band (35 GHz) and W-band (94-GHz) channels. This system uses a dual-frequency linear active electronically-steered array (AESA) combined with a parabolic cylindrical reflector. This configuration provides a large aperture (3m x 5m) with electronic-steering but is much simpler than a two-dimension AESA of similar size. Still, the W-band frequency requires element spacing of approximately 2.5 mm, presenting significant challenges for signal routing and incorporation of MMICs. By combining (Gallium Nitride) GaN MMIC technology with micro-machined radiators and interconnects and silicon-germanium (SiGe) beamforming MMICs, we are able to meet all the performance and packaging requirements of the linear array feed and enable simultaneous scanning of Ka-band and W-band radars over swath of up to 100 km.
Terai, C. R.; Klein, S. A.; Zelinka, M. D.
2016-08-26
The increase in cloud optical depth with warming at middle and high latitudes is a robust cloud feedback response found across all climate models. This study builds on results that suggest the optical depth response to temperature is timescale invariant for low-level clouds. The timescale invariance allows one to use satellite observations to constrain the models' optical depth feedbacks. Three passive-sensor satellite retrievals are compared against simulations from eight models from the Atmosphere Model Intercomparison Project (AMIP) of the 5th Coupled Model Intercomparison Project (CMIP5). This study confirms that the low-cloud optical depth response is timescale invariant in the AMIPmore » simulations, generally at latitudes higher than 40°. Compared to satellite estimates, most models overestimate the increase in optical depth with warming at the monthly and interannual timescales. Many models also do not capture the increase in optical depth with estimated inversion strength that is found in all three satellite observations and in previous studies. The discrepancy between models and satellites exists in both hemispheres and in most months of the year. A simple replacement of the models' optical depth sensitivities with the satellites' sensitivities reduces the negative shortwave cloud feedback by at least 50% in the 40°–70°S latitude band and by at least 65% in the 40°–70°N latitude band. Furthermore, based on this analysis of satellite observations, we conclude that the low-cloud optical depth feedback at middle and high latitudes is likely too negative in climate models.« less
Extended field observations of cirrus clouds using a ground-based cloud observing system
NASA Technical Reports Server (NTRS)
Ackerman, Thomas P.
1994-01-01
The evolution of synoptic-scale dynamics associated with a middle and upper tropospheric cloud event that occurred on 26 November 1991 is examined. The case under consideration occurred during the FIRE CIRRUS-II Intensive Field Observing Period held in Coffeyville, KS during Nov. and Dec., 1991. Using data from the wind profiler demonstration network and a temporally and spatially augmented radiosonde array, emphasis is given to explaining the evolution of the kinematically-derived ageostrophic vertical circulations and correlating the circulation with the forcing of an extensively sampled cloud field. This is facilitated by decomposing the horizontal divergence into its component parts through a natural coordinate representation of the flow. Ageostrophic vertical circulations are inferred and compared to the circulation forcing arising from geostrophic confluence and shearing deformation derived from the Sawyer-Eliassen Equation. It is found that a thermodynamically indirect vertical circulation existed in association with a jet streak exit region. The circulation was displaced to the cyclonic side of the jet axis due to the orientation of the jet exit between a deepening diffluent trough and building ridge. The cloud line formed in the ascending branch of the vertical circulation with the most concentrated cloud development occurring in conjunction with the maximum large-scale vertical motion. The relationship between the large scale dynamics and the parameterization of middle and upper tropospheric clouds in large-scale models is discussed and an example of ice water contents derived from a parameterization forced by the diagnosed vertical motions and observed water vapor contents is presented.
a Point Cloud Classification Approach Based on Vertical Structures of Ground Objects
NASA Astrophysics Data System (ADS)
Zhao, Y.; Hu, Q.; Hu, W.
2018-04-01
This paper proposes a novel method for point cloud classification using vertical structural characteristics of ground objects. Since urbanization develops rapidly nowadays, urban ground objects also change frequently. Conventional photogrammetric methods cannot satisfy the requirements of updating the ground objects' information efficiently, so LiDAR (Light Detection and Ranging) technology is employed to accomplish this task. LiDAR data, namely point cloud data, can obtain detailed three-dimensional coordinates of ground objects, but this kind of data is discrete and unorganized. To accomplish ground objects classification with point cloud, we first construct horizontal grids and vertical layers to organize point cloud data, and then calculate vertical characteristics, including density and measures of dispersion, and form characteristic curves for each grids. With the help of PCA processing and K-means algorithm, we analyze the similarities and differences of characteristic curves. Curves that have similar features will be classified into the same class and point cloud correspond to these curves will be classified as well. The whole process is simple but effective, and this approach does not need assistance of other data sources. In this study, point cloud data are classified into three classes, which are vegetation, buildings, and roads. When horizontal grid spacing and vertical layer spacing are 3 m and 1 m respectively, vertical characteristic is set as density, and the number of dimensions after PCA processing is 11, the overall precision of classification result is about 86.31 %. The result can help us quickly understand the distribution of various ground objects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terai, C. R.; Klein, S. A.; Zelinka, M. D.
The increase in cloud optical depth with warming at middle and high latitudes is a robust cloud feedback response found across all climate models. This study builds on results that suggest the optical depth response to temperature is timescale invariant for low-level clouds. The timescale invariance allows one to use satellite observations to constrain the models' optical depth feedbacks. Three passive-sensor satellite retrievals are compared against simulations from eight models from the Atmosphere Model Intercomparison Project (AMIP) of the 5th Coupled Model Intercomparison Project (CMIP5). This study confirms that the low-cloud optical depth response is timescale invariant in the AMIPmore » simulations, generally at latitudes higher than 40°. Compared to satellite estimates, most models overestimate the increase in optical depth with warming at the monthly and interannual timescales. Many models also do not capture the increase in optical depth with estimated inversion strength that is found in all three satellite observations and in previous studies. The discrepancy between models and satellites exists in both hemispheres and in most months of the year. A simple replacement of the models' optical depth sensitivities with the satellites' sensitivities reduces the negative shortwave cloud feedback by at least 50% in the 40°–70°S latitude band and by at least 65% in the 40°–70°N latitude band. Furthermore, based on this analysis of satellite observations, we conclude that the low-cloud optical depth feedback at middle and high latitudes is likely too negative in climate models.« less
NASA Astrophysics Data System (ADS)
Connell, P. H.
2017-12-01
The University of Valencia has developed a software simulator LEPTRACK to simulate lepton and photon scattering in any kind of media with a variable density, and permeated by electric/magnetic fields of any geometry, and which can handle an exponential runaway avalanche. Here we show results of simulating the interaction of electrons/positrons/photons in an incoming TeV cosmic ray shower with the kind of electric fields expected in a stormcloud after a CG discharge which removes much of the positive charge build up at the centre of the cloud. The point is to show not just a Relativistic Runaway Electron Avalanche (RREA) above the upper negative shielding layer at 12 km but other gamma ray emission due to electron/positron interaction in the remaining positive charge around 9km and the lower negative charge at 6km altitude. We present here images, lightcurves, altitude profiles, spectra and videos showing the different ionization, excitation and photon density fields produced, their time evolution, and how they depend critically on where the cosmic ray shower beam intercepts the electric field geometry. We also show a new effect of incoming positrons, which make up a significant fraction of the shower, where they appear to "orbit" within the high altitude negative shielding layer, and which has been conjectured to produce significant microwave emission, as well as a short range 511 keV annihilation line. The interesting question is if this conjectured emission can be observed and correlated with TGF orbital observations to prove that a TGF originates in the macro-fields of stormclouds or the micro-fields of light leaders and streamers where this "positron orbiting" is not likely to occur.
Who Needs Lewis Structures to Get VSEPR Geometries?
ERIC Educational Resources Information Center
Lindmark, Alan F.
2010-01-01
Teaching the VSEPR (valence shell electron-pair repulsion) model can be a tedious process. Traditionally, Lewis structures are drawn and the number of "electron clouds" (groups) around the central atom are counted and related to the standard VSEPR table of possible geometries. A simpler method to deduce the VSEPR structure without first drawing…
NASA Technical Reports Server (NTRS)
Wanjek, Christopher
2003-01-01
The CMB polarization was produced as light scattered off a primordial cloud of protons and electrons nearly 14 billion years ago, about 400,000 years after the Big Bang. This marks the moment of recombination, when the universe finally cooled enough to allow electrons to join protons. The CMB is the light that broke through the fog.
DeGaspari, John
2011-10-01
CIOs are hard at work coming up with the most effective and affordable strategies for protecting electronic data as their hospitals move forward on electronic medical records. While the rise of cloud computing and declining network costs are offering new opportunities in dealing with potential disasters, many find there is no substitute for good planning and constant testing.
Plasma waves associated with the first AMPTE magnetotail barium release
NASA Technical Reports Server (NTRS)
Gurnett, D. A.; Anderson, R. R.; Bernhardt, P. A.; Luehr, H.; Haerendel, G.
1986-01-01
Plasma waves observed during the March 21, 1985, AMPTE magnetotail barium release are described. Electron plasma oscillations provided local measurements of the plasma density during both the expansion and decay phases. Immediately after the explosion, the electron density reached a peak of about 400,000/cu cm, and then started decreasing approximately as t to the -2.4 as the cloud expanded. About 6 minutes after the explosion, the electron density suddenly began to increase, reached a secondary peak of about 240/cu cm, and then slowly decayed down to the preevent level over a period of about 15 minutes. The density increase is believed to be caused by the collapse of the ion cloud into the diamagnetic cavity created by the initial expansion. The plasma wave intensities observed during the entire event were quite low. In the diamagnetic cavity, electrostatic emissions were observed near the barium ion plasma frequency, and in another band at lower frequencies. A broadband burst of electrostatic noise was also observed at the boundary of the diamagnetic cavity. Except for electron plasma oscillations, no significant wave activity was observed outside of the diamagnetic cavity.
Electron Identification and Energy Measurement with Emulsion Cloud Chamber
NASA Astrophysics Data System (ADS)
Kitagawa, Nobuko; Komatsu, Masahiro
Charged particles undergo the Multiple Coulomb Scattering (MCS) when passing through a material. Their momentum can be estimated from the distribution of the scattering angle directly. Angle of electrons (or positrons) largely changes because of the energy loss in bremsstrahlung, and they are distinguished from other charged particles by making use of its feature. Electron energy is generally measured by counting of electromagnetic shower (e.m. shower) tracks in Emulsion Cloud Chamber (ECC), so enough absorber material is needed to develop the shower. In the range from sub-GeV to a few GeV, electrons don't develop noticeable showers. In order to estimate the energy of electrons in this range with a limited material, we established the new method which is based on the scattering angle considering the energy loss in bremsstrahlung. From the Monte Carlo simulation (MC) data, which is generated by electron beam (0.5 GeV, 1 GeV, 2 GeV) exposure to ECC, we derived the correlation between energy and scattering angle in each emulsion layer. We fixed the function and some parameters which 1 GeV MC sample would return 1 GeV as the center value, and then applied to 0.5 GeV and 2 GeV sample and confirmed the energy resolution about 50% within two radiation length.
A relativistic neutron fireball from a supernova explosion as a possible source of chiral influence.
Gusev, G A; Saito, T; Tsarev, V A; Uryson, A V
2007-06-01
We elaborate on a previously proposed idea that polarized electrons produced from neutrons, released in a supernova (SN) explosion, can cause chiral dissymmetry of molecules in interstellar gas-dust clouds. A specific physical mechanism of a relativistic neutron fireball with Lorentz factor of the order of 100 is assumed for propelling a great number of free neutrons outside the dense SN shell. A relativistic chiral electron-proton plasma, produced from neutron decays, is slowed down owing to collective effects in the interstellar plasma. As collective effects do not involve the particle spin, the electrons can carry their helicities to the cloud. The estimates show high chiral efficiency of such electrons. In addition to this mechanism, production of circularly polarized ultraviolet photons through polarized-electron bremsstrahlung at an early stage of the fireball evolution is considered. It is shown that these photons can escape from the fireball plasma. However, for an average density of neutrals in the interstellar medium of the order of 0.2 cm(-3) and at distances of the order of 10 pc from the SN, these photons will be absorbed with a factor of about 10(-7) due to the photoeffect. In this case, their chiral efficiency will be about five orders of magnitude less than that for polarized electrons.
NASA Technical Reports Server (NTRS)
Chenette, D. L.; Stone, E. C.
1983-01-01
An analysis of the electron-absorption signature observed by the cosmic-ray system on Voyager 2 near the orbit of Mimas is presented. It is found that these observations cannot be explained as the absorption signature of Mimas. By combining Pioneer 11 and Voyager 2 measurements of the electron flux at Mimas's orbit (L = 3.1), an electron spectrum is found in which most of the flux above about 100 keV is concentrated near 1 to 3 MeV. This spectral form is qualitatively consistent with the bandpass filter model of Van Allen et al. (1980). The expected Mimas absorption signature is calculated from this spectrum neglecting radial diffusion. Since no Mimas absorption signature was observed in the inbound Voyager 2 data, a lower limit on the diffusion coefficient for MeV electrons at L = 3.1 of D greater than 10 to the -8th sq Saturn radii/sec is obtained. With a diffusion coefficient this large, both the Voyager 2 and the Pioneer 11 small-scale electron-absorption-signature observations in Mimas's orbit are enigmatic. Thus the mechanism for producing these signatures is referred to as the Mimas ghost. A cloud of material in orbit with Mimas may account for the observed electron signature if the cloud is at least 1-percent opaque to electrons across a region extending over a few hundred kilometers.
NASA Astrophysics Data System (ADS)
Baumann, Thomas M.; Lapierre, Alain; Kittimanapun, Kritsada; Schwarz, Stefan; Leitner, Daniela; Bollen, Georg
2014-07-01
The Electron Beam Ion Trap (EBIT) of the National Superconducting Cyclotron Laboratory at Michigan State University is used as a charge booster and injector for the currently commissioned rare isotope re-accelerator facility ReA. This EBIT charge breeder is equipped with a unique superconducting magnet configuration, a combination of a solenoid and a pair of Helmholtz coils, allowing for a direct observation of the ion cloud while maintaining the advantages of a long ion trapping region. The current density of its electron beam is a key factor for efficient capture and fast charge breeding of continuously injected, short-lived isotope beams. It depends on the radius of the magnetically compressed electron beam. This radius is measured by imaging the highly charged ion cloud trapped within the electron beam with a pinhole camera, which is sensitive to X-rays emitted by the ions with photon energies between 2 keV and 10 keV. The 80%-radius of a cylindrical 800 mA electron beam with an energy of 15 keV is determined to be r_{80%}=(212± 19)μm in a 4 T magnetic field. From this, a current density of j = (454 ± 83)A/cm2 is derived. These results are in good agreement with electron beam trajectory simulations performed with TriComp and serve as a test for future electron gun design developments.
High-energy radiation from collisions of high-velocity clouds and the Galactic disc
NASA Astrophysics Data System (ADS)
del Valle, Maria V.; Müller, A. L.; Romero, G. E.
2018-04-01
High-velocity clouds (HVCs) are interstellar clouds of atomic hydrogen that do not follow normal Galactic rotation and have velocities of a several hundred kilometres per second. A considerable number of these clouds are falling down towards the Galactic disc. HVCs form large and massive complexes, so if they collide with the disc a great amount of energy would be released into the interstellar medium. The cloud-disc interaction produces two shocks: one propagates through the cloud and the other through the disc. The properties of these shocks depend mainly on the cloud velocity and the disc-cloud density ratio. In this work, we study the conditions necessary for these shocks to accelerate particles by diffusive shock acceleration and we study the non-thermal radiation that is produced. We analyse particle acceleration in both the cloud and disc shocks. Solving a time-dependent two-dimensional transport equation for both relativistic electrons and protons, we obtain particle distributions and non-thermal spectral energy distributions. In a shocked cloud, significant synchrotron radio emission is produced along with soft gamma rays. In the case of acceleration in the shocked disc, the non-thermal radiation is stronger; the gamma rays, of leptonic origin, might be detectable with current instruments. A large number of protons are injected into the Galactic interstellar medium, and locally exceed the cosmic ray background. We conclude that under adequate conditions the contribution from HVC-disc collisions to the galactic population of relativistic particles and the associated extended non-thermal radiation might be important.
NASA Technical Reports Server (NTRS)
Hastings, D. E.; Gatsonis, N. A.; Rivas, D. A.
1988-01-01
Plasma contactors have been proposed as a means of making good electrical contact between biased surfaces such as found at the ends of an electrodynamic tether and the space environment. A plasma contactor is a plasma source which emits a plasma cloud which facilitates the electrical connection. The physics of this plasma cloud is investigated for contactors used as electron collectors and it is shown that contactor clouds in space will consist of a spherical core possibly containing a shock wave. Outside of the core the cloud will expand anisotropically across the magnetic field leading to a turbulent cigar shape structure along the field. This outer region is itself divided into two regions by the ion response to the electric field. A two-dimensional theory of the motion of the cloud across the magnetic field is developed. The current voltage characteristic of an Argon plasma contactor cloud is estimated for several ion currents in the range of 1-100 Amperes. It is shown that small ion current contactors are more efficient than large ion current contactors. This suggests that if a plasma contactor is used on an electrodynamic tether then a miltiple tether array will be more efficient than a single tether.
Interstellar molecules and dense clouds.
NASA Technical Reports Server (NTRS)
Rank, D. M.; Townes, C. H.; Welch, W. J.
1971-01-01
Current knowledge of the interstellar medium is discussed on the basis of recent published studies. The subjects considered include optical identification of interstellar molecules, radio molecular lines, interstellar clouds, isotopic abundances, formation and disappearance of interstellar molecules, and interstellar probing techniques. Diagrams are plotted for the distribution of galactic sources exhibiting molecular lines, for hydrogen molecule, hydrogen atom and electron abundances due to ionization, for the densities, velocities and temperature of NH3 in the direction of Sagitarius B2, for the lower rotational energy levels of H2CO, and for temporal spectral variations in masing H2O clouds of the radio source W49. Future applications of the maser and of molecular microscopy in this field are visualized.
Complementarity of Historic Building Information Modelling and Geographic Information Systems
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.
2016-06-01
In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.
Cloud Based Earth Observation Data Exploitation Platforms
NASA Astrophysics Data System (ADS)
Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.
2017-12-01
In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland and the Amazon Web Services cloud. This work will present an overview of the TEPs and the multi-cloud EO data processing platform, and discuss their main achievements and their impacts in the context of distributed Research Infrastructures such as EPOS and EOSC.
Cloud Based Electronic Health Record Applications are Essential to Expeditionary Patient Care
2017-05-01
security46 and privacy concerns47). Privacy/Security Risks of Cloud Computing A quantitative study based on the preceding literature review...to medical IT wherever there is a Wi-Fi connection and a computing device (desktop, laptop , tablet, phone, etc.). In 2015 the DoD launched MiCare, a...Hosting Services: a Study on Students’ Acceptance,” Computers in Human Behavior, 2013. Takai, Teri. DoD CIO’s 10-Point Plan for IT Modernization
Kondoh, Hiroshi; Teramoto, Kei; Kawai, Tatsurou; Mochida, Maki; Nishimura, Motohiro
2013-01-01
A Newly developed Oshidori-Net2, providing medical professionals with remote access to electronic patient record systems (EPR) and PACSs of four hospitals, of different venders, using cloud computing technology and patient identifier cross reference manager. The operation was started from April 2012. The patients moved to other hospital were applied. Objective is to show the merit and demerit of the new system.
Lost in the Cloud - New Challenges for Teaching GIS
NASA Astrophysics Data System (ADS)
Bellman, C. J.; Pupedis, G.
2016-06-01
As cloud based services move towards becoming the dominant paradigm in many areas of information technology, GIS has also moved into `the Cloud', creating a new opportunities for professionals and students alike, while at the same time presenting a range of new challenges and opportunities for GIS educators. Learning for many students in the geospatial science disciplines has been based on desktop software for GIS, building their skills from basic data handling and manipulation to advanced spatial analysis and database storage. Cloud-based systems challenge this paradigm in many ways, with some of the skills being replaced by clever and capable software tools, while the ubiquitous nature of the computing environment offers access and processing from anywhere, on any device. This paper describes our experiences over the past two years in developing and delivering a new course incorporating cloud based technologies for GIS and illustrates the many benefits and pitfalls of a cloud based approach to teaching. Throughout the course, students were encouraged to provide regular feedback on the course through the use of online journals. This allowed students to critique the approach to teaching, the learning materials available and to describe their own level of comfort and engagement with the material in an honest and non-confrontational manner. Many of the students did not have a strong information technology background and the journals provided great insight into the views of the students and the challenges they faced in mastering this technology.
Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT.
Yan, Hongyang; Li, Xuan; Wang, Yu; Jia, Chunfu
2018-06-04
In recent years, the Internet of Things (IoT) has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users' personal information, the privacy protection of users' information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX) technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.
Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds
Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn; ...
2016-02-18
In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less
Large-scale urban point cloud labeling and reconstruction
NASA Astrophysics Data System (ADS)
Zhang, Liqiang; Li, Zhuqiang; Li, Anjian; Liu, Fangyu
2018-04-01
The large number of object categories and many overlapping or closely neighboring objects in large-scale urban scenes pose great challenges in point cloud classification. In this paper, a novel framework is proposed for classification and reconstruction of airborne laser scanning point cloud data. To label point clouds, we present a rectified linear units neural network named ReLu-NN where the rectified linear units (ReLu) instead of the traditional sigmoid are taken as the activation function in order to speed up the convergence. Since the features of the point cloud are sparse, we reduce the number of neurons by the dropout to avoid over-fitting of the training process. The set of feature descriptors for each 3D point is encoded through self-taught learning, and forms a discriminative feature representation which is taken as the input of the ReLu-NN. The segmented building points are consolidated through an edge-aware point set resampling algorithm, and then they are reconstructed into 3D lightweight models using the 2.5D contouring method (Zhou and Neumann, 2010). Compared with deep learning approaches, the ReLu-NN introduced can easily classify unorganized point clouds without rasterizing the data, and it does not need a large number of training samples. Most of the parameters in the network are learned, and thus the intensive parameter tuning cost is significantly reduced. Experimental results on various datasets demonstrate that the proposed framework achieves better performance than other related algorithms in terms of classification accuracy and reconstruction quality.
Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn
In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less
Automatic 3D Building Detection and Modeling from Airborne LiDAR Point Clouds
ERIC Educational Resources Information Center
Sun, Shaohui
2013-01-01
Urban reconstruction, with an emphasis on man-made structure modeling, is an active research area with broad impact on several potential applications. Urban reconstruction combines photogrammetry, remote sensing, computer vision, and computer graphics. Even though there is a huge volume of work that has been done, many problems still remain…
ERIC Educational Resources Information Center
Association of College Unions International (NJ1), 2012
2012-01-01
This publication presents a collection of technology resources from the Association of College Unions International (ACUI) community. Contents include: (1) Podcasting (Jeff Lail); (2) Video Podcasting (Ed Cabellon); (3) Building a Multimedia Production Center (Nathan Byrer); (4) Cloud Computing in the Student Union and Student Activities (TJ…
Reconstructing evolutionary trees in parallel for massive sequences.
Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam
2017-12-14
Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .
NASA Astrophysics Data System (ADS)
Santagati, Cettina; Lo Turco, Massimiliano
2017-01-01
In recent years, we have witnessed a huge diffusion of building information modeling (BIM) approaches in the field of architectural design, although very little research has been undertaken to explore the value, criticalities, and advantages attributable to the application of these methodologies in the cultural heritage domain. Furthermore, the last developments in digital photogrammetry lead to the easy generation of reliable low-cost three-dimensional textured models that could be used in BIM platforms to create semantic-aware objects that could compose a specific library of historical architectural elements. In this case, the transfer between the point cloud and its corresponding parametric model is not so trivial and the level of geometrical abstraction could not be suitable with the scope of the BIM. The aim of this paper is to explore and retrace the milestone works on this crucial topic in order to identify the unsolved issues and to propose and test a unique and simple workflow practitioner centered and based on the use of the latest available solutions for point cloud managing into commercial BIM platforms.
Investigating the influence of volcanic sulfate aerosol on cloud properties Along A-Train tracks
NASA Astrophysics Data System (ADS)
Mace, G. G.
2017-12-01
Marine boundary layer (MBL) clouds are central actors in the climate system given their extensive coverage on the Earth's surface, their 1-way influence on the radiative balance (cooling), and their intimate coupling between air motions, anthropogenic and natural aerosol sources, and processes within the upper ocean mixed layer. Knowledge of how MBL shallow cumulus clouds respond to changes in aerosol is central to understanding how MBL clouds modulate the climate system. A frequent approach to investigating how sulfate aerosol influences MBL clouds has been to examine sulfate plumes extending downstream of active island volcanoes. This approach is challenging due to modification of the air motions in the plumes downstream of islands and due to the tendency of most researchers to examine only level-2 retrievals ignoring the actual data collected by sensors such as MODIS. Past studies have concluded that sulfate aerosols have large effects consistent with the 1st aerosol indirect effect (AIE). We reason that if such effects are as large as suggested in level-2 retrievals then evidence should also be present in the raw MODIS reflectance data as well as other data sources. In this paper we will build on our recently published work where we tested that hypothesis from data collected near Mount Kilauea during a 3-year period. Separating data into aerosol optical depth (A) quartiles, we found little support for a large 1st AIE response. We did find an unambiguous increase in sub 1km-scale cloud fraction with A. This increase in sub 1 km cloud fraction was entirely consistent with increased reflectance with increasing A that is used, via the level 2 retrievals, to argue for a large AIE response of MBL clouds. While the 1-km pixels became unambiguously brighter, that brightening was due to increased sub 1 km cloud fraction and not necessarily due to changes in pixel-level cloud microphysics. We also found that MBL cloud top heights increase as do surface wind speeds as aerosol increases while the radar reflectivity from CloudSat does not change implying that increased aerosols may have caused invigoration of the MBL clouds with little effect on precipitation. We have since expanded upon this initial analysis by exmaining data near other volcanic islands. These expanded results support our initial findings.
145. ELECTRONICS SHOP (BUILDING 60), FIRST FLOOR PLAN AND ELEVATION, ...
145. ELECTRONICS SHOP (BUILDING 60), FIRST FLOOR PLAN AND ELEVATION, CHARLES A. MAGUIRE, MARCH 25, 1952. PWD 10332. - Quonset Point Naval Air Station, Roger Williams Way, North Kingstown, Washington County, RI
An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing
NASA Astrophysics Data System (ADS)
Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.
2015-07-01
Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.
Development of Software for a Lidar-Altimeter Processor
NASA Technical Reports Server (NTRS)
Rosenberg, Jacob S.; Trujillo, Carlos
2005-01-01
A report describes the development of software for a digital processor that operates in conjunction with a finite-impulse-response (FIR) chip in a spaceborne lidar altimeter. Processing is started by a laser-fire interrupt signal that is repeated at intervals of 25 ms. For the purpose of discriminating between returns from the ground and returns from such things as trees, buildings, and clouds, the software is required to scan digitized lidar-return data in reverse of the acquisition sequence in order to distinguish the last return pulse from within a commanded ground-return range window. The digitized waveform information within this range window is filtered through 6 matched filters, in the hardware electronics, in order to maximize the probability of finding echoes from sloped or rough terrain and minimize the probability of selecting cloud returns. From the data falling past the end of the range window, there is obtained a noise baseline that is used to calculate a threshold value for each filter. The data from each filter is analyzed by a complex weighting scheme and the filter with the greatest weight is selected. A region around the peak of the ground return pulse associated with the selected filter is placed in telemetry, as well as information on its location, height, and other characteristics. The software requires many uplinked parameters as input. Included in the report is a discussion of major software-development problems posed by the design of the FIR chip and the need for the software to complete its process within 20 ms to fit within the overall 25-ms cycle.
Meteoric Metal Chemistry in the Martian Atmosphere
Carrillo‐Sanchez, J. D.; Mangan, T. P.; Crismani, M. M. J.; Schneider, N. M.; Määttänen, A.
2018-01-01
Abstract Recent measurements by the Imaging Ultraviolet Spectrograph (IUVS) instrument on NASA's Mars Atmosphere and Volatile EvolutioN mission show that a persistent layer of Mg+ ions occurs around 90 km in the Martian atmosphere but that neutral Mg atoms are not detectable. These observations can be satisfactorily modeled with a global meteoric ablation rate of 0.06 t sol−1, out of a cosmic dust input of 2.7 ± 1.6 t sol−1. The absence of detectable Mg at 90 km requires that at least 50% of the ablating Mg atoms ionize through hyperthermal collisions with CO2 molecules. Dissociative recombination of MgO+.(CO2)n cluster ions with electrons to produce MgCO3 directly, rather than MgO, also avoids a buildup of Mg to detectable levels. The meteoric injection rate of Mg, Fe, and other metals—constrained by the IUVS measurements—enables the production rate of metal carbonate molecules (principally MgCO3 and FeCO3) to be determined. These molecules have very large electric dipole moments (11.6 and 9.2 Debye, respectively) and thus form clusters with up to six H2O molecules at temperatures below 150 K. These clusters should then coagulate efficiently, building up metal carbonate‐rich ice particles which can act as nucleating particles for the formation of CO2‐ice clouds. Observable mesospheric clouds are predicted to occur between 65 and 80 km at temperatures below 95 K and above 85 km at temperatures about 5 K colder. PMID:29780678
Collisional heating as the origin of filament emission in galaxy clusters
NASA Astrophysics Data System (ADS)
Ferland, G. J.; Fabian, A. C.; Hatch, N. A.; Johnstone, R. M.; Porter, R. L.; van Hoof, P. A. M.; Williams, R. J. R.
2009-02-01
It has long been known that photoionization, whether by starlight or other sources, has difficulty in accounting for the observed spectra of the optical filaments that often surround central galaxies in large clusters. This paper builds on the first of this series in which we examined whether heating by energetic particles or dissipative magnetohydrodynamic (MHD) wave can account for the observations. The first paper focused on the molecular regions which produce strong H2 and CO lines. Here we extend the calculations to include atomic and low-ionization regions. Two major improvements to the previous calculations have been made. The model of the hydrogen atom, along with all elements of the H-like iso-electronic sequence, is now fully nl-resolved. This allows us to predict the hydrogen emission-line spectrum including excitation by suprathermal secondary electrons and thermal electrons or nuclei. We show how the predicted HI spectrum differs from the pure-recombination case. The second update is to the rates for H0-H2 inelastic collisions. We now use the values computed by Wrathmall et al. The rates are often much larger and allow the ro-vibrational H2 level populations to achieve a thermal distribution at substantially lower densities than previously thought. We calculate the chemistry, ionization, temperature, gas pressure and emission-line spectrum for a wide range of gas densities and collisional heating rates. We assume that the filaments are magnetically confined. The gas is free to move along field lines so that the gas pressure is equal to that of the surrounding hot gas. A mix of clouds, some being dense and cold and others hot and tenuous, can exist. The observed spectrum will be the integrated emission from clouds with different densities and temperatures but the same pressure P/k = nT. We assume that the gas filling factor is given by a power law in density. The power-law index, the only free parameter in this theory, is set by matching the observed intensities of infrared H2 lines relative to optical HI lines. We conclude that the filaments are heated by ionizing particles, either conducted in from surrounding regions or produced in situ by processes related to MHD waves. Contains material © British Crown copyright 2008/MoD. E-mail: gjferland@gmail.com
NASA Astrophysics Data System (ADS)
Yang, Wei; Hall, Trevor J.
2013-12-01
The Internet is entering an era of cloud computing to provide more cost effective, eco-friendly and reliable services to consumer and business users. As a consequence, the nature of the Internet traffic has been fundamentally transformed from a pure packet-based pattern to today's predominantly flow-based pattern. Cloud computing has also brought about an unprecedented growth in the Internet traffic. In this paper, a hybrid optical switch architecture is presented to deal with the flow-based Internet traffic, aiming to offer flexible and intelligent bandwidth on demand to improve fiber capacity utilization. The hybrid optical switch is capable of integrating IP into optical networks for cloud-based traffic with predictable performance, for which the delay performance of the electronic module in the hybrid optical switch architecture is evaluated through simulation.
Helmet-Mounted Display Of Clouds Of Harmful Gases
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Barengoltz, Jack B.; Schober, Wayne R.
1995-01-01
Proposed helmet-mounted opto-electronic instrument provides real-time stereoscopic views of clouds of otherwise invisible toxic, explosive, and/or corrosive gas. Display semitransparent: images of clouds superimposed on scene ordinarily visible to wearer. Images give indications on sizes and concentrations of gas clouds and their locations in relation to other objects in scene. Instruments serve as safety devices for astronauts, emergency response crews, fire fighters, people cleaning up chemical spills, or anyone working near invisible hazardous gases. Similar instruments used as sensors in automated emergency response systems that activate safety equipment and emergency procedures. Both helmet-mounted and automated-sensor versions used at industrial sites, chemical plants, or anywhere dangerous and invisible or difficult-to-see gases present. In addition to helmet-mounted and automated-sensor versions, there could be hand-held version. In some industrial applications, desirable to mount instruments and use them similarly to parking-lot surveillance cameras.
NASA Technical Reports Server (NTRS)
Collier, Michael R.; Szabo, A.; Farrell, W.; Slavin, J. A.; Lepping, R. P.; Fitzenreiter, R.; Thompson, B.; Hamilton, D. C.; Gloeckler, G.; Ho, G. C.
2000-01-01
Evidence is presented that the WIND spacecraft observed particle and field signatures on October 18-19, 1995 due to reconnection near the footpoints of a magnetic cloud (i.e., between 1 and 5 solar radii). These signatures include: (1) an internal shock traveling approximately along the axis of the magnetic cloud, (2) a simple compression of the magnetic field consistent with the footpoint magnetic fields being thrust outwards at speeds much greater than the solar wind speed, (3) an electron heat flux dropout occurring within minutes of the shock indicating a topological change resulting from disconnection from the solar surface, (4) a very cold 5 keV proton beam and (5) an associated monochromatic wave. We expect that, given observations of enough magnetic clouds, Wind and other spacecraft will see signatures similar to the ones reported here indicating reconnection. However, these observations require the spacecraft to be fortuitously positioned to observe the passing shock and other signatures and will therefore be associated with only a small fraction of magnetic clouds. Consistent with this, a few magnetic clouds observed by Wind have been found to possess internal shock waves.
NASA Astrophysics Data System (ADS)
Riebeek Kohl, H.; Chambers, L. H.; Murphy, T.
2016-12-01
For more that 20 years, the Global Learning and Observations to Benefit the Environment (GLOBE) Program has sought to increase environment literacy in students by involving them in the process of data collection and scientific research. In 2016, the program expanded to accept observations from citizen scientists of all ages through a relatively simple app. Called GLOBE Observer, the new program aims to help participants feel connected to a global community focused on advancing the scientific understanding of Earth system science while building climate literacy among participants and increasing valuable environmental data points to expand both student and scientific research. In October 2016, GLOBE Observer partnered with the Association of Science & Technology Centers (ASTC) in an international science experiment in which museums and patrons around the world collected cloud observations through GLOBE Observer to create a global cloud map in support of NASA satellite science. The experiment was an element of the International Science Center and Science Museum Day, an event planned in partnership with UNESCO and ASTC. Museums and science centers provided the climate context for the observations, while GLOBE Observer offered a uniform experience and a digital platform to build a connected global community. This talk will introduce GLOBE Observer and will present the results of the experiment, including evaluation feedback on gains in climate literacy through the event.
NASA Astrophysics Data System (ADS)
Krinitskiy, Mikhail; Sinitsyn, Alexey
2017-04-01
Shortwave radiation is an important component of surface heat budget over sea and land. To estimate them accurate observations of cloud conditions are needed including total cloud cover, spatial and temporal cloud structure. While massively observed visually, for building accurate SW radiation parameterizations cloud structure needs also to be quantified using precise instrumental measurements. While there already exist several state of the art land-based cloud-cameras that satisfy researchers needs, their major disadvantages are associated with inaccuracy of all-sky images processing algorithms which typically result in the uncertainties of 2-4 octa of cloud cover estimates with the resulting true-scoring cloud cover accuracy of about 7%. Moreover, none of these algorithms determine cloud types. We developed an approach for cloud cover and structure estimating, which provides much more accurate estimates and also allows for measuring additional characteristics. This method is based on the synthetic controlling index, namely the "grayness rate index", that we introduced in 2014. Since then this index has already demonstrated high efficiency being used along with the technique namely the "background sunburn effect suppression", to detect thin clouds. This made it possible to significantly increase the accuracy of total cloud cover estimation in various sky image states using this extension of routine algorithm type. Errors for the cloud cover estimates significantly decreased down resulting the mean squared error of about 1.5 octa. Resulting true-scoring accuracy is more than 38%. The main source of this approach uncertainties is the solar disk state determination errors. While the deep neural networks approach lets us to estimate solar disk state with 94% accuracy, the final result of total cloud estimation still isn`t satisfying. To solve this problem completely we applied the set of machine learning algorithms to the problem of total cloud cover estimation directly. The accuracy of this approach varies depending on algorithm choice. Deep neural networks demonstrated the best accuracy of more than 96%. We will demonstrate some approaches and the most influential statistical features of all-sky images that lets the algorithm reach that high accuracy. With the use of our new optical package a set of over 480`000 samples has been collected in several sea missions in 2014-2016 along with concurrent standard human observed and instrumentally recorded meteorological parameters. We will demonstrate the results of the field measurements and will discuss some still remaining problems and the potential of the further developments of machine learning approach.
Electron Excitation of High Dipole Moment Molecules
NASA Astrophysics Data System (ADS)
Goldsmith, Paul; Kauffmann, Jens
2018-01-01
Emission from high-dipole moment molecules such as HCN allows determination of the density in molecular clouds, and is often considered to trace the “dense” gas available for star formation. We assess the importance of electron excitation in various environments. The ratio of the rate coefficients for electrons and H2 molecules, ~10^5 for HCN, yields the requirements for electron excitation to be of practical importance if n(H2) < 10^{5.5} /cm3 and X(e-) > 10^{-5}, where the numerical factors reflect critical values n_c(H2) and X^*(e-). This indicates that in regions where a large fraction of carbon is ionized, X(e-) will be large enough to make electron excitation significant. The situation is in general similar for other “high density tracers”, including HCO+, CN, and CS. But there are significant differences in the critical electron fractional abundance, X^*(e-), defined by the value required for equal effect from collisions with H2 and e-. Electron excitation is, for example, unimportant for CO and C+. Electron excitation may be responsible for the surprisingly large spatial extent of the emission from dense gas tracers in some molecular clouds (Pety et al. 2017, Kauffmann, Goldsmith et al. 2017, A&A, submitted). The enhanced estimates for HCN abundances and HCN/CO and HCN/HCO+ ratios observed in the nuclear regions of luminous galaxies may be in part a result of electron excitation of high dipole moment tracers. The importance of electron excitation will depend on detailed models of the chemistry, which may well be non-steady state and non--static.
NASA Astrophysics Data System (ADS)
Ueda, S.; Hirose, Y.; Miura, K.; Okochi, H.
2014-02-01
Sizes and compositions of atmospheric aerosol particles can be altered by in-cloud processing by absorption/adsorption of gaseous and particulate materials and drying of aerosol particles that were formerly activated as cloud condensation nuclei. To elucidate differences of aerosol particles before and after in-cloud processing, aerosols were observed along a slope of Mt. Fuji, Japan (3776 m a.s.l.) during the summer in 2011 and 2012 using a portable laser particle counter (LPC) and an aerosol sampler. Aerosol samples for analyses of elemental compositions were obtained using a cascade impactor at top-of-cloud, in-cloud, and below-cloud altitudes. To investigate composition changes via in-cloud processing, individual particles (0.5-2 μm diameter) of samples from five cases (days) collected at different altitudes under similar backward air mass trajectory conditions were analyzed using a transmission electron microscope (TEM) equipped with an energy dispersive X-ray analyzer. For most cases (four cases), most particles at all altitudes mainly comprised sea salts: mainly Na with some S and/or Cl. Of those, in two cases, sea-salt-containing particles with Cl were found in below-cloud samples, although sea-salt-containing particles in top-of-cloud samples did not contain Cl. This result suggests that Cl in the sea salt was displaced by other cloud components. In the other two cases, sea-salt-containing particles on samples at all altitudes were without Cl. However, molar ratios of S to Na (S/Na) of the sea-salt-containing particles of top-of-cloud samples were higher than those of below-cloud samples, suggesting that sulfuric acid or sulfate was added to sea-salt-containing particles after complete displacement of Cl by absorption of SO2 or coagulation with sulfate. The additional volume of sulfuric acid in clouds for the two cases was estimated using the observed S/Na values of sea-salt-containing particles. The estimation revealed that size changes by in-cloud processing from below-cloud to top-of-cloud altitudes were less than 6% for sizes of 0.5-2 μm diameter. The obtained results will be useful to evaluate the aging effect and transition of aerosol particles through in-cloud processing.
NASA Astrophysics Data System (ADS)
Torbert, R.
1992-12-01
The present volume on active experiments in space discusses dynamic trapping of electrons in the Porcupine ionospheric ion beam experiment, plasma wave observations during electron gun experiments on ISEE-1, spatial coherence and electromagnetic wave generation during electron beam experiments in space, and recent experimental measurements of space platform charging at LEO altitudes. Attention is given to high voltage spheres in an unmagnetized plasma, energetic ion emission for active spacecraft control, the collective gyration of a heavy ion cloud in a magnetized plasma, and remote sensing of artificial luminous clouds by lidars. Topics addressed include modulation of the background flux of energetic particles by artificial injection, wave measurements in active experiments on plasma beam injection, field formation around negatively biased solar arrays in the LEO-plasma, and the registration of ELF waves in rocket-satellite experiments with plasma injection.