Deployment dynamics and control of large-scale flexible solar array system with deployable mast
NASA Astrophysics Data System (ADS)
Li, Hai-Quan; Liu, Xiao-Feng; Guo, Shao-Jing; Cai, Guo-Ping
2016-10-01
In this paper, deployment dynamics and control of large-scale flexible solar array system with deployable mast are investigated. The adopted solar array system is introduced firstly, including system configuration, deployable mast and solar arrays with several mechanisms. Then dynamic equation of the solar array system is established by the Jourdain velocity variation principle and a method for dynamics with topology changes is introduced. In addition, a PD controller with disturbance estimation is designed to eliminate the drift of spacecraft mainbody. Finally the validity of the dynamic model is verified through a comparison with ADAMS software and the deployment process and dynamic behavior of the system are studied in detail. Simulation results indicate that the proposed model is effective to describe the deployment dynamics of the large-scale flexible solar arrays and the proposed controller is practical to eliminate the drift of spacecraft mainbody.
White Paper on Dish Stirling Technology: Path Toward Commercial Deployment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andraka, Charles E.; Stechel, Ellen; Becker, Peter
2016-07-01
Dish Stirling energy systems have been developed for distributed and large-scale utility deployment. This report summarizes the state of the technology in a joint project between Stirling Energy Systems, Sandia National Laboratories, and the Department of Energy in 2011. It then lays out a feasible path to large scale deployment, including development needs and anticipated cost reduction paths that will make a viable deployment product.
2015-02-11
A similar risk-based approach may be appropriate for deploying military personnel. e) If DoD were to consider implementing a large- scale pre...quality of existing spirometry programs prior to considering a larger scale pre-deployment effort. Identifying an accelerated decrease in spirometry...baseline spirometry on a wider scale . e) Conduct pre-deployment baseline spirometry if there is a significant risk of exposure to a pulmonary hazard based
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, Alan J.
2016-04-29
While the stated reason for asking this question is “to understand better our ability to warn policy makers in the unlikely event of an unanticipated SRM geoengineering deployment or large-scale field experiment”, my colleagues and I felt that motives would be important context because the scale of any meaningful SRM deployment would be so large that covert deployment seems impossible. However, several motives emerged that suggest a less-than-global effort might be important.
Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.
Eichelberg, Marco; Chronaki, Catherine
2016-01-01
Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimenez, T.; Tegen, S.; Beiter, P.
To begin understanding the potential economic impacts of large-scale WEC technology, the Bureau of Ocean Energy Management (BOEM) commissioned the National Renewable Energy Laboratory (NREL) to conduct an economic impact analysis of largescale WEC deployment for Oregon coastal counties. This report follows a previously published report by BOEM and NREL on the jobs and economic impacts of WEC technology for the entire state (Jimenez and Tegen 2015). As in Jimenez and Tegen (2015), this analysis examined two deployment scenarios in the 2026-2045 timeframe: the first scenario assumed 13,000 megawatts (MW) of WEC technology deployed during the analysis period, and themore » second assumed 18,000 MW of WEC technology deployed by 2045. Both scenarios require major technology and cost improvements in the WEC devices. The study is on very large-scale deployment so readers can examine and discuss the potential of a successful and very large WEC industry. The 13,000-MW is used as the basis for the county analysis as it is the smaller of the two scenarios. Sensitivity studies examined the effects of a robust in-state WEC supply chain. The region of analysis is comprised of the seven coastal counties in Oregon—Clatsop, Coos, Curry, Douglas, Lane, Lincoln, and Tillamook—so estimates of jobs and other economic impacts are specific to this coastal county area.« less
A Commercialization Roadmap for Carbon-Negative Energy Systems
NASA Astrophysics Data System (ADS)
Sanchez, D.
2016-12-01
The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Kai; Kim, Donghoe; Whitaker, James B
Rapid development of perovskite solar cells (PSCs) during the past several years has made this photovoltaic (PV) technology a serious contender for potential large-scale deployment on the terawatt scale in the PV market. To successfully transition PSC technology from the laboratory to industry scale, substantial efforts need to focus on scalable fabrication of high-performance perovskite modules with minimum negative environmental impact. Here, we provide an overview of the current research and our perspective regarding PSC technology toward future large-scale manufacturing and deployment. Several key challenges discussed are (1) a scalable process for large-area perovskite module fabrication; (2) less hazardous chemicalmore » routes for PSC fabrication; and (3) suitable perovskite module designs for different applications.« less
Supporting Knowledge Transfer in IS Deployment Projects
NASA Astrophysics Data System (ADS)
Schönström, Mikael
To deploy new information systems is an expensive and complex task, and does seldom result in successful usage where the system adds strategic value to the firm (e.g. Sharma et al. 2003). It has been argued that innovation diffusion is a knowledge integration problem (Newell et al. 2000). Knowledge about business processes, deployment processes, information systems and technology are needed in a large-scale deployment of a corporate IS. These deployments can therefore to a large extent be argued to be a knowledge management (KM) problem. An effective deployment requires that knowledge about the system is effectively transferred to the target organization (Ko et al. 2005).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallarno, George; Rogers, James H; Maxwell, Don E
The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less
Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S
2016-12-01
Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment.Initial findings illustrate that it is clear - and perhaps not surprising - that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a 'multi-stakeholder' environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased 'buy-in' from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder - or at least substantially slow down - the effective rollout of services at scale.The deployment of Living-It-Up services is ongoing, but our results to date suggest that - in order to be successful - the roll-out of digital health and well-being technologies at scale requires a delicate and pragmatic trade-off between co-design activities, the development of innovative services and the efforts allocated to widespread marketing and recruitment initiatives. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Broderick, Robert; Mather, Barry
2016-05-01
This report analyzes distribution-integration challenges, solutions, and research needs in the context of distributed generation from PV (DGPV) deployment to date and the much higher levels of deployment expected with achievement of the U.S. Department of Energy's SunShot targets. Recent analyses have improved estimates of the DGPV hosting capacities of distribution systems. This report uses these results to statistically estimate the minimum DGPV hosting capacity for the contiguous United States using traditional inverters of approximately 170 GW without distribution system modifications. This hosting capacity roughly doubles if advanced inverters are used to manage local voltage and additional minor, low-cost changesmore » could further increase these levels substantially. Key to achieving these deployment levels at minimum cost is siting DGPV based on local hosting capacities, suggesting opportunities for regulatory, incentive, and interconnection innovation. Already, pre-computed hosting capacity is beginning to expedite DGPV interconnection requests and installations in select regions; however, realizing SunShot-scale deployment will require further improvements to DGPV interconnection processes, standards and codes, and compensation mechanisms so they embrace the contributions of DGPV to system-wide operations. SunShot-scale DGPV deployment will also require unprecedented coordination of the distribution and transmission systems. This includes harnessing DGPV's ability to relieve congestion and reduce system losses by generating closer to loads; minimizing system operating costs and reserve deployments through improved DGPV visibility; developing communication and control architectures that incorporate DGPV into system operations; providing frequency response, transient stability, and synthesized inertia with DGPV in the event of large-scale system disturbances; and potentially managing reactive power requirements due to large-scale deployment of advanced inverter functions. Finally, additional local and system-level value could be provided by integrating DGPV with energy storage and 'virtual storage,' which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Together, continued innovation across this rich distribution landscape can enable the very-high deployment levels envisioned by SunShot.« less
Biology-Inspired Distributed Consensus in Massively-Deployed Sensor Networks
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng
2005-01-01
Promises of ubiquitous control of the physical environment by large-scale wireless sensor networks open avenues for new applications that are expected to redefine the way we live and work. Most of recent research has concentrated on developing techniques for performing relatively simple tasks in small-scale sensor networks assuming some form of centralized control. The main contribution of this work is to propose a new way of looking at large-scale sensor networks, motivated by lessons learned from the way biological ecosystems are organized. Indeed, we believe that techniques used in small-scale sensor networks are not likely to scale to large networks; that such large-scale networks must be viewed as an ecosystem in which the sensors/effectors are organisms whose autonomous actions, based on local information, combine in a communal way to produce global results. As an example of a useful function, we demonstrate that fully distributed consensus can be attained in a scalable fashion in massively deployed sensor networks where individual motes operate based on local information, making local decisions that are aggregated across the network to achieve globally-meaningful effects.
ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.
Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng
2017-08-30
While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
"Tactic": Traffic Aware Cloud for Tiered Infrastructure Consolidation
ERIC Educational Resources Information Center
Sangpetch, Akkarit
2013-01-01
Large-scale enterprise applications are deployed as distributed applications. These applications consist of many inter-connected components with heterogeneous roles and complex dependencies. Each component typically consumes 5-15% of the server capacity. Deploying each component as a separate virtual machine (VM) allows us to consolidate the…
pre-feasibility analysis; wind data analysis; the small wind turbine certification process; economic Regional Test Center effort, analysis of the potential economic impact of large-scale MHK deployment off pre-feasibility analysis. Tony is an engineer officer in the Army Reserve. He has deployed twice
Data-Driven Simulation-Enhanced Optimization of People-Based Print Production Service
NASA Astrophysics Data System (ADS)
Rai, Sudhendu
This paper describes a systematic six-step data-driven simulation-based methodology for optimizing people-based service systems on a large distributed scale that exhibit high variety and variability. The methodology is exemplified through its application within the printing services industry where it has been successfully deployed by Xerox Corporation across small, mid-sized and large print shops generating over 250 million in profits across the customer value chain. Each step of the methodology consisting of innovative concepts co-development and testing in partnership with customers, development of software and hardware tools to implement the innovative concepts, establishment of work-process and practices for customer-engagement and service implementation, creation of training and infrastructure for large scale deployment, integration of the innovative offering within the framework of existing corporate offerings and lastly the monitoring and deployment of the financial and operational metrics for estimating the return-on-investment and the continual renewal of the offering are described in detail.
Potential climatic impacts and reliability of large-scale offshore wind farms
NASA Astrophysics Data System (ADS)
Wang, Chien; Prinn, Ronald G.
2011-04-01
The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land-based installations. However, the intermittency caused by the significant seasonal wind variations over several major offshore sites is substantial, and demands further options to ensure the reliability of large-scale offshore wind power. The method that we used to simulate the offshore wind turbine effect on the lower atmosphere involved simply increasing the ocean surface drag coefficient. While this method is consistent with several detailed fine-scale simulations of wind turbines, it still needs further study to ensure its validity. New field observations of actual wind turbine arrays are definitely required to provide ultimate validation of the model predictions presented here.
Deployment, Design, and Commercialization of Carbon-Negative Energy Systems
NASA Astrophysics Data System (ADS)
Sanchez, Daniel Lucio
Climate change mitigation requires gigaton-scale carbon dioxide removal technologies, yet few examples exist beyond niche markets. This dissertation informs large-scale implementation of bioenergy with carbon capture and sequestration (BECCS), a carbon-negative energy technology. It builds on existing literature with a novel focus on deployment, design, commercialization, and communication of BECCS. BECCS, combined with aggressive renewable deployment and fossil emission reductions, can enable a carbon-negative power system in Western North America by 2050, with up to 145% emissions reduction from 1990 levels. BECCS complements other sources of renewable energy, and can be deployed in a manner consistent with regional policies and design considerations. The amount of biomass resource available limits the level of fossil CO2 emissions that can still satisfy carbon emissions caps. Offsets produced by BECCS are more valuable to the power system than the electricity it provides. Implied costs of carbon for BECCS are relatively low ( 75/ton CO2 at scale) for a capital-intensive technology. Optimal scales for BECCS are an order of magnitude larger than proposed scales found in existing literature. Deviations from optimal scaled size have little effect on overall systems costs - suggesting that other factors, including regulatory, political, or logistical considerations, may ultimately have a greater influence on plant size than the techno-economic factors considered. The flexibility of thermochemical conversion enables a viable transition pathway for firms, utilities and governments to achieve net-negative CO 2 emissions in production of electricity and fuels given increasingly stringent climate policy. Primary research, development (R&D), and deployment needs are in large-scale biomass logistics, gasification, gas cleaning, and geological CO2 storage. R&D programs, subsidies, and policy that recognize co-conversion processes can support this pathway to commercialization. Here, firms can embrace a gradual transition pathway to deep decarbonization, limiting economic dislocation and increasing transfer of knowledge between the fossil and renewable sectors. Global cumulative capital investment needs for BECCS through 2050 are over 1.9 trillion (2015$, 4% real interest rate) for scenarios likely to limit global warming to 2 °C. This scenario envisions deployment of as much as 24 GW/yr of BECCS by 2040 in the electricity sector. To achieve theses rates of deployment within 15-20 years, governments and firms must commit to research, development, and deployment on an unprecedented scale. Three primary issues complicate emissions accounting for BECCS: cross-sector CO2 accounting, regrowth, and timing. Switchgrass integration decreases lifecycle greenhouse gas impacts of co-conversion systems with CCS, across a wide range of land-use change scenarios. Risks at commercial scale include adverse effects on food security, land conservation, social equity, and biodiversity, as well as competition for water resources. This dissertation argues for an iterative risk management approach to BECCS sustainability, with standards being updated as more knowledge is gained through deployment. Sustainability impacts and public opposition to BECCS may be reduced with transparent measurement and communication. Commercial-scale deployment is dependent on the coordination of a wide range of actors, many with different incentives and worldviews. Despite this problem, this dissertation challenges governments, industry incumbents, and emerging players to research, support, and deploy BECCS.
Deployment simulation of a deployable reflector for earth science application
NASA Astrophysics Data System (ADS)
Wang, Xiaokai; Fang, Houfei; Cai, Bei; Ma, Xiaofei
2015-10-01
A novel mission concept namely NEXRAD-In-Space (NIS) has been developed for monitoring hurricanes, cyclones and other severe storms from a geostationary orbit. It requires a space deployable 35-meter diameter Ka-band (35 GHz) reflector. NIS can measure hurricane precipitation intensity, dynamics and its life cycle. These information is necessary for predicting the track, intensity, rain rate and hurricane-induced floods. To meet the requirements of the radar system, a Membrane Shell Reflector Segment (MSRS) reflector technology has been developed and several technologies have been evaluated. However, the deployment analysis of this large size and high-precision reflector has not been investigated. For a pre-studies, a scaled tetrahedral truss reflector with spring driving deployment system has been made and tested, deployment dynamics analysis of this scaled reflector has been performed using ADAMS to understand its deployment dynamic behaviors. Eliminating the redundant constraints in the reflector system with a large number of moving parts is a challenging issue. A primitive joint and flexible struts were introduced to the analytical model and they can effectively eliminate over constraints of the model. By using a high-speed camera and a force transducer, a deployment experiment of a single-bay tetrahedral module has been conducted. With the tested results, an optimization process has been performed by using the parameter optimization module of ADAMS to obtain the parameters of the analytical model. These parameters were incorporated to the analytical model of the whole reflector. It is observed from the analysis results that the deployment process of the reflector with a fixed boundary experiences three stages. These stages are rapid deployment stage, slow deployment stage and impact stage. The insight of the force peak distributions of the reflector can help the optimization design of the structure.
Fuel savings and emissions reductions from light duty fuel cell vehicles
NASA Astrophysics Data System (ADS)
Mark, J.; Ohi, J. M.; Hudson, D. V., Jr.
1994-04-01
Fuel cell vehicles (FCV's) operate efficiently, emit few pollutants, and run on nonpetroleum fuels. Because of these characteristics, the large-scale deployment of FCV's has the potential to lessen U.S. dependence on foreign oil and improve air quality. This study characterizes the benefits of large-scale FCV deployment in the light duty vehicle market. Specifically, the study assesses the potential fuel savings and emissions reductions resulting from large-scale use of these FCV's and identifies the key parameters that affect the scope of the benefits from FCV use. The analysis scenario assumes that FCV's will compete with gasoline-powered light trucks and cars in the new vehicle market for replacement of retired vehicles and will compete for growth in the total market. Analysts concluded that the potential benefits from FCV's, measured in terms of consumer outlays for motor fuel and the value of reduced air emissions, are substantial.
DOT National Transportation Integrated Search
2016-08-31
A major challenge for achieving large-scale adoption of EVs is an accessible infrastructure for the communities. The societal benefits of large-scale adoption of EVs cannot be realized without adequate deployment of publicly accessible charging stati...
ADOPT: Automotive Deployment Options Projection Tool | Transportation
new model options by combining high-selling powertrains and high-selling vehicle platforms. NREL has . Screenshot of the ADOPT user interface, with two simulation scenario options (low tech and high tech emissions. Biomass Market Dynamics Supporting the Large-Scale Deployment of High-Octane Fuel Production in
ERIC Educational Resources Information Center
Rupp, André A.
2018-01-01
This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimenez, Tony; Keyser, David; Tegen, Suzanne
This analysis examines the employment and potential economic impacts of large-scale deployment of offshore wind technology off the coast of Oregon. This analysis examines impacts within the seven Oregon coastal counties: Clatsop, Tillamook, Lincoln, Lane, Douglas, Coos, and Curry. The impacts highlighted here can be used in county, state, and regional planning discussions and can be scaled to get a general sense of the economic development opportunities associated with other deployment scenarios.
Deployable Soft Composite Structures.
Wang, Wei; Rodrigue, Hugo; Ahn, Sung-Hoon
2016-02-19
Deployable structure composed of smart materials based actuators can reconcile its inherently conflicting requirements of low mass, good shape adaptability, and high load-bearing capability. This work describes the fabrication of deployable structures using smart soft composite actuators combining a soft matrix with variable stiffness properties and hinge-like movement through a rigid skeleton. The hinge actuator has the advantage of being simple to fabricate, inexpensive, lightweight and simple to actuate. This basic actuator can then be used to form modules capable of different types of deformations, which can then be assembled into deployable structures. The design of deployable structures is based on three principles: design of basic hinge actuators, assembly of modules and assembly of modules into large-scale deployable structures. Various deployable structures such as a segmented triangular mast, a planar structure comprised of single-loop hexagonal modules and a ring structure comprised of single-loop quadrilateral modules were designed and fabricated to verify this approach. Finally, a prototype for a deployable mirror was developed by attaching a foldable reflective membrane to the designed ring structure and its functionality was tested by using it to reflect sunlight onto to a small-scale solar panel.
Deployable Soft Composite Structures
Wang, Wei; Rodrigue, Hugo; Ahn, Sung-Hoon
2016-01-01
Deployable structure composed of smart materials based actuators can reconcile its inherently conflicting requirements of low mass, good shape adaptability, and high load-bearing capability. This work describes the fabrication of deployable structures using smart soft composite actuators combining a soft matrix with variable stiffness properties and hinge-like movement through a rigid skeleton. The hinge actuator has the advantage of being simple to fabricate, inexpensive, lightweight and simple to actuate. This basic actuator can then be used to form modules capable of different types of deformations, which can then be assembled into deployable structures. The design of deployable structures is based on three principles: design of basic hinge actuators, assembly of modules and assembly of modules into large-scale deployable structures. Various deployable structures such as a segmented triangular mast, a planar structure comprised of single-loop hexagonal modules and a ring structure comprised of single-loop quadrilateral modules were designed and fabricated to verify this approach. Finally, a prototype for a deployable mirror was developed by attaching a foldable reflective membrane to the designed ring structure and its functionality was tested by using it to reflect sunlight onto to a small-scale solar panel. PMID:26892762
Quality Function Deployment for Large Systems
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1992-01-01
Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.
New Markets for Solar Photovoltaic Power Systems
NASA Astrophysics Data System (ADS)
Thomas, Chacko; Jennings, Philip; Singh, Dilawar
2007-10-01
Over the past five years solar photovoltaic (PV) power supply systems have matured and are now being deployed on a much larger scale. The traditional small-scale remote area power supply systems are still important and village electrification is also a large and growing market but large scale, grid-connected systems and building integrated systems are now being deployed in many countries. This growth has been aided by imaginative government policies in several countries and the overall result is a growth rate of over 40% per annum in the sales of PV systems. Optimistic forecasts are being made about the future of PV power as a major source of sustainable energy. Plans are now being formulated by the IEA for very large-scale PV installations of more than 100 MW peak output. The Australian Government has announced a subsidy for a large solar photovoltaic power station of 154 MW in Victoria, based on the concentrator technology developed in Australia. In Western Australia a proposal has been submitted to the State Government for a 2 MW photovoltaic power system to provide fringe of grid support at Perenjori. This paper outlines the technologies, designs, management and policies that underpin these exciting developments in solar PV power.
Dispersion and Cluster Scales in the Ocean
NASA Astrophysics Data System (ADS)
Kirwan, A. D., Jr.; Chang, H.; Huntley, H.; Carlson, D. F.; Mensa, J. A.; Poje, A. C.; Fox-Kemper, B.
2017-12-01
Ocean flow space scales range from centimeters to thousands of kilometers. Because of their large Reynolds number these flows are considered turbulent. However, because of rotation and stratification constraints they do not conform to classical turbulence scaling theory. Mesoscale and large-scale motions are well described by geostrophic or "2D turbulence" theory, however extending this theory to submesoscales has proved to be problematic. One obvious reason is the difficulty in obtaining reliable data over many orders of magnitude of spatial scales in an ocean environment. The goal of this presentation is to provide a preliminary synopsis of two recent experiments that overcame these obstacles. The first experiment, the Grand LAgrangian Deployment (GLAD) was conducted during July 2012 in the eastern half of the Gulf of Mexico. Here approximately 300 GPS-tracked drifters were deployed with the primary goal to determine whether the relative dispersion of an initially densely clustered array was driven by processes acting at local pair separation scales or by straining imposed by mesoscale motions. The second experiment was a component of the LAgrangian Submesoscale Experiment (LASER) conducted during the winter of 2016. Here thousands of bamboo plates were tracked optically from an Aerostat. Together these two deployments provided an unprecedented data set on dispersion and clustering processes from 1 to 106 meter scales. Calculations of statistics such as two point separations, structure functions, and scale dependent relative diffusivities showed: inverse energy cascade as expected for scales above 10 km, a forward energy cascade at scales below 10 km with a possible energy input at Langmuir circulation scales. We also find evidence from structure function calculations for surface flow convergence at scales less than 10 km that account for material clustering at the ocean surface.
The military social health index: a partial multicultural validation.
Van Breda, Adrian D
2008-05-01
Routine military deployments place great stress on military families. Before South African soldiers can be deployed, they undergo a comprehensive health assessment, which includes a social work assessment. The assessment focuses on the resilience of the family system to estimate how well the family will cope when exposed to the stress of deployments. This article reports on the development and validation of a new measuring tool, the Military Social Health Index, or MSHI. The MSHI is made up of four scales, each comprising 14 items, viz social support, problem solving, stressor appraisal, and generalized resistance resources. An initial, large-scale, multicultural validation of the MSHI revealed strong levels of reliability (Cronbach a and standard error of measurement) and validity (factorial, construct, convergent, and discriminant).
Accommodating Thickness in Origami-Based Deployable Arrays
NASA Technical Reports Server (NTRS)
Zirbel, Shannon A.; Magleby, Spencer P.; Howell, Larry L.; Lang, Robert J.; Thomson, Mark W.; Sigel, Deborah A.; Walkemeyer, Phillip E.; Trease, Brian P.
2013-01-01
The purpose of this work is to create deployment systems with a large ratio of stowed-to-deployed diameter. Deployment from a compact form to a final flat state can be achieved through origami-inspired folding of panels. There are many models capable of this motion when folded in a material with negligible thickness; however, when the application requires the folding of thick, rigid panels, attention must be paid to the effect of material thickness not only on the final folded state, but also during the folding motion (i.e., the panels must not be required to flex to attain the final folded form). The objective is to develop new methods for deployment from a compact folded form to a large circular array (or other final form). This paper describes a mathematical model for modifying the pattern to accommodate material thickness in the context of the design, modeling, and testing of a deployable system inspired by an origami six-sided flasher model. The model is demonstrated in hardware as a 1/20th scale prototype of a deployable solar array for space applications. The resulting prototype has a ratio of stowed-to-deployed diameter of 9.2 (or 1.25 m deployed outer diameter to 0.136 m stowed outer diameter).
PKI security in large-scale healthcare networks.
Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos
2012-06-01
During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.
NASA Astrophysics Data System (ADS)
Ali, Hatamirad; Hasan, Mehrjerdi
Automotive industry and car production process is one of the most complex and large-scale production processes. Today, information technology (IT) and ERP systems incorporates a large portion of production processes. Without any integrated systems such as ERP, the production and supply chain processes will be tangled. The ERP systems, that are last generation of MRP systems, make produce and sale processes of these industries easier and this is the major factor of development of these industries anyhow. Today many of large-scale companies are developing and deploying the ERP systems. The ERP systems facilitate many of organization processes and make organization to increase efficiency. The security is a very important part of the ERP strategy at the organization, Security at the ERP systems, because of integrity and extensive, is more important of local and legacy systems. Disregarding of this point can play a giant role at success or failure of this kind of systems. The IRANKHODRO is the biggest automotive factory in the Middle East with an annual production over 600.000 cars. This paper presents ERP security deployment experience at the "IRANKHODRO Company". Recently, by launching ERP systems, it moved a big step toward more developments.
IRIS Arrays: Observing Wavefields at Multiple Scales and Frequencies
NASA Astrophysics Data System (ADS)
Sumy, D. F.; Woodward, R.; Frassetto, A.
2014-12-01
The Incorporated Research Institutions for Seismology (IRIS) provides instruments for creating and operating seismic arrays at a wide range of scales. As an example, for over thirty years the IRIS PASSCAL program has provided instruments to individual Principal Investigators to deploy arrays of all shapes and sizes on every continent. These arrays have ranged from just a few sensors to hundreds or even thousands of sensors, covering areas with dimensions of meters to thousands of kilometers. IRIS also operates arrays directly, such as the USArray Transportable Array (TA) as part of the EarthScope program. Since 2004, the TA has rolled across North America, at any given time spanning a swath of approximately 800 km by 2,500 km, and thus far sampling 2% of the Earth's surface. This achievement includes all of the lower-48 U.S., southernmost Canada, and now parts of Alaska. IRIS has also facilitated specialized arrays in polar environments and on the seafloor. In all cases, the data from these arrays are freely available to the scientific community. As the community of scientists who use IRIS facilities and data look to the future they have identified a clear need for new array capabilities. In particular, as part of its Wavefields Initiative, IRIS is exploring new technologies that can enable large, dense array deployments to record unaliased wavefields at a wide range of frequencies. Large-scale arrays might utilize multiple sensor technologies to best achieve observing objectives and optimize equipment and logistical costs. Improvements in packaging and power systems can provide equipment with reduced size, weight, and power that will reduce logistical constraints for large experiments, and can make a critical difference for deployments in harsh environments or other situations where rapid deployment is required. We will review the range of existing IRIS array capabilities with an overview of previous and current deployments and examples of data and results. We will review existing IRIS projects that explore new array capabilities and highlight future directions for IRIS instrumentation facilities.
Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.
Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk
2015-01-01
Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system throughput performance.
Ten Years of Analyzing the Duck Chart: How an NREL Discovery in 2008 Is
examined how to plan for future large-scale integration of solar photovoltaic (PV) generation on the result, PV was deployed more widely, and system operators became increasingly concerned about how solar emerging energy and environmental policy initiatives pushing for higher levels of solar PV deployment. As a
Transmission Infrastructure | Energy Analysis | NREL
aggregating geothermal with other complementary generating technologies, in renewable energy zones infrastructure planning and expansion to enable large-scale deployment of renewable energy in the future. Large Energy, FERC, NERC, and the regional entities, transmission providers, generating companies, utilities
Chemical Warfare and Medical Response During World War I
Fitzgerald, Gerard J.
2008-01-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914–1918). Historians now refer to the Great War as the chemist’s war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations. PMID:18356568
GLAD: a system for developing and deploying large-scale bioinformatics grid.
Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong
2005-03-01
Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.
Chemical warfare and medical response during World War I.
Fitzgerald, Gerard J
2008-04-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914-1918). Historians now refer to the Great War as the chemist's war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations.
Active Self-Testing Noise Measurement Sensors for Large-Scale Environmental Sensor Networks
Domínguez, Federico; Cuong, Nguyen The; Reinoso, Felipe; Touhafi, Abdellah; Steenhaut, Kris
2013-01-01
Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone's frequency response over time. This paper presents our noise sensor's hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50) effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10. PMID:24351634
Transforming a Liability Into An Asset-Creating a Market for CO2-based Products
NASA Astrophysics Data System (ADS)
David, B. J.
2016-12-01
This session will discuss converting CO2 from a liability into an asset. It will specifically discuss how at least 25 products can be created using CO2 as a feedstock and deployed in the market at large scale. Focus will be on products that can both achieve scale from a market standpoint as well as climate significance in use of CO2 as a feedstock. The session will describe the market drivers supporting and inhibiting commercial deployment of CO2-based products. It will list key barriers and risks in the various CO2-based product segments. These barriers/risks could occur across technology, policy, institutional, economic, and other dimensions. The means to mitigate each barrier and the likelihood for such means to be deployed will be discussed.
Charting the Emergence of Corporate Procurement of Utility-Scale PV |
Jeffrey J. Cook Though most large-scale solar photovoltaic (PV) deployment has been driven by utility corporate interest in renewables as more companies are recognizing that solar PV can provide clean United States highlighting states with utility-scale solar PV purchasing options Figure 2. States with
Forecasting Demand for KC-135 Sorties: Deploy to Dwell Impacts
2013-06-01
fighter movements from individual units are rampant (6 OSS/ OSOS , 2013). However, TACC directed missions in this category are scarce, if not non...existent (6 OSS/ OSOS , 2013). Recent TACC tasked missions that appear to support CONUS fighter movements were training related: pre-deployment preparation...and large scale exercises directed by the Joint Staff (6 OSS/ OSOS , 2013). Anecdotal evidence that AMC supports CONUS fighter movements was flawed
Resource allocation for epidemic control in metapopulations.
Ndeffo Mbah, Martial L; Gilligan, Christopher A
2011-01-01
Deployment of limited resources is an issue of major importance for decision-making in crisis events. This is especially true for large-scale outbreaks of infectious diseases. Little is known when it comes to identifying the most efficient way of deploying scarce resources for control when disease outbreaks occur in different but interconnected regions. The policy maker is frequently faced with the challenge of optimizing efficiency (e.g. minimizing the burden of infection) while accounting for social equity (e.g. equal opportunity for infected individuals to access treatment). For a large range of diseases described by a simple SIRS model, we consider strategies that should be used to minimize the discounted number of infected individuals during the course of an epidemic. We show that when faced with the dilemma of choosing between socially equitable and purely efficient strategies, the choice of the control strategy should be informed by key measurable epidemiological factors such as the basic reproductive number and the efficiency of the treatment measure. Our model provides new insights for policy makers in the optimal deployment of limited resources for control in the event of epidemic outbreaks at the landscape scale.
House Divided: The Splitting of Active Duty Civil Affairs Forces
2009-12-01
DUAL HEADQUARTERS”) History records few instances where a majority of the population welcomes an occupying army. — Ivan Muscicant, Banana Wars, p...Review Report it may not be accepted by the society at large. A large-scale deployment of GPF forces could essentially peel away any further
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
GATECloud.net: a platform for large-scale, open-source text processing on the cloud.
Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina
2013-01-28
Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.
NASA Technical Reports Server (NTRS)
Cleveland, Paul E.; Parrish, Keith A.
2005-01-01
A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.
To cope with the rising demand for fresh water, desalination of brackish groundwater and seawater is increasingly being viewed as a pragmatic option for augmenting fresh water supplies. The large scale deployment of desalination is likely to demonstrably increase electricity use,...
Ocean Thermal Energy Conversion (OTEC) Programmatic Environmental Analysis--Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Authors, Various
1980-01-01
The programmatic environmental analysis is an initial assessment of Ocean Thermal Energy Conversion (OTEC) technology considering development, demonstration and commercialization. It is concluded that the OTEC development program should continue because the development, demonstration, and commercialization on a single-plant deployment basis should not present significant environmental impacts. However, several areas within the OTEC program require further investigation in order to assess the potential for environmental impacts from OTEC operation, particularly in large-scale deployments and in defining alternatives to closed-cycle biofouling control: (1) Larger-scale deployments of OTEC clusters or parks require further investigations in order to assess optimal platform siting distancesmore » necessary to minimize adverse environmental impacts. (2) The deployment and operation of the preoperational platform (OTEC-1) and future demonstration platforms must be carefully monitored to refine environmental assessment predictions, and to provide design modifications which may mitigate or reduce environmental impacts for larger-scale operations. These platforms will provide a valuable opportunity to fully evaluate the intake and discharge configurations, biofouling control methods, and both short-term and long-term environmental effects associated with platform operations. (3) Successful development of OTEC technology to use the maximal resource capabilities and to minimize environmental effects will require a concerted environmental management program, encompassing many different disciplines and environmental specialties. This volume contains these appendices: Appendix A -- Deployment Scenario; Appendix B -- OTEC Regional Characterization; and Appendix C -- Impact and Related Calculations.« less
Deployment Methods for an Origami-Inspired Rigid-Foldable Array
NASA Technical Reports Server (NTRS)
Zirbel, Shannon A.; Trease, Brian P.; Magleby, Spencer P.; Howell, Larry L.
2014-01-01
The purpose of this work is to evaluate several deployment methods for an origami-inspired solar array at two size scales: 25-meter array and CubeSat array. The array enables rigid panel deployment and introduces new concepts for actuating CubeSat deployables. The design for the array was inspired by the origami flasher model (Lang, 1997; Shafer, 2001). Figure 1 shows the array prototyped from Garolite and Kapton film at the CubeSat scale. Prior work demonstrated that rigid panels like solar cells could successfully be folded into the final stowed configuration without requiring the panels to flex (Zirbel, Lang, Thomson, & al., 2013). The design of the array is novel and enables efficient use of space. The array can be wrapped around the central bus of the spacecraft in the case of the large array, or can accommodate storage of a small instrument payload in the case of the CubeSat array. The radial symmetry of this array around the spacecraft is ideally suited for spacecraft that need to spin. This work focuses on several actuation methods for a one-time deployment of the array. The array is launched in its stowed configuration and it will be deployed when it is in space. Concepts for both passive and active actuation were considered.
Node Deployment with k-Connectivity in Sensor Networks for Crop Information Full Coverage Monitoring
Liu, Naisen; Cao, Weixing; Zhu, Yan; Zhang, Jingchao; Pang, Fangrong; Ni, Jun
2016-01-01
Wireless sensor networks (WSNs) are suitable for the continuous monitoring of crop information in large-scale farmland. The information obtained is great for regulation of crop growth and achieving high yields in precision agriculture (PA). In order to realize full coverage and k-connectivity WSN deployment for monitoring crop growth information of farmland on a large scale and to ensure the accuracy of the monitored data, a new WSN deployment method using a genetic algorithm (GA) is here proposed. The fitness function of GA was constructed based on the following WSN deployment criteria: (1) nodes must be located in the corresponding plots; (2) WSN must have k-connectivity; (3) WSN must have no communication silos; (4) the minimum distance between node and plot boundary must be greater than a specific value to prevent each node from being affected by the farmland edge effect. The deployment experiments were performed on natural farmland and on irregular farmland divided based on spatial differences of soil nutrients. Results showed that both WSNs gave full coverage, there were no communication silos, and the minimum connectivity of nodes was equal to k. The deployment was tested for different values of k and transmission distance (d) to the node. The results showed that, when d was set to 200 m, as k increased from 2 to 4 the minimum connectivity of nodes increases and is equal to k. When k was set to 2, the average connectivity of all nodes increased in a linear manner with the increase of d from 140 m to 250 m, and the minimum connectivity does not change. PMID:27941704
Designing for Scale: Reflections on Rolling Out Reading Improvement in Kenya and Liberia.
Gove, Amber; Korda Poole, Medina; Piper, Benjamin
2017-03-01
Since 2008, the Ministries of Education in Liberia and Kenya have undertaken transitions from small-scale pilot programs to improve reading outcomes among primary learners to the large-scale implementation of reading interventions. The effects of the pilots on learning outcomes were significant, but questions remained regarding whether such large gains could be sustained at scale. In this article, the authors dissect the Liberian and Kenyan experiences with implementing large-scale reading programs, documenting the critical components and conditions of the program designs that affected the likelihood of successfully transitioning from pilot to scale. They also review the design, deployment, and effectiveness of each pilot program and the scale, design, duration, enabling conditions, and initial effectiveness results of the scaled programs in each country. The implications of these results for the design of both pilot and large-scale reading programs are discussed in light of the experiences of both the Liberian and Kenyan programs. © 2017 Wiley Periodicals, Inc.
A Phenomenology of Learning Large: The Tutorial Sphere of xMOOC Video Lectures
ERIC Educational Resources Information Center
Adams, Catherine; Yin, Yin; Vargas Madriz, Luis Francisco; Mullen, C. Scott
2014-01-01
The current discourse surrounding Massive Open Online Courses (MOOCs) is powerful. Despite their rapid and widespread deployment, research has yet to confirm or refute some of the bold claims rationalizing the popularity and efficacy of these large-scale virtual learning environments. Also, MOOCs' reputed disruptive, game-changing potential…
Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo
2014-04-21
Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment.
Structural design of the Large Deployable Reflector (LDR)
NASA Technical Reports Server (NTRS)
Satter, Celeste M.; Lou, Michael C.
1991-01-01
An integrated Large Deployable Reflector (LDR) analysis model was developed to enable studies of system responses to the mechanical and thermal disturbances anticipated during on-orbit operations. Functional requirements of the major subsystems of the LDR are investigated, design trades are conducted, and design options are proposed. System mass and inertia properties are computed in order to estimate environmental disturbances, and in the sizing of control system hardware. Scaled system characteristics are derived for use in evaluating launch capabilities and achievable orbits. It is concluded that a completely passive 20-m primary appears feasible for the LDR from the standpoint of both mechanical vibration and thermal distortions.
Structural design of the Large Deployable Reflector (LDR)
NASA Astrophysics Data System (ADS)
Satter, Celeste M.; Lou, Michael C.
1991-09-01
An integrated Large Deployable Reflector (LDR) analysis model was developed to enable studies of system responses to the mechanical and thermal disturbances anticipated during on-orbit operations. Functional requirements of the major subsystems of the LDR are investigated, design trades are conducted, and design options are proposed. System mass and inertia properties are computed in order to estimate environmental disturbances, and in the sizing of control system hardware. Scaled system characteristics are derived for use in evaluating launch capabilities and achievable orbits. It is concluded that a completely passive 20-m primary appears feasible for the LDR from the standpoint of both mechanical vibration and thermal distortions.
Measuring Large-Scale Social Networks with High Resolution
Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune
2014-01-01
This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359
Commentary: Environmental nanophotonics and energy
NASA Astrophysics Data System (ADS)
Smith, Geoff B.
2011-01-01
The reasons nanophotonics is proving central to meeting the need for large gains in energy efficiency and renewable energy supply are analyzed. It enables optimum management and use of environmental energy flows at low cost and on a sufficient scale by providing spectral, directional and temporal control in tune with radiant flows from the sun, and the local atmosphere. Benefits and problems involved in large scale manufacture and deployment are discussed including how managing and avoiding safety issues in some nanosystems will occur, a process long established in nature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-25
The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
Shape accuracy optimization for cable-rib tension deployable antenna structure with tensioned cables
NASA Astrophysics Data System (ADS)
Liu, Ruiwei; Guo, Hongwei; Liu, Rongqiang; Wang, Hongxiang; Tang, Dewei; Song, Xiaoke
2017-11-01
Shape accuracy is of substantial importance in deployable structures as the demand for large-scale deployable structures in various fields, especially in aerospace engineering, increases. The main purpose of this paper is to present a shape accuracy optimization method to find the optimal pretensions for the desired shape of cable-rib tension deployable antenna structure with tensioned cables. First, an analysis model of the deployable structure is established by using finite element method. In this model, geometrical nonlinearity is considered for the cable element and beam element. Flexible deformations of the deployable structure under the action of cable network and tensioned cables are subsequently analyzed separately. Moreover, the influence of pretension of tensioned cables on natural frequencies is studied. Based on the results, a genetic algorithm is used to find a set of reasonable pretension and thus minimize structural deformation under the first natural frequency constraint. Finally, numerical simulations are presented to analyze the deployable structure under two kinds of constraints. Results show that the shape accuracy and natural frequencies of deployable structure can be effectively improved by pretension optimization.
NASA Technical Reports Server (NTRS)
Cleveland, Paul; Parrish, Keith; Thomson, Shaun; Marsh, James; Comber, Brian
2016-01-01
The James Webb Space Telescope (JWST), successor to the Hubble Space Telescope, will be the largest astronomical telescope ever sent into space. To observe the very first light of the early universe, JWST requires a large deployed 6.5-meter primary mirror cryogenically cooled to less than 50 Kelvin. Three scientific instruments are further cooled via a large radiator system to less than 40 Kelvin. A fourth scientific instrument is cooled to less than 7 Kelvin using a combination pulse-tube Joule-Thomson mechanical cooler. Passive cryogenic cooling enables the large scale of the telescope which must be highly folded for launch on an Ariane 5 launch vehicle and deployed once on orbit during its journey to the second Earth-Sun Lagrange point. Passive cooling of the observatory is enabled by the deployment of a large tennis court sized five layer Sunshield combined with the use of a network of high efficiency radiators. A high purity aluminum heat strap system connects the three instrument's detector systems to the radiator systems to dissipate less than a single watt of parasitic and instrument dissipated heat. JWST's large scale features, while enabling passive cooling, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone of most space missions' thermal verification plans. This paper describes the JWST Core 2 Test, which is a cryogenic thermal balance test of a full size, high fidelity engineering model of the Observatory's 'Core' area thermal control hardware. The 'Core' area is the key mechanical and cryogenic interface area between all Observatory elements. The 'Core' area thermal control hardware allows for temperature transition of 300K to approximately 50 K by attenuating heat from the room temperature IEC (instrument electronics) and the Spacecraft Bus. Since the flight hardware is not available for test, the Core 2 test uses high fidelity and flight-like reproductions.
Seattle wide-area information for travelers (SWIFT) : architecture study
DOT National Transportation Integrated Search
1998-10-19
The SWIFT (Seattle Wide-area Information For Travelers) Field Operational Test was intended to evaluate the performance of a large-scale urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. The unique features of the SWIF...
OCEAN THERMAL ENERGY CONVERSION (OTEC) PROGRAMMATIC ENVIRONMENTAL ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sands, M. D.
1980-01-01
This programmatic environmental analysis is an initial assessment of OTEC technology considering development, demonstration and commercialization; it is concluded that the OTEC development program should continue because the development, demonstration, and commercialization on a single-plant deployment basis should not present significant environmental impacts. However, several areas within the OTEC program require further investigation in order to assess the potential for environmental impacts from OTEC operation, particularly in large-scale deployments and in defining alternatives to closed-cycle biofouling control: (1) Larger-scale deployments of OTEC clusters or parks require further investigations in order to assess optimal platform siting distances necessary to minimize adversemore » environmental impacts. (2) The deployment and operation of the preoperational platform (OTEC-1) and future demonstration platforms must be carefully monitored to refine environmental assessment predictions, and to provide design modifications which may mitigate or reduce environmental impacts for larger-scale operations. These platforms will provide a valuable opportunity to fully evaluate the intake and discharge configurations, biofouling control methods, and both short-term and long-term environmental effects associated with platform operations. (3) Successful development of OTEC technology to use the maximal resource capabilities and to minimize environmental effects will require a concerted environmental management program, encompassing many different disciplines and environmental specialties.« less
WebCIS: large scale deployment of a Web-based clinical information system.
Hripcsak, G; Cimino, J J; Sengupta, S
1999-01-01
WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.
NASA Astrophysics Data System (ADS)
Zhang, Pengsong; Jiang, Shanping; Yang, Linhua; Zhang, Bolun
2018-01-01
In order to meet the requirement of high precision thermal distortion measurement foraΦ4.2m deployable mesh antenna of satellite in vacuum and cryogenic environment, based on Digital Close-range Photogrammetry and Space Environment Test Technology of Spacecraft, a large scale antenna distortion measurement system under vacuum and cryogenic environment is developed in this paper. The antenna Distortion measurement system (ADMS) is the first domestic independently developed thermal distortion measurement system for large antenna, which has successfully solved non-contact high precision distortion measurement problem in large spacecraft structure under vacuum and cryogenic environment. The measurement accuracy of ADMS is better than 50 μm/5m, which has reached international advanced level. The experimental results show that the measurement system has great advantages in large structural measurement of spacecrafts, and also has broad application prospects in space or other related fields.
Oak Ridge Leadership Computing Facility Position Paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oral, H Sarp; Hill, Jason J; Thach, Kevin G
This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.
ATLAS I/O performance optimization in as-deployed environments
NASA Astrophysics Data System (ADS)
Maier, T.; Benjamin, D.; Bhimji, W.; Elmsheuser, J.; van Gemmeren, P.; Malon, D.; Krumnack, N.
2015-12-01
This paper provides an overview of an integrated program of work underway within the ATLAS experiment to optimise I/O performance for large-scale physics data analysis in a range of deployment environments. It proceeds to examine in greater detail one component of that work, the tuning of job-level I/O parameters in response to changes to the ATLAS event data model, and considers the implications of such tuning for a number of measures of I/O performance.
Fiber-Optic Sensing System: Overview, Development and Deployment in Flight at NASA
NASA Technical Reports Server (NTRS)
Chan, Hon Man; Parker, Allen R.; Piazza, Anthony; Richards, W. Lance
2015-01-01
An overview of the research and technological development of the fiber-optic sensing system (FOSS) at the National Aeronautics and Space Administration Armstrong Flight Research Center (NASA AFRC) is presented. Theory behind fiber Bragg grating (FBG) sensors, as well as interrogation technique based on optical frequency domain reflectometry (OFDR) is discussed. Assessment and validation of FOSS as an accurate measurement tool for structural health monitoring is realized in the laboratory environment as well as large-scale flight deployment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.
CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less
Space Situational Awareness of Large Numbers of Payloads From a Single Deployment
NASA Astrophysics Data System (ADS)
Segerman, A.; Byers, J.; Emmert, J.; Nicholas, A.
2014-09-01
The nearly simultaneous deployment of a large number of payloads from a single vehicle presents a new challenge for space object catalog maintenance and space situational awareness (SSA). Following two cubesat deployments last November, it took five weeks to catalog the resulting 64 orbits. The upcoming Kicksat mission will present an even greater SSA challenge, with its deployment of 128 chip-sized picosats. Although all of these deployments are in short-lived orbits, future deployments will inevitably occur at higher altitudes, with a longer term threat of collision with active spacecraft. With such deployments, individual scientific payload operators require rapid precise knowledge of their satellites' locations. Following the first November launch, the cataloguing did not initially associate a payload with each orbit, leaving this to the satellite operators. For short duration missions, the time required to identify an experiment's specific orbit may easily be a large fraction of the spacecraft's lifetime. For a Kicksat-type deployment, present tracking cannot collect enough observations to catalog each small object. The current approach is to treat the chip cloud as a single catalog object. However, the cloud dissipates into multiple subclouds and, ultimately, tiny groups of untrackable chips. One response to this challenge may be to mandate installation of a transponder on each spacecraft. Directional transponder transmission detections could be used as angle observations for orbit cataloguing. Of course, such an approach would only be employable with cooperative spacecraft. In other cases, a probabilistic association approach may be useful, with the goal being to establish the probability of an element being at a given point in space. This would permit more reliable assessment of the probability of collision of active spacecraft with any cloud element. This paper surveys the cataloguing challenges presented by large scale deployments of small spacecraft, examining current methods. Potential new approaches are discussed, including simulations to evaluate their utility. Acknowledgement: This work was supported by the Office of the Assistant Secretary of Defense for R&E, via the Data-to-Decisions program.
Seattle wide-area information for travelers (SWIFT) : consumer acceptance study
DOT National Transportation Integrated Search
1998-10-19
The Seattle Wide-area Information for Travelers (SWIFT) 0perational Test was intended to evaluate the performance of a large-scale, urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. With the majority of the SWIFT syste...
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; van Rest, Jeroen H. C.; ten Hove, Johan-Martijn; ter Haar, Frank B.; Burghouts, Gertjan J.
2017-10-01
Video analytics is essential for managing large quantities of raw data that are produced by video surveillance systems (VSS) for the prevention, repression and investigation of crime and terrorism. Analytics is highly sensitive to changes in the scene, and for changes in the optical chain so a VSS with analytics needs careful configuration and prompt maintenance to avoid false alarms. However, there is a trend from static VSS consisting of fixed CCTV cameras towards more dynamic VSS deployments over public/private multi-organization networks, consisting of a wider variety of visual sensors, including pan-tilt-zoom (PTZ) cameras, body-worn cameras and cameras on moving platforms. This trend will lead to more dynamic scenes and more frequent changes in the optical chain, creating structural problems for analytics. If these problems are not adequately addressed, analytics will not be able to continue to meet end users' developing needs. In this paper, we present a three-part solution for managing the performance of complex analytics deployments. The first part is a register containing meta data describing relevant properties of the optical chain, such as intrinsic and extrinsic calibration, and parameters of the scene such as lighting conditions or measures for scene complexity (e.g. number of people). A second part frequently assesses these parameters in the deployed VSS, stores changes in the register, and signals relevant changes in the setup to the VSS administrator. A third part uses the information in the register to dynamically configure analytics tasks based on VSS operator input. In order to support the feasibility of this solution, we give an overview of related state-of-the-art technologies for autocalibration (self-calibration), scene recognition and lighting estimation in relation to person detection. The presented solution allows for rapid and robust deployment of Video Content Analysis (VCA) tasks in large scale ad-hoc networks.
NASA Astrophysics Data System (ADS)
Cross, J. N.; Meinig, C.; Mordy, C. W.; Lawrence-Slavas, N.; Cokelet, E. D.; Jenkins, R.; Tabisola, H. M.; Stabeno, P. J.
2016-12-01
New autonomous sensors have dramatically increased the resolution and accuracy of oceanographic data collection, enabling rapid sampling over extremely fine scales. Innovative new autonomous platofrms like floats, gliders, drones, and crawling moorings leverage the full potential of these new sensors by extending spatiotemporal reach across varied environments. During 2015 and 2016, The Innovative Technology for Arctic Exploration Program at the Pacific Marine Environmental Laboratory tested several new types of fully autonomous platforms with increased speed, durability, and power and payload capacity designed to deliver cutting-edge ecosystem assessment sensors to remote or inaccessible environments. The Expendable Ice-Tracking (EXIT) gloat developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) is moored near bottom during the ice-free season and released on an autonomous timer beneath the ice during the following winter. The float collects a rapid profile during ascent, and continues to collect critical, poorly-accessible under-ice data until melt, when data is transmitted via satellite. The autonomous Oculus sub-surface glider developed by the University of Washington and PMEL has a large power and payload capacity and an enhanced buoyancy engine. This 'coastal truck' is designed for the rapid water column ascent required by optical imaging systems. The Saildrone is a solar and wind powered ocean unmanned surface vessel (USV) developed by Saildrone, Inc. in partnership with PMEL. This large-payload (200 lbs), fast (1-7 kts), durable (46 kts winds) platform was equipped with 15 sensors designed for ecosystem assessment during 2016, including passive and active acoustic systems specially redesigned for autonomous vehicle deployments. The senors deployed on these platforms achieved rigorous accuracy and precision standards. These innovative platforms provide new sampling capabilities and cost efficiencies in high-resolution sensor deployment, including reconnaissance for annual fisheries and marine mammal surveys; better linkages between sustained observing platforms; and adaptive deployments that can easily target anomalies as they arise.
Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.; ...
2015-04-27
CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less
Negative emissions: Part 1—research landscape and synthesis
NASA Astrophysics Data System (ADS)
Minx, Jan C.; Lamb, William F.; Callaghan, Max W.; Fuss, Sabine; Hilaire, Jérôme; Creutzig, Felix; Amann, Thorben; Beringer, Tim; de Oliveira Garcia, Wagner; Hartmann, Jens; Khanna, Tarun; Lenzi, Dominic; Luderer, Gunnar; Nemet, Gregory F.; Rogelj, Joeri; Smith, Pete; Vicente, Jose Luis Vicente; Wilcox, Jennifer; del Mar Zamora Dominguez, Maria
2018-06-01
With the Paris Agreement’s ambition of limiting climate change to well below 2 °C, negative emission technologies (NETs) have moved into the limelight of discussions in climate science and policy. Despite several assessments, the current knowledge on NETs is still diffuse and incomplete, but also growing fast. Here, we synthesize a comprehensive body of NETs literature, using scientometric tools and performing an in-depth assessment of the quantitative and qualitative evidence therein. We clarify the role of NETs in climate change mitigation scenarios, their ethical implications, as well as the challenges involved in bringing the various NETs to the market and scaling them up in time. There are six major findings arising from our assessment: first, keeping warming below 1.5 °C requires the large-scale deployment of NETs, but this dependency can still be kept to a minimum for the 2 °C warming limit. Second, accounting for economic and biophysical limits, we identify relevant potentials for all NETs except ocean fertilization. Third, any single NET is unlikely to sustainably achieve the large NETs deployment observed in many 1.5 °C and 2 °C mitigation scenarios. Yet, portfolios of multiple NETs, each deployed at modest scales, could be invaluable for reaching the climate goals. Fourth, a substantial gap exists between the upscaling and rapid diffusion of NETs implied in scenarios and progress in actual innovation and deployment. If NETs are required at the scales currently discussed, the resulting urgency of implementation is currently neither reflected in science nor policy. Fifth, NETs face severe barriers to implementation and are only weakly incentivized so far. Finally, we identify distinct ethical discourses relevant for NETs, but highlight the need to root them firmly in the available evidence in order to render such discussions relevant in practice.
Lessons Learned in Deploying the World s Largest Scale Lustre File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillow, David A; Fuller, Douglas; Wang, Feiyi
2010-01-01
The Spider system at the Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) is the world's largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF's diverse computational environment, the project had a number of ambitious goals. To support the workloads of the OLCF's diverse computational platforms, the aggregate performance and storage capacity of Spider exceed that of our previously deployed systems by a factor of 6x - 240 GB/sec, and 17x - 10 Petabytes, respectively. Furthermore, Spider supports over 26,000 clients concurrently accessing themore » file system, which exceeds our previously deployed systems by nearly 4x. In addition to these scalability challenges, moving to a center-wide shared file system required dramatically improved resiliency and fault-tolerance mechanisms. This paper details our efforts in designing, deploying, and operating Spider. Through a phased approach of research and development, prototyping, deployment, and transition to operations, this work has resulted in a number of insights into large-scale parallel file system architectures, from both the design and the operational perspectives. We present in this paper our solutions to issues such as network congestion, performance baselining and evaluation, file system journaling overheads, and high availability in a system with tens of thousands of components. We also discuss areas of continued challenges, such as stressed metadata performance and the need for file system quality of service alongside with our efforts to address them. Finally, operational aspects of managing a system of this scale are discussed along with real-world data and observations.« less
NASA Astrophysics Data System (ADS)
Kasdin, N. J.; Shaklan, S.; Lisman, D.; Thomson, M.; Cady, E.; Lo, A.; Macintosh, B.
2014-01-01
An external occulter is a satellite employing a large screen, or starshade, that flies in formation with a spaceborne telescope to provide the starlight suppression needed for detecting and characterizing exoplanets. Among the advantages of using an occulter are the broadband allowed for characterization and the removal of light before entering the observatory, greatly relaxing the requirements on the telescope and instrument. In this poster we report on the results of our two Technology Development for Exoplanet Missions (TDEM) studies. In the first we examined the manufacturability and metrology of starshade petals, successfully constructing a full size petal from flight like materials and showing through precise edge shape measurements that an occulter made with petals consistent with the measured accuracy would achieve close to 10^-10 contrast. Our second TDEM tested the deployment precision of a roughly half-scale starshade. We demonstrated the deployment of an existing deployable truss outfitted with four sub-scale petals and a custom designed central hub. We showed that the system can be deployed multiple times with a repeatable positioning accuracy of the petals better than the requirement of 1.0 mm. The combined results of these two TDEM projects has significantly advanced the readiness level of occulter technology and moved the community closer to a realizable mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2018-01-23
Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.
Large-scale deep learning for robotically gathered imagery for science
NASA Astrophysics Data System (ADS)
Skinner, K.; Johnson-Roberson, M.; Li, J.; Iscar, E.
2016-12-01
With the explosion of computing power, the intelligence and capability of mobile robotics has dramatically increased over the last two decades. Today, we can deploy autonomous robots to achieve observations in a variety of environments ripe for scientific exploration. These platforms are capable of gathering a volume of data previously unimaginable. Additionally, optical cameras, driven by mobile phones and consumer photography, have rapidly improved in size, power consumption, and quality making their deployment cheaper and easier. Finally, in parallel we have seen the rise of large-scale machine learning approaches, particularly deep neural networks (DNNs), increasing the quality of the semantic understanding that can be automatically extracted from optical imagery. In concert this enables new science using a combination of machine learning and robotics. This work will discuss the application of new low-cost high-performance computing approaches and the associated software frameworks to enable scientists to rapidly extract useful science data from millions of robotically gathered images. The automated analysis of imagery on this scale opens up new avenues of inquiry unavailable using more traditional manual or semi-automated approaches. We will use a large archive of millions of benthic images gathered with an autonomous underwater vehicle to demonstrate how these tools enable new scientific questions to be posed.
Deployment Effects of Marin Renewable Energy Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian Polagye; Mirko Previsic
2010-06-17
Given proper care in siting, design, deployment, operation and maintenance, marine and hydrokinetic technologies could become one of the more environmentally benign sources of electricity generation. In order to accelerate the adoption of these emerging hydrokinetic and marine energy technologies, navigational and environmental concerns must be identified and addressed. All developing hydrokinetic projects involve a wide variety of stakeholders. One of the key issues that site developers face as they engage with this range of stakeholders is that many of the possible conflicts (e.g., shipping and fishing) and environmental issues are not well-understood, due to a lack of technical certainty.more » In September 2008, re vision consulting, LLC was selected by the Department of Energy (DoE) to apply a scenario-based approach to the emerging wave and tidal technology sectors in order to evaluate the impact of these technologies on the marine environment and potentially conflicting uses. The project’s scope of work includes the establishment of baseline scenarios for wave and tidal power conversion at potential future deployment sites. The scenarios will capture variations in technical approaches and deployment scales to properly identify and characterize environmental impacts and navigational effects. The goal of the project is to provide all stakeholders with an improved understanding of the potential effects of these emerging technologies and focus all stakeholders onto the critical issues that need to be addressed. This groundwork will also help in streamlining siting and associated permitting processes, which are considered key hurdles for the industry’s development in the U.S. today. Re vision is coordinating its efforts with two other project teams funded by DoE which are focused on regulatory and navigational issues. The results of this study are structured into three reports: 1. Wave power scenario description 2. Tidal power scenario description 3. Framework for Identifying Key Environmental Concerns This is the second report in the sequence and describes the results of conceptual feasibility studies of tidal power plants deployed in Tacoma Narrows, Washington. The Narrows contain many of the same competing stakeholder interactions identified at other tidal power sites and serves as a representative case study. Tidal power remains at an early stage of development. As such, a wide range of different technologies are being pursued by different manufacturers. In order to properly characterize impacts, it is useful to characterize the range of technologies that could be deployed at the site of interest. An industry survey informs the process of selecting representative tidal power devices. The selection criteria is that such devices are at an advanced stage of development to reduce technical uncertainties and that enough data are available from the manufacturers to inform the conceptual design process of this study. Further, an attempt is made to cover the range of different technologies under development to capture variations in potential environmental effects. A number of other developers are also at an advanced stage of development including Verdant Power, which has demonstrated an array of turbines in the East River of New York, Clean Current, which has demonstrated a device off Race Rocks, BC, and OpenHydro, which has demonstrated a device at the European Marine Energy Test Center and is on the verge of deploying a larger device in the Bay of Fundy. MCT demonstrated their device both at Devon (UK) and Strangford Narrows (Northern Ireland). Furthermore OpenHydro, CleanCurrent, and MCT are the three devices being installed at the Minas Passage (Canada). Environmental effects will largely scale with the size of tidal power development. In many cases, the effects of a single device may not be measurable, while larger scale device arrays may have cumulative impacts that differ significantly from smaller scale deployments. In order to characterize these effects, scenarios are established at three deployment scales which nominally represent (1) a small pilot deployment, (2) an early, small commercial deployment, and (3) a large commercial scale plant. For the three technologies and scales at the selected site, this results in a total of nine deployment scenarios outlined in the report.« less
Economically Sustainable Scaling of Photovoltaics to Meet Climate Targets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Needleman, David Berney; Poindexter, Jeremy R.; Kurchin, Rachel C.
To meet climate goals, photovoltaics (PV) deployment will have to grow rapidly over the next fifteen years. We identify two barriers to this growth: scale-up of manufacturing capacity and the cost of PV module production. We explore several technoeconomic approaches to overcoming these barriers and identify deep reductions in the capital intensity (capex) of PV module manufacturing and large increases in module efficiency as the most promising routes to rapid deployment. Given the lag inherent in rolling out new technology, we explore an approach where growth is fueled by debt or subsidies in the short-term and technological advances in themore » medium term. Finally, we analyze the current capex structure of crystalline silicon PV module manufacturing to identify potential savings.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lilliestam, Johan; Barradi, Touria; Caldes, Natalia
Concentrating solar power (CSP) is one of the few renewable electricity technologies that can offer dispatchable electricity at large scale. Thus, it may play an important role in the future, especially to balance fluctuating sources in increasingly renewables-based power systems. Today, its costs are higher than those of PV and wind power and, as most countries do not support CSP, deployment is slow. Unless the expansion gains pace and costs decrease, the industry may stagnate or collapse, and an important technology for climate change mitigation has been lost. Keeping CSP as a maturing technology for dispatchable renewable power thus requiresmore » measures to improve its short-term economic attractiveness and to continue reducing costs in the longer term. We suggest a set of three policy instruments - feed-in tariffs or auctions reflecting the value of dispatchable CSP, and not merely its cost; risk coverage support for innovative designs; and demonstration projects - to be deployed, in regions where CSP has a potentially large role to play. This could provide the CSP industry with a balance of attractive profits and competitive pressure, the incentive to expand CSP while also reducing its costs, making it ready for broad-scale deployment when it is needed.« less
Lilliestam, Johan; Barradi, Touria; Caldes, Natalia; ...
2018-02-16
Concentrating solar power (CSP) is one of the few renewable electricity technologies that can offer dispatchable electricity at large scale. Thus, it may play an important role in the future, especially to balance fluctuating sources in increasingly renewables-based power systems. Today, its costs are higher than those of PV and wind power and, as most countries do not support CSP, deployment is slow. Unless the expansion gains pace and costs decrease, the industry may stagnate or collapse, and an important technology for climate change mitigation has been lost. Keeping CSP as a maturing technology for dispatchable renewable power thus requiresmore » measures to improve its short-term economic attractiveness and to continue reducing costs in the longer term. We suggest a set of three policy instruments - feed-in tariffs or auctions reflecting the value of dispatchable CSP, and not merely its cost; risk coverage support for innovative designs; and demonstration projects - to be deployed, in regions where CSP has a potentially large role to play. This could provide the CSP industry with a balance of attractive profits and competitive pressure, the incentive to expand CSP while also reducing its costs, making it ready for broad-scale deployment when it is needed.« less
The Mediation of Acculturation: Orchestrating School Leadership Development in England
ERIC Educational Resources Information Center
Wallace, Mike; Tomlinson, Michael; O'Reilly, Dermot
2011-01-01
Among western governments large-scale leadership development initiatives represent an increasingly deployed means of promoting the acculturation of school leaders to support educational reforms and ongoing improvement. England's sophisticated initiative centres on the National College for Leadership in Schools and Children's Services, a…
Transforming Power Systems; 21st Century Power Partnership
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-05-20
The 21st Century Power Partnership - a multilateral effort of the Clean Energy Ministerial - serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with deep energy ef?ciency and smart grid solutions.
NASA Astrophysics Data System (ADS)
Strefler, Jessica; Bauer, Nico; Kriegler, Elmar; Popp, Alexander; Giannousakis, Anastasis; Edenhofer, Ottmar
2018-04-01
There are major concerns about the sustainability of large-scale deployment of carbon dioxide removal (CDR) technologies. It is therefore an urgent question to what extent CDR will be needed to implement the long term ambition of the Paris Agreement. Here we show that ambitious near term mitigation significantly decreases CDR requirements to keep the Paris climate targets within reach. Following the nationally determined contributions (NDCs) until 2030 makes 2 °C unachievable without CDR. Reducing 2030 emissions by 20% below NDC levels alleviates the trade-off between high transitional challenges and high CDR deployment. Nevertheless, transitional challenges increase significantly if CDR is constrained to less than 5 Gt CO2 a‑1 in any year. At least 8 Gt CO2 a‑1 CDR are necessary in the long term to achieve 1.5 °C and more than 15 Gt CO2 a‑1 to keep transitional challenges in bounds.
Research in Observations of Oceanic Air/Sea Interaction
NASA Technical Reports Server (NTRS)
Long, David G.; Arnold, David V.
1995-01-01
The primary purpose of this research has been: (1) to develop an innovative research radar scatterometer system capable of directly measuring both the radar backscatter and the small-scale and large-scale ocean wave field simultaneously and (2) deploy this instrument to collect data to support studies of air/sea interaction. The instrument has been successfully completed and deployed. The system deployment lasted for six months during 1995. Results to date suggest that the data is remarkably useful in air/sea interaction studies. While the data analysis is continuing, two journal and fifteen conference papers have been published. Six papers are currently in review with two additional journal papers scheduled for publication. Three Master's theses on this research have been completed. A Ph.D. student is currently finalizing his dissertation which should be completed by the end of the calendar year. We have received additional 'mainstream' funding from the NASA oceans branch to continue data analysis and instrument operations. We are actively pursuing results from the data expect additional publications to follow. This final report briefly describes the instrument system we developed and results to-date from the deployment. Additional detail is contained in the attached papers selected from the bibliography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cory, K.; Coughlin, J.; Coggeshall, C.
State and local governments have grown increasingly aware of the economic, environmental, and societal benefits of taking a lead role in U.S. implementation of renewable energy, particularly distributed photovoltaic (PV) installations. Recently, solar energy's cost premium has declined as a result of technology improvements and an increase in the cost of traditional energy generation. At the same time, a nationwide public policy focus on carbon-free, renewable energy has created a wide range of financial incentives to lower the costs of deploying PV even further. These changes have led to exponential increases in the availability of capital for solar projects, andmore » tremendous creativity in the development of third-party ownership structures. As significant users of electricity, state and local governments can be an excellent example for solar PV system deployment on a national scale. Many public entities are not only considering deployment on public building rooftops, but also large-scale applications on available public lands. The changing marketplace requires that state and local governments be financially sophisticated to capture as much of the economic potential of a PV system as possible. This report examines ways that state and local governments can optimize the financial structure of deploying solar PV for public uses.« less
Is Intelligent Speed Adaptation ready for deployment?
Carsten, Oliver
2012-09-01
There have been 30 years of research on Intelligent Speed Adaptation (ISA), the in-vehicle system that is designed to promote compliance with speed limits. Extensive trials of ISA in real-world driving have shown that ISA can significantly reduce speeding, users have been found to have generally positive attitudes and at least some sections of the public have been shown to be willing to purchase ISA systems. Yet large-scale deployment of a system that could deliver huge accident reductions is still by no means guaranteed. Copyright © 2012. Published by Elsevier Ltd.
A Networked Sensor System for the Analysis of Plot-Scale Hydrology.
Villalba, German; Plaza, Fernando; Zhong, Xiaoyang; Davis, Tyler W; Navarro, Miguel; Li, Yimei; Slater, Thomas A; Liang, Yao; Liang, Xu
2017-03-20
This study presents the latest updates to the Audubon Society of Western Pennsylvania (ASWP) testbed, a $50,000 USD, 104-node outdoor multi-hop wireless sensor network (WSN). The network collects environmental data from over 240 sensors, including the EC-5, MPS-1 and MPS-2 soil moisture and soil water potential sensors and self-made sap flow sensors, across a heterogeneous deployment comprised of MICAz, IRIS and TelosB wireless motes. A low-cost sensor board and software driver was developed for communicating with the analog and digital sensors. Innovative techniques (e.g., balanced energy efficient routing and heterogeneous over-the-air mote reprogramming) maintained high success rates (>96%) and enabled effective software updating, throughout the large-scale heterogeneous WSN. The edaphic properties monitored by the network showed strong agreement with data logger measurements and were fitted to pedotransfer functions for estimating local soil hydraulic properties. Furthermore, sap flow measurements, scaled to tree stand transpiration, were found to be at or below potential evapotranspiration estimates. While outdoor WSNs still present numerous challenges, the ASWP testbed proves to be an effective and (relatively) low-cost environmental monitoring solution and represents a step towards developing a platform for monitoring and quantifying statistically relevant environmental parameters from large-scale network deployments.
A Networked Sensor System for the Analysis of Plot-Scale Hydrology
Villalba, German; Plaza, Fernando; Zhong, Xiaoyang; Davis, Tyler W.; Navarro, Miguel; Li, Yimei; Slater, Thomas A.; Liang, Yao; Liang, Xu
2017-01-01
This study presents the latest updates to the Audubon Society of Western Pennsylvania (ASWP) testbed, a $50,000 USD, 104-node outdoor multi-hop wireless sensor network (WSN). The network collects environmental data from over 240 sensors, including the EC-5, MPS-1 and MPS-2 soil moisture and soil water potential sensors and self-made sap flow sensors, across a heterogeneous deployment comprised of MICAz, IRIS and TelosB wireless motes. A low-cost sensor board and software driver was developed for communicating with the analog and digital sensors. Innovative techniques (e.g., balanced energy efficient routing and heterogeneous over-the-air mote reprogramming) maintained high success rates (>96%) and enabled effective software updating, throughout the large-scale heterogeneous WSN. The edaphic properties monitored by the network showed strong agreement with data logger measurements and were fitted to pedotransfer functions for estimating local soil hydraulic properties. Furthermore, sap flow measurements, scaled to tree stand transpiration, were found to be at or below potential evapotranspiration estimates. While outdoor WSNs still present numerous challenges, the ASWP testbed proves to be an effective and (relatively) low-cost environmental monitoring solution and represents a step towards developing a platform for monitoring and quantifying statistically relevant environmental parameters from large-scale network deployments. PMID:28335534
Deployment Effects of Marine Renewable Energy Technologies: Wave Energy Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirko Previsic
2010-06-17
Given proper care in siting, design, deployment, operation and maintenance, wave energy conversion could become one of the more environmentally benign sources of electricity generation. In order to accelerate the adoption of these emerging hydrokinetic and marine energy technologies, navigational and environmental concerns must be identified and addressed. All developing hydrokinetic projects involve a wide variety of stakeholders. One of the key issues that site developers face as they engage with this range of stakeholders is that, due to a lack of technical certainty, many of the possible conflicts (e.g., shipping and fishing) and environmental issues are not well-understood,. Inmore » September 2008, re vision consulting, LLC was selected by the Department of Energy (DoE) to apply a scenario-based assessment to the emerging hydrokinetic technology sector in order to evaluate the potential impact of these technologies on the marine environment and navigation constraints. The project’s scope of work includes the establishment of baseline scenarios for wave and tidal power conversion at potential future deployment sites. The scenarios capture variations in technical approaches and deployment scales to properly identify and characterize environmental effects and navigational effects. The goal of the project is to provide all stakeholders with an improved understanding of the potential range of technical attributes and potential effects of these emerging technologies and focus all stakeholders on the critical issues that need to be addressed. By identifying and addressing navigational and environmental concerns in the early stages of the industry’s development, serious mistakes that could potentially derail industry-wide development can be avoided. This groundwork will also help in streamlining siting and associated permitting processes, which are considered key hurdles for the industry’s development in the U.S. today. Re vision is coordinating its efforts with two other project teams funded by DoE which are focused on regulatory issues (Pacific Energy Ventures) and navigational issues (PCCI). The results of this study are structured into three reports: (1) Wave power scenario description (2) Tidal power scenario description (3) Framework for Identifying Key Environmental Concerns This is the first report in the sequence and describes the results of conceptual feasibility studies of wave power plants deployed in Humboldt County, California and Oahu, Hawaii. These two sites contain many of the same competing stakeholder interactions identified at other wave power sites in the U.S. and serve as representative case studies. Wave power remains at an early stage of development. As such, a wide range of different technologies are being pursued by different manufacturers. In order to properly characterize potential effects, it is useful to characterize the range of technologies that could be deployed at the site of interest. An industry survey informed the process of selecting representative wave power devices. The selection criteria requires that devices are at an advanced stage of development to reduce technical uncertainties, and that enough data are available from the manufacturers to inform the conceptual design process of this study. Further, an attempt is made to cover the range of different technologies under development to capture variations in potential environmental effects. Table 1 summarizes the selected wave power technologies. A number of other developers are also at an advanced stage of development, but are not directly mentioned here. Many environmental effects will largely scale with the size of the wave power plant. In many cases, the effects of a single device may not be measurable, while larger scale device arrays may have cumulative impacts that differ significantly from smaller scale deployments. In order to characterize these effects, scenarios are established at three deployment scales which nominally represent (1) a small pilot deployment, (2) a small commercial deployment, and (3) a large commercial scale plant. It is important to understand that the purpose of this study was to establish baseline scenarios based on basic device data that was provided to use by the manufacturer for illustrative purposes only.« less
Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
William J. Schroeder
2011-11-13
This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannotmore » be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem important to the nations scientific progress as described shortly. Further, SLAC researchers routinely generate massive amounts of data, and frequently collaborate with other researchers located around the world. Thus SLAC is an ideal teammate through which to develop, test and deploy this technology. The nature of the datasets generated by simulations performed at SLAC presented unique visualization challenges especially when dealing with higher-order elements that were addressed during this Phase II. During this Phase II, we have developed a strong platform for collaborative visualization based on ParaView. We have developed and deployed a ParaView Web Visualization framework that can be used for effective collaboration over the Web. Collaborating and visualizing over the Web presents the community with unique opportunities for sharing and accessing visualization and HPC resources that hitherto with either inaccessible or difficult to use. The technology we developed in here will alleviate both these issues as it becomes widely deployed and adopted.« less
Measurement-Driven Characterization of the Mobile Environment
ERIC Educational Resources Information Center
Soroush, Hamed
2013-01-01
The concurrent deployment of high-quality wireless networks and large-scale cloud services offers the promise of secure ubiquitous access to seemingly limitless amount of content. However, as users' expectations have grown more demanding, the performance and connectivity failures endemic to the existing networking infrastructure have become more…
Fast Open-World Person Re-Identification.
Zhu, Xiatian; Wu, Botong; Huang, Dongcheng; Zheng, Wei-Shi
2018-05-01
Existing person re-identification (re-id) methods typically assume that: 1) any probe person is guaranteed to appear in the gallery target population during deployment (i.e., closed-world) and 2) the probe set contains only a limited number of people (i.e., small search scale). Both assumptions are artificial and breached in real-world applications, since the probe population in target people search can be extremely vast in practice due to the ambiguity of probe search space boundary. Therefore, it is unrealistic that any probe person is assumed as one target people, and a large-scale search in person images is inherently demanded. In this paper, we introduce a new person re-id search setting, called large scale open-world (LSOW) re-id, characterized by huge size probe images and open person population in search thus more close to practical deployments. Under LSOW, the under-studied problem of person re-id efficiency is essential in addition to that of commonly studied re-id accuracy. We, therefore, develop a novel fast person re-id method, called Cross-view Identity Correlation and vErification (X-ICE) hashing, for joint learning of cross-view identity representation binarisation and discrimination in a unified manner. Extensive comparative experiments on three large-scale benchmarks have been conducted to validate the superiority and advantages of the proposed X-ICE method over a wide range of the state-of-the-art hashing models, person re-id methods, and their combinations.
NASA Astrophysics Data System (ADS)
Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars
2013-08-01
There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.
Taking it to the streets: delivering on deployment.
Carr, Dafna; Welch, Vickie; Fabik, Trish; Hirji, Nadir; O'Connor, Casey
2009-01-01
From inception to deployment, the Wait Time Information System (WTIS) project faced significant challenges associated with time, scope and complexity. It involved not only the creation and deployment of two large-scale province-wide systems (the WTIS and Ontario's Client Registry/Enterprise Master Patient Index) within aggressive time frames, but also the active engagement of 82 Ontario hospitals, scores of healthcare leaders and several thousand clinicians who would eventually be using the new technology and its data. The provincial WTIS project team (see Figure 1) also had to be able to adapt and evolve their planning in an environment that was changing day-by-day. This article looks at the factors that allowed the team to take the WTIS out to the field and shares the approach, processes and tools used to deploy this complex and ambitious information management and information technology (IM/IT) initiative.
NASA Astrophysics Data System (ADS)
Parker, Tim; Devanney, Peter; Bainbridge, Geoff; Townsend, Bruce
2017-04-01
The march to make every type of seismometer, weak to strong motion, reliable and economically deployable in any terrestrial environment continues with the availability of three new sensors and seismic systems including ones with over 200dB of dynamic range. Until recently there were probably 100 pier type broadband sensors for every observatory type pier, not the types of deployments geoscientists are needing to advance science and monitoring capability. Deeper boreholes are now the recognized quieter environments for best observatory class instruments and these same instruments can now be deployed in direct burial environments which is unprecedented. The experiences of facilities in large deployments of broadband seismometers in continental scale rolling arrays proves the utility of packaging new sensors in corrosion resistant casings and designing in the robustness needed to work reliably in temporary deployments. Integrating digitizers and other sensors decreases deployment complexity, decreases acquisition and deployment costs, increases reliability and utility. We'll discuss the informed evolution of broadband pier instruments into the modern integrated field tools that enable economic densification of monitoring arrays along with supporting new ways to approach geoscience research in a field environment.
2011-02-01
capabilities for airbags , sensors, and seatbelts have tailored the code for applications in the automotive industry. Currently the code contains...larger intervals. In certain contact scenarios where contacting parts are moving relative to each other in a rapid fashion, such as airbag deployment
Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency
ERIC Educational Resources Information Center
Kim, Yong; Chung, Min Gyo
2008-01-01
Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…
Workforce Development Analysis | Energy Analysis | NREL
with customer service, construction, and electrical projects One-half of surveyed firms reported , training, and experience that will enable continued large-scale deployment of wind and solar technologies engineers; and project managers. Standardized education and training at all levels-primary school through
How to spend a dwindling greenhouse gas budget
NASA Astrophysics Data System (ADS)
Obersteiner, Michael; Bednar, Johannes; Wagner, Fabian; Gasser, Thomas; Ciais, Philippe; Forsell, Nicklas; Frank, Stefan; Havlik, Petr; Valin, Hugo; Janssens, Ivan A.; Peñuelas, Josep; Schmidt-Traub, Guido
2018-01-01
The Paris Agreement is based on emission scenarios that move from a sluggish phase-out of fossil fuels to large-scale late-century negative emissions. Alternative pathways of early deployment of negative emission technologies need to be considered to ensure that climate targets are reached safely and sustainably.
Towards large-scale deployment of bifacial photovoltaics
NASA Astrophysics Data System (ADS)
Kopecek, R.; Libal, J.
2018-06-01
Low photovoltaic module costs imply that increasing the energy yield per module area is now a priority. We argue that modules harvesting sunlight from both sides will strongly penetrate the market but that more field data, better simulation tools and international measurement standards are needed to overcome perceived investment risks.
Microworld Simulations: A New Dimension in Training Army Logistics Management Skills
2004-01-01
Providing effective training to Army personnelis always challenging, but the Army facessome new challenges in training its logisticsstaff managers in...soldiers are stationed and where materiel and services are readily available. The design and management of the Army’s Combat Ser- vice Support (CSS) large...scale logistics systems are increasingly important. The skills that are required to manage these systems are difficult to train. Large deployments
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343
NASA Astrophysics Data System (ADS)
Kato, E.; Moriyama, R.; Kurosawa, A.
2016-12-01
Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socio-economic scenarios that aim to keep mean global temperature rise well below 2°C above pre-industrial, which would require net negative carbon emissions at the end of the 21st century. Also, in the Paris agreement from COP21, it is denoted "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century" which could require large scale deployment of negative emissions technologies later in this century. Because of the additional requirement for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of large-scale BECCS. In this study, we present possible development strategies of low carbon scenarios that consider interaction of economically efficient deployment of bioenergy and/or BECCS technologies, biophysical limit of bioenergy productivity, and food production. In the evaluations, detailed bioenergy representations, including bioenergy feedstocks and conversion technologies with and without CCS, are implemented in an integrated assessment model GRAPE. Also, to overcome a general discrepancy about yield development between 'top-down' integrate assessment models and 'bottom-up' estimates, we applied yields changes of food and bioenergy crops consistent with process-based biophysical models; PRYSBI-2 (Process-Based Regional-Scale Yield Simulator with Bayesian Inference) for food crops, and SWAT (Soil and Water Assessment Tool) for bioenergy crops in changing climate conditions. Using the framework, economically viable strategy for implementing sustainable BECCS are evaluated.
Subsurface Monitoring of CO2 Sequestration - A Review and Look Forward
NASA Astrophysics Data System (ADS)
Daley, T. M.
2012-12-01
The injection of CO2 into subsurface formations is at least 50 years old with large-scale utilization of CO2 for enhanced oil recovery (CO2-EOR) beginning in the 1970s. Early monitoring efforts had limited measurements in available boreholes. With growing interest in CO2 sequestration beginning in the 1990's, along with growth in geophysical reservoir monitoring, small to mid-size sequestration monitoring projects began to appear. The overall goals of a subsurface monitoring plan are to provide measurement of CO2 induced changes in subsurface properties at a range of spatial and temporal scales. The range of spatial scales allows tracking of the location and saturation of the plume with varying detail, while finer temporal sampling (up to continuous) allows better understanding of dynamic processes (e.g. multi-phase flow) and constraining of reservoir models. Early monitoring of small scale pilots associated with CO2-EOR (e.g., the McElroy field and the Lost Hills field), developed many of the methodologies including tomographic imaging and multi-physics measurements. Large (reservoir) scale sequestration monitoring began with the Sleipner and Weyburn projects. Typically, large scale monitoring, such as 4D surface seismic, has limited temporal sampling due to costs. Smaller scale pilots can allow more frequent measurements as either individual time-lapse 'snapshots' or as continuous monitoring. Pilot monitoring examples include the Frio, Nagaoka and Otway pilots using repeated well logging, crosswell imaging, vertical seismic profiles and CASSM (continuous active-source seismic monitoring). For saline reservoir sequestration projects, there is typically integration of characterization and monitoring, since the sites are not pre-characterized resource developments (oil or gas), which reinforces the need for multi-scale measurements. As we move beyond pilot sites, we need to quantify CO2 plume and reservoir properties (e.g. pressure) over large scales, while still obtaining high resolution. Typically the high-resolution (spatial and temporal) tools are deployed in permanent or semi-permanent borehole installations, where special well design may be necessary, such as non-conductive casing for electrical surveys. Effective utilization of monitoring wells requires an approach of modular borehole monitoring (MBM) were multiple measurements can be made. An example is recent work at the Citronelle pilot injection site where an MBM package with seismic, fluid sampling and distributed fiber sensing was deployed. For future large scale sequestration monitoring, an adaptive borehole-monitoring program is proposed.
Global economic consequences of deploying bioenergy with carbon capture and storage (BECCS)
NASA Astrophysics Data System (ADS)
Muratori, Matteo; Calvin, Katherine; Wise, Marshall; Kyle, Page; Edmonds, Jae
2016-09-01
Bioenergy with carbon capture and storage (BECCS) is considered a potential source of net negative carbon emissions and, if deployed at sufficient scale, could help reduce carbon dioxide emissions and concentrations. However, the viability and economic consequences of large-scale BECCS deployment are not fully understood. We use the Global Change Assessment Model (GCAM) integrated assessment model to explore the potential global and regional economic impacts of BECCS. As a negative-emissions technology, BECCS would entail a net subsidy in a policy environment in which carbon emissions are taxed. We show that by mid-century, in a world committed to limiting climate change to 2 °C, carbon tax revenues have peaked and are rapidly approaching the point where climate mitigation is a net burden on general tax revenues. Assuming that the required policy instruments are available to support BECCS deployment, we consider its effects on global trade patterns of fossil fuels, biomass, and agricultural products. We find that in a world committed to limiting climate change to 2 °C, the absence of CCS harms fossil-fuel exporting regions, while the presence of CCS, and BECCS in particular, allows greater continued use and export of fossil fuels. We also explore the relationship between carbon prices, food-crop prices and use of BECCS. We show that the carbon price and biomass and food crop prices are directly related. We also show that BECCS reduces the upward pressure on food crop prices by lowering carbon prices and lowering the total biomass demand in climate change mitigation scenarios. All of this notwithstanding, many challenges, both technical and institutional, remain to be addressed before BECCS can be deployed at scale.
Thorstenson, Sten; Molin, Jesper; Lundström, Claes
2014-01-01
Recent technological advances have improved the whole slide imaging (WSI) scanner quality and reduced the cost of storage, thereby enabling the deployment of digital pathology for routine diagnostics. In this paper we present the experiences from two Swedish sites having deployed routine large-scale WSI for primary review. At Kalmar County Hospital, the digitization process started in 2006 to reduce the time spent at the microscope in order to improve the ergonomics. Since 2008, more than 500,000 glass slides have been scanned in the routine operations of Kalmar and the neighboring Linköping University Hospital. All glass slides are digitally scanned yet they are also physically delivered to the consulting pathologist who can choose to review the slides on screen, in the microscope, or both. The digital operations include regular remote case reporting by a few hospital pathologists, as well as around 150 cases per week where primary review is outsourced to a private clinic. To investigate how the pathologists choose to use the digital slides, a web-based questionnaire was designed and sent out to the pathologists in Kalmar and Linköping. The responses showed that almost all pathologists think that ergonomics have improved and that image quality was sufficient for most histopathologic diagnostic work. 38 ± 28% of the cases were diagnosed digitally, but the survey also revealed that the pathologists commonly switch back and forth between digital and conventional microscopy within the same case. The fact that two full-scale digital systems have been implemented and that a large portion of the primary reporting is voluntarily performed digitally shows that large-scale digitization is possible today. PMID:24843825
Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younge, Andrew J.; Pedretti, Kevin; Grant, Ryan
While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In thismore » paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.« less
The AlpArray Seismic Network: A Large-Scale European Experiment to Image the Alpine Orogen
NASA Astrophysics Data System (ADS)
Hetényi, György; Molinari, Irene; Clinton, John; Bokelmann, Götz; Bondár, István; Crawford, Wayne C.; Dessa, Jean-Xavier; Doubre, Cécile; Friederich, Wolfgang; Fuchs, Florian; Giardini, Domenico; Gráczer, Zoltán; Handy, Mark R.; Herak, Marijan; Jia, Yan; Kissling, Edi; Kopp, Heidrun; Korn, Michael; Margheriti, Lucia; Meier, Thomas; Mucciarelli, Marco; Paul, Anne; Pesaresi, Damiano; Piromallo, Claudia; Plenefisch, Thomas; Plomerová, Jaroslava; Ritter, Joachim; Rümpker, Georg; Šipka, Vesna; Spallarossa, Daniele; Thomas, Christine; Tilmann, Frederik; Wassermann, Joachim; Weber, Michael; Wéber, Zoltán; Wesztergom, Viktor; Živčić, Mladen
2018-04-01
The AlpArray programme is a multinational, European consortium to advance our understanding of orogenesis and its relationship to mantle dynamics, plate reorganizations, surface processes and seismic hazard in the Alps-Apennines-Carpathians-Dinarides orogenic system. The AlpArray Seismic Network has been deployed with contributions from 36 institutions from 11 countries to map physical properties of the lithosphere and asthenosphere in 3D and thus to obtain new, high-resolution geophysical images of structures from the surface down to the base of the mantle transition zone. With over 600 broadband stations operated for 2 years, this seismic experiment is one of the largest simultaneously operated seismological networks in the academic domain, employing hexagonal coverage with station spacing at less than 52 km. This dense and regularly spaced experiment is made possible by the coordinated coeval deployment of temporary stations from numerous national pools, including ocean-bottom seismometers, which were funded by different national agencies. They combine with permanent networks, which also required the cooperation of many different operators. Together these stations ultimately fill coverage gaps. Following a short overview of previous large-scale seismological experiments in the Alpine region, we here present the goals, construction, deployment, characteristics and data management of the AlpArray Seismic Network, which will provide data that is expected to be unprecedented in quality to image the complex Alpine mountains at depth.
Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.
Coverage-guaranteed sensor node deployment strategies for wireless sensor networks.
Fan, Gaojuan; Wang, Ruchuan; Huang, Haiping; Sun, Lijuan; Sha, Chao
2010-01-01
Deployment quality and cost are two conflicting aspects in wireless sensor networks. Random deployment, where the monitored field is covered by randomly and uniformly deployed sensor nodes, is an appropriate approach for large-scale network applications. However, their successful applications depend considerably on the deployment quality that uses the minimum number of sensors to achieve a desired coverage. Currently, the number of sensors required to meet the desired coverage is based on asymptotic analysis, which cannot meet deployment quality due to coverage overestimation in real applications. In this paper, we first investigate the coverage overestimation and address the challenge of designing coverage-guaranteed deployment strategies. To overcome this problem, we propose two deployment strategies, namely, the Expected-area Coverage Deployment (ECD) and BOundary Assistant Deployment (BOAD). The deployment quality of the two strategies is analyzed mathematically. Under the analysis, a lower bound on the number of deployed sensor nodes is given to satisfy the desired deployment quality. We justify the correctness of our analysis through rigorous proof, and validate the effectiveness of the two strategies through extensive simulation experiments. The simulation results show that both strategies alleviate the coverage overestimation significantly. In addition, we also evaluate two proposed strategies in the context of target detection application. The comparison results demonstrate that if the target appears at the boundary of monitored region in a given random deployment, the average intrusion distance of BOAD is considerably shorter than that of ECD with the same desired deployment quality. In contrast, ECD has better performance in terms of the average intrusion distance when the invasion of intruder is from the inside of monitored region.
Feasibility of Very Large Sparse Aperture Deployable Antennas
2014-03-27
FEASIBILITY OF VERY LARGE SPARSE APERTURE DEPLOYABLE ANTENNAS THESIS Jason C. Heller, Captain...States. AFIT-ENY-14-M-24 FEASIBILITY OF VERY LARGE SPARSE APERTURE DEPLOYABLE ANTENNAS THESIS Presented to the Faculty...UNLIMITED AFIT-ENY-14-M-24 FEASIBILITY OF VERY LARGE SPARSE APERTURE DEPLOYABLE ANTENNAS Jason C. Heller, B.S., Aerospace
Transforming Power Systems Through Global Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-06-01
Ambitious and integrated policy and regulatory frameworks are crucial to achieve power system transformation. The 21st Century Power Partnership -- a multilateral initiative of the Clean Energy Ministerial -- serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with energy efficiency and grid modernization.
@NWTC Newsletter: Spring 2015 | Wind | NREL
). 8 pp.; NREL Report No. BR-5000-63254. Jimenez, T.; Tegen, S. (2015). Economic Impact from Large -Scale Deployment of Offshore Marine and Hydrokinetic Technology in Oregon. 42 pp.; NREL Report No. TP Validation Code. 31 pp.; NREL Report No. TP-5000-62595. Bulaevskaya, V.; Wharton, S.; Clifton, A.; Qualley, G
ERIC Educational Resources Information Center
Kharabe, Amol T.
2012-01-01
Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…
Wireless Technology Infrastructures for Authentication of Patients: PKI that Rings
Sax, Ulrich; Kohane, Isaac; Mandl, Kenneth D.
2005-01-01
As the public interest in consumer-driven electronic health care applications rises, so do concerns about the privacy and security of these applications. Achieving a balance between providing the necessary security while promoting user acceptance is a major obstacle in large-scale deployment of applications such as personal health records (PHRs). Robust and reliable forms of authentication are needed for PHRs, as the record will often contain sensitive and protected health information, including the patient's own annotations. Since the health care industry per se is unlikely to succeed at single-handedly developing and deploying a large scale, national authentication infrastructure, it makes sense to leverage existing hardware, software, and networks. This report proposes a new model for authentication of users to health care information applications, leveraging wireless mobile devices. Cell phones are widely distributed, have high user acceptance, and offer advanced security protocols. The authors propose harnessing this technology for the strong authentication of individuals by creating a registration authority and an authentication service, and examine the problems and promise of such a system. PMID:15684133
Wireless technology infrastructures for authentication of patients: PKI that rings.
Sax, Ulrich; Kohane, Isaac; Mandl, Kenneth D
2005-01-01
As the public interest in consumer-driven electronic health care applications rises, so do concerns about the privacy and security of these applications. Achieving a balance between providing the necessary security while promoting user acceptance is a major obstacle in large-scale deployment of applications such as personal health records (PHRs). Robust and reliable forms of authentication are needed for PHRs, as the record will often contain sensitive and protected health information, including the patient's own annotations. Since the health care industry per se is unlikely to succeed at single-handedly developing and deploying a large scale, national authentication infrastructure, it makes sense to leverage existing hardware, software, and networks. This report proposes a new model for authentication of users to health care information applications, leveraging wireless mobile devices. Cell phones are widely distributed, have high user acceptance, and offer advanced security protocols. The authors propose harnessing this technology for the strong authentication of individuals by creating a registration authority and an authentication service, and examine the problems and promise of such a system.
The role of large—scale BECCS in the pursuit of the 1.5°C target: an Earth system model perspective
NASA Astrophysics Data System (ADS)
Muri, Helene
2018-04-01
The increasing awareness of the many damaging aspects of climate change has prompted research into ways of reducing and reversing the anthropogenic increase in carbon concentrations in the atmosphere. Most emission scenarios stabilizing climate at low levels, such as the 1.5 °C target as outlined by the Paris Agreement, require large-scale deployment of Bio-Energy with Carbon Capture and Storage (BECCS). Here, the potential of large-scale BECCS deployment in contributing towards the 1.5 °C global warming target is evaluated using an Earth system model, as well as associated climate responses and carbon cycle feedbacks. The geographical location of the bioenergy feedstock is shown to be key to the success of such measures in the context of temperature targets. Although net negative emissions were reached sooner, by ∼6 years, and scaled up, land use change emissions and reductions in forest carbon sinks outweigh these effects in one scenario. Re-cultivating mid-latitudes was found to be beneficial, on the other hand, contributing in the right direction towards the 1.5 °C target, only by ‑0.1 °C and ‑54 Gt C in avoided emissions, however. Obstacles remain related to competition for land from nature preservation and food security, as well as the technological availability of CCS.
Technologies for low radio frequency observations of the Cosmic Dawn
NASA Astrophysics Data System (ADS)
Jones, D. L.
2014-03-01
The Jet Propulsion Laboratory (JPL) is developing concepts and technologies for low frequency radio astronomy space missions aimed at observing highly redshifted neutral Hydrogen from the Dark Ages. This is the period of cosmic history between the recombination epoch when the microwave background radiation was produced and the re-ionization of the intergalactic medium by the first generation of stars (Cosmic Dawn). This period, at redshifts z > ~20, is a critical epoch for the formation and evolution of large-scale structure in the universe. The 21-cm spectral line of Hydrogen provides the most promising method for directly studying the Dark Ages, but the corresponding frequencies at such large redshifts are only tens of MHz and thus require space-based observations to avoid terrestrial RFI and ionospheric absorption and refraction. This paper reports on the status of several low frequency technology development activities at JPL, including deployable bi-conical dipoles for a planned lunar-orbiting mission, and both rover-deployed and inflation-deployed long dipole antennas for use on the lunar surface. In addition, recent results from laboratory testing of low frequency receiver designs are presented. Finally, several concepts for space-based imaging interferometers utilizing deployable low frequency antennas are described. Some of these concepts involve large numbers of antennas and consequently a large digital cross-correlator will be needed. JPL has studied correlator architectures that greatly reduce the DC power required for this step, which can dominate the power consumption of real-time signal processing. Strengths and weaknesses of each mission concept are discussed in the context of the additional technology development required.
Minassian, Arpi; Geyer, Mark A.; Baker, Dewleen G.; Nievergelt, Caroline M.; O'Connor, Daniel T.; Risbrough, Victoria B.
2014-01-01
Objective Heart rate variability (HRV), thought to reflect autonomic nervous system function, is lowered in conditions such as posttraumatic stress disorder (PTSD). The potential confounding effects of traumatic brain injury (TBI) and depression in the relationship between HRV and PTSD have not been elucidated in a large cohort of military service members. Here we describe HRV associations with stress disorder symptoms in a large study of Marines, while accounting for well-known covariates of HRV and PTSD including TBI and depression. Methods Four battalions of male active-duty Marines (N=2430) were assessed 1-2 months prior to a combat deployment. HRV was measured during 5 minutes of rest. Depression and PTSD were assessed using the Beck Depression Inventory and Clinician Administered PTSD scale respectively. Results After accounting for covariates including TBI, a regression indicated that lower levels of high frequency (HF) HRV were associated with a diagnosis of PTSD (beta = -.20, p=.035). Depression and PTSD severity were correlated (r= .49, p <.001), however participants with PTSD but relatively low depression scores exhibited reduced HF compared to controls (p=.012). Marines with deployment experience (n=1254) had lower HRV than those with no experience (p = .033). Conclusions This cross-sectional analysis of a large cohort supports associations between PTSD and reduced HRV when accounting for TBI and depression symptoms. Future post-deployment assessments will be used to determine whether pre-deployment HRV can predict vulnerability and resilience to the serious psychological and physiological consequences of combat exposure. PMID:24804881
Geospatial analysis of near-term potential for carbon-negative bioenergy in the United States
Baik, Ejeong; Turner, Peter A.; Mach, Katharine J.; Field, Christopher B.; Benson, Sally M.
2018-01-01
Bioenergy with carbon capture and storage (BECCS) is a negative-emissions technology that may play a crucial role in climate change mitigation. BECCS relies on the capture and sequestration of carbon dioxide (CO2) following bioenergy production to remove and reliably sequester atmospheric CO2. Previous BECCS deployment assessments have largely overlooked the potential lack of spatial colocation of suitable storage basins and biomass availability, in the absence of long-distance biomass and CO2 transport. These conditions could constrain the near-term technical deployment potential of BECCS due to social and economic barriers that exist for biomass and CO2 transport. This study leverages biomass production data and site-specific injection and storage capacity estimates at high spatial resolution to assess the near-term deployment opportunities for BECCS in the United States. If the total biomass resource available in the United States was mobilized for BECCS, an estimated 370 Mt CO2⋅y−1 of negative emissions could be supplied in 2020. However, the absence of long-distance biomass and CO2 transport, as well as limitations imposed by unsuitable regional storage and injection capacities, collectively decrease the technical potential of negative emissions to 100 Mt CO2⋅y−1. Meeting this technical potential may require large-scale deployment of BECCS technology in more than 1,000 counties, as well as widespread deployment of dedicated energy crops. Specifically, the Illinois basin, Gulf region, and western North Dakota have the greatest potential for near-term BECCS deployment. High-resolution spatial assessment as conducted in this study can inform near-term opportunities that minimize social and economic barriers to BECCS deployment. PMID:29531081
Modeling the Economic Impacts of Large Deployments on Local Communities
2008-12-01
MODELING THE ECONOMIC IMPACTS OF LARGE DEPLOYMENTS ON LOCAL COMMUNITIES THESIS Aaron L... MODELING THE ECONOMIC IMPACTS OF LARGE DEPLOYMENTS ON LOCAL COMMUNITIES THESIS Presented to the Faculty Department of Systems Engineering and...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED AFIT/GCA/ENV/08-D01 MODELING THE ECONOMIC IMPACTS OF LARGE DEPLOYMENTS ON LOCAL
Large-scale machine learning and evaluation platform for real-time traffic surveillance
NASA Astrophysics Data System (ADS)
Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel
2016-09-01
In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.
Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics
NASA Astrophysics Data System (ADS)
Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.
2016-12-01
The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.
Large-Scale Science Observatories: Building on What We Have Learned from USArray
NASA Astrophysics Data System (ADS)
Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.
2015-12-01
With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community governance structures were put in place to ensure a focus on science needs and goals, to provide an informed review of the project's results, and to carefully balance consistency of observations with technical evolution. We will summarize lessons learned from USArray and how these can be applied to future efforts such as SZO.
Technologies for Low Frequency Radio Observations of the Cosmic Dawn
NASA Technical Reports Server (NTRS)
Jones, Dayton L.
2014-01-01
The Jet Propulsion Laboratory (JPL) is developing concepts and technologies for low frequency radio astronomy space missions aimed at observing highly redshifted neutral Hydrogen from the Dark Ages. This is the period of cosmic history between the recombination epoch when the microwave background radiation was produced and the re-ionization of the intergalactic medium by the first generation of stars (Cosmic Dawn). This period, at redshifts greater than about 20, is a critical epoch for the formation and evolution of large-scale structure in the universe. The 21-cm spectral line of Hydrogen provides the most promising method for directly studying the Dark Ages, but the corresponding frequencies at such large redshifts are only tens of MHz and thus require space-based observations to avoid terrestrial RFI and ionospheric absorption and refraction. This paper reports on the status of several low frequency technology development activities at JPL, including deployable bi-conical dipoles for a planned lunar-orbiting mission, and both rover-deployed and inflation-deployed long dipole antennas for use on the lunar surface.
Wheeling and Banking Strategies for Optimal Renewable Energy Deployment. International Experiences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heeter, Jenny; Vora, Ravi; Mathur, Shivani
This paper defines the principles of wheeling (i.e., transmission) tariffs and renewable energy (RE) banking provisions and their role in RE deployment in countries with plans for large-scale RE. It reviews experiences to date in the United States, Mexico, and India and discusses key policy and regulatory considerations for devising more effective wheeling and/or banking provisions for countries with ambitious RE deployment targets. The paper addresses the challenges of competing needs of stakeholders, especially those of RE generators, distribution utilities, and transmission network owners and operators. The importance of wheeling and banking and their effectiveness for financial viability of REmore » deployment is also explored. This paper aims to benefit policymakers and regulators as well as key renewable energy stakeholders. Key lessons for regulators include: creating long-term wheeling and banking policy certainty, considering incentivizing RE through discounted transmission access, and assessing the cost implications of such discounts, as well as expanding access to renewable energy customers.« less
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410
Gaussian processes for personalized e-health monitoring with wearable sensors.
Clifton, Lei; Clifton, David A; Pimentel, Marco A F; Watkinson, Peter J; Tarassenko, Lionel
2013-01-01
Advances in wearable sensing and communications infrastructure have allowed the widespread development of prototype medical devices for patient monitoring. However, such devices have not penetrated into clinical practice, primarily due to a lack of research into "intelligent" analysis methods that are sufficiently robust to support large-scale deployment. Existing systems are typically plagued by large false-alarm rates, and an inability to cope with sensor artifact in a principled manner. This paper has two aims: 1) proposal of a novel, patient-personalized system for analysis and inference in the presence of data uncertainty, typically caused by sensor artifact and data incompleteness; 2) demonstration of the method using a large-scale clinical study in which 200 patients have been monitored using the proposed system. This latter provides much-needed evidence that personalized e-health monitoring is feasible within an actual clinical environment, at scale, and that the method is capable of improving patient outcomes via personalized healthcare.
915-MHz Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, M.; Bartholomew, M. J.; Giangrande, S.
When considering the amount of shortwave radiation incident on a photovoltaic solar array and, therefore, the amount and stability of the energy output from the system, clouds represent the greatest source of short-term (i.e., scale of minutes to hours) variability through scattering and reflection of incoming solar radiation. Providing estimates of this short-term variability is important for determining and regulating the output from large solar arrays as they connect with the larger power infrastructure. In support of the installation of a 37-MW solar array on the grounds of Brookhaven National Laboratory (BNL), a study of the impacts of clouds onmore » the output of the solar array has been undertaken. The study emphasis is on predicting the change in surface solar radiation resulting from the observed/forecast cloud field on a 5-minute time scale. At these time scales, advection of cloud elements over the solar array is of particular importance. As part of the BNL Aerosol Life Cycle Intensive Operational Period (IOP), a 915-MHz Radar Wind Profiler (RWP) was deployed to determine the profile of low-level horizontal winds and the depth of the planetary boundary layer. The initial deployment mission of the 915-MHz RWP for cloud forecasting has been expanded the deployment to provide horizontal wind measurements for estimating and constraining cloud advection speeds. A secondary focus is on the observation of dynamics and microphysics of precipitation during cold season/winter storms on Long Island. In total, the profiler was deployed at BNL for 1 year from May 2011 through May 2012.« less
915-Mhz Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, M.; Bartholomew, M. J.; Giangrande, S.
When considering the amount of shortwave radiation incident on a photovoltaic solar array and, therefore, the amount and stability of the energy output from the system, clouds represent the greatest source of short-term (i.e., scale of minutes to hours) variability through scattering and reflection of incoming solar radiation. Providing estimates of this short-term variability is important for determining and regulating the output from large solar arrays as they connect with the larger power infrastructure. In support of the installation of a 37-MW solar array on the grounds of Brookhaven National Laboratory (BNL), a study of the impacts of clouds onmore » the output of the solar array has been undertaken. The study emphasis is on predicting the change in surface solar radiation resulting from the observed/forecast cloud field on a 5-minute time scale. At these time scales, advection of cloud elements over the solar array is of particular importance. As part of the BNL Aerosol Life Cycle Intensive Operational Period (IOP), a 915-MHz Radar Wind Profiler (RWP) was deployed to determine the profile of low-level horizontal winds and the depth of the planetary boundary layer. The initial deployment mission of the 915-MHz RWP for cloud forecasting has been expanded the deployment to provide horizontal wind measurements for estimating and constraining cloud advection speeds. A secondary focus is on the observation of dynamics and microphysics of precipitation during cold season/winter storms on Long Island. In total, the profiler was deployed at BNL for 1 year from May 2011 through May 2012.« less
ERIC Educational Resources Information Center
Vogt, Dawne S.; Proctor, Susan P.; King, Daniel W.; King, Lynda A.; Vasterling, Jennifer J.
2008-01-01
The Deployment Risk and Resilience Inventory (DRRI) is a suite of scales that can be used to assess deployment-related factors implicated in the health and well-being of military veterans. Although initial evidence for the reliability and validity of DRRI scales based on Gulf War veteran samples is encouraging, evidence with respect to a more…
Constraints on biomass energy deployment in mitigation pathways: the case of water scarcity
NASA Astrophysics Data System (ADS)
Séférian, Roland; Rocher, Matthias; Guivarch, Céline; Colin, Jeanne
2018-05-01
To limit global warming to well below 2 ° most of the IPCC-WGIII future stringent mitigation pathways feature a massive global-scale deployment of negative emissions technologies (NETs) before the end of the century. The global-scale deployment of NETs like Biomass Energy with Carbon Capture and Storage (BECCS) can be hampered by climate constraints that are not taken into account by Integrated assessment models (IAMs) used to produce those pathways. Among the various climate constraints, water scarcity appears as a potential bottleneck for future land-based mitigation strategies and remains largely unexplored. Here, we assess climate constraints relative to water scarcity in response to the global deployment of BECCS. To this end, we confront results from an Earth system model (ESM) and an IAM under an array of 25 stringent mitigation pathways. These pathways are compatible with the Paris Agreement long-term temperature goal and with cumulative carbon emissions ranging from 230 Pg C and 300 Pg C from January 1st onwards. We show that all stylized mitigation pathways studied in this work limit warming below 2 °C or even 1.5 °C by 2100 but all exhibit a temperature overshoot exceeding 2 °C after 2050. According to the IAM, a subset of 17 emission pathways are feasible when evaluated in terms of socio-economic and technological constraints. The ESM however shows that water scarcity would limit the deployment of BECCS in all the mitigation pathways assessed in this work. Our findings suggest that the evolution of the water resources under climate change can exert a significant constraint on BECCS deployment before 2050. In 2100, the BECCS water needs could represent more than 30% of the total precipitation in several regions like Europe or Asia.
An Experimental Study on the Deployment Behavior of Membrane Structure under Spin Motion
NASA Astrophysics Data System (ADS)
Murakami, T.
load fuel, so to speak, is an ideal propellant system. As a large film is deployed in the space, solar radiation presses it. However, force of solar radiation is tiny, and so it is necessary for it to have a large square in order to use for propulsive force. As larger it becomes, bigger the weigh is. For realizing good efficient Solar Sail it is indispensable to develop a material. sail spacecraft mission realistic. However, to install a solar sail in the real mission, it is found that there are a lot of problems to be solved. Among them is a technology of deployment. attitude stability by rotating a film constantly. It is true that there are some difficulties to change an attitude, still in general, interplanetary missions does not require frequent attitude change. So the solar sail can be realistic if the mission is interplanetary. velocity, the estimation of a necessary deployment force, and the influence of outer force acting to the film. Moreover, it is necessary to consider a shape after deployment because of using it as a propellant system. That is to say, as larger difference from an ideal circular shape is, lower the efficiency as a propellant system is. numerical simulation, but also micro-gravity experiment. In numerical simulation membrane should be modeled carefully, because a dynamics of a film deployment is transitional and includes a large transformation. In this report a simple model which consists of many rigid boards is dealt with. A film is approximated to an aggregate of tiny rigid boards and the shape is calculated by solving additional force of each board. For showing a validity of this modeling, micro-gravity experiment is necessary to be conducted. Because there is a limitation of space and an experiment is conducted by using a small scaling model, similar parameters should be selected carefully.
NASA Astrophysics Data System (ADS)
Frey, Davide; Guerraoui, Rachid; Kermarrec, Anne-Marie; Koldehofe, Boris; Mogensen, Martin; Monod, Maxime; Quéma, Vivien
Gossip-based information dissemination protocols are considered easy to deploy, scalable and resilient to network dynamics. Load-balancing is inherent in these protocols as the dissemination work is evenly spread among all nodes. Yet, large-scale distributed systems are usually heterogeneous with respect to network capabilities such as bandwidth. In practice, a blind load-balancing strategy might significantly hamper the performance of the gossip dissemination.
ERIC Educational Resources Information Center
Munday, Max; Bristow, Gill; Cowell, Richard
2011-01-01
Although the large-scale deployment of renewable technologies can bring significant, localised economic and environmental changes, there has been remarkably little empirical investigation of the rural development implications. This paper seeks to redress this through an analysis of the economic development opportunities surrounding wind energy…
Pacific Array of, by and for Global Deep Earth Research
NASA Astrophysics Data System (ADS)
Kawakatsu, H.
2016-12-01
Recent advances in ocean bottom geophysical observations, together with advances in the analysis methodology, have now enabled us to resolve the regional 1-D structure of the entire lithosphere- asthenosphere system (LAS), from the surface to a depth of ˜200km, including seismic anisotropy (azimuthal), with deployments of ˜10-15 BBOBSs & OBEMs each for a year or so (Takeo et al, 2013, 2016; Baba et al., 2013; Lin et al. 2016). Thus the in-situ characterization of the physical properties of the entire oceanic LAS without a priori assumption for the shallow-most structure, the assumption often made for global studies, has become possible. We are now entering a new stage that a large scale array experiment in the ocean (e.g., Pacific Array: http://gachon.eri.u-tokyo.ac.jp/ hitosi/PArray/) has become approachable: having 10-15 BBOBSs as an array unit for a 1-2-year deployment, and repeating such deployments in a leap-frog way or concurrently (an array of arrays) for a decade or so would enable us to cover a large portion of the Pacific basin. Such array observations not only by giving regional constraints on the 1-D structure (including seismic anisotropy), but also by sharing waveform data for global scale waveform tomography (e.g., Fichtner et al. 2010; French et al. 2013; Zhu & Tromp 2013), would drastically increase our knowledge of how plate tectonics works beneath oceanic basins, as well as of the large scale picture of the interior of the Earth. For such an array of arrays to be realized, international collaboration seems essential. If three or four countries collaborate together, it may be achieved within a 10-year time frame that makes this concept attractive. It is also essential that global seismology, geodynamics, and deep earth (GSGD) communities work closely with the ocean science community for Pacific Array to be realized, as they would get most benefit from it. While unit array deployments may have their own scientific goals, it is important that they are planned to fit within a larger international Pacific Array structure. The GSGD community should take a lead in providing such an umbrella, as well as stimulating collaborations between different disciplines .
NASA Astrophysics Data System (ADS)
Andersen, G.; Dearborn, M.; Hcharg, G.
2010-09-01
We are investigating new technologies for creating ultra-large apertures (>20m) for space-based imagery. Our approach has been to create diffractive primaries in flat membranes deployed from compact payloads. These structures are attractive in that they are much simpler to fabricate, launch and deploy compared to conventional three-dimensional optics. In this case the flat focusing element is a photon sieve which consists of a large number of holes in an otherwise opaque substrate. A photon sieve is essentially a large number of holes located according to an underlying Fresnel Zone Plate (FZP) geometry. The advantages over the FZP are that there are no support struts which lead to diffraction spikes in the far-field and non-uniform tension which can cause wrinkling of the substrate. Furthermore, with modifications in hole size and distribution we can achieve improved resolution and contrast over conventional optics. The trade-offs in using diffractive optics are the large amounts of dispersion and decreased efficiency. We present both theoretical and experimental results from small-scale prototypes. Several key solutions to issues of limited bandwidth and efficiency have been addressed. Along with these we have studied the materials aspects in order to optimize performance and achieve a scalable solution to an on-orbit demonstrator. Our current efforts are being directed towards an on-orbit 1m solar observatory demonstration deployed from a CubeSat bus.
Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric
2014-01-29
Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.
NASA Technical Reports Server (NTRS)
Le, Guan; Wang, Yongli; Slavin, James A.; Strangeway, Robert J.
2007-01-01
Space Technology 5 (ST5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this paper, we present a study of the temporal variability of field-aligned currents using multi-point magnetic field measurements from ST5. The data demonstrate that meso-scale current structures are commonly embedded within large-scale field-aligned current sheets. The meso-scale current structures are very dynamic with highly variable current density and/or polarity in time scales of - 10 min. They exhibit large temporal variations during both quiet and disturbed times in such time scales. On the other hand, the data also shown that the time scales for the currents to be relatively stable are approx. 1 min for meso-scale currents and approx. 10 min for large scale current sheets. These temporal features are obviously associated with dynamic variations of their particle carriers (mainly electrons) as they respond to the variations of the parallel electric field in auroral acceleration region. The characteristic time scales for the temporal variability of meso-scale field-aligned currents are found to be consistent with those of auroral parallel electric field.
Scales of variability of bio-optical properties as observed from near-surface drifters
NASA Technical Reports Server (NTRS)
Abbott, Mark R.; Brink, Kenneth H.; Booth, C. R.; Blasco, Dolors; Swenson, Mark S.; Davis, Curtiss O.; Codispoti, L. A.
1995-01-01
A drifter equipped with bio-optical sensors and an automated water sampler was deployed in the California Current as part of the coastal transition zone program to study the biological, chemical, and physical dynamics of the meandering filaments. During deployments in 1987 and 1988, measurements were made of fluorescence, downwelling irradiance, upwelling radiance, and beam attenuation using several bio-optical sensors. Samples were collected by an automated sampler for later analysis of nutrients and phytoplankton species compositon. Large-scale spatial and temporal changes in the bio-optical and biological properties of the region were driven by changes in phytoplankton species composition which, in turn, were associated with the meandering circulation. Variance spectra of the bio-optical paramenters revealed fluctuations on both diel and semidiurnal scales, perhaps associated with solar variations and internal tides, respectively. Offshore, inertial-scale fluctuations were apparent in the variance spectra of temperature, fluorescence, and beam attenuation. Although calibration samples can help remove some of these variations, these results suggest that the use of bio-optical data from unattended platforms such as moorings and drifters must be analyzed carefully. Characterization of the scaled of phytoplankton variability must account for the scales of variability in the algorithms used to convert bio-optical measurments into biological quantities.
NASA Technical Reports Server (NTRS)
Shaffer, Joe R.; Headley, David E.
1993-01-01
Compact storable components expand to create large shelter. Fully deployed structure provides large, unobstructed bay. Deployed trusses support wall and roof blankets. Provides temporary cover for vehicles, people, and materials. Terrestrial version used as garage, hangar, or large tent.
NASA Astrophysics Data System (ADS)
Downey, Austin; Laflamme, Simon; Ubertini, Filippo
2017-12-01
Condition evaluation of wind turbine blades is difficult due to their large size, complex geometry and lack of economic and scalable sensing technologies capable of detecting, localizing, and quantifying faults over a blade’s global area. A solution is to deploy inexpensive large area electronics over strategic areas of the monitored component, analogous to sensing skin. The authors have previously proposed a large area electronic consisting of a soft elastomeric capacitor (SEC). The SEC is highly scalable due to its low cost and ease of fabrication, and can, therefore, be used for monitoring large-scale components. A single SEC is a strain sensor that measures the additive strain over a surface. Recently, its application in a hybrid dense sensor network (HDSN) configuration has been studied, where a network of SECs is augmented with a few off-the-shelf strain gauges to measure boundary conditions and decompose the additive strain to obtain unidirectional surface strain maps. These maps can be analyzed to detect, localize, and quantify faults. In this work, we study the performance of the proposed sensing skin at conducting condition evaluation of a wind turbine blade model in an operational environment. Damage in the form of changing boundary conditions and cuts in the monitored substrate are induced into the blade. An HDSN is deployed onto the interior surface of the substrate, and the blade excited in a wind tunnel. Results demonstrate the capability of the HDSN and associated algorithms to detect, localize, and quantify damage. These results show promise for the future deployment of fully integrated sensing skins deployed inside wind turbine blades for condition evaluation.
MIGHTEE: The MeerKAT International GHz Tiered Extragalactic Exploration
NASA Astrophysics Data System (ADS)
Taylor, A. Russ; Jarvis, Matt
2017-05-01
The MeerKAT telescope is the precursor of the Square Kilometre Array mid-frequency dish array to be deployed later this decade on the African continent. MIGHTEE is one of the MeerKAT large survey projects designed to pathfind SKA key science in cosmology and galaxy evolution. Through a tiered radio continuum deep imaging project including several fields totaling 20 square degrees to microJy sensitivities and an ultra-deep image of a single 1 square degree field of view, MIGHTEE will explore dark matter and large scale structure, the evolution of galaxies, including AGN activity and star formation as a function of cosmic time and environment, the emergence and evolution of magnetic fields in galaxies, and the magnetic counter part to large scale structure of the universe.
Space Technology 5 Multi-Point Observations of Temporal Variability of Field-Aligned Currents
NASA Technical Reports Server (NTRS)
Le, Guan; Wang, Yongli; Slavin, James A.; Strangeway, Robert J.
2008-01-01
Space Technology 5 (ST5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this paper, we present a study of the temporal variability of field-aligned currents using multi-point magnetic field measurements from ST5. The data demonstrate that meso-scale current structures are commonly embedded within large-scale field-aligned current sheets. The meso-scale current structures are very dynamic with highly variable current density and/or polarity in time scales of approximately 10 min. They exhibit large temporal variations during both quiet and disturbed times in such time scales. On the other hand, the data also shown that the time scales for the currents to be relatively stable are approximately 1 min for meso-scale currents and approximately 10 min for large scale current sheets. These temporal features are obviously associated with dynamic variations of their particle carriers (mainly electrons) as they respond to the variations of the parallel electric field in auroral acceleration region. The characteristic time scales for the temporal variability of meso-scale field-aligned currents are found to be consistent with those of auroral parallel electric field.
LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 1
NASA Technical Reports Server (NTRS)
Sullivan, M. R.
1982-01-01
The first of a two-phase program was performed to develop the technology necessary to evaluate, design, manufacture, package, transport and deploy the hoop/column deployable antenna reflector by means of a ground based program. The hoop/column concept consists of a cable stiffened large diameter hoop and central column structure that supports and contours a radio frequency reflective mesh surface. Mission scenarios for communications, radiometer and radio astronomy, were studied. The data to establish technology drivers that resulted in a specification of a point design was provided. The point design is a multiple beam quadaperture offset antenna system wich provides four separate offset areas of illumination on a 100 meter diameter symmetrical parent reflector. The periphery of the reflector is a hoop having 48 segments that articulate into a small stowed volume around a center extendable column. The hoop and column are structurally connected by graphite and quartz cables. The prominence of cables in the design resulted in the development of advanced cable technology. Design verification models were built of the hoop, column, and surface stowage subassemblies. Model designs were generated for a half scale sector of the surface and a 1/6 scale of the complete deployable reflector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Speer, Bethany; Keyser, David; Tegen, Suzanne
Construction of the first offshore wind farm in the United States began in 2015, using fixed platform structures that are appropriate for shallow seafloors, like those located off of the East Coast and mid-Atlantic. However, floating platforms, which have yet to be deployed commercially, will likely need to anchor to the deeper seafloor if deployed off of the West Coast. To analyze the employment and economic potential for floating offshore wind along the West Coast, the Bureau of Ocean Energy Management (BOEM) has commissioned the National Renewable Energy Laboratory (NREL) to analyze two hypothetical, large-scale deployment scenarios for California: 16more » GW of offshore wind by 2050 (Scenario A) and 10 GW of offshore wind by 2050 (Scenario B). The results of this analysis can be used to better understand the general scales of economic opportunities that could result from offshore wind development. Results show total state gross domestic product (GDP) impacts of $16.2 billion in Scenario B or $39.7 billion in Scenario A for construction; and $3.5 billion in Scenario B or $7.9 billion in Scenario A for the operations phases.« less
Ground Deployment Demonstration and Material Testing for Solar Sail
NASA Astrophysics Data System (ADS)
Huang, Xiaoqi; Cheng, Zhengai; Liu, Yufei; Wang, Li
2016-07-01
Solar Sail is a kind of spacecraft that can achieve extremely high velocity by light pressure instead of chemical fuel. The great accelerate rely on its high area-to-mass ratio. So solar sail is always designed in huge size and it use ultra thin and light weight materials. For 100-meter class solar sail, two key points must be considered in the design process. They are fold-deployment method, and material property change in space environment. To test and verify the fold-deployment technology, a 8*8m principle prototype was developed. Sail membrane folding in method of IKAROS, Nanosail-D , and new proposed L-shape folding pattern were tested on this prototype. Their deployment properties were investigated in detail, and comparisons were made between them. Also, the space environment suitability of ultra thin polyimide films as candidate solar sail material was analyzed. The preliminary test results showed that membrane by all the folding method could deploy well. Moreover, sail membrane folding by L-shape pattern deployed more rapidly and more organized among the three folding pattern tested. The mechanical properties of the polyimide had no significant change after electron irradiation. As the preliminary research on the key technology of solar sail spacecraft, in this paper, the results of the study would provide important basis on large-scale solar sail membrane select and fold-deploying method design.
Imaging Exoplanets with the Exo-S Starshade Mission: Key Enabling Technologies
NASA Astrophysics Data System (ADS)
Kasdin, N. Jeremy; Lisman, Doug; Shaklan, Stuart; Thomson, Mark; Webb, David; Cady, Eric; Exo-S Science; Technology Definition Team, Exoplanet Program Probe Study Design Team
2015-01-01
There is increasing interest in the use of a starshade, a spacecraft employing a large screen flying in formation with a space telescope, for providing the starlight suppression needed to detect and characterize exoplanets. In particular, Exo-S is a NASA study directed at designing a probe-scale exoplanet mission employing a starshade. In this poster we present the enabling technologies needed to make a starshade mission a reality: flight-like petals, a deployable truss to support the petals, optical edges, optical diffraction studies, and formation sensing and control. We show the status of each technology gap and summarize our progress over the past 5 years with plans for the next 3 years in demonstrating feasibility in all these areas. In particular, since no optical end-to-end test is possible, it is necessary to both show that a starshade can be built and deployed to the required accuracy and, via laboratory experiments at smaller scale, that the optical modeling upon which the accuracy requirements are based is validated. We show our progress verifying key enabling technologies, including demonstrating that a starshade petal made from flight-like materials can be manufactured to the needed accuracy and that a central truss with attached petals can be deployed with the needed precision. We also summarize our sub-scale lab experiments that demonstrate we can achieve the contrast predicted by our optical models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.
Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less
NASA Astrophysics Data System (ADS)
Xiong, Zhi; Zhu, J. G.; Xue, B.; Ye, Sh. H.; Xiong, Y.
2013-10-01
As a novel network coordinate measurement system based on multi-directional positioning, workspace Measurement and Positioning System (wMPS) has outstanding advantages of good parallelism, wide measurement range and high measurement accuracy, which makes it to be the research hotspots and important development direction in the field of large-scale measurement. Since station deployment has a significant impact on the measurement range and accuracy, and also restricts the use-cost, the optimization method of station deployment was researched in this paper. Firstly, positioning error model was established. Then focusing on the small network consisted of three stations, the typical deployments and error distribution characteristics were studied. Finally, through measuring the simulated fuselage using typical deployments at the industrial spot and comparing the results with Laser Tracker, some conclusions are obtained. The comparison results show that under existing prototype conditions, I_3 typical deployment of which three stations are distributed in a straight line has an average error of 0.30 mm and the maximum error is 0.50 mm in the range of 12 m. Meanwhile, C_3 typical deployment of which three stations are uniformly distributed in the half-circumference of an circle has an average error of 0.17 mm and the maximum error is 0.28 mm. Obviously, C_3 typical deployment has a higher control effect on precision than I_3 type. The research work provides effective theoretical support for global measurement network optimization in the future work.
US National Large-scale City Orthoimage Standard Initiative
Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.
2003-01-01
The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.
A Mars Riometer: Antenna Considerations
NASA Technical Reports Server (NTRS)
Fry, Craig D.
2001-01-01
This is the final report on NASA Grant NAG5-9706. This project explored riometer (relative ionospheric opacity meter) antenna designs that would be practical for a Mars surface or balloon mission. The riometer is an important radio science instrument for terrestrial aeronomy investigations. The riometer measures absorption of cosmic radio waves by the overhead ionosphere. Studies have shown the instrument should work well on Mars, which has an appreciable daytime ionosphere. There has been concern that the required radio receiver antenna (with possibly a 10 meter scale size) would be too large or too difficult to deploy on Mars. This study addresses those concerns and presents several antenna designs and deployment options. It is found that a Mars balloon would provide an excellent platform for the riometer antenna. The antenna can be incorporated into the envelope design, allowing self-deployment of the antenna as the balloon inflates.
Hernández-Ramos, José L.; Bernabe, Jorge Bernal; Moreno, M. Victoria; Skarmeta, Antonio F.
2015-01-01
As we get into the Internet of Things era, security and privacy concerns remain as the main obstacles in the development of innovative and valuable services to be exploited by society. Given the Machine-to-Machine (M2M) nature of these emerging scenarios, the application of current privacy-friendly technologies needs to be reconsidered and adapted to be deployed in such global ecosystem. This work proposes different privacy-preserving mechanisms through the application of anonymous credential systems and certificateless public key cryptography. The resulting alternatives are intended to enable an anonymous and accountable access control approach to be deployed on large-scale scenarios, such as Smart Cities. Furthermore, the proposed mechanisms have been deployed on constrained devices, in order to assess their suitability for a secure and privacy-preserving M2M-enabled Internet of Things. PMID:26140349
Application of the actor model to large scale NDE data analysis
NASA Astrophysics Data System (ADS)
Coughlin, Chris
2018-03-01
The Actor model of concurrent computation discretizes a problem into a series of independent units or actors that interact only through the exchange of messages. Without direct coupling between individual components, an Actor-based system is inherently concurrent and fault-tolerant. These traits lend themselves to so-called "Big Data" applications in which the volume of data to analyze requires a distributed multi-system design. For a practical demonstration of the Actor computational model, a system was developed to assist with the automated analysis of Nondestructive Evaluation (NDE) datasets using the open source Myriad Data Reduction Framework. A machine learning model trained to detect damage in two-dimensional slices of C-Scan data was deployed in a streaming data processing pipeline. To demonstrate the flexibility of the Actor model, the pipeline was deployed on a local system and re-deployed as a distributed system without recompiling, reconfiguring, or restarting the running application.
Hernández-Ramos, José L; Bernabe, Jorge Bernal; Moreno, M Victoria; Skarmeta, Antonio F
2015-07-01
As we get into the Internet of Things era, security and privacy concerns remain as the main obstacles in the development of innovative and valuable services to be exploited by society. Given the Machine-to-Machine (M2M) nature of these emerging scenarios, the application of current privacy-friendly technologies needs to be reconsidered and adapted to be deployed in such global ecosystem. This work proposes different privacy-preserving mechanisms through the application of anonymous credential systems and certificateless public key cryptography. The resulting alternatives are intended to enable an anonymous and accountable access control approach to be deployed on large-scale scenarios, such as Smart Cities. Furthermore, the proposed mechanisms have been deployed on constrained devices, in order to assess their suitability for a secure and privacy-preserving M2M-enabled Internet of Things.
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System
Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-01-01
Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.
Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-12-01
Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.
Data-Gathering Scheme Using AUVs in Large-Scale Underwater Sensor Networks: A Multihop Approach
Khan, Jawaad Ullah; Cho, Ho-Shin
2016-01-01
In this paper, we propose a data-gathering scheme for hierarchical underwater sensor networks, where multiple Autonomous Underwater Vehicles (AUVs) are deployed over large-scale coverage areas. The deployed AUVs constitute an intermittently connected multihop network through inter-AUV synchronization (in this paper, synchronization means an interconnection between nodes for communication) for forwarding data to the designated sink. In such a scenario, the performance of the multihop communication depends upon the synchronization among the vehicles. The mobility parameters of the vehicles vary continuously because of the constantly changing underwater currents. The variations in the AUV mobility parameters reduce the inter-AUV synchronization frequency contributing to delays in the multihop communication. The proposed scheme improves the AUV synchronization frequency by permitting neighboring AUVs to share their status information via a pre-selected node called an agent-node at the static layer of the network. We evaluate the proposed scheme in terms of the AUV synchronization frequency, vertical delay (node→AUV), horizontal delay (AUV→AUV), end-to-end delay, and the packet loss ratio. Simulation results show that the proposed scheme significantly reduces the aforementioned delays without the synchronization time-out process employed in conventional works. PMID:27706042
Data-Gathering Scheme Using AUVs in Large-Scale Underwater Sensor Networks: A Multihop Approach.
Khan, Jawaad Ullah; Cho, Ho-Shin
2016-09-30
In this paper, we propose a data-gathering scheme for hierarchical underwater sensor networks, where multiple Autonomous Underwater Vehicles (AUVs) are deployed over large-scale coverage areas. The deployed AUVs constitute an intermittently connected multihop network through inter-AUV synchronization (in this paper, synchronization means an interconnection between nodes for communication) for forwarding data to the designated sink. In such a scenario, the performance of the multihop communication depends upon the synchronization among the vehicles. The mobility parameters of the vehicles vary continuously because of the constantly changing underwater currents. The variations in the AUV mobility parameters reduce the inter-AUV synchronization frequency contributing to delays in the multihop communication. The proposed scheme improves the AUV synchronization frequency by permitting neighboring AUVs to share their status information via a pre-selected node called an agent-node at the static layer of the network. We evaluate the proposed scheme in terms of the AUV synchronization frequency, vertical delay (node→AUV), horizontal delay (AUV→AUV), end-to-end delay, and the packet loss ratio. Simulation results show that the proposed scheme significantly reduces the aforementioned delays without the synchronization time-out process employed in conventional works.
Support of Helicopter 'Free Flight' Operations in the 1996 Olympics
NASA Technical Reports Server (NTRS)
Branstetter, James R.; Cooper, Eric G.
1996-01-01
The microcosm of activity surrounding the 1996 Olympic Games provided researchers an opportunity for demonstrating state-of-the art technology in the first large-scale deployment of a prototype digital communication/navigation/surveillance system in a confined environment. At the same time it provided an ideal opportunity for transportation officials to showcase the merits of an integrated transportation system in meeting the operational needs to transport time sensitive goods and provide public safety services under real-world conditions. Five aeronautical CNS functions using a digital datalink system were chosen for operational flight testing onboard 91 aircraft, most of them helicopters, participating in the Atlanta Short-Haul Transportation System. These included: GPS-based Automatic Dependent Surveillance, Cockpit Display of Traffic Information, Controller-Pilot Communications, Graphical Weather Information (uplink), and Automated Electronic Pilot Reporting (downlink). Atlanta provided the first opportunity to demonstrate, in an actual operating environment, key datalink functions which would enhance flight safety and situational awareness for the pilot and supplement conventional air traffic control. The knowledge gained from such a large-scale deployment will help system designers in development of a national infrastructure where aircraft would have the ability to navigate autonomously.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Taiping; Khangaonkar, Tarang; Long, Wen
2014-02-07
In recent years, with the rapid growth of global energy demand, the interest in extracting uranium from seawater for nuclear energy has been renewed. While extracting seawater uranium is not yet commercially viable, it serves as a “backstop” to the conventional uranium resources and provides an essentially unlimited supply of uranium resource. With recent advances in seawater uranium extraction technology, extracting uranium from seawater could be economically feasible when the extraction devices are deployed at a large scale (e.g., several hundred km2). There is concern however that the large scale deployment of adsorbent farms could result in potential impacts tomore » the hydrodynamic flow field in an oceanic setting. In this study, a kelp-type structure module was incorporated into a coastal ocean model to simulate the blockage effect of uranium extraction devices on the flow field. The module was quantitatively validated against laboratory flume experiments for both velocity and turbulence profiles. The model-data comparison showed an overall good agreement and validated the approach of applying the model to assess the potential hydrodynamic impact of uranium extraction devices or other underwater structures in coastal oceans.« less
NASA Astrophysics Data System (ADS)
Jenerette, D.; Wang, J.; Chandler, M.; Ripplinger, J.; Koutzoukis, S.; Ge, C.; Castro Garcia, L.; Kucera, D.; Liu, X.
2017-12-01
Large uncertainties remain in identifying the distribution of urban air quality and temperature risks across neighborhood to regional scales. Nevertheless, many cities are actively expanding vegetation with an expectation to moderate both climate and air quality risks. We address these uncertainties through an integrated analysis of satellite data, atmospheric modeling, and in-situ environmental sensor networks maintained by citizen scientists. During the summer of 2017 we deployed neighborhood-scale networks of air temperature and ozone sensors through three campaigns across urbanized southern California. During each five-week campaign we deployed six sensor nodes that included an EPA federal equivalent method ozone sensor and a suite of meteorological sensors. Each node was further embedded in a network of 100 air temperature sensors that combined a randomized design developed by the research team and a design co-created by citizen scientists. Between 20 and 60 citizen scientists were recruited for each campaign, with local partners supporting outreach and training to ensure consistent deployment and data gathering. We observed substantial variation in both temperature and ozone concentrations at scales less than 4km, whole city, and the broader southern California region. At the whole city scale the average spatial variation with our ozone sensor network just for city of Long Beach was 26% of the mean, while corresponding variation in air temperature was only 7% of the mean. These findings contrast with atmospheric model estimates of variation at the regional scale of 11% and 1%. Our results show the magnitude of fine-scale variation underestimated by current models and may also suggest scaling functions that can connect neighborhood and regional variation in both ozone and temperature risks in southern California. By engaging citizen science with high quality sensors, satellite data, and real-time forecasting, our results help identify magnitudes of climate and air quality risk variation across scales and can guide individual decisions and urban policies surrounding vegetation to moderate these risks.
Space Technology 5 (ST-5) Observations of Field-Aligned Currents: Temporal Variability
NASA Technical Reports Server (NTRS)
Le, Guan
2010-01-01
Space Technology 5 (ST-5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this paper, we present a study of the temporal variability of field-aligned currents using multi-point magnetic field measurements from STS. The data demonstrate that masoscale current structures are commonly embedded within large-scale field-aligned current sheets. The meso-scale current structures are very dynamic with highly variable current density and/or polarity in time scales of about 10 min. They exhibit large temporal variations during both quiet and disturbed times in such time scales. On the other hand, the data also shown that the time scales for the currents to be relatively stable are about I min for meso-scale currents and about 10 min for large scale current sheets. These temporal features are obviously associated with dynamic variations of their particle carriers (mainly electrons) as they respond to the variations of the parallel electric field in auroral acceleration region. The characteristic time scales for the temporal variability of meso-scale field-aligned currents are found to be consistent with those of auroral parallel electric field.
NASA Technical Reports Server (NTRS)
Le, Guan
2010-01-01
Space Technology 5 (ST-5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this paper, we present a study of the temporal variability of field-aligned currents using multi-point magnetic field measurements from ST5. The data demonstrate that mesoscale current structures are commonly embedded within large-scale field-aligned current sheets. The meso-scale current structures are very dynamic with highly variable current density and/or polarity in time scales of about 10 min. They exhibit large temporal variations during both quiet and disturbed times in such time scales. On the other hand, the data also shown that the time scales for the currents to be relatively stable are about 1 min for meso-scale currents and about 10 min for large scale current sheets. These temporal features are obviously associated with dynamic variations of their particle carriers (mainly electrons) as they respond to the variations of the parallel electric field in auroral acceleration region. The characteristic time scales for the temporal variability of meso-scale field-aligned currents are found to be consistent with those of auroral parallel electric field.
Technology development for cryogenic deployable telescope structures and mechanisms
NASA Astrophysics Data System (ADS)
Atkinson, Charles B.; Gilman, Larry; Reynolds, Paul
2003-12-01
At 6-7 meters in diameter, the James Webb Space Telescope (JWST) will require structures that remain stable to levels that are on the order of 10 nanometers under dynamic and thermal loading while operating at cryogenic temperatures. Moreover, the JWST will be the first telescope in space that is deployed, resulting in an aperture that is not only segmented, but has hinge-lines and the associated joining systems or latches in it. In order to understand the behavior and reduce the risk associated with very large, deployed structures and the stability of the associated structure and latches, we developed and tested the largest cryogenic structure ever built and then characterized its stability. This paper presents a description of the design of the Development Optical Telescope Assembly (DOTA), the testing performed, and the results of the testing performed on it. We discuss the material selection and characterization processes, give a description of the test configurations, describe the metrology equipment and the validation process for it, provide the test results, and summarize the conclusions drawn from the results. The testing and associated results include characterization of the thermal stability of the large-scale structure, characterization of the micro-dynamic stability of the latching system, and measurements of the deployment capability of the mechanisms. We also describe how the DOTA design relates to the JWST design and how the test results relate to the JWST requirements.
Ultralightweight Space Deployable Primary Reflector Demonstrator
NASA Technical Reports Server (NTRS)
Montgomery, Edward E., IV; Zeiders, Glenn W.; Smith, W. Scott (Technical Monitor)
2002-01-01
A concept has been developed and analyzed and several generational prototypes built for a gossamer-class deployable truss for a mirror or reflector with many smaller precisely-figured solid elements attached will, for at least the next several decades, minimize the mass of a large primary mirror assembly while still providing the high image quality essential for planet-finding and cosmological astronomical missions. Primary mirror segments are mounted in turn on ultralightweight thermally-formed plastic panels that hold clusters of mirror segments in rigid arrays whose tip/tilt and piston would be corrected over the scale of the plastic panels by the control segments. Prototype panels developed under this program are 45 cm wide and fabricated from commercially available Kaplan sheets. A three-strut octahedral tensegrity is the basis for the overall support structure. Each fundamental is composed of two such octahedrons, rotated oppositely about a common triangular face. Adjacent modules are joined at the nodes of the upper and lower triangles to form a deployable structure that could be made arbitrarily large. A seven-module dowel-and-wire prototype has been constructed. Deployment techniques based on the use of collapsing toggled struts with diagonal tensional elements allows an assembly of tensegrities to be fully collapsed and redeployed. The prototype designs will be described and results of a test program for measuring strength and deformation will be presented.
ENGINEERING DEVELOPMENT UNIT SOLAR SAIL
2016-01-13
TIFFANY LOCKETT OVERSEES THE HALF SCALE (36 SQUARE METERS) ENGINEERING DEVELOPMENT UNIT (EDU) SOLAR SAIL DEPLOYMENT DEMONSTRATION IN PREPARATION FOR FULL SCALE EDU (86 SQUARE METERS) DEPLOYMENT IN APRIL, 2016
Tani, Kassimu; Exavery, Amon; Baynes, Colin D; Pemba, Senga; Hingora, Ahmed; Manzi, Fatuma; Phillips, James F; Kanté, Almamy Malick
2016-07-08
Tanzania, like other African countries, faces significant health workforce shortages. With advisory and partnership from Columbia University, the Ifakara Health Institute and the Tanzanian Training Centre for International Health (TTCIH) developed and implemented the Connect Project as a randomized cluster experimental trial of the childhood survival impact of recruiting, training, and deploying of a new cadre of paid community health workers (CHW), named "Wawazesha wa afya ya Jamii" (WAJA). This paper presents an estimation of the cost of training and deploying WAJA in three rural districts of Tanzania. Costing data were collected by tracking project activity expenditure records and conducting in-depth interviews of TTCIH staff who have led the training and deployment of WAJA, as well as their counterparts at Public Clinical Training Centres who have responsibility for scaling up the WAJA training program. The trial is registered with the International Standard Randomized Controlled Trial Register number ( ISRCTN96819844 ). The Connect training cost was US$ 2,489.3 per WAJA, of which 40.1 % was for meals, 20.2 % for accommodation 10.2 % for tuition fees and the remaining 29.5 % for other costs including instruction and training facilities and field allowance. A comparable training program estimated unit cost for scaling-up this training via regional/district clinical training centres would be US$ 833.5 per WAJA. Of this unit cost, 50.3 % would involve the cost of meals, 27.4 % training fees, 13.7 % for field allowances, 9 % for accommodation and medical insurance. The annual running cost of WAJA in a village will cost US$ 1.16 per capita. Costs estimated by this study are likely to be sustainable on a large scale, particularly if existing regional/district institutions are utilized for this program.
Energy Management and Optimization Methods for Grid Energy Storage Systems
Byrne, Raymond H.; Nguyen, Tu A.; Copp, David A.; ...
2017-08-24
Today, the stability of the electric power grid is maintained through real time balancing of generation and demand. Grid scale energy storage systems are increasingly being deployed to provide grid operators the flexibility needed to maintain this balance. Energy storage also imparts resiliency and robustness to the grid infrastructure. Over the last few years, there has been a significant increase in the deployment of large scale energy storage systems. This growth has been driven by improvements in the cost and performance of energy storage technologies and the need to accommodate distributed generation, as well as incentives and government mandates. Energymore » management systems (EMSs) and optimization methods are required to effectively and safely utilize energy storage as a flexible grid asset that can provide multiple grid services. The EMS needs to be able to accommodate a variety of use cases and regulatory environments. In this paper, we provide a brief history of grid-scale energy storage, an overview of EMS architectures, and a summary of the leading applications for storage. These serve as a foundation for a discussion of EMS optimization methods and design.« less
Energy Management and Optimization Methods for Grid Energy Storage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrne, Raymond H.; Nguyen, Tu A.; Copp, David A.
Today, the stability of the electric power grid is maintained through real time balancing of generation and demand. Grid scale energy storage systems are increasingly being deployed to provide grid operators the flexibility needed to maintain this balance. Energy storage also imparts resiliency and robustness to the grid infrastructure. Over the last few years, there has been a significant increase in the deployment of large scale energy storage systems. This growth has been driven by improvements in the cost and performance of energy storage technologies and the need to accommodate distributed generation, as well as incentives and government mandates. Energymore » management systems (EMSs) and optimization methods are required to effectively and safely utilize energy storage as a flexible grid asset that can provide multiple grid services. The EMS needs to be able to accommodate a variety of use cases and regulatory environments. In this paper, we provide a brief history of grid-scale energy storage, an overview of EMS architectures, and a summary of the leading applications for storage. These serve as a foundation for a discussion of EMS optimization methods and design.« less
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
NASA Astrophysics Data System (ADS)
Hill, Gary J.; Tuttle, Sarah E.; Vattiat, Brian L.; Lee, Hanshin; Drory, Niv; Kelz, Andreas; Ramsey, Jason; Peterson, Trent W.; DePoy, D. L.; Marshall, J. L.; Gebhardt, Karl; Chonis, Taylor; Dalton, Gavin; Farrow, Daniel; Good, John M.; Haynes, Dionne M.; Indahl, Briana L.; Jahn, Thomas; Kriel, Hermanus; Montesano, Francesco; Nicklas, Harald; Noyola, Eva; Prochaska, Travis; Allen, Richard D.; Bender, Ralf; Blanc, Guillermo; Fabricius, Maximilian H.; Finkelstein, Steve; Landriau, Martin; MacQueen, Phillip J.; Roth, M. M.; Savage, R. D.; Snigula, Jan M.; Anwad, Heiko
2016-08-01
The Visible Integral-field Replicable Unit Spectrograph (VIRUS) consists of 156 identical spectrographs (arrayed as 78 pairs) fed by 35,000 fibers, each 1.5 arcsec diameter, at the focus of the upgraded 10 m Hobby-Eberly Telescope (HET). VIRUS has a fixed bandpass of 350-550 nm and resolving power R 700. VIRUS is the first example of industrial-scale replication applied to optical astronomy and is capable of surveying large areas of sky, spectrally. The VIRUS concept offers significant savings of engineering effort, cost, and schedule when compared to traditional instruments. The main motivator for VIRUS is to map the evolution of dark energy for the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX‡), using 0.8M Lyman-alpha emitting galaxies as tracers. The VIRUS array is undergoing staged deployment during 2016 and 2017. It will provide a powerful new facility instrument for the HET, well suited to the survey niche of the telescope, and will open up large spectroscopic surveys of the emission line universe for the first time. We will review the production, lessons learned in reaching volume production, characterization, and first deployment of this massive instrument.
Configuration Analysis of the ERS Points in Large-Volume Metrology System
Jin, Zhangjun; Yu, Cijun; Li, Jiangxiong; Ke, Yinglin
2015-01-01
In aircraft assembly, multiple laser trackers are used simultaneously to measure large-scale aircraft components. To combine the independent measurements, the transformation matrices between the laser trackers’ coordinate systems and the assembly coordinate system are calculated, by measuring the enhanced referring system (ERS) points. This article aims to understand the influence of the configuration of the ERS points that affect the transformation matrix errors, and then optimize the deployment of the ERS points to reduce the transformation matrix errors. To optimize the deployment of the ERS points, an explicit model is derived to estimate the transformation matrix errors. The estimation model is verified by the experiment implemented in the factory floor. Based on the proposed model, a group of sensitivity coefficients are derived to evaluate the quality of the configuration of the ERS points, and then several typical configurations of the ERS points are analyzed in detail with the sensitivity coefficients. Finally general guidance is established to instruct the deployment of the ERS points in the aspects of the layout, the volume size and the number of the ERS points, as well as the position and orientation of the assembly coordinate system. PMID:26402685
NASA Astrophysics Data System (ADS)
Ward, K. M.; Lin, F. C.
2017-12-01
Recent advances in seismic data-acquisition technology paired with an increasing interest from the academic passive source seismological community have opened up new scientific targets and imaging possibilities, often referred to as Large-N experiments (large number of instruments). The success of these and other deployments has motivated individual researchers, as well as the larger seismological community, to invest in the next generation of nodal geophones. Although the new instruments have battery life and bandwidth limitations compared to broadband instruments, the relatively low deployment and procurement cost of these new nodal geophones provides an additional novel tool for researchers. Here, we explore the viability of using autonomous three-component nodal geophones to calculate teleseismic Ps receiver functions by comparison of co-located broadband stations and highlight some potential advantages with a dense nodal array deployed around the Upper Geyser basin in Yellowstone National Park. Two key findings from this example include (1) very dense nodal arrays can be used to image small-scale features in the shallow crust that typical broadband station spacing would alias, and (2) nodal arrays with a larger footprint could be used to image deeper features with greater or equal detail as typical broadband deployments but at a reduced deployment cost. The success of the previous example has motivated a larger 2-D line across the Cascadia subduction zone. In the summer of 2017, we deployed 174 nodal geophones with an average site spacing of 750 m. Synthetic tests with dense station spacing ( 1 km) reveal subtler features of the system that is consistent with our preliminary receiver function results from our Cascadia deployment. With the increasing availability of nodal geophones to individual researchers and the successful demonstration that nodal geophones are a viable instrument for receiver function studies, numerous scientific targets can be investigated at reduced costs or in expanded detail.
NASA Astrophysics Data System (ADS)
Kasdin, N. J.; Lisman, D.; Shaklan, S.; Thomson, M.; Webb, D.; Cady, E.; Marks, G. W.; Lo, A.
2013-09-01
An external occulter is a satellite employing a large screen, or starshade, that flies in formation with a spaceborne telescope to provide the starlight suppression needed for detecting and characterizing exoplanets. Among the advantages of using an occulter are the broadband allowed for characterization and the removal of light before entering the observatory, greatly relaxing the requirements on the telescope and instrument. In support of NASA's Exoplanet Exploration Program and the Technology Development for Exoplanet Missions (TDEM), we recently completed a 2 year study of the manufacturability and metrology of starshade petals. In this paper we review the results of that successful first TDEM which demonstrated an occulter petal could be built and measured to an accuracy consistent with close to 10-10 contrast. We then present the results of our second TDEM to demonstrate the next critical technology milestone: precision deployment of the central truss and petals to the necessary accuracy. We show the deployment of an existing deployable truss outfitted with four sub-scale petals and a custom designed central hub.
Charting the Emergence of Corporate Procurement of Utility-Scale PV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heeter, Jenny S.; Cook, Jeffrey J.; Bird, Lori A.
Through July 2017, corporate customers contracted for more than 2,300 MW of utility-scale solar. This paper examines the benefits, challenges, and outlooks for large-scale off-site solar purchasing through four pathways: PPAs, retail choice, utility partnerships (green tariffs and bilateral contracts with utilities), and by becoming a licensed wholesale seller of electricity. Each pathway differs based on where in the United States it is available, the value provided to a corporate off-taker, and the ease of implementation. The paper concludes with a discussion of future pathway comparison, noting that to deploy more corporate off-site solar, new procurement pathways are needed.
Multimodal biometrics for identity documents (MBioID).
Dessimoz, Damien; Richiardi, Jonas; Champod, Christophe; Drygajlo, Andrzej
2007-04-11
The MBioID initiative has been set up to address the following germane question: What and how biometric technologies could be deployed in identity documents in the foreseeable future? This research effort proposes to look at current and future practices and systems of establishing and using biometric identity documents (IDs) and evaluate their effectiveness in large-scale developments. The first objective of the MBioID project is to present a review document establishing the current state-of-the-art related to the use of multimodal biometrics in an IDs application. This research report gives the main definitions, properties and the framework of use related to biometrics, an overview of the main standards developed in the biometric industry and standardisation organisations to ensure interoperability, as well as some of the legal framework and the issues associated to biometrics such as privacy and personal data protection. The state-of-the-art in terms of technological development is also summarised for a range of single biometric modalities (2D and 3D face, fingerprint, iris, on-line signature and speech), chosen according to ICAO recommendations and availabilities, and for various multimodal approaches. This paper gives a summary of the main elements of that report. The second objective of the MBioID project is to propose relevant acquisition and evaluation protocols for a large-scale deployment of biometric IDs. Combined with the protocols, a multimodal database will be acquired in a realistic way, in order to be as close as possible to a real biometric IDs deployment. In this paper, the issues and solutions related to the acquisition setup are briefly presented.
A Laser Interferometric Miniature Seismometer
2010-09-01
A LASER INTERFEROMETRIC MINIATURE SEISMOMETER Dustin W. Carr, Patrick C. Baldwin, Shawn A. Knapp-Kleinsorge, Howard Milburn, and David Robinson...Symphony Acoustics, Inc. Sponsored by the National Nuclear Security Administration Award No. DE-FG02-08ER85108.001 ABSTRACT The threat of...performance, compact device can enable rapid deployment of large-scale arrays , which can in turn be used to provide higher-quality data during times of
Status of liquid metal fast breeder reactor fuel development in Japan
NASA Astrophysics Data System (ADS)
Katsuragawa, M.; Kashihara, H.; Akebi, M.
1993-09-01
The mixed-oxide fuel technology for a liquid metal fast breeder reactor (LMFBR) in Japan is progressing toward commercial deployment of LMFBR. Based on accumulated experience in Joyo and Monju fuel development, efforts for large scale LMFBR fuel development are devoted to improved irradiation performance, reliability and economy. This paper summarizes accomplishments, current activities and future plans for LMFBR fuel development in Japan.
Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan
2017-12-20
A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-01-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.
The Potential of Geothermal as a Major Supplier of U.S. Primary Energy using EGS technology
NASA Astrophysics Data System (ADS)
Tester, J. W.
2012-12-01
Recent national focus on the value of increasing our supply of indigenous, renewable energy underscores the need for re-evaluating all alternatives, particularly those that are large and well-distributed nationally. To transition from our current hydrocarbon-based energy system, we will need to expand and diversify the portfolio of options we currently have. One such option that has been undervalued and often ignored completely in national assessments is geothermal energy from both conventional hydrothermal resources and enhanced or engineered geothermal systems (EGS). Although geothermal energy is currently used for both electric and non-electric applications worldwide from conventional hydrothermal resources and in groundsource heat pumps, most of the emphasis in the US has been generating electricity. For example, a 2006 MIT-led study focused on the potential for EGS to provide 100,000 MWe of base-load electric generating capacity in the US by 2050. Since that time, a Cornell-led study has evaluated the potential for geothermal to meet the more than 25 EJ per year demand in the US for low temperature thermal energy for heating and other direct process applications Field testing of EGS in the US, Europe, and Australia is reviewed to outline what remains to be done for large-scale deployment. Research, Development and Demonstration (RD&D) needs in five areas important to geothermal deployment on a national scale will be reviewed: 1. Resource - estimating the magnitude and distribution of the US resource 2. Reservoir Technology - establishing requirements for extracting and utilizing energy from EGS reservoirs including drilling, reservoir design and stimulation 3. Utilization - exploring end use options for district heating, electricity generation and co-generation. 4. Environmental impacts and tradeoffs -- dealing with water and land use and seismic risk and quantifying the reduction in carbon emissions with increased deployment 5. Economics - projecting costs for EGS supplied electricity as a function of invested R&D and deployment in evolving US energy markets
Arbogast, Kristy B; Durbin, Dennis R; Kallan, Michael J; Elliott, Michael R; Winston, Flaura K
2005-04-01
To estimate the risk of serious nonfatal injuries in frontal crashes among belted children seated in the right front seat of vehicles in which second-generation passenger air bags deployed compared with that of belted children seated in the right front seat of vehicles in which first-generation passenger air bags deployed. We enrolled a probability sample of 1781 seat belt-restrained occupants aged 3 through 15 years seated in the right front seat, exposed to deployed passenger air bags in frontal crashes involving insured vehicles in 3 large US regions, between December 1, 1998, and November 30, 2002. A telephone interview was conducted with the driver of the vehicle using a previously validated instrument. The study sample was weighted according to each subject's probability of selection, with analyses conducted on the weighted sample. Main Outcome Measure Risk of serious injury (Abbreviated Injury Scale score of > or =2 injuries and facial lacerations). The risk of serious injury for restrained children in the right front seat exposed to deployed second-generation passenger air bags was 9.9%, compared with 14.9% for similar children exposed to deployed first-generation passenger air bags (adjusted odds ratio, 0.59; 95% confidence interval, 0.36-0.97). This study provides evidence based on field data that the risk of injury to children exposed to deploying second-generation passenger air bags is reduced compared with earlier designs.
Bioinspired principles for large-scale networked sensor systems: an overview.
Jacobsen, Rune Hylsberg; Zhang, Qi; Toftegaard, Thomas Skjødeberg
2011-01-01
Biology has often been used as a source of inspiration in computer science and engineering. Bioinspired principles have found their way into network node design and research due to the appealing analogies between biological systems and large networks of small sensors. This paper provides an overview of bioinspired principles and methods such as swarm intelligence, natural time synchronization, artificial immune system and intercellular information exchange applicable for sensor network design. Bioinspired principles and methods are discussed in the context of routing, clustering, time synchronization, optimal node deployment, localization and security and privacy.
High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing
NASA Astrophysics Data System (ADS)
Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.
2015-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.
Near Earth Asteroid Scout Solar Sail Engineering Development Unit Test Suite
NASA Technical Reports Server (NTRS)
Lockett, Tiffany Russell; Few, Alexander; Wilson, Richard
2017-01-01
The Near Earth Asteroid (NEA) Scout project is a 6U reconnaissance mission to investigate a near Earth asteroid utilizing an 86m(sub 2) solar sail as the primary propulsion system. This will be the largest solar sail NASA has launched to date. NEA Scout is currently manifested on the maiden voyage of the Space Launch System in 2018. In development of the solar sail subsystem, design challenges were identified and investigated for packaging within a 6U form factor and deployment in cis-lunar space. Analysis was able to capture understanding of thermal, stress, and dynamics of the stowed system as well as mature an integrated sail membrane model for deployed flight dynamics. Full scale system testing on the ground is the optimal way to demonstrate system robustness, repeatability, and overall performance on a compressed flight schedule. To physically test the system, the team developed a flight sized engineering development unit with design features as close to flight as possible. The test suite included ascent vent, random vibration, functional deployments, thermal vacuum, and full sail deployments. All of these tests contributed towards development of the final flight unit. This paper will address several of the design challenges and lessons learned from the NEA Scout solar sail subsystem engineering development unit. Testing on the component level all the way to the integrated subsystem level. From optical properties of the sail material to fold and spooling the single sail, the team has developed a robust deployment system for the solar sail. The team completed several deployments of the sail system in preparation for flight at half scale (4m) and full scale (6.8m): boom only, half scale sail deployment, and full scale sail deployment. This paper will also address expected and received test results from ascent vent, random vibration, and deployment tests.
A Lightweight, Precision-Deployable, Optical Bench for High Energy Astrophysics Missions
NASA Astrophysics Data System (ADS)
Danner, Rolf; Dailey, D.; Lillie, C.
2011-09-01
The small angle of total reflection for X-rays, forcing grazing incidence optics with large collecting areas to long focal lengths, has been a fundamental barrier to the advancement of high-energy astrophysics. Design teams around the world have long recognized that a significant increase in effective area beyond Chandra and XMM-Newton requires either a deployable optical bench or separate X-ray optics and instrument module on formation flying spacecraft. Here, we show that we have in hand the components for a lightweight, precision-deployable optical bench that, through its inherent design features, is the affordable path to the next generation of imaging high-energy astrophysics missions. We present our plans for a full-scale engineering model of a deployable optical bench for Explorer-class missions. We intend to use this test article to raise the technology readiness level (TRL) of the tensegrity truss for a lightweight, precision-deployable optical bench for high-energy astrophysics missions from TRL 3 to TRL 5 through a set of four well-defined technology milestones. The milestones cover the architecture's ability to deploy and control the focal point, characterize the deployed dynamics, determine long-term stability, and verify the stowed load capability. Our plan is based on detailed design and analysis work and the construction of a first prototype by our team. Building on our prior analysis and the high TRL of the architecture components we are ready to move on to the next step. The key elements to do this affordably are two existing, fully characterized, flight-quality, deployable booms. After integrating them into the test article, we will demonstrate that our architecture meets the deployment accuracy, adjustability, and stability requirements. The same test article can be used to further raise the TRL in the future.
Pye, Rachel E; Simpson, Leanne K
2017-09-01
Military deployment can have an adverse effect on a soldier's family, though little research has looked at these effects in a British sample. We investigated wives' of U.K.-serving soldiers perceptions of marital and family functioning, across three stages of the deployment cycle: currently deployed, postdeployment and predeployed, plus a nonmilitary comparison group. Uniquely, young (aged 3.5-11 years) children's perceptions of their family were also investigated, using the parent-child alliance (PCA) coding scheme of drawings of the family. Two hundred and twenty British military families of regular service personnel from the British Army's Royal Armoured Corps, were sent survey packs distributed with a monthly welfare office newsletter. Wives were asked to complete a series of self-report items, and the youngest child in the family between the ages of 3.5 and 11 years was asked to draw a picture of their family. Complete data were available for 78 military families, and an additional 34 nonmilitary families were recruited via opportunity sampling. Results indicated wives of currently deployed and recently returned personnel were less satisfied with their family and its communication, and children's pictures indicated higher levels of dysfunctional parent-child alliance, whereas predeployed families responded similarly to nonmilitary families. Marital satisfaction was similar across all groups except predeployed families who were significantly more satisfied. Nonmilitary and predeployed families showed balanced family functioning, and currently and recently deployed families demonstrated poor family functioning. In comparison to nonmilitary families, predeployed families showed a large "spike" in the rigidity subscale of the Family Adaptability and Cohesion Evaluation Scale IV. Wives' perceptions of family functioning, but not marital satisfaction, differed between the deployment groups. The results from the coded children's drawings correlated with the self-report measures from the wife/mother, indicating that children's drawings could be a useful approach when working with younger children in this area. It is tentatively suggested that the differences across deployment stage on family functioning could be mediated not only by communication difficulties between deployed personnel and their families, but also by its effect on the children in the family. Larger-scale longitudinal research is needed to investigate this further. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.
2014-01-01
Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911
Cyberhubs: Virtual Research Environments for Astronomy
NASA Astrophysics Data System (ADS)
Herwig, Falk; Andrassy, Robert; Annau, Nic; Clarkson, Ondrea; Côté, Benoit; D’Sa, Aaron; Jones, Sam; Moa, Belaid; O’Connell, Jericho; Porter, David; Ritter, Christian; Woodward, Paul
2018-05-01
Collaborations in astronomy and astrophysics are faced with numerous cyber-infrastructure challenges, such as large data sets, the need to combine heterogeneous data sets, and the challenge to effectively collaborate on those large, heterogeneous data sets with significant processing requirements and complex science software tools. The cyberhubs system is an easy-to-deploy package for small- to medium-sized collaborations based on the Jupyter and Docker technology, which allows web-browser-enabled, remote, interactive analytic access to shared data. It offers an initial step to address these challenges. The features and deployment steps of the system are described, as well as the requirements collection through an account of the different approaches to data structuring, handling, and available analytic tools for the NuGrid and PPMstar collaborations. NuGrid is an international collaboration that creates stellar evolution and explosion physics and nucleosynthesis simulation data. The PPMstar collaboration performs large-scale 3D stellar hydrodynamics simulations of interior convection in the late phases of stellar evolution. Examples of science that is currently performed on cyberhubs, in the areas of 3D stellar hydrodynamic simulations, stellar evolution and nucleosynthesis, and Galactic chemical evolution, are presented.
The Hawaiian PLUME Project Successfully Completes its First Deployment
NASA Astrophysics Data System (ADS)
Laske, G.; Collins, J. A.; Wolfe, C. J.; Weeraratne, D.; Solomon, S. C.; Detrick, R. S.; Orcutt, J. A.; Bercovici, D. A.; Hauri, E. H.
2006-12-01
The Hawaiian PLUME (Plume-Lithosphere Undersea Melt Experiment) project is a multi-disciplinary program to study the deep mantle roots of the Hawaiian hotspot. The nearly linear alignment of the Hawaiian Islands has heretofore prevented high-resolution, three-dimensional imaging of mantle structure in the region from land seismic observations, a situation that has permitted debates to persist over whether or not the Hawaiian hotspot is underlain by a classical plume from the deep mantle and how mantle upwelling interacts with the overlying lithosphere beneath the Hawaiian Swell. The centerpiece of the PLUME project is a large broadband seismic network that includes ocean-bottom seismometers (OBSs) as well as portable land stations. Occupying a total of more than 80 sites and having a two-dimensional aperture of more than 1000~km, this network includes one of the first large-scale, long-term deployments of broadband OBSs. The seismic experiment has been conducted in two stages to record teleseismic body and surface waves over a total duration of two years. A first deployment of 35 OBSs extended from January 2005 through January 2006 and was centered on the island of Hawaii, the locus of the hotspot. A second OBS deployment, with a larger aperture and larger station spacing was carried out in May 2006 to collect data for another year. The first deployment was a technical success, with 32 of 35 OBSs recovered and many large events at suitable distances and azimuths well recorded. We recorded 225 events with scalar seismic moments greater than 5× 1017Nm. Our database includes the great 28 March 2005, M_S=8.2 aftershock of the 26 December 2004 Sumatra-Andaman earthquake and two large earthquakes on the Juan de Fuca plate on 15 and 17 June 2005. Our surface wave analysis will be based on 102 large, shallow (h_0<200 km) earthquakes with scalar seismic moments M_0≥ 20/times 1017Nm. This number of events is about 20% more than what was gathered during the year--long SWELL pilot deployment in the same region in 1997-98 using solely differential pressure gauges. The database also includes excellent long-period body wave waveforms suitable for tomographic imaging as well as horizontal- component data suitable for a shear-wave splitting analysis and for identifying converted phases from the upper-mantle transition zone with receiver function techniques. In addition to the seismic experiment, nine of eleven dredges on the first deployment cruise yielded coral and basalt samples that will help to constrain subsidence rates of the Hawaiian Islands and the origin of rift volcanism. On the two deployment cruises we also obtained high-resolution multi-beam bathymetry along previously unmapped transects covering areas of the eastern parts of the Maui and the Molokai Fracture Zones as well as portions of the Bach Ridge at the southern end of the Musician Seamounts.
NASA Astrophysics Data System (ADS)
Kasdin, N. J.; Shaklan, S.; Lisman, D.; Thomson, M.; Webb, D.; Cady, E.; Marks, G. W.; Lo, A.
2013-01-01
In support of NASA's Exoplanet Exploration Program and the Technology Development for Exoplanet Missions (TDEM), we recently completed a 2 year study of the manufacturability and metrology of starshade petals. An external occult is a satellite employing a large screen, or starshade, that flies in formation with a spaceborne telescope to provide the starlight suppression needed for detecting and characterizing exoplanets. Among the advantages of using an occulter are the broadband allowed for characterization and the removal of light before entering the observatory, greatly relaxing the requirements on the telescope and instrument. This poster presents the results of our successful first TDEM that demonstrated an occulter petal could be built and measured to an accuracy consistent with close to 10^-10 contrast. We also present the progress in our second TDEM to demonstrate the next critical technology milestone: precision deployment of the central truss and petals to the necessary accuracy. We have completed manufacture of four sub-scale petals and a central hub to fit with an existing deployable truss. We show the plans for repeated stow and deploy tests of the assembly and the metrology to confirm that each deploy repeatably meets the absolute positioning requirements of the petals (better than 1.0 mm).
A stowing and deployment strategy for large membrane space systems on the example of Gossamer-1
NASA Astrophysics Data System (ADS)
Seefeldt, Patric
2017-09-01
Deployment systems for innovative space applications such as solar sails require a technique for a controlled and autonomous deployment in space. The deployment process has a strong impact on the mechanism and structural design and sizing. On the example of the design implemented in the Gossamer-1 project of the German Aerospace Center (DLR), such a stowing and deployment process is analyzed. It is based on a combination of zig-zag folding and coiling of triangular sail segments spanned between crossed booms. The deployment geometry and forces introduced by the mechanism considered are explored in order to reveal how the loads are transferred through the membranes to structural components such as the booms. The folding geometry and force progressions are described by function compositions of an inverse trigonometric function with the considered trigonometric function itself. If these functions are evaluated over several periods of the trigonometric function, a non-smooth oscillating curve occurs. Depending on the trigonometric function, these are often vividly described as zig-zag or sawtooth functions. The developed functions are applied to the Gossamer-1 design. The deployment geometry reveals a tendency that the loads are transferred along the catheti of the sail segments and therefore mainly along the boom axes. The load introduced by the spool deployment mechanism is described. By combining the deployment geometry with that load, a prediction of the deployment load progression is achieved. The mathematical description of the stowing and deployment geometry, as well as the forces inflicted by the mechanism provides an understanding of how exactly the membrane deploys and through which edges the deployment forces are transferred. The mathematical analysis also gives an impression of sensitive parameters that could be influenced by manufacturing tolerances or unsymmetrical deployment of the sail segments. While the mathematical model was applied on the design of the Gossamer-1 hardware, it allows an analysis of other geometries. This is of particular interest as Gossamer-1 investigated deployment technology on a relatively small scale of 5m × 5m , while the currently considered solar sail missions require sails that are about one order of magnitude bigger.
Advances in Field Deployable Instrumented Particles for the Study of Alluvial Transport Mechanisms
NASA Astrophysics Data System (ADS)
Dillon, B.; Strom, K.
2017-12-01
Advances in microelectromechanical systems (MEMs) in the past decade have lead to the development of various instrumented or "smart" particles for use in the study of alluvial transport. The goal of many of these devices is to collect data on the interaction between hydrodynamic turbulence and individual sediment particles. Studying this interaction provides a basis to better understand entrainment and deposition processes which leads to better predictive morphologic and transport models. In collecting data on these processes, researchers seek to capture the time history of the forces incident on the particle and the particle's reaction. Many methods have been employed to capture this data - miniaturized pressure traps, accelerometers, gyroscopes, MEMs pressure transducers, and cantilevered load cells. However no system to date has been able to capture the pressure forces incident on the particle and its reaction while remaining mobile and of a size and density comparable to most gravels. Advances in the development, deployment, and use of waterproofed laboratory instrumentation have led our research group to develop such a particle. This particle has been used in both laboratory settings and large-scale fluvial environments (coupled with a field-deployable PIV system) to capture data on turbulent erosion processes. This system advances the practice in several ways: 1) It is, at present, the smallest (⌀ 19mm) instrumented erodible particle reported in the literature. 2) It contains novel developments in pressure sensing technology which allow the inclusion of six pressure ports, a 3-axis accelerometer, and a 1-axis gyroscope - all of which can be recorded simultaneously. 3) It expands the researcher's abilities to gather data on phenomena that, previously, have mandated the use of a laboratory scale model. The use of this system has generated observations of the so-called very large scale motions (VLSMs) in a reach of the Virginia section of the New River. Their effects on erosional processes are presented.
Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Beatty, Brenda; Hill, Graham
2013-12-01
Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less
Thinking big: Towards ideal strains and processes for large-scale aerobic biofuels production
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMillan, James D.; Beckham, Gregg T.
In this study, global concerns about anthropogenic climate change, energy security and independence, and environmental consequences of continued fossil fuel exploitation are driving significant public and private sector interest and financing to hasten development and deployment of processes to produce renewable fuels, as well as bio-based chemicals and materials, towards scales commensurate with current fossil fuel-based production. Over the past two decades, anaerobic microbial production of ethanol from first-generation hexose sugars derived primarily from sugarcane and starch has reached significant market share worldwide, with fermentation bioreactor sizes often exceeding the million litre scale. More recently, industrial-scale lignocellulosic ethanol plants aremore » emerging that produce ethanol from pentose and hexose sugars using genetically engineered microbes and bioreactor scales similar to first-generation biorefineries.« less
Thinking big: Towards ideal strains and processes for large-scale aerobic biofuels production
McMillan, James D.; Beckham, Gregg T.
2016-12-22
In this study, global concerns about anthropogenic climate change, energy security and independence, and environmental consequences of continued fossil fuel exploitation are driving significant public and private sector interest and financing to hasten development and deployment of processes to produce renewable fuels, as well as bio-based chemicals and materials, towards scales commensurate with current fossil fuel-based production. Over the past two decades, anaerobic microbial production of ethanol from first-generation hexose sugars derived primarily from sugarcane and starch has reached significant market share worldwide, with fermentation bioreactor sizes often exceeding the million litre scale. More recently, industrial-scale lignocellulosic ethanol plants aremore » emerging that produce ethanol from pentose and hexose sugars using genetically engineered microbes and bioreactor scales similar to first-generation biorefineries.« less
ENGINEERING DEVELOPMENT UNIT SOLAR SAIL
2016-01-13
TIFFANY LOCKETT OVERSEES THE HALF SCALE (36 SQUARE METERS) ENGINEERING DEVELOPMENT UNIT (EDU) SOLAR SAIL DEPLOYMENT DEMONSTRATION IN PREPARATION FOR FULL SCALE EDU (86 SQUARE METERS) DEPLOYMENT IN APRIL, 2016. DETAILS OF RIPS AND HOLES IN SOLAR SAIL FABRIC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papay, L.T.; Trocki, L.K.; McKinsey, R.R.
The Department of Energy`s clean coal technology (CCT) program succeeded in developing more efficient, cleaner, coal-fired electricity options. The Department and its private partners succeeded in the demonstration of CCT -- a major feat that required more than a decade of commitment between them. As with many large-scale capital developments and changes, the market can shift dramatically over the course of the development process. The CCT program was undertaken in an era of unstable oil and gas prices, concern over acid rain, and guaranteed markets for power suppliers. Regulations, fuel prices, emergency of competing technologies, and institutional factors are allmore » affecting the outlook for CCT deployment. The authors identify the major barriers to CCT deployment and then introduce some possible means to surmount the barriers.« less
STEP flight experiments Large Deployable Reflector (LDR) telescope
NASA Technical Reports Server (NTRS)
Runge, F. C.
1984-01-01
Flight testing plans for a large deployable infrared reflector telescope to be tested on a space platform are discussed. Subsystem parts, subassemblies, and whole assemblies are discussed. Assurance of operational deployability, rigidization, alignment, and serviceability will be sought.
The Framework for Life Cycle Cost Management,
1982-01-01
in the early phases of system development before full- scale development and initial production, O&S cost projections are too un- certain to...Ensure that each increment of cost and schedule investment in R&M contributes significantly to the above objectives. DoDD 5000.40, Reliability and...Major systems are characterized by large investments of time and resources in the uncertain periods of development, production, and deployment and in
Towards a Cross-Domain MapReduce Framework
2013-11-01
These Big Data applications typically run as a set of MapReduce jobs to take advantage of Hadoop’s ease of service deployment and large-scale...parallelism. Yet, Hadoop has not been adapted for multilevel secure (MLS) environments where data of different security classifications co-exist. To solve...multilevel security. I. INTRODUCTION The US Department of Defense (DoD) and US Intelligence Community (IC) recognize they have a Big Data problem
Building rooftop classification using random forests for large-scale PV deployment
NASA Astrophysics Data System (ADS)
Assouline, Dan; Mohajeri, Nahid; Scartezzini, Jean-Louis
2017-10-01
Large scale solar Photovoltaic (PV) deployment on existing building rooftops has proven to be one of the most efficient and viable sources of renewable energy in urban areas. As it usually requires a potential analysis over the area of interest, a crucial step is to estimate the geometric characteristics of the building rooftops. In this paper, we introduce a multi-layer machine learning methodology to classify 6 roof types, 9 aspect (azimuth) classes and 5 slope (tilt) classes for all building rooftops in Switzerland, using GIS processing. We train Random Forests (RF), an ensemble learning algorithm, to build the classifiers. We use (2 × 2) [m2 ] LiDAR data (considering buildings and vegetation) to extract several rooftop features, and a generalised footprint polygon data to localize buildings. The roof classifier is trained and tested with 1252 labeled roofs from three different urban areas, namely Baden, Luzern, and Winterthur. The results for roof type classification show an average accuracy of 67%. The aspect and slope classifiers are trained and tested with 11449 labeled roofs in the Zurich periphery area. The results for aspect and slope classification show different accuracies depending on the classes: while some classes are well identified, other under-represented classes remain challenging to detect.
Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor
2014-01-01
Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277
Hybridization of active and passive elements for planar photonic components and interconnects
NASA Astrophysics Data System (ADS)
Pearson, M.; Bidnyk, S.; Balakrishnan, A.
2007-02-01
The deployment of Passive Optical Networks (PON) for Fiber-to-the-Home (FTTH) applications currently represents the fastest growing sector of the telecommunication industry. Traditionally, FTTH transceivers have been manufactured using commodity bulk optics subcomponents, such as thin film filters (TFFs), micro-optic collimating lenses, TO-packaged lasers, and photodetectors. Assembling these subcomponents into a single housing requires active alignment and labor-intensive techniques. Today, the majority of cost reducing strategies using bulk subcomponents has been implemented making future reductions in the price of manufacturing FTTH transceivers unlikely. Future success of large scale deployments of FTTH depends on further cost reductions of transceivers. Realizing the necessity of a radically new packaging approach for assembly of photonic components and interconnects, we designed a novel way of hybridizing active and passive elements into a planar lightwave circuit (PLC) platform. In our approach, all the filtering components were monolithically integrated into the chip using advancements in planar reflective gratings. Subsequently, active components were passively hybridized with the chip using fully-automated high-capacity flip-chip bonders. In this approach, the assembly of the transceiver package required no active alignment and was readily suitable for large-scale production. This paper describes the monolithic integration of filters and hybridization of active components in both silica-on-silicon and silicon-on-insulator PLCs.
NASA Astrophysics Data System (ADS)
Kato, E.; Yamagata, Y.
2014-12-01
Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socio-economic scenarios that aim to keep mean global temperature rise below 2°C above pre-industrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large-scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full post-process combustion CO2 capture is deployed with a high fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required, however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise a conflict of land-use with food production is inevitable.
NASA Astrophysics Data System (ADS)
Kato, Etsushi; Yamagata, Yoshiki
2014-09-01
Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socioeconomic scenarios that aim to keep mean global temperature rise below 2°C above preindustrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high-fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full postprocess combustion CO2 capture is deployed with a high-fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required; however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise, a conflict of land use with food production is inevitable.
Miller, Lee M; Kleidon, Axel
2016-11-29
Wind turbines generate electricity by removing kinetic energy from the atmosphere. Large numbers of wind turbines are likely to reduce wind speeds, which lowers estimates of electricity generation from what would be presumed from unaffected conditions. Here, we test how well wind power limits that account for this effect can be estimated without explicitly simulating atmospheric dynamics. We first use simulations with an atmospheric general circulation model (GCM) that explicitly simulates the effects of wind turbines to derive wind power limits (GCM estimate), and compare them to a simple approach derived from the climatological conditions without turbines [vertical kinetic energy (VKE) estimate]. On land, we find strong agreement between the VKE and GCM estimates with respect to electricity generation rates (0.32 and 0.37 W e m -2 ) and wind speed reductions by 42 and 44%. Over ocean, the GCM estimate is about twice the VKE estimate (0.59 and 0.29 W e m -2 ) and yet with comparable wind speed reductions (50 and 42%). We then show that this bias can be corrected by modifying the downward momentum flux to the surface. Thus, large-scale limits to wind power use can be derived from climatological conditions without explicitly simulating atmospheric dynamics. Consistent with the GCM simulations, the approach estimates that only comparatively few land areas are suitable to generate more than 1 W e m -2 of electricity and that larger deployment scales are likely to reduce the expected electricity generation rate of each turbine. We conclude that these atmospheric effects are relevant for planning the future expansion of wind power.
2013-01-01
Background The mobile medical unit/polyclinic (MMU/PC) was an essential part of the medical services to support ill or injured Olympic or Paralympics family during the 2010 Olympic and Paralympics winter games. The objective of this study was to survey the satisfaction of the clinical staff that completed the training programs prior to deployment to the MMU. Methods Medical personnel who participated in at least one of the four training programs, including (1) week-end sessions; (2) web-based modules; (3) just-in-time training; and (4) daily simulation exercises were invited to participate in a web-based survey and comment on their level of satisfaction with training program. Results A total of 64 (out of 94 who were invited) physicians, nurses and respiratory therapists completed the survey. All participants reported favorably that the MMU/PC training positively impacted their knowledge, skills and team functions while deployed at the MMU/PC during the 2010 Olympic Games. However, components of the training program were valued differently depending on clinical job title, years of experience, and prior experience in large scale events. Respondents with little or no experience working in large scale events (45%) rated daily simulations as the most valuable component of the training program for strengthening competencies and knowledge in clinical skills for working in large scale events. Conclusion The multi-phase MMU/PC training was found to be beneficial for preparing the medical team for the 2010 Winter Games. In particular this survey demonstrates the effectiveness of simulation training programs on teamwork competencies in ad hoc groups. PMID:24225074
Miller, Lee M.; Kleidon, Axel
2016-01-01
Wind turbines generate electricity by removing kinetic energy from the atmosphere. Large numbers of wind turbines are likely to reduce wind speeds, which lowers estimates of electricity generation from what would be presumed from unaffected conditions. Here, we test how well wind power limits that account for this effect can be estimated without explicitly simulating atmospheric dynamics. We first use simulations with an atmospheric general circulation model (GCM) that explicitly simulates the effects of wind turbines to derive wind power limits (GCM estimate), and compare them to a simple approach derived from the climatological conditions without turbines [vertical kinetic energy (VKE) estimate]. On land, we find strong agreement between the VKE and GCM estimates with respect to electricity generation rates (0.32 and 0.37 We m−2) and wind speed reductions by 42 and 44%. Over ocean, the GCM estimate is about twice the VKE estimate (0.59 and 0.29 We m−2) and yet with comparable wind speed reductions (50 and 42%). We then show that this bias can be corrected by modifying the downward momentum flux to the surface. Thus, large-scale limits to wind power use can be derived from climatological conditions without explicitly simulating atmospheric dynamics. Consistent with the GCM simulations, the approach estimates that only comparatively few land areas are suitable to generate more than 1 We m−2 of electricity and that larger deployment scales are likely to reduce the expected electricity generation rate of each turbine. We conclude that these atmospheric effects are relevant for planning the future expansion of wind power. PMID:27849587
Wadud, Zahid; Hussain, Sajjad; Javaid, Nadeem; Bouk, Safdar Hussain; Alrajeh, Nabil; Alabed, Mohamad Souheil; Guizani, Nadra
2017-09-30
Industrial Underwater Acoustic Sensor Networks (IUASNs) come with intrinsic challenges like long propagation delay, small bandwidth, large energy consumption, three-dimensional deployment, and high deployment and battery replacement cost. Any routing strategy proposed for IUASN must take into account these constraints. The vector based forwarding schemes in literature forward data packets to sink using holding time and location information of the sender, forwarder, and sink nodes. Holding time suppresses data broadcasts; however, it fails to keep energy and delay fairness in the network. To achieve this, we propose an Energy Scaled and Expanded Vector-Based Forwarding (ESEVBF) scheme. ESEVBF uses the residual energy of the node to scale and vector pipeline distance ratio to expand the holding time. Resulting scaled and expanded holding time of all forwarding nodes has a significant difference to avoid multiple forwarding, which reduces energy consumption and energy balancing in the network. If a node has a minimum holding time among its neighbors, it shrinks the holding time and quickly forwards the data packets upstream. The performance of ESEVBF is analyzed through in network scenario with and without node mobility to ensure its effectiveness. Simulation results show that ESEVBF has low energy consumption, reduces forwarded data copies, and less end-to-end delay.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dirk Gombert; Jay Roach
The U. S. Department of Energy (DOE) Global Nuclear Energy Partnership (GNEP) was announced in 2006. As currently envisioned, GNEP will be the basis for growth of nuclear energy worldwide, using a closed proliferation-resistant fuel cycle. The Integrated Waste Management Strategy (IWMS) is designed to ensure that all wastes generated by fuel fabrication and recycling will have a routine disposition path making the most of feedback to fuel and recycling operations to eliminate or minimize byproducts and wastes. If waste must be generated, processes will be designed with waste treatment in mind to reduce use of reagents that complicate stabilizationmore » and minimize volume. The IWMS will address three distinct levels of technology investigation and systems analyses and will provide a cogent path from (1) research and development (R&D) and engineering scale demonstration, (Level I); to (2) full scale domestic deployment (Level II); and finally to (3) establishing an integrated global nuclear energy infrastructure (Level III). The near-term focus of GNEP is on achieving a basis for large-scale commercial deployment (Level II), including the R&D and engineering scale activities in Level I that are necessary to support such an accomplishment. Throughout these levels is the need for innovative thinking to simplify, including regulations, separations and waste forms to minimize the burden of safe disposition of wastes on the fuel cycle.« less
Design Aspects of the Rayleigh Convection Code
NASA Astrophysics Data System (ADS)
Featherstone, N. A.
2017-12-01
Understanding the long-term generation of planetary or stellar magnetic field requires complementary knowledge of the large-scale fluid dynamics pervading large fractions of the object's interior. Such large-scale motions are sensitive to the system's geometry which, in planets and stars, is spherical to a good approximation. As a result, computational models designed to study such systems often solve the MHD equations in spherical geometry, frequently employing a spectral approach involving spherical harmonics. We present computational and user-interface design aspects of one such modeling tool, the Rayleigh convection code, which is suitable for deployment on desktop and petascale-hpc architectures alike. In this poster, we will present an overview of this code's parallel design and its built-in diagnostics-output package. Rayleigh has been developed with NSF support through the Computational Infrastructure for Geodynamics and is expected to be released as open-source software in winter 2017/2018.
Live immunization against East Coast fever--current status.
Di Giulio, Giuseppe; Lynen, Godelieve; Morzaria, Subhash; Oura, Chris; Bishop, Richard
2009-02-01
The infection-and-treatment method (ITM) for immunization of cattle against East Coast fever has historically been used only on a limited scale because of logistical and policy constraints. Recent large-scale deployment among pastoralists in Tanzania has stimulated demand. Concurrently, a suite of molecular tools, developed from the Theileria parva genome, has enabled improved quality control of the immunizing stabilate and post-immunization monitoring of the efficacy and biological impact of ITM in the field. This article outlines the current status of ITM immunization in the field, with associated developments in the molecular epidemiology of T. parva.
Kushniruk, A; Kaipio, J; Nieminen, M; Hyppönen, H; Lääveri, T; Nohr, C; Kanstrup, A M; Berg Christiansen, M; Kuo, M-H; Borycki, E
2014-08-15
The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems.
Kaipio, J.; Nieminen, M.; Hyppönen, H.; Lääveri, T.; Nohr, C.; Kanstrup, A. M.; Berg Christiansen, M.; Kuo, M.-H.; Borycki, E.
2014-01-01
Summary Objectives The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Methods Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. Results It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. Conclusion As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems. PMID:25123725
Negative emissions—Part 3: Innovation and upscaling
NASA Astrophysics Data System (ADS)
Nemet, Gregory F.; Callaghan, Max W.; Creutzig, Felix; Fuss, Sabine; Hartmann, Jens; Hilaire, Jérôme; Lamb, William F.; Minx, Jan C.; Rogers, Sophia; Smith, Pete
2018-06-01
We assess the literature on innovation and upscaling for negative emissions technologies (NETs) using a systematic and reproducible literature coding procedure. To structure our review, we employ the framework of sequential stages in the innovation process, with which we code each NETs article in innovation space. We find that while there is a growing body of innovation literature on NETs, 59% of the articles are focused on the earliest stages of the innovation process, ‘research and development’ (R&D). The subsequent stages of innovation are also represented in the literature, but at much lower levels of activity than R&D. Distinguishing between innovation stages that are related to the supply of the technology (R&D, demonstrations, scale up) and demand for the technology (demand pull, niche markets, public acceptance), we find an overwhelming emphasis (83%) on the supply side. BECCS articles have an above average share of demand-side articles while direct air carbon capture and storage has a very low share. Innovation in NETs has much to learn from successfully diffused technologies; appealing to heterogeneous users, managing policy risk, as well as understanding and addressing public concerns are all crucial yet not well represented in the extant literature. Results from integrated assessment models show that while NETs play a key role in the second half of the 21st century for 1.5 °C and 2 °C scenarios, the major period of new NETs deployment is between 2030 and 2050. Given that the broader innovation literature consistently finds long time periods involved in scaling up and deploying novel technologies, there is an urgency to developing NETs that is largely unappreciated. This challenge is exacerbated by the thousands to millions of actors that potentially need to adopt these technologies for them to achieve planetary scale. This urgency is reflected neither in the Paris Agreement nor in most of the literature we review here. If NETs are to be deployed at the levels required to meet 1.5 °C and 2 °C targets, then important post-R&D issues will need to be addressed in the literature, including incentives for early deployment, niche markets, scale-up, demand, and—particularly if deployment is to be hastened—public acceptance.
Want, Andrew; Crawford, Rebecca; Kakkonen, Jenni; Kiddie, Greg; Miller, Susan; Harris, Robert E; Porter, Joanne S
2017-08-01
As part of ongoing commitments to produce electricity from renewable energy sources in Scotland, Orkney waters have been targeted for potential large-scale deployment of wave and tidal energy converting devices. Orkney has a well-developed infrastructure supporting the marine energy industry; recently enhanced by the construction of additional piers. A major concern to marine industries is biofouling on submerged structures, including energy converters and measurement instrumentation. In this study, the marine energy infrastructure and instrumentation were surveyed to characterise the biofouling. Fouling communities varied between deployment habitats; key species were identified allowing recommendations for scheduling device maintenance and preventing spread of invasive organisms. A method to measure the impact of biofouling on hydrodynamic response is described and applied to data from a wave-monitoring buoy deployed at a test site in Orkney. The results are discussed in relation to the accuracy of the measurement resources for power generation. Further applications are suggested for future testing in other scenarios, including tidal energy.
NASA Astrophysics Data System (ADS)
Jørgen Koch, Hans
To meet the commitments made in Kyoto, energy-related CO 2 emissions would have to fall to almost 30% below the level projected for a "Business-As-Usual" scenario. Meeting this goal will require a large-scale shift toward climate-friendly technologies such as fuel cells, which have a large long-term potential for both stationary generation and transportation. The deployment of a technology is the last major stage in the process of technological shift. Climate-friendly technologies are not being deployed at a sufficient rate or in sufficient amount to allow IEA countries to meet their targets. Hence, if technology is to play an important roll in reducing emissions within the Kyoto time frame (2008-2012) and beyond, immediate and sustained action to accelerate technology deployment will be required. Obstacles in the way of the deployment of technologies that are ready or near-ready for normal use have come to be referred to as market barriers. The simplest yet most significant form of market barrier to a new technology is the out-of-pocket cost to the user relative to the cost of technologies currently in use. Some market barriers also involve market failure, where the market fails to take account of all the costs and benefits involved, such as omitting external environmental costs, and therefore retard the deployment of more environmentally sustainable technologies. Other barriers include poor information dissemination, excessive and costly regulations, slow capital turnover rates, and inadequate financing. Efforts by governments to alleviate market barriers play an important role to complement private-sector activities, and there are many policies and measures each government could take. In addition, international technology collaboration can help promote the best use of available R&D resources and can contribute to more effective deployment of the result of research and development by sharing costs, pooling information and avoiding duplication of efforts.
NASA Astrophysics Data System (ADS)
Diaz Cusi, J.; Grevemeyer, I.; Thomas, C.; Harnafi, M.
2012-12-01
The data provided by the dense Iberarray broad-band seismic network deployed in the framework of the large-scale TopoIberia project, as well as from permanent broad-band stations operating in Morocco, Portugal and Spain has allowed to get a large scale view of the anisotropic properties of the mantle beneath the western termination of the Mediterranean region and its transition to the Atlantic ocean. In this contribution we will gather the previously presented results with the analysis of the data provided by IberArray stations in the central part of Iberia, broad-band OBSs deployments in the Alboran Sea and the Gulf of Cadiz and new seismic networks deployed in the High Atlas and the Moroccan Meseta. The High Atlas has been investigated using data from a broad-band network installed by the Univ. of Munster with a primary focus on the study of the properties of the deep mantle. Additionally, up to 10 Iberarray stations have been shifted southward to complete the survey along the Atlas and to investigate the Moroccan Meseta. In agreement with the results presented by the Picasso team along a profile crossing the Atlas northward, the anisotropy observed in this area is small (0.6 - 0.9 s) with a fast polarization direction (FPD) oriented roughly E-W. It is important to note that there is a very significant number of high quality events without evidence for anisotropy. This may be the result of the combined effect of two or more anisotropic layers or of the presence of a large vertical component of flow in the upper mantle. Moving northwards, the first TopoIberia-Iberarray deployment in the Betics-Alboran zone has evidenced a spectacular rotation of the FPD along the Gibraltar arc following the curvature of the Rif-Betic chain, from roughly N65E beneath the Betics to close to N65W beneath the Rif chain. To complete this image, we have now processed data from two OBS deployments in the Alboran Sea and Gulf of Cadiz installed by Geomar as part of the TopoMed project. The short period of registration and the intrinsic problems related to noise and instrument stability in the seafloor has not allowed getting a large database of anisotropic measurements. However, the few events providing good quality SKS measurements show interesting results which may provide significant clues to the knowledge of the geodynamic evolution of this area. Beneath Iberia, the second Iberarray deployment encompasses mainly the Variscan units of the Central Iberian Massif. The results show a small amount of anisotropy and suggest complex anisotropy features, confirming what has been observed in the first deployment. A significant change in both FPD and delay times across the two main units of the Variscan domain, the Ossa-Morena and the Central Iberian zones seem to exist. Permanent stations in southern Portugal show a significant number of null measurements, similar to what has previously discussed for the High Atlas stations. Beneath Eastern Iberia, the FPD have a roughly E-W orientation. No significant changes are observed between the anisotropic parameters beneath the Balearic Islands and those in the Eastern Betics.
Women at war: implications for mental health.
Dutra, Lissa; Grubbs, Kathleen; Greene, Carolyn; Trego, Lori L; McCartin, Tamarin L; Kloezeman, Karen; Morland, Leslie
2011-01-01
Few studies have investigated the impact of deployment stressors on the mental health outcomes of women deployed to Iraq in support of Operation Iraqi Freedom. This pilot study examined exposure to combat experiences and military sexual harassment in a sample of 54 active duty women and assessed the impact of these stressors on post-deployment posttraumatic stress disorder (PTSD) symptoms and depressive symptoms. Within 3 months of returning from deployment to Iraq, participants completed (a) the Combat Experiences Scale and the Sexual Harassment Scale of the Deployment Risk and Resilience Inventory, (b) the Primary Care PTSD Screen, and (c) an abbreviated version of the Center for Epidemiological Studies-Depression scale. Approximately three quarters of the sample endorsed exposure to combat experiences, and more than half of the sample reported experiencing deployment-related sexual harassment, with nearly half of the sample endorsing both stressors. Approximately one third of the sample endorsed clinical or subclinical levels of PTSD symptoms, with 11% screening positive for PTSD and 9% to 14% of the sample endorsing depressive symptoms. Regression analyses revealed that combat experiences and sexual harassment jointly accounted for significant variance in post-deployment PTSD symptoms, whereas military sexual harassment was identified as the only unique significant predictor of these symptoms. Findings from the present study lend support to research demonstrating that military sexual trauma may be more highly associated with post-deployment PTSD symptoms than combat exposure among female service members and veterans.
Marr, Jeffrey D.G.; Gray, John R.; Davis, Broderick E.; Ellis, Chris; Johnson, Sara; Gray, John R.; Laronne, Jonathan B.; Marr, Jeffrey D.G.
2010-01-01
A 3-month-long, large-scale flume experiment involving research and testing of selected conventional and surrogate bedload-monitoring technologies was conducted in the Main Channel at the St. Anthony Falls Laboratory under the auspices of the National Center for Earth-surface Dynamics. These experiments, dubbed StreamLab06, involved 25 researchers and volunteers from academia, government, and the private sector. The research channel was equipped with a sediment-recirculation system and a sediment-flux monitoring system that allowed continuous measurement of sediment flux in the flume and provided a data set by which samplers were evaluated. Selected bedload-measurement technologies were tested under a range of flow and sediment-transport conditions. The experiment was conducted in two phases. The bed material in phase I was well-sorted siliceous sand (0.6-1.8 mm median diameter). A gravel mixture (1-32 mm median diameter) composed the bed material in phase II. Four conventional bedload samplers – a standard Helley-Smith, Elwha, BLH-84, and Toutle River II (TR-2) sampler – were manually deployed as part of both experiment phases. Bedload traps were deployed in study Phase II. Two surrogate bedload samplers – stationarymounted down-looking 600 kHz and 1200 kHz acoustic Doppler current profilers – were deployed in experiment phase II. This paper presents an overview of the experiment including the specific data-collection technologies used and the ambient hydraulic, sediment-transport and environmental conditions measured as part of the experiment. All data collected as part of the StreamLab06 experiments are, or will be available to the research community.
Full-field inspection of a wind turbine blade using three-dimensional digital image correlation
NASA Astrophysics Data System (ADS)
LeBlanc, Bruce; Niezrecki, Christopher; Avitabile, Peter; Chen, Julie; Sherwood, James; Hughes, Scott
2011-04-01
Increasing demand and deployment of wind power has led to a significant increase in the number of wind-turbine blades manufactured globally. As the physical size and number of turbines deployed grows, the probability of manufacturing defects being present in composite turbine blade fleets also increases. As both capital blade costs, and operational and maintenance costs, increase for larger turbine systems the need for large-scale inspection and monitoring of the state of structural health of turbine blades during manufacturing and operation critically increase. One method for locating and quantifying manufacturing defects, while also allowing for the in-situ measurement of the structural health of blades, is through the observation of the full-field state of deformation and strain of the blade. Static tests were performed on a nine-meter CX-100 composite turbine blade to extract full-field displacement and strain measurements using threedimensional digital image correlation (3D DIC). Measurements were taken at several angles near the blade root, including along the high-pressure surface, low-pressure surface, and along the trailing edge of the blade. The overall results indicate that the measurement approach can clearly identify failure locations and discontinuities in the blade curvature under load. Post-processing of the data using a stitching technique enables the shape and curvature of the entire blade to be observed for a large-scale wind turbine blade for the first time. The experiment demonstrates the feasibility of the approach and reveals that the technique readily can be scaled up to accommodate utility-scale blades. As long as a trackable pattern is applied to the surface of the blade, measurements can be made in-situ when a blade is on a manufacturing floor, installed in a test fixture, or installed on a rotating turbine. The results demonstrate the great potential of the optical measurement technique and its capability for use in the wind industry for large-area inspection.
SunShot 2030 for Photovoltaics (PV): Envisioning a Low-cost PV Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Wesley J.; Frew, Bethany A.; Gagnon, Pieter J.
In this report we summarize the implications, impacts, and deployment potential of reaching the SunShot 2030 targets for the electricity system in the contiguous United States. We model 25 scenarios of the U.S. power sector using the Regional Energy Deployment Systems (ReEDS) and Distributed Generation (dGen) capacity expansion models. The scenarios cover a wide range of sensitivities to capture future uncertainties relating to fuel prices, retirements, renewable energy capital costs, and load growth. We give special attention to the potential for storage costs to also rapidly decline due to its large synergies with low-cost solar. The ReEDS and dGen modelsmore » project utility- and distributed-scale power sector evolution, respectively, for the United States. Both models have been designed with special emphasis on capturing the unique traits of renewable energy, including variability and grid integration requirements. Across the suite of scenarios modeled, we find that reaching the SunShot 2030 target has the potential to lead to significant capacity additions of PV in the United States. By 2050, PV penetration levels are projected to reach 28-46 percent of total generation. If storage also sees significant reductions in cost, then the 2050 solar penetration levels could reach 41-64 percent. PV deployment is projected to occur in all of the lower 48 states, though the specific deployment level is scenario dependent. The growth in PV is projected to be dominated by utility-scale systems, but the actual mix between utility and distributed systems could ultimately vary depending on how policies, system costs, and rate structures evolve.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dagher, Habib; Viselli, Anthony; Goupee, Andrew
Volume II of the Final Report for the DeepCwind Consortium National Research Program funded by US Department of Energy Award Number: DE-EE0003278.001 summarizes the design, construction, deployment, testing, numerical model validation, retrieval, and post-deployment inspection of the VolturnUS 1:8-scale floating wind turbine prototype deployed off Castine, Maine on June 2nd, 2013. The 1:8 scale VolturnUS design served as a de-risking exercise for a commercial multi-MW VolturnUS design. The American Bureau of Shipping Guide for Building and Classing Floating Offshore Wind Turbine Installations was used to design the prototype. The same analysis methods, design methods, construction techniques, deployment methods, mooring, andmore » anchoring planned for full-scale were used. A commercial 20kW grid-connected turbine was used and was the first offshore wind turbine in the US.« less
Deployment of Large-Size Shell Constructions by Internal Pressure
NASA Astrophysics Data System (ADS)
Pestrenin, V. M.; Pestrenina, I. V.; Rusakov, S. V.; Kondyurin, A. V.
2015-11-01
A numerical study on the deployment pressure (the minimum internal pressure bringing a construction from the packed state to the operational one) of large laminated CFRP shell structures is performed using the ANSYS engineering package. The shell resists both membrane and bending deformations. Structures composed of shell elements whose median surface has an involute are considered. In the packed (natural) states of constituent elements, the median surfaces coincide with their involutes. Criteria for the termination of stepwise solution of the geometrically nonlinear problem on determination of the deployment pressure are formulated, and the deployment of cylindrical, conical (full and truncated cones), and large-size composite shells is studied. The results obtained are shown by graphs illustrating the deployment pressure in relation to the geometric and material parameters of the structure. These studies show that large pneumatic composite shells can be used as space and building structures, because the deployment pressure in them only slightly differs from the excess pressure in pneumatic articles made from films and soft materials.
BIO ARGO floats: tools for operational monitoring of the Black Sea
NASA Astrophysics Data System (ADS)
Palazov, Atanas; Slabakova, Violeta; Peneva, Elisaveta; Stanev, Emil
2014-05-01
The assessment of ecological status in the context of the Water Framework Directive (WFD) and Marine Strategy Framework Directive (MSFD) requires comprehensive knowledge and understanding of the physical and biogeochemical processes that determine the functioning of marine ecosystems. One of the main challenges however is the need of data with frequency relevant to the spatial and temporal scales of the ecological processes. The majority of in situ observations that are commonly used for ecological monitoring of the Black Sea are generally based on near-shore monitoring programs or irregular oceanographic cruises that provide either non-synoptic, coarse resolution realizations of large scale processes or detailed, but time and site specific snapshots of local features. These gaps can be filled by two independent sources: satellite observation and profiling floats. In fact satellite ocean color sensors allows for determination at synoptic scale of water quality parameters through its absorption properties. However the satellite ocean color methods have a number of limitations such as: measurements can only be made during daylight hours; require cloud-free conditions and are sensitive to atmospheric aerosols; provide information only for the upper layer of the ocean (approximately the depth of 10% incident light); algorithms developed for global applications are a source of large uncertainties in the marginal seas and costal areas. These constrains of the optical remote sensing observations can be avoided by using miniature biogeochemical sensors and autonomous platforms that offer remarkable perspectives for observing the "biological" ocean, notably at critical spatiotemporal scales which have been out of reach until recently (Claustre et al., 2010). In the frame of "E-AIMS: Euro-Argo Improvements for the GMES marine Service" 7 EC FP project two Bio Argo floats were deployed in the Black Sea. Beside the traditionally CTD the floats were equipped with biogeochemical sensors (oxygen, irradiance, chl-a and backscattering). The selection of the deployment locations was limited only to the Bulgarian Black Sea waters, so that the optimal deployment strategy that has been chosen was the floats to be deployed in the maximum distant positions from each other along the Black Sea geostrophic current at depth ~ 1800 m. Coincident biogeochemical and in-water radiometric measurements were collected at the time of each float deployment to ensure intercalibration of the instruments mounted on the floats and as well as to find empirical relationship between optical data and biogeochemical variables. The data obtained form Bio floats will be used to: investigate the seasonal evolution of oxygen in the upper layers, including the subsurface oxygen maximum; study the seasonal and inter annual dynamics of phytoplankton blooms in the deeper Black Sea; cross validation of satellite derived Chl-a and backscattering. References: Claustre et al. (2010). Bio-optical profiling floats as new observational tools for biogeochemical and ecosystem studies: potential synergies with ocean color remote sensing. Proceedings of the "OceanObs'09: Sustained Ocean Observations and Information for Society" Conference, Venice/Italy.
Bioinspired Principles for Large-Scale Networked Sensor Systems: An Overview
Jacobsen, Rune Hylsberg; Zhang, Qi; Toftegaard, Thomas Skjødeberg
2011-01-01
Biology has often been used as a source of inspiration in computer science and engineering. Bioinspired principles have found their way into network node design and research due to the appealing analogies between biological systems and large networks of small sensors. This paper provides an overview of bioinspired principles and methods such as swarm intelligence, natural time synchronization, artificial immune system and intercellular information exchange applicable for sensor network design. Bioinspired principles and methods are discussed in the context of routing, clustering, time synchronization, optimal node deployment, localization and security and privacy. PMID:22163841
NASA Astrophysics Data System (ADS)
Day, Danny
2006-04-01
Although `negative emissions' of carbon dioxide need not, in principle, involve use of biological processes to draw carbon out of the atmosphere, such `agricultural' sequestration' is the only known way to remove carbon from the atmosphere on time scales comparable to the time scale for anthropogenic increases in carbon emissions. In order to maintain the `negative emissions' the biomass must be used in such a way that the resulting carbon dioxide is separated and permanently sequestered. Two options for sequestration are in the topsoil and via geologic carbon sequestration. The former has multiple benefits, but the latter also is needed. Thus, although geologic carbon sequestration is viewed skeptically by some environmentalists as simply a way to keep using fossil fuels---it may be a key part of reversing accelerating climate forcing if rapid climate change is beginning to occur. I will first review the general approach of agricultural sequestration combined with use of resulting biofuels in a way that permits carbon separation and then geologic sequestration as a negative emissions technology. Then I discuss the process that is the focus of my company---the EPRIDA cycle. If deployed at a sufficiently large scale, it could reverse the increase in CO2 concentrations. I also estimate of benefits --carbon and other---of large scale deployment of negative emissions technologies. For example, using the EPRIDA cycle by planting and soil sequestering carbon in an area abut In 3X the size of Texas would remove the amount of carbon that is being accumulated worldwide each year. In addition to the atmospheric carbon removal, the EPRIDA approach also counters the depletion of carbon in the soil---increasing topsoil and its fertility; reduces the excess nitrogen in the water by eliminating the need for ammonium nitrate fertilizer and reduces fossil fuel reliance by providing biofuel and avoiding natural gas based fertilizer production.
2015-07-01
prior to, during, and following deployment: Dyadic Adjustment Scale – measures marital functioning Conflict-Tactics Scale Family Adaptability and...Applied Psychosocial Measurement,1, 385-401. Rocissano, L., Slade, A., & Lynch, V. (1987). Dyadic synchrony and toddler compliance. Developmental...new criterion Q-sort scale. Developmental Psychology, 33, 906-916. Spanier, G.B. (1976). Measuring dyadic adjustment: new scales for assessing the
NASA Technical Reports Server (NTRS)
Akle, W.
1983-01-01
This study report defines a set of tests and measurements required to characterize the performance of a Large Space System (LSS), and to scale this data to other LSS satellites. Requirements from the Mobile Communication Satellite (MSAT) configurations derived in the parent study were used. MSAT utilizes a large, mesh deployable antenna, and encompasses a significant range of LSS technology issues in the areas of structural/dynamics, control, and performance predictability. In this study, performance requirements were developed for the antenna. Special emphasis was placed on antenna surface accuracy, and pointing stability. Instrumentation and measurement systems, applicable to LSS, were selected from existing or on-going technology developments. Laser ranging and angulation systems, presently in breadboard status, form the backbone of the measurements. Following this, a set of ground, STS, and GEO-operational were investigated. A third scale (15 meter) antenna system as selected for ground characterization followed by STS flight technology development. This selection ensures analytical scaling from ground-to-orbit, and size scaling. Other benefits are cost and ability to perform reasonable ground tests. Detail costing of the various tests and measurement systems were derived and are included in the report.
NASA Astrophysics Data System (ADS)
Brett, Gareth; Barnett, Matthew
2014-12-01
Liquid Air Energy Storage (LAES) provides large scale, long duration energy storage at the point of demand in the 5 MW/20 MWh to 100 MW/1,000 MWh range. LAES combines mature components from the industrial gas and electricity industries assembled in a novel process and is one of the few storage technologies that can be delivered at large scale, with no geographical constraints. The system uses no exotic materials or scarce resources and all major components have a proven lifetime of 25+ years. The system can also integrate low grade waste heat to increase power output. Founded in 2005, Highview Power Storage, is a UK based developer of LAES. The company has taken the concept from academic analysis, through laboratory testing, and in 2011 commissioned the world's first fully integrated system at pilot plant scale (300 kW/2.5 MWh) hosted at SSE's (Scottish & Southern Energy) 80 MW Biomass Plant in Greater London which was partly funded by a Department of Energy and Climate Change (DECC) grant. Highview is now working with commercial customers to deploy multi MW commercial reference plants in the UK and abroad.
Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.
Household Energy Consumption Segmentation Using Hourly Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwac, J; Flora, J; Rajagopal, R
2014-01-01
The increasing US deployment of residential advanced metering infrastructure (AMI) has made hourly energy consumption data widely available. Using CA smart meter data, we investigate a household electricity segmentation methodology that uses an encoding system with a pre-processed load shape dictionary. Structured approaches using features derived from the encoded data drive five sample program and policy relevant energy lifestyle segmentation strategies. We also ensure that the methodologies developed scale to large data sets.
The U.S. Army in Asia, 2030-2040
2014-01-01
in general. However, to be fully effective in a war with China, AirSea Battle would likely require early (if not pre- 5 There is a potential downside...to establish a regional sphere of influence, potentially espousing ideologies that are in conflict with core values of the international system ...China may wish to deploy large-scale ground forces punitively, over a set of limited objectives, or differentiating Between a “ Systemic Continuity
Hydrodynamics Offshore of the North Beach of Indian River Inlet, DE
NASA Astrophysics Data System (ADS)
DiCosmo, N. R.; Puleo, J. A.
2014-12-01
The Indian River Inlet (IRI) on the east coast of Delaware, USA connects the Atlantic Ocean to the Indian River and Rehoboth Bays. Long-term and large-scale net alongshore sediment transport along this portion of coastline is from south to north. The north beach of IRI suffers from severe erosion due to interruption of the alongshore transport and current variability near the inlet. The magnitude of such erosion has increased over the past decade and questions have arisen as to the cause. The goal of this study is to quantify currents and wave patterns and estimate sediment transport rates at the north beach and near the inlet in an effort to determine the causes of persistent erosion. Data were obtained from October 2013 to March 2014 in the form of 3 separate 28-day deployments. Each deployment consisted of 4 proposed deployment sites. Data at each site were collected using a bottom mounted Nortek Aquadopp Acoustic Doppler Current Profiler (ADCP) and 2 Campbell Scientific Optical Backscatter Sensors (OBS). Currents and OBS data were sampled every 120 s. Waves were sampled for approximately 17 minutes at the beginning of every hour. Data analysis from the deployments indicates the presence of several interesting trends in currents that can be linked to the persistent erosion. Current data are filtered to quantify typical current speed and direction for a tidal cycle (peak flood to peak flood) at each deployment site. The typical currents off of the north beach and up to 800 m north of the north jetty are mostly directed southward over the entire tidal cycle. This consistent southward flow implies: 1) there is no flow reversal based on tide, contrary to what might be expected at an inlet adjacent beach, 2) the typical current direction is opposite of the expectations for the known long-term large-scale net alongshore transport and 3) the consistency of this atypical current may be responsible for transporting sediment southward and away from the north beach. Currents and waves will be further analyzed for storm and non-storm conditions in order to more completely quantify the hydrodynamics of the area. Sediment data will also be analyzed in conjunction with the hydrodynamic data in order to better understand the sediment transport process.
Cable-catenary large antenna concept
NASA Technical Reports Server (NTRS)
Akle, W.
1985-01-01
Deployable to very large diameters (over 1000 ft), while still remaining compatible with a complete satellite system launch by STS, the cable-catenary antenna comprises: 8 radial deployable boom masts; a deployable hub and feed support center mast; balanced front and back, radial and circumferential catenary cabling for highly accurate (mm) surface control; no interfering cabling in the antenna field; and an RF reflecting mesh supported on the front catenaries. Illustrations show the antenna-satellite system deployed and stowed configurations; the antenna deployment sequence; the design analysis logic; the sizing analysis output, and typical parametric design data.
NASA Astrophysics Data System (ADS)
Deng, Zhengping; Li, Shuanggao; Huang, Xiang
2018-06-01
In the assembly process of large-size aerospace products, the leveling and horizontal alignment of large components are essential prior to the installation of an inertial navigation system (INS) and the final quality inspection. In general, the inherent coordinate systems of large-scale coordinate measuring devices are not coincident with the geodetic horizontal system, and a dual-axis compensation system is commonly required for the measurement of difference in heights. These compensation systems are expensive and dedicated designs for different devices at present. Considering that a large-size assembly site usually needs more than one measuring device, a compensation approach which is versatile for different devices would be a more convenient and economic choice for manufacturers. In this paper, a flexible and cost-effective compensation method is proposed. Firstly, an auxiliary measuring device called a versatile compensation fixture (VCF) is designed, which mainly comprises reference points for coordinate transformation and a dual-axis inclinometer, and a kind of network tighten points (NTPs) are introduced and temporarily deployed in the large measuring space to further reduce transformation error. Secondly, the measuring principle of height difference is studied, based on coordinate transformation theory and trigonometry while considering the effects of earth curvature, and the coordinate transformation parameters are derived by least squares adjustment. Thirdly, the analytical solution of leveling uncertainty is analyzed, based on which the key parameters of the VCF and the proper deployment of NTPs are determined according to the leveling accuracy requirement. Furthermore, the proposed method is practically applied to the assembly of a large helicopter by developing an automatic leveling and alignment system. By measuring four NTPs, the leveling uncertainty (2σ) is reduced by 29.4% to about 0.12 mm, compared with that without NTPs.
Driver air bag effectiveness by severity of the crash.
Segui-Gomez, M
2000-01-01
OBJECTIVES: This analysis provided effectiveness estimates of the driver-side air bag while controlling for severity of the crash and other potential confounders. METHODS: Data were from the National Automotive Sampling System (1993-1996). Injury severity was described on the basis of the Abbreviated Injury Scale, Injury Severity Score, Functional Capacity Index, and survival. Ordinal, linear, and logistic multivariate regression methods were used. RESULTS: Air bag deployment in frontal or near-frontal crashes decreases the probability of having severe and fatal injuries (e.g., Abbreviated Injury Scale score of 4-6), including those causing a long-lasting high degree of functional limitation. However, air bag deployment in low-severity crashes increases the probability that a driver (particularly a woman) will sustain injuries of Abbreviated Injury Scale level 1 to 3. Air bag deployment exerts a net injurious effect in low-severity crashes and a net protective effect in high-severity crashes. The level of crash severity at which air bags are protective is higher for female than for male drivers. CONCLUSIONS: Air bag improvement should minimize the injuries induced by their deployment. One possibility is to raise their deployment level so that they deploy only in more severe crashes. PMID:11029991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.
Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less
Wilcox, S.; Andreas, A.
2010-03-16
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Stoffel, T.; Andreas, A.
2010-04-26
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-13
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2012-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Solar Resource & Meteorological Assessment Project (SOLRMAP): Sun Spot Two; Swink, Colorado (Data)
Wilcox, S.; Andreas, A.
2010-11-10
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-07-14
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2009-07-22
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Wilcox, S.; Andreas, A.
2010-11-03
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
ZERO: probabilistic routing for deploy and forget Wireless Sensor Networks.
Vilajosana, Xavier; Llosa, Jordi; Pacho, Jose Carlos; Vilajosana, Ignasi; Juan, Angel A; Vicario, Jose Lopez; Morell, Antoni
2010-01-01
As Wireless Sensor Networks are being adopted by industry and agriculture for large-scale and unattended deployments, the need for reliable and energy-conservative protocols become critical. Physical and Link layer efforts for energy conservation are not mostly considered by routing protocols that put their efforts on maintaining reliability and throughput. Gradient-based routing protocols route data through most reliable links aiming to ensure 99% packet delivery. However, they suffer from the so-called "hot spot" problem. Most reliable routes waste their energy fast, thus partitioning the network and reducing the area monitored. To cope with this "hot spot" problem we propose ZERO a combined approach at Network and Link layers to increase network lifespan while conserving reliability levels by means of probabilistic load balancing techniques.
NASA Technical Reports Server (NTRS)
Death, M. D.
1984-01-01
The evolution of an Antenna Deployment Mechanism (ADM) from a Hinge Actuator Mechanism (HAM) is described as it pertains to the deployment of large satellite antennas. Design analysis and mechanical tests are examined in detail.
Wave resource variability: Impacts on wave power supply over regional to international scales
NASA Astrophysics Data System (ADS)
Smith, Helen; Fairley, Iain; Robertson, Bryson; Abusara, Mohammad; Masters, Ian
2017-04-01
The intermittent, irregular and variable nature of the wave energy resource has implications for the supply of wave-generated electricity into the grid. Intermittency of renewable power may lead to frequency and voltage fluctuations in the transmission and distribution networks. A matching supply of electricity must be planned to meet the predicted demand, leading to a need for gas-fired and back-up generating plants to supplement intermittent supplies, and potentially limiting the integration of intermittent power into the grid. Issues relating to resource intermittency and their mitigation through the development of spatially separated sites have been widely researched in the wind industry, but have received little attention to date in the less mature wave industry. This study analyses the wave resource over three different spatial scales to investigate the potential impacts of the temporal and spatial resource variability on the grid supply. The primary focus is the Southwest UK, a region already home to multiple existing and proposed wave energy test sites. Concurrent wave buoy data from six locations, supported by SWAN wave model hindcast data, are analysed to assess the correlation of the resource across the region and the variation in wave power with direction. Power matrices for theoretical nearshore and offshore devices are used to calculate the maximum step change in generated power across the region as the number of deployment sites is increased. The step change analysis is also applied across national and international spatial scales using output from the European Centre for Medium-range Weather Forecasting (ECMWF) ERA-Interim hindcast model. It is found that the deployment of multiple wave energy sites, whether on a regional, national or international scale, results in both a reduction in step changes in power and reduced times of zero generation, leading to an overall smoothing of the wave-generated electrical power. This has implications for the planning and siting of future wave energy arrays when the industry reaches the point of large-scale deployment.
Floating Offshore Wind in Oregon: Potential for Jobs and Economic Impacts from Two Future Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimenez, Tony; Keyser, David; Tegen, Suzanne
Construction of the first offshore wind power plant in the United States began in 2015, off the coast of Rhode Island, using fixed platform structures that are appropriate for shallow seafloors, like those located off of the East Coast and mid-Atlantic. However, floating platforms, which have yet to be deployed commercially, will likely need to anchor to the deeper seafloor if deployed off of the West Coast. To analyze the employment and economic potential for floating offshore wind along the West Coast, the Bureau of Ocean Energy Management (BOEM) commissioned the National Renewable Energy Laboratory (NREL) to analyze two hypothetical,more » large-scale deployment scenarios for Oregon: 5,500 megawatts (MW) of offshore wind deployment in Oregon by 2050 (Scenario A), and 2,900 MW of offshore wind by 2050 (Scenario B). These levels of deployment could power approximately 1,600,000 homes (Scenario A) or 870,000 homes (Scenario B). Offshore wind would contribute to economic development in Oregon in the near future, and more substantially in the long term, especially if equipment and labor are sourced from within the state. According to the analysis, over the 2020-2050 period, Oregon floating offshore wind facilities could support 65,000-97,000 job-years and add $6.8 billion-$9.9 billion to the state GDP (Scenario A).« less
A Fault Tolerance Mechanism for On-Road Sensor Networks
Feng, Lei; Guo, Shaoyong; Sun, Jialu; Yu, Peng; Li, Wenjing
2016-01-01
On-Road Sensor Networks (ORSNs) play an important role in capturing traffic flow data for predicting short-term traffic patterns, driving assistance and self-driving vehicles. However, this kind of network is prone to large-scale communication failure if a few sensors physically fail. In this paper, to ensure that the network works normally, an effective fault-tolerance mechanism for ORSNs which mainly consists of backup on-road sensor deployment, redundant cluster head deployment and an adaptive failure detection and recovery method is proposed. Firstly, based on the N − x principle and the sensors’ failure rate, this paper formulates the backup sensor deployment problem in the form of a two-objective optimization, which explains the trade-off between the cost and fault resumption. In consideration of improving the network resilience further, this paper introduces a redundant cluster head deployment model according to the coverage constraint. Then a common solving method combining integer-continuing and sequential quadratic programming is explored to determine the optimal location of these two deployment problems. Moreover, an Adaptive Detection and Resume (ADR) protocol is deigned to recover the system communication through route and cluster adjustment if there is a backup on-road sensor mismatch. The final experiments show that our proposed mechanism can achieve an average 90% recovery rate and reduce the average number of failed sensors at most by 35.7%. PMID:27918483
Fitzgerald, Scott D; Rumbeiha, Wilson K; Emmett Braselton, W; Downend, Amanda B; Otto, Cynthia M
2008-07-01
A long-term surveillance study was conducted on 95 search-and-rescue (S&R) dogs deployed to the September 11, 2001, terrorist attack sites; an additional 55 nondeployed S&R dogs served as controls. After 5 years of surveillance, 32% of the deployed dogs have died and 24% of the nondeployed dogs. The mean age at the time of death in these 2 groups of dogs is not significantly different. Causes of death in both groups of dogs include inflammatory, degenerative, and proliferative conditions. No primary pulmonary tumors have been identified to date nor has any significant level of toxicant been found in the tissues from these dogs using assays for general organic compounds and metals or, specifically, for polychlorinated biphenyls. However, significant numbers of both deployed and nondeployed dogs have evidence of inhaled matter as demonstrated by the presence of anthracotic pigments or refractile particulate matter in pulmonary tissue. Although S&R activities in response to the 9/11 terrorist attacks exposed dogs to a wide variety of potentially toxic compounds, to date, these dogs do not appear to suffer from higher mortality or increased pulmonary disease compared with nondeployed dogs. To the authors' knowledge, the current survey represents the first long-term and large-scale survey of the pathology and toxicology of S&R dogs deployed to a major disaster site.
Visualizing the Big (and Large) Data from an HPC Resource
NASA Astrophysics Data System (ADS)
Sisneros, R.
2015-10-01
Supercomputers are built to endure painfully large simulations and contend with resulting outputs. These are characteristics that scientists are all too willing to test the limits of in their quest for science at scale. The data generated during a scientist's workflow through an HPC center (large data) is the primary target for analysis and visualization. However, the hardware itself is also capable of generating volumes of diagnostic data (big data); this presents compelling opportunities to deploy analogous analytic techniques. In this paper we will provide a survey of some of the many ways in which visualization and analysis may be crammed into the scientific workflow as well as utilized on machine-specific data.
NASA Astrophysics Data System (ADS)
Oschlies, Andreas; Klepper, Gernot
2017-01-01
The historical developments are reviewed that have led from a bottom-up responsibility initiative of concerned scientists to the emergence of a nationwide interdisciplinary Priority Program on the assessment of Climate Engineering (CE) funded by the German Research Foundation (DFG). Given the perceived lack of comprehensive and comparative appraisals of different CE methods, the Priority Program was designed to encompass both solar radiation management (SRM) and carbon dioxide removal (CDR) ideas and to cover the atmospheric, terrestrial, and oceanic realm. First, key findings obtained by the ongoing Priority Program are summarized and reveal that, compared to earlier assessments such as the 2009 Royal Society report, more detailed investigations tend to indicate less efficiency, lower effectiveness, and often lower safety. Emerging research trends are discussed in the context of the recent Paris agreement to limit global warming to less than two degrees and the associated increasing reliance on negative emission technologies. Our results show then when deployed at scales large enough to have a significant impact on atmospheric CO2, even CDR methods such as afforestation—often perceived as "benign"—can have substantial side effects and may raise severe ethical, legal, and governance issues. We suppose that before being deployed at climatically relevant scales, any negative emission or CE method will require careful analysis of efficiency, effectiveness, and undesired side effects.
Findings from the Supersonic Qualification Program of the Mars Science Laboratory Parachute System
NASA Technical Reports Server (NTRS)
Sengupta, Anita; Steltzner, Adam; Witkowski, Allen; Candler, Graham; Pantano, Carlos
2009-01-01
In 2012, the Mars Science Laboratory Mission (MSL) will deploy NASA's largest extra-terrestrial parachute, a technology integral to the safe landing of its advanced robotic explorer on the surface. The supersonic parachute system is a mortar deployed 21.5 m disk-gap-band (DGB) parachute, identical in geometric scaling to the Viking era DGB parachutes of the 1970's. The MSL parachute deployment conditions are Mach 2.3 at a dynamic pressure of 750 Pa. The Viking Balloon Launched Decelerator Test (BLDT) successfully demonstrated a maximum of 700 Pa at Mach 2.2 for a 16.1 m DGB parachute in its AV4 flight. All previous Mars deployments have derived their supersonic qualification from the Viking BLDT test series, preventing the need for full scale high altitude supersonic testing. The qualification programs for Mars Pathfinder, Mars Exploration Rover, and Phoenix Scout Missions were all limited to subsonic structural qualification, with supersonic performance and survivability bounded by the BLDT qualification. The MSL parachute, at the edge of the supersonic heritage deployment space and 33% larger than the Viking parachute, accepts a certain degree of risk without addressing the supersonic environment in which it will deploy. In addition, MSL will spend up to 10 seconds above Mach 1.5, an aerodynamic regime that is associated with a known parachute instability characterized by significant canopy projected area fluctuation and dynamic drag variation. This aerodynamic instability, referred to as "area oscillations" by the parachute community has drag performance, inflation stability, and structural implications, introducing risk to mission success if not quantified for the MSL parachute system. To minimize this risk and as an alternative to a prohibitively expensive high altitude test program, a multi-phase qualification program using computation simulation validated by subscale test was developed and implemented for MSL. The first phase consisted of 2% of fullscale supersonic wind tunnel testing of a rigid DGB parachute with entry-vehicle to validate two high fidelity computational fluid dynamics (CFD) tools. The computer codes utilized Large Eddy Simulation and Detached Eddy Simulation numerical approaches to accurately capture the turbulent wake of the entry vehicle and its coupling to the parachute bow-shock. The second phase was the development of fluid structure interaction (FSI) computational tools to predict parachute response to the supersonic flow field. The FSI development included the integration of the CFD from the first phase with a finite element structural model of the parachute membrane and cable elements. In this phase, a 4% of full-scale supersonic flexible parachute test program was conducted to provide validation data to the FSI code and an empirical dataset of the MSL parachute in a flight-like environment. The final phase is FSI simulations of the full-scale MSL parachute in a Mars type deployment. Findings from this program will be presented in terms of code development and validation, empirical findings from the supersonic testing, and drag performance during supersonic operation.
Hybrid methods for cybersecurity analysis :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Warren Leon,; Dunlavy, Daniel M.
2014-01-01
Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less
NASA/DOD Control/Structures Interaction Technology, 1986
NASA Technical Reports Server (NTRS)
Wright, Robert L. (Compiler)
1986-01-01
Control/structures interactions, deployment dynamics and system performance of large flexible spacecraft are discussed. Spacecraft active controls, deployable truss structures, deployable antennas, solar power systems for space stations, pointing control systems for space station gimballed payloads, computer-aided design for large space structures, and passive damping for flexible structures are among the topics covered.
Battery technologies for large-scale stationary energy storage.
Soloveichik, Grigorii L
2011-01-01
In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β″-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.
Pathways for Off-site Corporate PV Procurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heeter, Jenny S
Through July 2017, corporate customers contracted for more than 2,300 MW of utility-scale solar. This paper examines the benefits, challenges, and outlooks for large-scale off-site solar purchasing through four pathways: power purchase agreements, retail choice, utility partnerships (green tariffs and bilateral contracts with utilities), and by becoming a licensed wholesale seller of electricity. Each pathway differs based on where in the United States it is available, the value provided to a corporate off-taker, and the ease of implementation. The paper concludes with a discussion of future pathway comparison, noting that to deploy more corporate off-site solar, new procurement pathways aremore » needed.« less
Some ecological guidelines for large-scale biomass plantations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, W.; Cook, J.H.; Beyea, J.
1993-12-31
The National Audubon Society sees biomass as an appropriate and necessary source of energy to help replace fossil fuels in the near future, but is concerned that large-scale biomass plantations could displace significant natural vegetation and wildlife habitat, and reduce national and global biodiversity. We support the development of an industry large enough to provide significant portions of our energy budget, but we see a critical need to ensure that plantations are designed and sited in ways that minimize ecological disruption, or even provide environmental benefits. We have been studying the habitat value of intensively managed short-rotation tree plantations. Ourmore » results show that these plantations support large populations of some birds, but not all of the species using the surrounding landscape, and indicate that their value as habitat can be increased greatly by including small areas of mature trees within them. We believe short-rotation plantations can benefit regional biodiversity if they can be deployed as buffers for natural forests, or as corridors connecting forest tracts. To realize these benefits, and to avoid habitat degradation, regional biomass plantation complexes (e.g., the plantations supplying all the fuel for a powerplant) need to be planned, sited, and developed as large-scale units in the context of the regional landscape mosaic.« less
2016-08-09
This image shows the deployment of a half-scale starshade with four petals at NASA's Jet Propulsion Laboratory in Pasadena, California, in 2014. The full scale of this starshade (not shown) will measure at 34 meters, or approximately 111 feet. The flower-like petals of the starshade are designed to diffract bright starlight away from telescopes seeking the dim light of exoplanets. The starshade was re-designed from earlier models to allow these petals to furl, or wrap around the spacecraft, for launch into space. Once in space, the starshade will need to expand from its tightly-packed launch shape to become large and umbrella-like, ideal for blocking starlight. Each petal is covered in a high-performance plastic film that resembles gold foil. On a starshade ready for launch, the thermal gold foil will only cover the side of the petals facing away from the telescope, with black on the other, so as not to reflect other light sources such as the Earth into its lens. Starlight-blocking technologies such as the starshade are being developed to help image exoplanets, with a focus on Earth-sized, habitable worlds. http://photojournal.jpl.nasa.gov/catalog/PIA20907
NASA Astrophysics Data System (ADS)
Hardesty, R. Michael; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann M.; Shepson, Paul B.; Cambaliza, Maria; Heimburger, Alexie; Davis, Kenneth J.; Lauvaux, Thomas; Miles, Natasha L.; Sarmiento, Daniel P.; Deng, A. J.; Gaudet, Brian; Karion, Anna; Sweeney, Colm; Whetstone, James
2016-06-01
A compact commercial Doppler lidar has been deployed in Indianapolis for two years to measure wind profiles and mixing layer properties as part of project to improve greenhouse measurements from large area sources. The lidar uses vertical velocity variance and aerosol structure to measure mixing layer depth. Comparisons with aircraft and the NOAA HRDL lidar generally indicate good performance, although sensitivity might be an issue under low aerosol conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komomua, C.; Kroposki, B.; Mooney, D.
2009-01-01
On October 9, 2008, NREL hosted a workshop to provide an opportunity for external stakeholders to offer insights and recommendations on the design and functionality of DOE's planned Energy Systems Infrastructure Facility (ESIF). The goal was to ensure that the planning for the ESIF effectively addresses the most critical barriers to large-scale energy efficiency (EE) and renewable energy (RE) deployment. This technical report documents the ESIF workshop proceedings.
Emerging Genomic Tools for Legume Breeding: Current Status and Future Prospects
Pandey, Manish K.; Roorkiwal, Manish; Singh, Vikas K.; Ramalingam, Abirami; Kudapa, Himabindu; Thudi, Mahendar; Chitikineni, Anu; Rathore, Abhishek; Varshney, Rajeev K.
2016-01-01
Legumes play a vital role in ensuring global nutritional food security and improving soil quality through nitrogen fixation. Accelerated higher genetic gains is required to meet the demand of ever increasing global population. In recent years, speedy developments have been witnessed in legume genomics due to advancements in next-generation sequencing (NGS) and high-throughput genotyping technologies. Reference genome sequences for many legume crops have been reported in the last 5 years. The availability of the draft genome sequences and re-sequencing of elite genotypes for several important legume crops have made it possible to identify structural variations at large scale. Availability of large-scale genomic resources and low-cost and high-throughput genotyping technologies are enhancing the efficiency and resolution of genetic mapping and marker-trait association studies. Most importantly, deployment of molecular breeding approaches has resulted in development of improved lines in some legume crops such as chickpea and groundnut. In order to support genomics-driven crop improvement at a fast pace, the deployment of breeder-friendly genomics and decision support tools seems appear to be critical in breeding programs in developing countries. This review provides an overview of emerging genomics and informatics tools/approaches that will be the key driving force for accelerating genomics-assisted breeding and ultimately ensuring nutritional and food security in developing countries. PMID:27199998
The Impact of CCS Readiness on the Evolution of China's Electric Power Sector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahowski, Robert T.; Davidson, Casie L.; Yu, Sha
In this study, GCAM-China is exercised to examine the impact of CCS availability on the projected evolution of China’s electric power sector under the Paris Increased Ambition policy scenario developed by Fawcett et al. based on the Intended Nationally Determined Contributions (INDCs) submitted under the COP-21 Paris Agreement. This policy scenario provides a backdrop for understanding China’s electric generation mix over the coming century under several CCS availability scenarios: CCS is fully available for commercial-scale deployment by 2025; by 2050; by 2075; and CCS is unavailable for use in meeting the modelled mitigation targets through 2100. Without having CCS available,more » the Chinese electric power sector turns to significant use of nuclear, wind, and solar to meet growing demands and emissions targets, at a cost. Should large-scale CCS deployment be delayed in China by 25 years, the modeled per-ton cost of climate change mitigation is projected to be roughly $420/tC (2010 US dollars) by 2050, relative to $360/tC in the case in which CCS is available to deploy by 2025, a 16% increase. Once CCS is available for commercial use, mitigation costs for the two cases converge, equilibrating by 2085. However, should CCS be entirely unavailable to deploy in China, the mitigation cost spread, compared to the 2025 case, doubles by 2075 ($580/tC and $1130/tC respectively), and triples by 2100 ($1050/tC vs. $3200/tC). However, while delays in CCS availability may have short-term impacts on China’s overall per-ton cost of meeting the emissions reduction target evaluated here, as well as total mitigation costs, the carbon price is likely to approach the price path associated with the full CCS availability case within a decade of CCS deployment. Having CCS available before the end of the century, even under the delays examined here, could reduce the total amount of nuclear and renewable energy that must deploy, reducing the overall cost of meeting the emissions mitigation targets.« less
Kennedy, Jacob J.; Abbatiello, Susan E.; Kim, Kyunggon; Yan, Ping; Whiteaker, Jeffrey R.; Lin, Chenwei; Kim, Jun Seok; Zhang, Yuzheng; Wang, Xianlong; Ivey, Richard G.; Zhao, Lei; Min, Hophil; Lee, Youngju; Yu, Myeong-Hee; Yang, Eun Gyeong; Lee, Cheolju; Wang, Pei; Rodriguez, Henry; Kim, Youngsoo; Carr, Steven A.; Paulovich, Amanda G.
2014-01-01
The successful application of MRM in biological specimens raises the exciting possibility that assays can be configured to measure all human proteins, resulting in an assay resource that would promote advances in biomedical research. We report the results of a pilot study designed to test the feasibility of a large-scale, international effort in MRM assay generation. We have configured, validated across three laboratories, and made publicly available as a resource to the community 645 novel MRM assays representing 319 proteins expressed in human breast cancer. Assays were multiplexed in groups of >150 peptides and deployed to quantify endogenous analyte in a panel of breast cancer-related cell lines. Median assay precision was 5.4%, with high inter-laboratory correlation (R2 >0.96). Peptide measurements in breast cancer cell lines were able to discriminate amongst molecular subtypes and identify genome-driven changes in the cancer proteome. These results establish the feasibility of a scaled, international effort. PMID:24317253
Unprecedented rates of land-use transformation in modeled climate change mitigation pathways
NASA Astrophysics Data System (ADS)
Turner, P. A.; Field, C. B.; Lobell, D. B.; Sanchez, D.; Mach, K. J.
2017-12-01
Integrated assessment models (IAMs) generate climate change mitigation scenarios consistent with global temperature targets. To limit warming to 2°, stylized cost-effective mitigation pathways rely on extensive deployments of carbon dioxide (CO2) removal (CDR) technologies, including multi-gigatonne yearly carbon removal from the atmosphere through bioenergy with carbon capture and storage (BECCS) and afforestation/reforestation. These assumed CDR deployments keep ambitious temperature limits in reach, but associated rates of land-use transformation have not been evaluated. For IAM scenarios from the IPCC Fifth Assessment Report, we compare rates of modeled land-use conversion to recent observed commodity crop expansions. In scenarios with a likely chance of limiting warming to 2° in 2100, the rate of energy cropland expansion supporting BECCS exceeds past commodity crop rates by several fold. In some cases, mitigation scenarios include abrupt reversal of deforestation, paired with massive afforestation/reforestation. Specifically, energy cropland in <2° scenarios expands, on average, by 8.2 Mha yr-1 and 11.7% p.a. across scenarios. This rate exceeds, by more than 3-fold, the observed expansion of soybean, the most rapidly expanding commodity crop. If energy cropland instead increases at rates equal to recent soybean and oil palm expansions, the scale of CO2 removal possible with BECCS is 2.6 to 10-times lower, respectively, than the deployments <2° IAM scenarios rely upon in 2100. IAM mitigation pathways may favor multi-gigatonne biomass-based CDR given undervalued sociopolitical and techno-economic deployment barriers. Heroic modeled rates for land-use transformation imply that large-scale biomass-based CDR is not an easy solution to the climate challenge.
The “Wireless Sensor Networks for City-Wide Ambient Intelligence (WISE-WAI)” Project
Casari, Paolo; Castellani, Angelo P.; Cenedese, Angelo; Lora, Claudio; Rossi, Michele; Schenato, Luca; Zorzi, Michele
2009-01-01
This paper gives a detailed technical overview of some of the activities carried out in the context of the “Wireless Sensor networks for city-Wide Ambient Intelligence (WISE-WAI)” project, funded by the Cassa di Risparmio di Padova e Rovigo Foundation, Italy. The main aim of the project is to demonstrate the feasibility of large-scale wireless sensor network deployments, whereby tiny objects integrating one or more environmental sensors (humidity, temperature, light intensity), a microcontroller and a wireless transceiver are deployed over a large area, which in this case involves the buildings of the Department of Information Engineering at the University of Padova. We will describe how the network is organized to provide full-scale automated functions, and which services and applications it is configured to provide. These applications include long-term environmental monitoring, alarm event detection and propagation, single-sensor interrogation, localization and tracking of objects, assisted navigation, as well as fast data dissemination services to be used, e.g., to rapidly re-program all sensors over-the-air. The organization of such a large testbed requires notable efforts in terms of communication protocols and strategies, whose design must pursue scalability, energy efficiency (while sensors are connected through USB cables for logging and debugging purposes, most of them will be battery-operated), as well as the capability to support applications with diverse requirements. These efforts, the description of a subset of the results obtained so far, and of the final objectives to be met are the scope of the present paper. PMID:22408513
Beyond Widgets -- Systems Incentive Programs for Utilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Regnier, Cindy; Mathew, Paul; Robinson, Alastair
Utility incentive programs remain one of the most significant means of deploying commercialized, but underutilized building technologies to scale. However, these programs have been largely limited to component-based products (e.g., lamps, RTUs). While some utilities do provide ‘custom’ incentive programs with whole building and system level technical assistance, these programs require deeper levels of analysis, resulting in higher program costs. This results in custom programs being restricted to utilities with greater resources, and are typically applied mainly to large or energy-intensive facilities, leaving much of the market without cost effective access and incentives for these solutions. In addition, with increasinglymore » stringent energy codes, cost effective component-based solutions that achieve significant savings are dwindling. Building systems (e.g., integrated façade, HVAC and/or lighting solutions) can deliver higher savings that translate into large sector-wide savings if deployed at the scale of these programs. However, systems application poses a number of challenges – baseline energy use must be defined and measured; the metrics for energy and performance must be defined and tested against; in addition, system savings must be validated under well understood conditions. This paper presents a sample of findings of a project to develop validated utility incentive program packages for three specific integrated building systems, in collaboration with Xcel Energy (CO, MN), ComEd, and a consortium of California Public Owned Utilities (CA POUs) (Northern California Power Agency(NCPA) and the Southern California Public Power Authority(SCPPA)). Furthermore, these program packages consist of system specifications, system performance, M&V protocols, streamlined assessment methods, market assessment and implementation guidance.« less
Cragun, Joshua N; April, Michael D; Thaxton, Robert E
2016-08-01
Compassion fatigue is a problem for many health care providers manifesting as physical, mental, and spiritual exhaustion. Our objective was to evaluate the association between prior combat deployment and compassion fatigue among military emergency medicine providers. We conducted a nonexperimental cross-sectional survey of health care providers assigned to the San Antonio Military Medical Center, Department of Emergency Medicine. We used the Professional Quality of Life Scale V survey instrument that evaluates provider burnout, secondary traumatic stress, and compassion satisfaction. Outcomes included burnout, secondary traumatic stress, and compassion satisfaction raw scores. Scores were compared between providers based on previous combat deployments using two-tailed independent sample t tests and multiple regression models. Surveys were completed by 105 respondents: 42 nurses (20 previously deployed), 30 technicians (11 previously deployed), and 33 physicians (16 previously deployed). No statistically significant differences in burnout, secondary traumatic stress, or compassion satisfaction scores were detected between previously deployed providers versus providers not previously deployed. There was no association between previous combat deployment and emergency department provider burnout, secondary traumatic stress, or compassion satisfaction scores. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
Experiences with a Decade of Wireless Sensor Networks in Mountain Cryosphere Research
NASA Astrophysics Data System (ADS)
Beutel, Jan
2017-04-01
Research in geoscience depends on high-quality measurements over long periods of time in order to understand processes and to create and validate models. The promise of wireless sensor networks to monitor autonomously at unprecedented spatial and temporal scale motivated the use of this novel technology for studying mountain permafrost in the mid 2000s. Starting from a first experimental deployment to investigate the thermal properties of steep bedrock permafrost in 2006 on the Jungfraujoch, Switzerland at 3500 m asl using prototype wireless sensors the PermaSense project has evolved into a multi-site and multi-discipline initiative. We develop, deploy and operate wireless sensing systems customized for long-term autonomous operation in high-mountain environments. Around this central element, we develop concepts, methods and tools to investigate and to quantify the connection between climate, cryosphere (permafrost, glaciers, snow) and geomorphodynamics. In this presentation, we describe the concepts and system architecture used both for the wireless sensor network as well as for data management and processing. Furthermore, we will discuss the experience gained in over a decade of planning, installing and operating large deployments on field sites spread across a large part of the Swiss and French Alps and applications ranging from academic, experimental research campaigns, long-term monitoring and natural hazard warning in collaboration with government authorities and local industry partners. Reference http://www.permasense.ch Online Open Data Access http://data.permasense.ch
Bleier, Jonathan; McFarlane, Alexander; McGuire, Annabel; Treloar, Susan; Waller, Michael; Dobson, Annette
2011-02-01
The operational tempo of the Australian Defence Force has increased over the last two decades. We examine the relationship between health of personnel and the frequency and duration of their deployment. Self-reported health measures (number of symptoms, Kessler Psychological Distress Scale, and Post Traumatic Stress Disorder Checklist) were compared for people who had never deployed to those who had deployed only once and for those who had deployed at least twice with at least one deployment to East Timor and one deployment to Afghanistan or Iraq. Comparisons were also made between people who had deployed for at least one month and those who had deployed for longer periods. Frequency of deployment but not duration of deployment was associated with poorer health.
Large-scale bioenergy production: how to resolve sustainability trade-offs?
NASA Astrophysics Data System (ADS)
Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag
2018-02-01
Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the SDG agenda. Based on this, we argue that the development of policies for regulating externalities of large-scale bioenergy production should rely on broad sustainability assessments to discover potential trade-offs with the SDG agenda before implementation.
NASA Astrophysics Data System (ADS)
Wollheim, W. M.; Mulukutla, G. K.; Cook, C.; Carey, R. O.
2017-11-01
Nonpoint pollution sources are strongly influenced by hydrology and are therefore sensitive to climate variability. Some pollutants entering aquatic ecosystems, e.g., nitrate, can be mitigated by in-stream processes during transport through river networks. Whole river network nitrate retention is difficult to quantify with observations. High frequency, in situ nitrate sensors, deployed in nested locations within a single watershed, can improve estimates of both nonpoint inputs and aquatic retention at river network scales. We deployed a nested sensor network and associated sampling in the urbanizing Oyster River watershed in coastal New Hampshire, USA, to quantify storm event-scale loading and retention at network scales. An end member analysis used the relative behavior of reactive nitrate and conservative chloride to infer river network fate of nitrate. In the headwater catchments, nitrate and chloride concentrations are both increasingly diluted with increasing storm size. At the mouth of the watershed, chloride is also diluted, but nitrate tended to increase. The end member analysis suggests that this pattern is the result of high retention during small storms (51-78%) that declines to zero during large storms. Although high frequency nitrate sensors did not alter estimates of fluxes over seasonal time periods compared to less frequent grab sampling, they provide the ability to estimate nitrate flux versus storm size at event scales that is critical for such analyses. Nested sensor networks can improve understanding of the controls of both loading and network scale retention, and therefore also improve management of nonpoint source pollution.
Assessing Resource Assessment for MRE (Invited)
NASA Astrophysics Data System (ADS)
Hanson, H. P.; Bozec, A.; Duerr, A. S.; Rauchenstein, L. T.
2010-12-01
The Southeast National Marine Renewable Energy Center at Florida Atlantic University is concerned with marine renewable energy (MRE) recovery from the Florida Current using marine hydrokinetic technology and, in the future, from the thermocline in the Florida Straits via ocean thermal energy conversion. Although neither concept is new, technology improvements and the evolution of policy now warrant optimism for the future of these potentially rich resources. In moving toward commercial-scale deployments of energy-generating systems, an important first step is accurate and unembellished assessment of the resource itself. In short, we must ask: how much energy might be available? The answer to this deceptively simple question depends, of course, on the technology itself - system efficiencies, for example - but it also depends on a variety of other limiting factors such as deployment strategies, environmental considerations, and the overall economics of MRE in the context of competing energy resources. While it is universally agreed that MRE development must occur within a framework of environmental stewardship, it is nonetheless inevitable that there will be trade-offs between absolute environmental protection and realizing the benefits of MRE implementation. As with solar-energy and wind-power technologies, MRE technologies interact with the environment in which they are deployed. Ecological, societal, and even physical resource concerns all require investigation and, in some cases, mitigation. Moreover, the converse - how will the environment affect the equipment? - presents technical challenges that have confounded the seagoing community forever. Biofouling, for example, will affect system efficiency and create significant maintenance and operations issues. Because this will also affect the economics of MRE, nonlinear interactions among the limiting factors complicate the overall issue of resource assessment significantly. While MRE technology development is largely an engineering task, resource assessment falls more to the oceanography community. Current and temperature structure measurements, for example, are critical for these efforts. Once again, however, the picture is complicated by the nature of the endeavor: deploying complex equipment of scales of tens of meters into a medium that is traditionally measured on scales of tens of kilometers implies a scale mismatch that must be overcome. The challenge, then, is to develop assessments of the resource on larger scales - so that the potential of the resource may be understood - while characterizing it on very small scales to be able to understand how equipment will be affected. Meeting this challenge will require both funding and time, but it will also result in new oceanographic insight and understanding.
Medical Informatics Education & Research in Greece.
Chouvarda, I; Maglaveras, N
2015-08-13
This paper aims to present an overview of the medical informatics landscape in Greece, to describe the Greek ehealth background and to highlight the main education and research axes in medical informatics, along with activities, achievements and pitfalls. With respect to research and education, formal and informal sources were investigated and information was collected and presented in a qualitative manner, including also quantitative indicators when possible. Greece has adopted and applied medical informatics education in various ways, including undergraduate courses in health sciences schools as well as multidisciplinary postgraduate courses. There is a continuous research effort, and large participation in EU-wide initiatives, in all the spectrum of medical informatics research, with notable scientific contributions, although technology maturation is not without barriers. Wide-scale deployment of eHealth is anticipated in the healthcare system in the near future. While ePrescription deployment has been an important step, ICT for integrated care and telehealth have a lot of room for further deployment. Greece is a valuable contributor in the European medical informatics arena, and has the potential to offer more as long as the barriers of research and innovation fragmentation are addressed and alleviated.
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...
2017-09-29
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
A Multiscale Surface Water Temperature Data Acquisition Platform: Tests on Lake Geneva, Switzerland
NASA Astrophysics Data System (ADS)
Barry, D. A.; Irani Rahaghi, A.; Lemmin, U.; Riffler, M.; Wunderle, S.
2015-12-01
An improved understanding of surface transport processes is necessary to predict sediment, pollutant and phytoplankton patterns in large lakes. Lake surface water temperature (LSWT), which varies in space and time, reflects meteorological and climatological forcing more than any other physical lake parameter. There are different data sources for LSWT mapping, including remote sensing and in situ measurements. Satellite data can be suitable for detecting large-scale thermal patterns, but not meso- or small scale processes. Lake surface thermography, investigated in this study, has finer resolution compared to satellite images. Thermography at the meso-scale provides the ability to ground-truth satellite imagery over scales of one to several satellite image pixels. On the other hand, thermography data can be used as a control in schemes to upscale local measurements that account for surface energy fluxes and the vertical energy budget. Independently, since such data can be collected at high frequency, they can be also useful in capturing changes in the surface signatures of meso-scale eddies and thus to quantify mixing processes. In the present study, we report results from a Balloon Launched Imaging and Monitoring Platform (BLIMP), which was developed in order to measure the LSWT at meso-scale. The BLIMP consists of a small balloon that is tethered to a boat and equipped with thermal and RGB cameras, as well as other instrumentation for location and communication. Several deployments were carried out on Lake Geneva. In a typical deployment, the BLIMP is towed by a boat, and collects high frequency data from different heights (i.e., spatial resolutions) and locations. Simultaneous ground-truthing of the BLIMP data is achieved using an autonomous craft that collects a variety of data, including in situ surface/near surface temperatures, radiation and meteorological data in the area covered by the BLIMP images. With suitable scaling, our results show good consistency between in situ, BLIMP and concurrent satellite data. In addition, the BLIMP thermography reveals (hydrodynamically-driven) structures in the LSWT - an obvious example being mixing of river discharges.
Wilcox, S.; Andreas, A.
2010-09-27
The U.S. Department of Energy's National Renewable Energy Laboratory collaborates with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Use of an innovative design mobile hospital in the medical response to Hurricane Katrina.
Blackwell, Thomas; Bosse, Michael
2007-05-01
On August 29, 2005, Hurricane Katrina caused widespread devastation to the Gulf Coast region of the United States. Although New Orleans had extensive damage from flooding, many communities in Mississippi had equal damage from storm surge and wind. Because the medical resources in many of these areas were incapacitated, resources from North Carolina were deployed to assist in the medical mission. This response included the initial use of Carolinas MED-1, a mobile hospital that incorporates an emergency department, surgical suite, critical care beds, and general treatment and admitting area. This asset, along with additional state resources, provided comprehensive diagnostic and definitive patient care until the local medical infrastructure was rebuilt and functional. The use of a mobile hospital may be advantageous for future deployments to large-scale disasters, especially when integrated with specialty teams.
Vázquez, Enrique
2017-01-01
Internet of Things platforms for Smart Cities are technologically complex and deploying them at large scale involves high costs and risks. Therefore, pilot schemes that allow validating proof of concepts, experimenting with different technologies and services, and fine-tuning them before migrating them to actual scenarios, are especially important in this context. The IoT platform deployed across the engineering schools of the Universidad Politécnica de Madrid in the Moncloa Campus of International Excellence represents a good example of a test bench for experimentation with Smart City services. This paper presents the main features of this platform, putting special emphasis on the technological challenges faced and on the solutions adopted, as well as on the functionality, services and potential that the platform offers. PMID:29292790
Alvarez-Campana, Manuel; López, Gregorio; Vázquez, Enrique; Villagrá, Víctor A; Berrocal, Julio
2017-12-08
Internet of Things platforms for Smart Cities are technologically complex and deploying them at large scale involves high costs and risks. Therefore, pilot schemes that allow validating proof of concepts, experimenting with different technologies and services, and fine-tuning them before migrating them to actual scenarios, are especially important in this context. The IoT platform deployed across the engineering schools of the Universidad Politécnica de Madrid in the Moncloa Campus of International Excellence represents a good example of a test bench for experimentation with Smart City services. This paper presents the main features of this platform, putting special emphasis on the technological challenges faced and on the solutions adopted, as well as on the functionality, services and potential that the platform offers.
NASA Astrophysics Data System (ADS)
Wilcox, Steve; Myers, Daryl
2009-08-01
The U.S. Department of Energy's National Renewable Energy Laboratory has embarked on a collaborative effort with the solar industry to establish high quality solar and meteorological measurements. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of concentrating solar thermal power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result will be high quality data sets to support the financing, design, and monitoring of large scale solar power projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.
Large deployable antenna program. Phase 1: Technology assessment and mission architecture
NASA Technical Reports Server (NTRS)
Rogers, Craig A.; Stutzman, Warren L.
1991-01-01
The program was initiated to investigate the availability of critical large deployable antenna technologies which would enable microwave remote sensing missions from geostationary orbits as required for Mission to Planet Earth. Program goals for the large antenna were: 40-meter diameter, offset-fed paraboloid, and surface precision of 0.1 mm rms. Phase 1 goals were: to review the state-of-the-art for large, precise, wide-scanning radiometers up to 60 GHz; to assess critical technologies necessary for selected concepts; to develop mission architecture for these concepts; and to evaluate generic technologies to support the large deployable reflectors necessary for these missions. Selected results of the study show that deployable reflectors using furlable segments are limited by surface precision goals to 12 meters in diameter, current launch vehicles can place in geostationary only a 20-meter class antenna, and conceptual designs using stiff reflectors are possible with areal densities of 2.4 deg/sq m.
AAFE large deployable antenna development program: Executive summary
NASA Technical Reports Server (NTRS)
1977-01-01
The large deployable antenna development program sponsored by the Advanced Applications Flight Experiments of the Langley Research Center is summarized. Projected user requirements for large diameter deployable reflector antennas were reviewed. Trade-off studies for the selection of a design concept for 10-meter diameter reflectors were made. A hoop/column concept was selected as the baseline concept. Parametric data are presented for 15-meter, 30-meter, and 100-meter diameters. A 1.82-meter diameter engineering model which demonstrated the feasiblity of the concept is described.
The AlpArray Seismic Network: current status and next steps
NASA Astrophysics Data System (ADS)
Hetényi, György; Molinari, Irene; Clinton, John; Kissling, Edi
2016-04-01
The AlpArray initiative (http://www.alparray.ethz.ch) is a large-scale European collaboration to study the entire Alpine orogen at high resolution and in 3D with a large variety of geoscientific methods. The core element of the initiative is an extensive and dense broadband seismological network, the AlpArray Seismic Network (AASN), which complements the permanent seismological stations to ensure homogeneous coverage of the greater Alpine area. The some 260 temporary stations of the AlpArray Seismic Network are operated as a joint effort by a number of institutions from Austria, Bosnia-Herzegovina, Croatia, Czech Republic, France, Germany, Hungary, Italy, Slovakia and Switzerland. The first stations were installed in Spring 2015 and the full AASN is planned to be operational by early Summer 2016. In this poster we present the actual status of the deployment, the effort undertaken by the contributing groups, station performance, typical noise levels, best practices in installation as well as in data management, often encountered challenges, and planned next steps including the deployment of ocean bottom seismometers in the Ligurian Sea.
Genetics of Resistant Hypertension: the Missing Heritability and Opportunities.
Teixeira, Samantha K; Pereira, Alexandre C; Krieger, Jose E
2018-05-19
Blood pressure regulation in humans has long been known to be a genetically determined trait. The identification of causal genetic modulators for this trait has been unfulfilling at the least. Despite the recent advances of genome-wide genetic studies, loci associated with hypertension or blood pressure still explain a very low percentage of the overall variation of blood pressure in the general population. This has precluded the translation of discoveries in the genetics of human hypertension to clinical use. Here, we propose the combined use of resistant hypertension as a trait for mapping genetic determinants in humans and the integration of new large-scale technologies to approach in model systems the multidimensional nature of the problem. New large-scale efforts in the genetic and genomic arenas are paving the way for an increased and granular understanding of genetic determinants of hypertension. New technologies for whole genome sequence and large-scale forward genetic screens can help prioritize gene and gene-pathways for downstream characterization and large-scale population studies, and guided pharmacological design can be used to drive discoveries to the translational application through better risk stratification and new therapeutic approaches. Although significant challenges remain in the mapping and identification of genetic determinants of hypertension, new large-scale technological approaches have been proposed to surpass some of the shortcomings that have limited progress in the area for the last three decades. The incorporation of these technologies to hypertension research may significantly help in the understanding of inter-individual blood pressure variation and the deployment of new phenotyping and treatment approaches for the condition.
Deployment and retraction of a cable-driven solar array: Testing and simulation
NASA Technical Reports Server (NTRS)
Kumar, P.; Pellegrino, S.
1995-01-01
The paper investigates three critical areas in cable-driven rigid-panel solar arrays: First, the variation of deployment and retraction cable tensions due to friction at the hinges; Second, the change in deployment dynamics associated with different deployment histories; Third, the relationship between the level of pre-tension in the closed contact loops and the synchronization of deployment. A small scale model array has been made and tested, and its behavior has been compared to numerical simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-05-09
The 21st Century Power Partnership (21CPP) aims to accelerate the global transformation of power systems. The Power Partnership is a multilateral effort of the Clean Energy Ministerial (CEM) and serves as a platform for public-private collaboration to advance integrated policy, regulatory, financial, and technical solutions for the large-scale deployment of renewable energy in combination with deep energy efficiency and smart grid solutions. This fact sheet details the 21CPP's work in India.
Yu, Dongliang; Yin, Min; Lu, Linfeng; Zhang, Hanzhong; Chen, Xiaoyuan; Zhu, Xufei; Che, Jianfei; Li, Dongdong
2015-11-01
High-performance thin-film hydrogenated amorphous silicon solar cells are achieved by combining macroscale 3D tubular substrates and nanoscaled 3D cone-like antireflective films. The tubular geometry delivers a series of advantages for large-scale deployment of photovoltaics, such as omnidirectional performance, easier encapsulation, decreased wind resistance, and easy integration with a second device inside the glass tube. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology.
Oliver, Ruth Y; Ellis, Daniel P W; Chmura, Helen E; Krause, Jesse S; Pérez, Jonathan H; Sweet, Shannan K; Gough, Laura; Wingfield, John C; Boelman, Natalie T
2018-06-01
Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape's snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change.
Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems
Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.
2014-01-01
The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545
Luck, Kyle A; Shastry, Tejas A; Loser, Stephen; Ogien, Gabriel; Marks, Tobin J; Hersam, Mark C
2013-12-28
Organic photovoltaics have the potential to serve as lightweight, low-cost, mechanically flexible solar cells. However, losses in efficiency as laboratory cells are scaled up to the module level have to date impeded large scale deployment. Here, we report that a 3-aminopropyltriethoxysilane (APTES) cathode interfacial treatment significantly enhances performance reproducibility in inverted high-efficiency PTB7:PC71BM organic photovoltaic cells, as demonstrated by the fabrication of 100 APTES-treated devices versus 100 untreated controls. The APTES-treated devices achieve a power conversion efficiency of 8.08 ± 0.12% with histogram skewness of -0.291, whereas the untreated controls achieve 7.80 ± 0.26% with histogram skewness of -1.86. By substantially suppressing the interfacial origins of underperforming cells, the APTES treatment offers a pathway for fabricating large-area modules with high spatial performance uniformity.
Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.
Demchak, Barry; Krüger, Ingolf
2012-07-01
The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.
A simple dynamic subgrid-scale model for LES of particle-laden turbulence
NASA Astrophysics Data System (ADS)
Park, George Ilhwan; Bassenne, Maxime; Urzay, Javier; Moin, Parviz
2017-04-01
In this study, a dynamic model for large-eddy simulations is proposed in order to describe the motion of small inertial particles in turbulent flows. The model is simple, involves no significant computational overhead, contains no adjustable parameters, and is flexible enough to be deployed in any type of flow solvers and grids, including unstructured setups. The approach is based on the use of elliptic differential filters to model the subgrid-scale velocity. The only model parameter, which is related to the nominal filter width, is determined dynamically by imposing consistency constraints on the estimated subgrid energetics. The performance of the model is tested in large-eddy simulations of homogeneous-isotropic turbulence laden with particles, where improved agreement with direct numerical simulation results is observed in the dispersed-phase statistics, including particle acceleration, local carrier-phase velocity, and preferential-concentration metrics.
Toward a theoretical framework for trustworthy cyber sensing
NASA Astrophysics Data System (ADS)
Xu, Shouhuai
2010-04-01
Cyberspace is an indispensable part of the economy and society, but has been "polluted" with many compromised computers that can be abused to launch further attacks against the others. Since it is likely that there always are compromised computers, it is important to be aware of the (dynamic) cyber security-related situation, which is however challenging because cyberspace is an extremely large-scale complex system. Our project aims to investigate a theoretical framework for trustworthy cyber sensing. With the perspective of treating cyberspace as a large-scale complex system, the core question we aim to address is: What would be a competent theoretical (mathematical and algorithmic) framework for designing, analyzing, deploying, managing, and adapting cyber sensor systems so as to provide trustworthy information or input to the higher layer of cyber situation-awareness management, even in the presence of sophisticated malicious attacks against the cyber sensor systems?
General circulation of the South Atlantic between 5 deg N and 35 deg S
NASA Technical Reports Server (NTRS)
Ollitrault, Michel; Mercier, H.; Blanc, F.; Letraon, L. Y.
1991-01-01
The TOPEX/POSEIDON altimeter will provide the temporal mean seal level. So, secondly, we propose to compute the difference between these two surfaces (mean sea level minus general circulation dynamic topography). The result will be an estimate of the marine geoid, which is time invariant for the 5-year period under consideration. If this geoid is precise enough, it will permit a description of seasonal variability of the large-scale surface circulation. If there happens to be enough float data, it may be possible to infer the first vertical modes of this variability. Thus the main goal of our investigation is to determine the 3-D general circulation of the South Atlantic and the large-scale seasonal fluctuations. This last objective, however, may be restricted to the western part of the South Atlantic because float deployments have been scheduled only in the Brasil basin.
Analyzing large scale genomic data on the cloud with Sparkhit
Huang, Liren; Krüger, Jan
2018-01-01
Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074
Early opportunities of CO₂ geological storage deployment in coal chemical industry in China
Wei, Ning; Li, Xiaochun; Liu, Shengnan; ...
2014-12-31
Carbon dioxide capture and geological storage (CCS) is regarded as a promising option for climate change mitigation; however, the high capture cost is the major barrier to large-scale deployment of CCS technologies. High-purity CO₂ emission sources can reduce or even avoid the capture requirements and costs. Among these high-purity CO₂ sources, certain coal chemical industry processes are very important, especially in China. In this paper, the basic characteristics of coal chemical industries in China is investigated and analyzed. As of 2013 there were more than 100 coal chemical plants in operation. These emission sources together emit 430 million tons CO₂more » per year, of which about 30% are emit high-purity and pure CO₂ (CO₂ concentration >80% and >98.5% respectively). Four typical source-sink pairs are chosen for techno-economic evaluation, including site screening and selection, source-sink matching, concept design, and economic evaluation. The technical-economic evaluation shows that the levelized cost of a CO₂ capture and aquifer storage project in the coal chemistry industry ranges from 14 USD/t to 17 USD/t CO₂. When a 15USD/t CO₂ tax and 20USD/t for CO₂ sold to EOR are considered, the levelized cost of CCS project are negative, which suggests a benefit from some of these CCS projects. This might provide China early opportunities to deploy and scale-up CCS projects in the near future.« less
NASA Technical Reports Server (NTRS)
Mengshoel, Ole Jakob; Poll, Scott; Kurtoglu, Tolga
2009-01-01
In this paper, we investigate the use of Bayesian networks to construct large-scale diagnostic systems. In particular, we consider the development of large-scale Bayesian networks by composition. This compositional approach reflects how (often redundant) subsystems are architected to form systems such as electrical power systems. We develop high-level specifications, Bayesian networks, clique trees, and arithmetic circuits representing 24 different electrical power systems. The largest among these 24 Bayesian networks contains over 1,000 random variables. Another BN represents the real-world electrical power system ADAPT, which is representative of electrical power systems deployed in aerospace vehicles. In addition to demonstrating the scalability of the compositional approach, we briefly report on experimental results from the diagnostic competition DXC, where the ProADAPT team, using techniques discussed here, obtained the highest scores in both Tier 1 (among 9 international competitors) and Tier 2 (among 6 international competitors) of the industrial track. While we consider diagnosis of power systems specifically, we believe this work is relevant to other system health management problems, in particular in dependable systems such as aircraft and spacecraft. (See CASI ID 20100021910 for supplemental data disk.)
Trego, Lori L; Jordan, Patricia J
2010-01-01
To determine military women's attitudes toward menstruation and menstrual suppression with oral contraceptives in the deployed environment. A cross-sectional descriptive design with the administration of the Menstrual Attitude Questionnaire (MAQ) and the 55-item Military Women's Attitudes Towards Menstrual Suppression Scale (MWATMS) to a convenience sample (n = 278) of women in the U.S. Army with deployment experience. The MAQ's five subscales' mean scores ranged from 3.4 (+/-1.11) to 5.1 (+/-1.06), indicating neutral to moderate attitudes toward menstruation. Measurement development on the MWATMS produced a nine-item scale with three components: stress effects, benefits to self, and environmental barriers. Menstrual attitudes were generally neutral in this sample; however, military women favor menstrual suppression during deployment owing to the effects of stress during deployment, benefits that suppression would provide, and the barriers to menstrual hygiene in the deployed environment. Women who perceived menstruation as bothersome and debilitating had positive attitudes toward menstrual suppression. These findings can contribute to appropriate predeployment women's health care and improve the readiness for deployment in female soldiers. Providers should educate women on the risks and benefits of menstrual suppression methods and provide guidance on impact that the deployed environment can have on their menstrual experiences.
A controlled field pilot for testing near surface CO2 detection techniques and transport models
Spangler, L.H.; Dobeck, L.M.; Repasky, K.; Nehrir, A.; Humphries, S.; Keith, C.; Shaw, J.; Rouse, J.; Cunningham, A.; Benson, S.; Oldenburg, C.M.; Lewicki, J.L.; Wells, A.; Diehl, R.; Strazisar, B.; Fessenden, J.; Rahn, Thomas; Amonette, J.; Barr, J.; Pickles, W.; Jacobson, J.; Silver, E.; Male, E.; Rauch, H.; Gullickson, K.; Trautz, R.; Kharaka, Y.; Birkholzer, J.; Wielopolski, L.
2009-01-01
A field facility has been developed to allow controlled studies of near surface CO2 transport and detection technologies. The key component of the facility is a shallow, slotted horizontal well divided into six zones. The scale and fluxes were designed to address large scale CO2 storage projects and desired retention rates for those projects. A wide variety of detection techniques were deployed by collaborators from 6 national labs, 2 universities, EPRI, and the USGS. Additionally, modeling of CO2 transport and concentrations in the saturated soil and in the vadose zone was conducted. An overview of these results will be presented. ?? 2009 Elsevier Ltd. All rights reserved.
Anders, Katherine L; Indriani, Citra; Ahmad, Riris Andono; Tantowijoyo, Warsito; Arguni, Eggi; Andari, Bekti; Jewell, Nicholas P; Rances, Edwige; O'Neill, Scott L; Simmons, Cameron P; Utarini, Adi
2018-05-31
Dengue and other arboviruses transmitted by Aedes aegypti mosquitoes, including Zika and chikungunya, present an increasing public health challenge in tropical regions. Current vector control strategies have failed to curb disease transmission, but continue to be employed despite the absence of robust evidence for their effectiveness or optimal implementation. The World Mosquito Program has developed a novel approach to arbovirus control using Ae. aegypti stably transfected with Wolbachia bacterium, with a significantly reduced ability to transmit dengue, Zika and chikungunya in laboratory experiments. Modelling predicts this will translate to local elimination of dengue in most epidemiological settings. This study protocol describes the first trial to measure the efficacy of Wolbachia in reducing dengue virus transmission in the field. The study is a parallel, two-arm, non-blinded cluster randomised controlled trial conducted in a single site in Yogyakarta, Indonesia. The aim is to determine whether large-scale deployment of Wolbachia-infected Ae. aegypti mosquitoes leads to a measurable reduction in dengue incidence in treated versus untreated areas. The primary endpoint is symptomatic, virologically confirmed dengue virus infection of any severity. The 26 km 2 study area was subdivided into 24 contiguous clusters, allocated randomly 1:1 to receive Wolbachia deployments or no intervention. We use a novel epidemiological study design, the cluster-randomised test-negative design trial, in which dengue cases and arbovirus-negative controls are sampled concurrently from among febrile patients presenting to a network of primary care clinics, with case or control status classified retrospectively based on the results of laboratory diagnostic testing. Efficacy is estimated from the odds ratio of Wolbachia exposure distribution (probability of living in a Wolbachia-treated area) among virologically confirmed dengue cases compared to test-negative controls. A secondary per-protocol analysis allows for individual Wolbachia exposure levels to be assessed to account for movements outside the cluster and the heterogeneity in local Wolbachia prevalence among treated clusters. The findings from this study will provide the first experimental evidence for the efficacy of Wolbachia in reducing dengue incidence. Together with observational evidence that is accumulating from pragmatic deployments of Wolbachia in other field sites, this will provide valuable data to estimate the effectiveness of this novel approach to arbovirus control, inform future cost-effectiveness estimates, and guide plans for large-scale deployments in other endemic settings. ClinicalTrials.gov, identifier: NCT03055585 . Registered on 14 February 2017.
Full-Scale Crash Test of a MD-500 Helicopter with Deployable Energy Absorbers
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Jackson, Karen E.; Littell, Justin D.
2010-01-01
A new externally deployable energy absorbing system was demonstrated during a full-scale crash test of an MD-500 helicopter. The deployable system is a honeycomb structure and utilizes composite materials in its construction. A set of two Deployable Energy Absorbers (DEAs) were fitted on the MD-500 helicopter for the full-scale crash demonstration. Four anthropomorphic dummy occupants were also used to assess human survivability. A demonstration test was performed at NASA Langley's Landing and Impact Research Facility (LandIR). The test involved impacting the helicopter on a concrete surface with combined forward and vertical velocity components of 40-ft/s and 26-ft/s, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of dynamic finite element simulations. Descriptions of this test as well as other component and full-scale tests leading to the helicopter test are discussed. Acceleration data from the anthropomorphic dummies showed that dynamic loads were successfully attenuated to within non-injurious levels. Moreover, the airframe itself survived the relatively severe impact and was retested to provide baseline data for comparison for cases with and without DEAs.
Monitoring soil water dynamics at 0.1-1000 m scales using active DTS: the MOISST experience
NASA Astrophysics Data System (ADS)
Sayde, C.; Moreno, D.; Legrand, C.; Dong, J.; Steele-Dunne, S. C.; Ochsner, T. E.; Selker, J. S.
2014-12-01
The Actively Heated Fiber Optics (AHFO) method can measure soil water content at high temporal (<1hr) and spatial (every 0.25 m) resolutions along buried fiber optics (FO) cables multiple kilometers in length. As observed by Sayde et al. 2014, this unprecedented density of measurements captures soil water dynamics over four orders of magnitude in spatial scale (0.1-1000 m), bridging the gap between point scale measurements and large scale remote sensing. 4900 m of FO sensing cables were installed at the MOISST experimental site in Stillwater, Ok. The FO cables were deployed at 3 depths: 5, 10, and 15 cm. In this system the FO sensing system provides measurements of soil moisture at >39,000 locations simultaneously for each heat pulse. Six soil monitoring stations along the fiber optic path were installed to provide additional validation and calibration of the AHFO data. Gravimetric soil moisture and soil thermal samplings were performed periodically to provide additional distributed validation and calibration of the DTS data. In this work we present the preliminary results of this experiment. We will also address the experience learned from this large scale deployment of the AHFO method. In particular, we will present the in-situ soil moisture calibration method developed to tackle the calibration challenges associated with the high spatial heterogeneity of the soil physical and thermal properties. The material is based upon work supported by NASA under award NNX12AP58G, with equipment and assistance also provided by CTEMPs.org with support from the National Science Foundation under Grant Number 1129003. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of NASA or the National Science Foundation. Sayde, C., J. Benitez Buelga, L. Rodriguez-Sinobas, L. El Khoury, M. English, N. van de Giesen, and J.S. Selker (2014). Mapping Variability of Soil Water Content and Flux across 1-1,000 m scales using the Actively Heated Fiber Optic Method, Accepted for publication in Water Resour. Res.
NASA Astrophysics Data System (ADS)
Kerkez, B.; Zhang, Z.; Oroza, C.; Glaser, S. D.; Bales, R. C.
2012-12-01
We describe our improved, robust, and scalable architecture by which to rapidly instrument large-scale watersheds, while providing the resulting data in real-time. Our system consists of more than twenty wireless sensor networks and thousands of sensors, which will be deployed in the American River basin (5000 sq. km) of California. The core component of our system is known as a mote, a tiny, ultra-low-power, embedded wireless computer that can be used for any number of sensing applications. Our new generation of motes is equipped with IPv6 functionality, effectively giving each sensor in the field its own unique IP address, thus permitting users to remotely interact with the devices without going through intermediary services. Thirty to fifty motes will be deployed across 1-2 square kilometer regions to form a mesh-based wireless sensor network. Redundancy of local wireless links will ensure that data will always be able to traverse the network, even if hash wintertime conditions adversely affect some network nodes. These networks will be used to develop spatial estimates of a number of hydrologic parameters, focusing especially on snowpack. Each wireless sensor network has one main network controller, which is responsible with interacting with an embedded Linux computer to relay information across higher-powered, long-range wireless links (cell modems, satellite, WiFi) to neighboring networks and remote, offsite servers. The network manager is also responsible for providing an Internet connection to each mote. Data collected by the sensors can either be read directly by remote hosts, or stored on centralized servers for future access. With 20 such networks deployed in the American River, our system will comprise an unprecedented cyber-physical architecture for measuring hydrologic parameters in large-scale basins. The spatiotemporal density and real-time nature of the data is also expected to significantly improve operational hydrology and water resource management in the basin.
Sheehan, Emma V.; Stevens, Timothy F.; Attrill, Martin J.
2010-01-01
Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a “flying array” that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms−1 current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment. PMID:21206748
Sheehan, Emma V; Stevens, Timothy F; Attrill, Martin J
2010-12-29
Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a "flying array" that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms⁻¹ current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment.
NASA Technical Reports Server (NTRS)
Fernandez, Juan M.
2017-01-01
State of the art deployable structures are mainly being designed for medium to large size satellites. The lack of reliable deployable structural systems for low cost, small volume, rideshare-class spacecraft severely constrains the potential for using small satellite platforms for affordable deep space science and exploration precursor missions that could be realized with solar sails. There is thus a need for reliable, lightweight, high packaging efficiency deployable booms that can serve as the supporting structure for a wide range of small satellite systems including solar sails for propulsion. The National Air and Space Administration (NASA) is currently investing in the development of a new class of advanced deployable shell-based composite booms to support future deep space small satellite missions using solar sails. The concepts are being designed to: meet the unique requirements of small satellites, maximize ground testability, permit the use of low-cost manufacturing processes that will benefit scalability, be scalable for use as elements of hierarchical structures (e.g. trusses), allow long duration storage, have high deployment reliability, and have controlled deployment behavior and predictable deployed dynamics. This paper will present the various rollable boom concepts that are being developed for 5-20 m class size deployable structures that include solar sails with the so-called High Strain Composites (HSC) materials. The deployable composite booms to be presented are being developed to expand the portfolio of available rollable booms for small satellites and maximize their length for a given packaged volume. Given that solar sails are a great example of volume and mass optimization, the booms were designed to comply with nominal solar sail system requirements for 6U CubeSats, which are a good compromise between those of smaller form factors (1U, 2U and 3U CubeSats) and larger ones (12 U and 27 U future CubeSats, and ESPA-class microsatellites). Solar sail missions for such composite boom systems are already under consideration and development at NASA, as well as mission studies that will benefit from planned scaled-up versions of the composite boom technologies to be introduced. The paper presents ongoing research and development of thin-shell rollable composite booms designed under the particular stringent and challenging system requirements of relatively large solar sails housed on small satellites. These requirements will be derived and listed. Several new boom concepts are proposed and other existing ones are improved upon using thin-ply composite materials to yield unprecedented compact deployable structures. Some of these booms are shown in Fig. 1. For every boom to be introduced the scalable fabrication process developed to keep the overall boom system cost down will be shown. Finally, the initial results of purposely designed boom structural characterization test methods with gravity off-loading will be presented to compare their structural performance under expected and general load cases.
NASA Astrophysics Data System (ADS)
Yang, Junnan; Li, Xiaoyuan; Peng, Wei; Wagner, Fabian; Mauzerall, Denise L.
2018-06-01
Solar photovoltaic (PV) electricity generation can greatly reduce both air pollutant and greenhouse gas emissions compared to fossil fuel electricity generation. The Chinese government plans to greatly scale up solar PV installation between now and 2030. However, different PV development pathways will influence the range of air quality and climate benefits. Benefits depend on how much electricity generated from PV is integrated into power grids and the type of power plant displaced. Using a coal-intensive power sector projection as the base case, we estimate the climate, air quality, and related human health benefits of various 2030 PV deployment scenarios. We use the 2030 government goal of 400 GW installed capacity but vary the location of PV installation and the extent of inter-provincial PV electricity transmission. We find that deploying distributed PV in the east with inter-provincial transmission maximizes potential CO2 reductions and air quality-related health benefits (4.2% and 1.2% decrease in national total CO2 emissions and air pollution-related premature deaths compared to the base case, respectively). Deployment in the east with inter-provincial transmission results in the largest benefits because it maximizes displacement of the dirtiest coal-fired power plants and minimizes PV curtailment, which is more likely to occur without inter-provincial transmission. We further find that the maximum co-benefits achieved with deploying PV in the east and enabling inter-provincial transmission are robust under various maximum PV penetration levels in both provincial and regional grids. We find large potential benefits of policies that encourage distributed PV deployment and facilitate inter-provincial PV electricity transmission in China.
Autonomous Sensors for Large Scale Data Collection
NASA Astrophysics Data System (ADS)
Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.
2017-12-01
Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the ground, to compliment the CubeSat data.
2D wireless sensor network deployment based on Centroidal Voronoi Tessellation
NASA Astrophysics Data System (ADS)
Iliodromitis, Athanasios; Pantazis, George; Vescoukis, Vasileios
2017-06-01
In recent years, Wireless Sensor Networks (WSNs) have rapidly evolved and now comprise a powerful tool in monitoring and observation of the natural environment, among other fields. The use of WSNs is critical in early warning systems, which are of high importance today. In fact, WSNs are adopted more and more in various applications, e.g. for fire or deformation detection. The optimum deployment of sensors is a multi-dimensional problem, which has two main components; network and positioning approach. Although lots of work has dealt with the issue, most of it emphasizes on mere network approach (communication, energy consumption) and not on the topography (positioning) of the sensors in achieving ideal geometry. In some cases, it is hard or even impossible to achieve perfect geometry in nodes' deployment. The ideal and desirable scenario of nodes arranged in square or hexagonal grid would raise extremely the cost of the network, especially in unfriendly or hostile environments. In such environments the positions of the sensors have to be chosen among a list of possible points, which in most cases are randomly distributed. This constraint has to be taken under consideration during the WSN planning. Full geographical coverage is in some applications of the same, if not of greater, importance than the network coverage. Cost is a crucial factor at network planning and given that resources are often limited, what matters, is to cover the whole area with the minimum number of sensors. This paper suggests a deployment method for nodes, in large scale and high density WSNs, based on Centroidal Voronoi Tessellation (CVT). It approximates the solution through the geometry of the random points and proposes a deployment plan, for the given characteristics of the study area, in order to achieve a deployment as near as possible to the ideal one.
Agana, Bernard A; Reeve, Darrell; Orbell, John D
2013-01-15
This study presents the application of an integrated water management strategy at two large Australian manufacturing companies that are contrasting in terms of their respective products. The integrated strategy, consisting of water audit, pinch analysis and membrane process application, was deployed in series to systematically identify water conservation opportunities. Initially, a water audit was deployed to completely characterize all water streams found at each production site. This led to the development of a water balance diagram which, together with water test results, served as a basis for subsequent enquiry. After the water audit, commercially available water pinch software was utilized to identify possible water reuse opportunities, some of which were subsequently implemented on site. Finally, utilizing a laboratory-scale test rig, membrane processes such as UF, NF and RO were evaluated for their suitability to treat the various wastewater streams. The membranes tested generally showed good contaminant rejection rates, slow flux decline rates, low energy usage and were well suited for treatment of specific wastewater streams. The synergy between the various components of this strategy has the potential to reduce substantial amounts of Citywater consumption and wastewater discharge across a diverse range of large manufacturing companies. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Deployable antenna phase A study
NASA Technical Reports Server (NTRS)
Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.
1979-01-01
Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.
ICE911 Research: Preserving and Rebuilding Reflective Ice
NASA Astrophysics Data System (ADS)
Field, L. A.; Chetty, S.; Manzara, A.; Venkatesh, S.
2014-12-01
We have developed a localized surface albedo modification technique that shows promise as a method to increase reflective multi-year ice using floating materials, chosen so as to have low subsidiary environmental impact. It is now well-known that multi-year reflective ice has diminished rapidly in the Arctic over the past 3 decades and this plays a part in the continuing rapid decrease of summer-time ice. As summer-time bright ice disappears, the Arctic is losing its ability to reflect summer insolation, and this has widespread climatic effects, as well as a direct effect on sea level rise, as oceans heat and once-land-based ice melts into the sea. We have tested the albedo modification technique on a small scale over six Winter/Spring seasons at sites including California's Sierra Nevada Mountains, a Canadian lake, and a small man-made lake in Minnesota, using various materials and an evolving array of instrumentation. The materials can float and can be made to minimize effects on marine habitat and species. The instrumentation is designed to be deployed in harsh and remote locations. Localized snow and ice preservation, and reductions in water heating, have been quantified in small-scale testing. We have continued to refine our material and deployment approaches, and we have had laboratory confirmation by NASA. In the field, the materials were successfully deployed to shield underlying snow and ice from melting; applications of granular materials remained stable in the face of local wind and storms. We are evaluating the effects of snow and ice preservation for protection of infrastructure and habitat stabilization, and we are concurrently developing our techniques to aid in water conservation. Localized albedo modification options such as those being studied in this work may act to preserve ice, glaciers, permafrost and seasonal snow areas, and perhaps aid natural ice formation processes. If this method is deployed on a large enough scale, it could conceivably bring about a reduction in the Ice-Albedo Feedback Effect, possibly slowing one of the key effects and factors in climate change.
Jacobson, Isabel G; Smith, Tyler C; Smith, Besa; Keel, Pamela K; Amoroso, Paul J; Wells, Timothy S; Bathalon, Gaston P; Boyko, Edward J; Ryan, Margaret A K
2009-02-15
The effect of military deployments to combat environments on disordered eating and weight changes is unknown. Using longitudinal data from Millennium Cohort Study participants who completed baseline (2001-2003) and follow-up (2004-2006) questionnaires (n=48,378), the authors investigated new-onset disordered eating and weight changes in a large military cohort. Multivariable logistic regression was used to compare these outcomes among those who deployed and reported combat exposures, those who deployed but did not report combat exposures, and those who did not deploy in support of the wars in Iraq and Afghanistan. Deployment was not significantly associated with new-onset disordered eating in women or men, after adjustment for baseline demographic, military, and behavioral characteristics. However, in subgroup comparison analyses of deployers, deployed women reporting combat exposures were 1.78 times more likely to report new-onset disordered eating (95% confidence interval: 1.02, 3.11) and 2.35 times more likely to lose 10% or more of their body weight compared with women who deployed but did not report combat exposures (95% confidence interval: 1.17, 4.70). Despite no significant overall association between deployment and disordered eating and weight changes, deployed women reporting combat exposures represent a subgroup at higher risk for developing eating problems and weight loss.
The role of Natural Flood Management in managing floods in large scale basins during extreme events
NASA Astrophysics Data System (ADS)
Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David
2016-04-01
There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water in large scale basins in the future. The broader benefits of engineering landscapes to hold water for pollution control, sediment loss and drought minimisation will also be shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.
Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less
Molecular Simulations of Graphene-Based Electric Double-Layer Capacitors
NASA Astrophysics Data System (ADS)
Kalluri, Raja K.; Konatham, Deepthi; Striolo, Alberto
2011-03-01
Towards deploying renewable energy sources it is crucial to develop efficient and cost-effective technologies to store electricity. Traditional batteries are plagued by a number of practical problems that at present limit their widespread applicability. One possible solution is represented by electric double-layer capacitors (EDLCs). To deploy EDLCs at the large scale it is necessary to better understand how electrolytes pack and diffuse within narrow charged pores. We present here simulation results for the concentrated aqueous solutions of NaCl, CsCl, and NaI confined within charged graphene-based porous materials. We discuss how the structure of confined water, the salt concentration, the ions size, and the surface charge density determine the accumulation of electrolytes within the porous network. Our results, compared to data available for bulk systems, are critical for relating macroscopic observations to molecular-level properties of the confined working fluids. Research supported by the Department of Energy.
Wind Turbines in the Built Environment: Summary of a Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tinnesand, Heidi; Baring-Gould, Ian; Fields, Jason
2016-09-28
Built-environment wind turbine (BEWT) projects are wind energy projects that are constructed on, in, or near buildings. These projects present an opportunity for distributed, low-carbon generation combined with highly visible statements on sustainability, but the BEWT niche of the wind industry is still developing and is relatively less mature than the utility-scale wind or conventional ground-based distributed wind sectors. The findings presented in this presentation cannot be extended to wind energy deployments in general because of the large difference in application and technology maturity. This presentation summarizes the results of a report investigating the current state of the BEWT industrymore » by reviewing available literature on BEWT projects as well as interviewing project owners on their experiences deploying and operating the technology. The authors generated a series of case studies that outlines the pertinent project details, project outcomes, and lessons learned.« less
Secure Cryptographic Key Management System (CKMS) Considerations for Smart Grid Devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Aldridge, Hal
2011-01-01
In this paper, we examine some unique challenges associated with key management in the Smart Grid and concomitant research initiatives: 1) effectively model security requirements and their implementations, and 2) manage keys and key distribution for very large scale deployments such as Smart Meters over a long period of performance. This will set the stage to: 3) develop innovative, low cost methods to protect keying material, and 4) provide high assurance authentication services. We will present our perspective on key management and will discuss some key issues within the life cycle of a cryptographic key designed to achieve the following:more » 1) control systems designed, installed, operated, and maintained to survive an intentional cyber assault with no loss of critical function, and 2) widespread implementation of methods for secure communication between remote access devices and control centers that are scalable and cost-effective to deploy.« less
Chip-based quantum key distribution
NASA Astrophysics Data System (ADS)
Sibson, P.; Erven, C.; Godfrey, M.; Miki, S.; Yamashita, T.; Fujiwara, M.; Sasaki, M.; Terai, H.; Tanner, M. G.; Natarajan, C. M.; Hadfield, R. H.; O'Brien, J. L.; Thompson, M. G.
2017-02-01
Improvement in secure transmission of information is an urgent need for governments, corporations and individuals. Quantum key distribution (QKD) promises security based on the laws of physics and has rapidly grown from proof-of-concept to robust demonstrations and deployment of commercial systems. Despite these advances, QKD has not been widely adopted, and large-scale deployment will likely require chip-based devices for improved performance, miniaturization and enhanced functionality. Here we report low error rate, GHz clocked QKD operation of an indium phosphide transmitter chip and a silicon oxynitride receiver chip--monolithically integrated devices using components and manufacturing processes from the telecommunications industry. We use the reconfigurability of these devices to demonstrate three prominent QKD protocols--BB84, Coherent One Way and Differential Phase Shift--with performance comparable to state-of-the-art. These devices, when combined with integrated single photon detectors, pave the way for successfully integrating QKD into future telecommunications networks.
Advances in on-line drinking water quality monitoring and early warning systems.
Storey, Michael V; van der Gaag, Bram; Burns, Brendan P
2011-01-01
Significant advances have been made in recent years in technologies to monitor drinking water quality for source water protection, treatment operations, and distribution system management, in the event of accidental (or deliberate) contamination. Reports prepared through the Global Water Research Coalition (GWRC) and United States Environment Protection Agency (USEPA) agree that while many emerging technologies show promise, they are still some years from being deployed on a large scale. Further underpinning their viability is a need to interpret data in real time and implement a management strategy in response. This review presents the findings of an international study into the state of the art in this field. These results are based on visits to leading water utilities, research organisations and technology providers throughout Europe, the United States and Singapore involved in the development and deployment of on-line monitoring technology for the detection of contaminants in water. Copyright © 2010 Elsevier Ltd. All rights reserved.
Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.; ...
2017-12-06
Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less
Chip-based quantum key distribution
Sibson, P.; Erven, C.; Godfrey, M.; Miki, S.; Yamashita, T.; Fujiwara, M.; Sasaki, M.; Terai, H.; Tanner, M. G.; Natarajan, C. M.; Hadfield, R. H.; O'Brien, J. L.; Thompson, M. G.
2017-01-01
Improvement in secure transmission of information is an urgent need for governments, corporations and individuals. Quantum key distribution (QKD) promises security based on the laws of physics and has rapidly grown from proof-of-concept to robust demonstrations and deployment of commercial systems. Despite these advances, QKD has not been widely adopted, and large-scale deployment will likely require chip-based devices for improved performance, miniaturization and enhanced functionality. Here we report low error rate, GHz clocked QKD operation of an indium phosphide transmitter chip and a silicon oxynitride receiver chip—monolithically integrated devices using components and manufacturing processes from the telecommunications industry. We use the reconfigurability of these devices to demonstrate three prominent QKD protocols—BB84, Coherent One Way and Differential Phase Shift—with performance comparable to state-of-the-art. These devices, when combined with integrated single photon detectors, pave the way for successfully integrating QKD into future telecommunications networks. PMID:28181489
Accidental Turbulent Discharge Rate Estimation from Videos
NASA Astrophysics Data System (ADS)
Ibarra, Eric; Shaffer, Franklin; Savaş, Ömer
2015-11-01
A technique to estimate the volumetric discharge rate in accidental oil releases using high speed video streams is described. The essence of the method is similar to PIV processing, however the cross correlation is carried out on the visible features of the efflux, which are usually turbulent, opaque and immiscible. The key step in the process is to perform a pixelwise time filtering on the video stream, in which the parameters are commensurate with the scales of the large eddies. The velocity field extracted from the shell of visible features is then used to construct an approximate velocity profile within the discharge. The technique has been tested on laboratory experiments using both water and oil jets at Re ~105 . The technique is accurate to 20%, which is sufficient for initial responders to deploy adequate resources for containment. The software package requires minimal user input and is intended for deployment on an ROV in the field. Supported by DOI via NETL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Travis, Bryan; Sauer, Jeremy; Dubey, Manvendra
2017-02-24
FIGS is a neural network software that ingests real time synchronized field data on environmental flow fields and turbulence and gas concentration variations at high frequency and uses an error minimization algorithm to locate the gas source and quantify its strength. The software can be interfaced with atmospheric, oceanic and subsurface instruments in a variety of platforms stationary or mobile (e.g. cars, UAVs, submersible vehicles or boreholes) and used to find gas sources by smart use of data and phenomenology. FIGS can be trained by phenomenological model of the flow fields in the environment of interest and/or be calibrated bymore » controlled release. After initial deployment the FIGS learning will grow with time as it accumulates data on source quantification. FIGS can be installed on any computer from small beagle-bones for field deployment/end-use to PC/MACs/main-frame for training/analysis. FIGS has been trained (using LANL's high resolution atmospheric simulations) and calibrated, tested and evaluated in the field and shown to perform well in finding and quantifying methane leaks at 10-100m scales at well pads by ingesting atmospheric measurements. The code is applicable to gas and particle source location at large scales.« less
Rodosta, T.; Litynski, J.; Plasynski, S.; Spangler, L.; Finley, R.; Steadman, E.; Ball, D.; Gerald, H.; McPherson, B.; Burton, E.; Vikara, D.
2011-01-01
The U.S. Department of Energy (DOE) is the lead federal agency for the development and deployment of carbon sequestration technologies. The Regional Carbon Sequestration Partnerships (RCSPs) are the mechanism DOE utilizes to prove the technology and to develop human capital, stakeholder networks, information for regulatory policy, best practices documents and training to work toward the commercialization of carbon capture and storage (CCS). The RCSPs are tasked with determining the most suitable technologies, regulations, and infrastructure for carbon capture, transport, and storage in their respective geographic areas of responsibility. The seven partnerships include more than 400 state agencies, universities, national laboratories, private companies, and environmental organizations, spanning 43 states and four Canadian provinces. The Regional Partnerships Initiative is being implemented in three phases: Characterization, Validation, and Development. The initial Characterization Phase began in 2003 and was completed in 2005 and focused on characterization of CO2 storage potential within each region. It was followed by the Validation Phase, which began in 2005 and is nearing completion in 2011. The focus of the Validation Phase has been on small-scale field tests throughout the seven partnerships in various formation types such as saline, oil-bearing, and coal seams. The Validation Phase has characterized suitable CO2 storage reservoirs and identified the need for comprehensive legal and regulatory frameworks to enable commercial-scale CCS deployment. Finally, the Development Phase will consist of a series of large-scale, one-million-ton, injection tests throughout the United States and Canada. The objective of these large-scale tests is to identify the regulatory path or challenges in permitting CCS projects, to demonstrate the technology can inject CO2 safely, and to verify its permanence in geologic formations in preparation for the commercialization of geologic sequestration. ?? 2010 Elsevier Ltd. All rights reserved. ?? 2011 Published by Elsevier Ltd.
Micro-sensors for in-situ meteorological measurements
NASA Technical Reports Server (NTRS)
Crisp, David; Kaiser, William J.; Vanzandt, Thomas R.; Tillman, James E.
1993-01-01
Improved in-situ meteorological measurements are needed for monitoring the weather and climate of the terrestrial and Martian atmospheres. We have initiated a program to assess the feasibility and utility of micro-sensors for precise in-situ meteorological measurements in these environments. Sensors are being developed for measuring pressure, temperature, wind velocity, humidity, and aerosol amounts. Silicon micro-machining and large scale integration technologies are being used to make sensors that are small, rugged, lightweight, and require very little power. Our long-term goal is to develop very accurate miniaturized sensors that can be incorporated into complete instrument packages or 'micro weather stations,' and deployed on a variety of platforms. If conventional commercially available silicon production techniques can be used to fabricate these sensor packages, it will eventually be possible to mass-produce them at low cost. For studies of the Earth's troposphere and stratosphere, they could be deployed on aircraft, dropsondes, radiosondes, or autonomous surface stations at remote sites. Improved sensor accuracy and reduced sensor cost are the primary challenges for these applications. For studies of the Martian atmosphere, these sensor packages could be incorporated into the small entry probes and surface landers that are being planned for the Mars Environmental SURvey (MESUR) Mission. That decade-long program will deploy a global network of small stations on the Martian surface for monitoring meteorological and geological processes. Low mass, low power, durability, large dynamic range and calibration stability are the principal challenges for this application. Our progress on each of these sensor types is presented.
Experiences with the ALICE Mesos infrastructure
NASA Astrophysics Data System (ADS)
Berzano, D.; Eulisse, G.; Grigoraş, C.; Napoli, K.
2017-10-01
Apache Mesos is a resource management system for large data centres, initially developed by UC Berkeley, and now maintained under the Apache Foundation umbrella. It is widely used in the industry by companies like Apple, Twitter, and Airbnb and it is known to scale to 10 000s of nodes. Together with other tools of its ecosystem, such as Mesosphere Marathon or Metronome, it provides an end-to-end solution for datacenter operations and a unified way to exploit large distributed systems. We present the experience of the ALICE Experiment Offline & Computing in deploying and using in production the Apache Mesos ecosystem for a variety of tasks on a small 500 cores cluster, using hybrid OpenStack and bare metal resources. We will initially introduce the architecture of our setup and its operation, we will then describe the tasks which are performed by it, including release building and QA, release validation, and simple Monte Carlo production. We will show how we developed Mesos enabled components (called “Mesos Frameworks”) to carry out ALICE specific needs. In particular, we will illustrate our effort to integrate Work Queue, a lightweight batch processing engine developed by University of Notre Dame, which ALICE uses to orchestrate release validation. Finally, we will give an outlook on how to use Mesos as resource manager for DDS, a software deployment system developed by GSI which will be the foundation of the system deployment for ALICE next generation Online-Offline (O2).
Spatio-temporal Eigenvector Filtering: Application on Bioenergy Crop Impacts
NASA Astrophysics Data System (ADS)
Wang, M.; Kamarianakis, Y.; Georgescu, M.
2017-12-01
A suite of 10-year ensemble-based simulations was conducted to investigate the hydroclimatic impacts due to large-scale deployment of perennial bioenergy crops across the continental United States. Given the large size of the simulated dataset (about 60Tb), traditional hierarchical spatio-temporal statistical modelling cannot be implemented for the evaluation of physics parameterizations and biofuel impacts. In this work, we propose a filtering algorithm that takes into account the spatio-temporal autocorrelation structure of the data while avoiding spatial confounding. This method is used to quantify the robustness of simulated hydroclimatic impacts associated with bioenergy crops to alternative physics parameterizations and observational datasets. Results are evaluated against those obtained from three alternative Bayesian spatio-temporal specifications.
Piezoelectric Polymers Actuators for Precise Shape Control of Large Scale Space Antennas
NASA Technical Reports Server (NTRS)
Chen, Qin; Natale, Don; Neese, Bret; Ren, Kailiang; Lin, Minren; Zhang, Q. M.; Pattom, Matthew; Wang, K. W.; Fang, Houfei; Im, Eastwood
2007-01-01
Extremely large, lightweight, in-space deployable active and passive microwave antennas are demanded by future space missions. This paper investigates the development of PVDF based piezopolymer actuators for controlling the surface accuracy of a membrane reflector. Uniaxially stretched PVDF films were poled using an electrodeless method which yielded high quality poled piezofilms required for this application. To further improve the piezoperformance of piezopolymers, several PVDF based copolymers were examined. It was found that one of them exhibits nearly three times improvement in the in-plane piezoresponse compared with PVDF and P(VDF-TrFE) piezopolymers. Preliminary experimental results indicate that these flexible actuators are very promising in controlling precisely the shape of the space reflectors.
U.S. Department of Energy's Regional Carbon Sequestration Partnership Program: Overview
Litynski, J.; Plasynski, S.; Spangler, L.; Finley, R.; Steadman, E.; Ball, D.; Nemeth, K.J.; McPherson, B.; Myer, L.
2009-01-01
The U.S. Department of Energy (DOE) has formed a nationwide network of seven regional partnerships to help determine the best approaches for capturing and permanently storing gases that can contribute to global climate change. The Regional Carbon Sequestration Partnerships (RCSPs) are tasked with determining the most suitable technologies, regulations, and infrastructure for carbon capture, transport, and storage in their areas of the country and parts of Canada. The seven partnerships include more than 350 state agencies, universities, national laboratories, private companies, and environmental organizations, spanning 42 states, two Indian nations, and four Canadian provinces. The Regional Partnerships initiative is being implemented in three phases: ???Characterization Phase (2003-2005): The objective was to collect data on CO2 sources and sinks and develop the human capital to support and enable future carbon sequestration field tests and deployments. The completion of this Phase was marked by release of the Carbon Sequestration Atlas of the United States and Canada-Version 1 which included a common methodology for capacity assessment and reported over 3,000GT of storage capacity in saline formations, depleted oil and gas fields, and coal seams.???Validation Phase (2005-2009): The objective is to plan and implement small-scale (<1??million tons CO2) field testing of storage technologies in areas determined to be favorable for carbon storage. The partnerships are currently conducting over 20 small-scale geologic field tests and 11 terrestrial field tests.???Development Phase (2008-2018): The primary objective is the development of large-scale (>1??million tons of CO2) Carbon Capture and Storage (CCS) projects, which will demonstrate that large volumes of CO2 can be injected safely, permanently, and economically into geologic formations representative of large storage capacity. Even though the RCSP Program is being implemented in three phases, it should be viewed as an integrated whole, with many of the goals and objectives transitioning from one phase to the next. Accomplishments and results from the Characterization Phase have helped to refine goals and activities in the Validation and Deployment Phases. The RCSP Program encourages and requires open information sharing among its members by sponsoring both general workshops and meetings to facilitate information exchange. Although each RCSP has its own objectives and field tests, mutual cooperation has been an important part of the Program thus far. The primary goal of the RCSP initiative is to promote the development of a regional framework and the infrastructure necessary to validate and deploy carbon sequestration technologies within each Partnership's region. ?? 2009 Elsevier Ltd. All rights reserved.
Režek Jambrak, Anet; Šimunek, Marina; Grbeš, Franjo; Mandura, Ana; Djekic, Ilija
2018-04-01
The objective of this paper was to demonstrate application of quality function deployment in analysing effects of high power ultrasound on quality properties of apple juices and nectars. In order to develop a quality function deployment model, joint with instrumental analysis of treated samples, a field survey was performed to identify consumer preferences towards quality characteristics of juices/nectar. Based on field research, the three most important characteristics were 'taste' and 'aroma' with 28.5% of relative absolute weight importance, followed by 'odour' (16.9%). The quality function deployment model showed that the top three 'quality scores' for apple juice were treatments with amplitude 90 µm, 9 min treatment time and sample temperature 40 °C; 60 µm, 9 min, 60 °C; and 90 µm, 6 min, 40 °C. For nectars, the top three were treatments 120 µm, 9 min, 20 °C; 60 µm, 9 min, 60 °C; and A2.16 60 µm, 9 min, 20 °C. This type of quality model enables a more complex measure of large scale of different quality parameters. Its simplicity should be understood as its practical advantage and, as such, this tool can be a part of design quality when using novel preservation technologies. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Policies to Support Wind Power Deployment: Key Considerations and Good Practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sadie; Tegen, Suzanne; Baring-Gould, Ian
2015-05-19
Policies have played an important role in scaling up wind deployment and increasing its economic viability while also supporting country-specific economic, social, and environmental development goals. Although wind power has become cost-competitive in several contexts, challenges to wind power deployment remain. Within the context of country-specific goals and challenges, policymakers are seeking
Development of a verification program for deployable truss advanced technology
NASA Technical Reports Server (NTRS)
Dyer, Jack E.
1988-01-01
Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.
Test Frame for Gravity Offload Systems
NASA Technical Reports Server (NTRS)
Murray, Alexander R.
2005-01-01
Advances in space telescope and aperture technology have created a need to launch larger structures into space. Traditional truss structures will be too heavy and bulky to be effectively used in the next generation of space-based structures. Large deployable structures are a possible solution. By packaging deployable trusses, the cargo volume of these large structures greatly decreases. The ultimate goal is to three dimensionally measure a boom's deployment in simulated microgravity. This project outlines the construction of the test frame that supports a gravity offload system. The test frame is stable enough to hold the gravity offload system and does not interfere with deployment of, or vibrations in, the deployable test boom. The natural frequencies and stability of the frame were engineered in FEMAP. The test frame was developed to have natural frequencies that would not match the first two modes of the deployable beam. The frame was then modeled in Solidworks and constructed. The test frame constructed is a stable base to perform studies on deployable structures.
Recent developments in deployment analysis simulation using a multi-body computer code
NASA Technical Reports Server (NTRS)
Housner, Jerrold M.
1989-01-01
Deployment is a candidate mode for construction of structural space systems components. By its very nature, deployment is a dynamic event, often involving large angle unfolding of flexible beam members. Validation of proposed designs and conceptual deployment mechanisms is enhanced through analysis. Analysis may be used to determine member loads thus helping to establish deployment rates and deployment control requirements for a given concept. Futhermore, member flexibility, joint free-play, manufacturing tolerances, and imperfections can affect the reliability of deployment. Analyses which include these effects can aid in reducing risks associated with a particular concept. Ground tests which can play a similar role to that of analyses are difficult and expensive to perform. Suspension systems just for vibration ground tests of large space structures in a 1 g environment present many challenges. Suspension of a structure which spatially expands is even more challenging. Analysis validation through experimental confirmation on relatively small simple models would permit analytical extrapolation to larger more complex space structures.
Industrial biomanufacturing: The future of chemical production.
Clomburg, James M; Crumbley, Anna M; Gonzalez, Ramon
2017-01-06
The current model for industrial chemical manufacturing employs large-scale megafacilities that benefit from economies of unit scale. However, this strategy faces environmental, geographical, political, and economic challenges associated with energy and manufacturing demands. We review how exploiting biological processes for manufacturing (i.e., industrial biomanufacturing) addresses these concerns while also supporting and benefiting from economies of unit number. Key to this approach is the inherent small scale and capital efficiency of bioprocesses and the ability of engineered biocatalysts to produce designer products at high carbon and energy efficiency with adjustable output, at high selectivity, and under mild process conditions. The biological conversion of single-carbon compounds represents a test bed to establish this paradigm, enabling rapid, mobile, and widespread deployment, access to remote and distributed resources, and adaptation to new and changing markets. Copyright © 2017, American Association for the Advancement of Science.
Harrington, Rebecca M.; Kwiatek, Grzegorz; Moran, Seth C.
2015-01-01
We analyze a group of 6073 low-frequency earthquakes recorded during a week-long temporary deployment of broadband seismometers at distances of less than 3 km from the crater at Mount St. Helens in September of 2006. We estimate the seismic moment (M0) and spectral corner frequency (f0) using a spectral ratio approach for events with a high signal-to-noise (SNR) ratio that have a cross-correlation coefficient of 0.8 or greater with at least five other events. A cluster analysis of cross-correlation values indicates that the group of 421 events meeting the SNR and cross-correlation criteria forms eight event families that exhibit largely self-similar scaling. We estimate the M0 and f0 values of the 421 events and calculate their static stress drop and scaled energy (ER/M0) values. The estimated values suggest self-similar scaling within families, as well as between five of eight families (i.e., and constant). We speculate that differences in scaled energy values for the two families with variable scaling may result from a lack of resolution in the velocity model. The observation of self-similar scaling is the first of its kind for such a large group of low-frequency volcanic tectonic events occurring during a single active dome extrusion eruption.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frew, Bethany; Mai, Trieu; Krishnan, Venkat
2016-12-01
In this study, we use the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) capacity expansion model to estimate utility-scale photovoltaic (UPV) deployment trends from present day through 2030. The analysis seeks to inform the U.S. Bureau of Land Management's (BLM's) planning activities related to UPV development on federal lands in Nevada as part of the Resource Management Plan (RMP) revision for the Las Vegas and Pahrump field offices. These planning activities include assessing the demand for new or expanded additional Solar Energy Zones (SEZ), per the process outlined in BLM's Western Solar Plan process.
Development of the Aquarius Antenna Deployment Mechanisms and Spring/Damper Actuator
NASA Technical Reports Server (NTRS)
Johnson, Joel A.
2008-01-01
The Aquarius Instrument s large radar reflector dish needed to be stowed for launch, and then deployed on-orbit. The Deployment Subsystem consisted of a cantilevered boom structure and two single-axis hinge mechanisms to accurately deploy and position the reflector dish relative to the radar feed horns. The cantilevered design demanded high stiffness and accuracy from the deployment mechanism at the root of the boom. A preload-generating end-of-travel latch was also required. To largely eliminate the need for control systems, each deployment mechanism was actuated by a passive spring motor with viscous-fluid damping. Tough requirements and adaptation of a heritage actuator to the new application resulted in numerous challenges. Fabrication, assembly, and testing encountered additional problems, though ultimately the system was demonstrated very successfully. This paper revisits the development to highlight which design concepts worked and the many important lessons learned.
Application of a New Infrasound Sensor Technology in a Long Range Infrasound Propagation Experiment
NASA Astrophysics Data System (ADS)
Talmadge, C. L.; Waxler, R.; Hetzer, C. H.; Kleniert, D. E., Jr.; Dillion, K.; Assink, J.; Aydin, A.
2009-12-01
A low-cost ruggedized infrasound sensor has been developed at the NCPA laboratory of the University of Mississippi for outdoor infrasound measurements. This sensor has similar performance characteristics to other "standard" infrasound sensors, such as the Chaparral 50. A total of 50 sensors were constructed for this experiment, of which 42 were deployed on the Nevada and Utah desert for a period of four months. A long-range infrasound propagation experiment using these sensors was performed during the summer and fall of 2009. Source sizes varied in size from 4, 20 and 80 equivalent tons of TNT. The blasts were carried out typically on the Monday of each week in the afternoon, and were part of a scheduled demolition of first, second and third stages of trident missiles. In addition to a source capture location 23-km south of the site of the blasts, a series of 8 5-element arrays are located to the west of the blast location, at approximate ranges of 180 through 250 km in 10-km steps. Each array consisted of elements at -150-m, -50-m, 0-m, 50-m and 150-m relative to the center of the array along an east-west direction, and all microphones were equipped with 4 50-ft porous hoses connected to the microphone manifold for wind noise suppression. The signals from the microphones were digitized using GPS-synchronized, 24-bit DAQ systems. A Westerly direction for the deployment of the microphones was motivated by the presence of a strong stratospheric duct that persists through the summer months in the northern hemisphere at these latitudes. In this paper, we will discuss feasibility issues related the design of the NCPA microphone that makes possible deployments on these on large scales. Signal to noise issues related to temperature and wind fluctuations will also be discussed. Future plans include a larger scale deployment of several hundred microphones during 2010. We will discuss how the lessons learned from this series of measurements impacts that future deployment.
Qvist, Staffan A; Brook, Barry W
2015-01-01
There is an ongoing debate about the deployment rates and composition of alternative energy plans that could feasibly displace fossil fuels globally by mid-century, as required to avoid the more extreme impacts of climate change. Here we demonstrate the potential for a large-scale expansion of global nuclear power to replace fossil-fuel electricity production, based on empirical data from the Swedish and French light water reactor programs of the 1960s to 1990s. Analysis of these historical deployments show that if the world built nuclear power at no more than the per capita rate of these exemplar nations during their national expansion, then coal- and gas-fired electricity could be replaced worldwide in less than a decade. Under more conservative projections that take into account probable constraints and uncertainties such as differing relative economic output across regions, current and past unit construction time and costs, future electricity demand growth forecasts and the retiring of existing aging nuclear plants, our modelling estimates that the global share of fossil-fuel-derived electricity could be replaced within 25-34 years. This would allow the world to meet the most stringent greenhouse-gas mitigation targets.
Medical Informatics Education & Research in Greece
Chouvarda, I.
2015-01-01
Summary Objectives This paper aims to present an overview of the medical informatics landscape in Greece, to describe the Greek ehealth background and to highlight the main education and research axes in medical informatics, along with activities, achievements and pitfalls. Methods With respect to research and education, formal and informal sources were investigated and information was collected and presented in a qualitative manner, including also quantitative indicators when possible. Results Greece has adopted and applied medical informatics education in various ways, including undergraduate courses in health sciences schools as well as multidisciplinary postgraduate courses. There is a continuous research effort, and large participation in EU-wide initiatives, in all the spectrum of medical informatics research, with notable scientific contributions, although technology maturation is not without barriers. Wide-scale deployment of eHealth is anticipated in the healthcare system in the near future. While ePrescription deployment has been an important step, ICT for integrated care and telehealth have a lot of room for further deployment. Conclusions Greece is a valuable contributor in the European medical informatics arena, and has the potential to offer more as long as the barriers of research and innovation fragmentation are addressed and alleviated. PMID:26123910
Origami tubes assembled into stiff, yet reconfigurable structures and metamaterials.
Filipov, Evgueni T; Tachi, Tomohiro; Paulino, Glaucio H
2015-10-06
Thin sheets have long been known to experience an increase in stiffness when they are bent, buckled, or assembled into smaller interlocking structures. We introduce a unique orientation for coupling rigidly foldable origami tubes in a "zipper" fashion that substantially increases the system stiffness and permits only one flexible deformation mode through which the structure can deploy. The flexible deployment of the tubular structures is permitted by localized bending of the origami along prescribed fold lines. All other deformation modes, such as global bending and twisting of the structural system, are substantially stiffer because the tubular assemblages are overconstrained and the thin sheets become engaged in tension and compression. The zipper-coupled tubes yield an unusually large eigenvalue bandgap that represents the unique difference in stiffness between deformation modes. Furthermore, we couple compatible origami tubes into a variety of cellular assemblages that can enhance mechanical characteristics and geometric versatility, leading to a potential design paradigm for structures and metamaterials that can be deployed, stiffened, and tuned. The enhanced mechanical properties, versatility, and adaptivity of these thin sheet systems can provide practical solutions of varying geometric scales in science and engineering.
Origami tubes assembled into stiff, yet reconfigurable structures and metamaterials
Filipov, Evgueni T.; Tachi, Tomohiro; Paulino, Glaucio H.
2015-01-01
Thin sheets have long been known to experience an increase in stiffness when they are bent, buckled, or assembled into smaller interlocking structures. We introduce a unique orientation for coupling rigidly foldable origami tubes in a “zipper” fashion that substantially increases the system stiffness and permits only one flexible deformation mode through which the structure can deploy. The flexible deployment of the tubular structures is permitted by localized bending of the origami along prescribed fold lines. All other deformation modes, such as global bending and twisting of the structural system, are substantially stiffer because the tubular assemblages are overconstrained and the thin sheets become engaged in tension and compression. The zipper-coupled tubes yield an unusually large eigenvalue bandgap that represents the unique difference in stiffness between deformation modes. Furthermore, we couple compatible origami tubes into a variety of cellular assemblages that can enhance mechanical characteristics and geometric versatility, leading to a potential design paradigm for structures and metamaterials that can be deployed, stiffened, and tuned. The enhanced mechanical properties, versatility, and adaptivity of these thin sheet systems can provide practical solutions of varying geometric scales in science and engineering. PMID:26351693
Origami tubes assembled into stiff, yet reconfigurable structures and metamaterials
NASA Astrophysics Data System (ADS)
Filipov, Evgueni T.; Tachi, Tomohiro; Paulino, Glaucio H.
2015-10-01
Thin sheets have long been known to experience an increase in stiffness when they are bent, buckled, or assembled into smaller interlocking structures. We introduce a unique orientation for coupling rigidly foldable origami tubes in a "zipper" fashion that substantially increases the system stiffness and permits only one flexible deformation mode through which the structure can deploy. The flexible deployment of the tubular structures is permitted by localized bending of the origami along prescribed fold lines. All other deformation modes, such as global bending and twisting of the structural system, are substantially stiffer because the tubular assemblages are overconstrained and the thin sheets become engaged in tension and compression. The zipper-coupled tubes yield an unusually large eigenvalue bandgap that represents the unique difference in stiffness between deformation modes. Furthermore, we couple compatible origami tubes into a variety of cellular assemblages that can enhance mechanical characteristics and geometric versatility, leading to a potential design paradigm for structures and metamaterials that can be deployed, stiffened, and tuned. The enhanced mechanical properties, versatility, and adaptivity of these thin sheet systems can provide practical solutions of varying geometric scales in science and engineering.
Sathre, Roger; Masanet, Eric
2012-09-04
To understand the long-term energy and climate implications of different implementation strategies for carbon capture and storage (CCS) in the US coal-fired electricity fleet, we integrate three analytical elements: scenario projection of energy supply systems, temporally explicit life cycle modeling, and time-dependent calculation of radiative forcing. Assuming continued large-scale use of coal for electricity generation, we find that aggressive implementation of CCS could reduce cumulative greenhouse gas emissions (CO(2), CH(4), and N(2)O) from the US coal-fired power fleet through 2100 by 37-58%. Cumulative radiative forcing through 2100 would be reduced by only 24-46%, due to the front-loaded time profile of the emissions and the long atmospheric residence time of CO(2). The efficiency of energy conversion and carbon capture technologies strongly affects the amount of primary energy used but has little effect on greenhouse gas emissions or radiative forcing. Delaying implementation of CCS deployment significantly increases long-term radiative forcing. This study highlights the time-dynamic nature of potential climate benefits and energy costs of different CCS deployment pathways and identifies opportunities and constraints of successful CCS implementation.
Fault-Tolerant Algorithms for Connectivity Restoration in Wireless Sensor Networks.
Zeng, Yali; Xu, Li; Chen, Zhide
2015-12-22
As wireless sensor network (WSN) is often deployed in a hostile environment, nodes in the networks are prone to large-scale failures, resulting in the network not working normally. In this case, an effective restoration scheme is needed to restore the faulty network timely. Most of existing restoration schemes consider more about the number of deployed nodes or fault tolerance alone, but fail to take into account the fact that network coverage and topology quality are also important to a network. To address this issue, we present two algorithms named Full 2-Connectivity Restoration Algorithm (F2CRA) and Partial 3-Connectivity Restoration Algorithm (P3CRA), which restore a faulty WSN in different aspects. F2CRA constructs the fan-shaped topology structure to reduce the number of deployed nodes, while P3CRA constructs the dual-ring topology structure to improve the fault tolerance of the network. F2CRA is suitable when the restoration cost is given the priority, and P3CRA is suitable when the network quality is considered first. Compared with other algorithms, these two algorithms ensure that the network has stronger fault-tolerant function, larger coverage area and better balanced load after the restoration.
Self-Deployable Membrane Structures
NASA Technical Reports Server (NTRS)
Sokolowski, Witold M.; Willis, Paul B.; Tan, Seng C.
2010-01-01
Currently existing approaches for deployment of large, ultra-lightweight gossamer structures in space rely typically upon electromechanical mechanisms and mechanically expandable or inflatable booms for deployment and to maintain them in a fully deployed, operational configuration. These support structures, with the associated deployment mechanisms, launch restraints, inflation systems, and controls, can comprise more than 90 percent of the total mass budget. In addition, they significantly increase the stowage volume, cost, and complexity. A CHEM (cold hibernated elastic memory) membrane structure without any deployable mechanism and support booms/structure is deployed by using shape memory and elastic recovery. The use of CHEM micro-foams reinforced with carbon nanotubes is considered for thin-membrane structure applications. In this advanced structural concept, the CHEM membrane structure is warmed up to allow packaging and stowing prior to launch, and then cooled to induce hibernation of the internal restoring forces. In space, the membrane remembers its original shape and size when warmed up. After the internal restoring forces deploy the structure, it is then cooled to achieve rigidization. For this type of structure, the solar radiation could be utilized as the heat energy used for deployment and space ambient temperature for rigidization. The overall simplicity of the CHEM self-deployable membrane is one of its greatest assets. In present approaches to space-deployable structures, the stow age and deployment are difficult and challenging, and introduce a significant risk, heavy mass, and high cost. Simple procedures provided by CHEM membrane greatly simplify the overall end-to-end process for designing, fabricating, deploying, and rigidizing large structures. The CHEM membrane avoids the complexities associated with other methods for deploying and rigidizing structures by eliminating deployable booms, deployment mechanisms, and inflation and control systems that can use up the majority of the mass budget
Large-N Seismic Deployment at the Source Physics Experiment (SPE) Site
NASA Astrophysics Data System (ADS)
Chen, T.; Snelson, C. M.; Mellors, R. J.; Pitarka, A.
2015-12-01
The Source Physics Experiment (SPE) is multi-institutional and multi-disciplinary project that consists of a series of chemical explosion experiments at the Nevada National Security Site. The goal of SPE is to understand the complicated effect of earth structures on source energy partitioning and seismic wave propagation, develop and validate physics-based monitoring, and ultimately better discriminate low-yield nuclear explosions from background seismicity. Deployment of a large number of seismic sensors is planned for SPE to image the full 3-D wavefield with about 500 three-component sensors and 500 vertical component sensors. This large-N seismic deployment will operate near the site of SPE-5 shot for about one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources. This presentation focuses on the design of the large-N seismic deployment. We show how we optimized the sensor layout based on the geological structure and experiment goals with a limited number of sensors. In addition, we will also show some preliminary record sections from deployment. This work was conducted under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy.
Social Acceptance of Wind Energy: Managing and Evaluating Its Market Impacts (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baring-Gould, I.
2012-06-01
As with any industrial-scale technology, wind power has impacts. As wind technology deployment becomes more widespread, a defined opposition will form as a result of fear of change and competing energy technologies. As the easy-to-deploy sites are developed, the costs of developing at sites with deployment barriers will increase, therefore increasing the total cost of power. This presentation provides an overview of wind development stakeholders and related stakeholder engagement questions, Energy Department activities that provide wind project deployment information, and the quantification of deployment barriers and costs in the continental United States.
NDE application of ultrasonic tomography to a full-scale concrete structure.
Choi, Hajin; Popovics, John S
2015-06-01
Newly developed ultrasonic imaging technology for large concrete elements, based on tomographic reconstruction, is presented. The developed 3-D internal images (velocity tomograms) are used to detect internal defects (polystyrene foam and pre-cracked concrete prisms) that represent structural damage within a large steel reinforced concrete element. A hybrid air-coupled/contact transducer system is deployed. Electrostatic air-coupled transducers are used to generate ultrasonic energy and contact accelerometers are attached on the opposing side of the concrete element to detect the ultrasonic pulses. The developed hybrid testing setup enables collection of a large amount of high-quality, through-thickness ultrasonic data without surface preparation to the concrete. The algebraic reconstruction technique is used to reconstruct p-wave velocity tomograms from the obtained time signal data. A comparison with a one-sided ultrasonic imaging method is presented for the same specimen. Through-thickness tomography shows some benefit over one-sided imaging for highly reinforced concrete elements. The results demonstrate that the proposed through-thickness ultrasonic technique shows great potential for evaluation of full-scale concrete structures in the field.
CORALINA: a universal method for the generation of gRNA libraries for CRISPR-based screening.
Köferle, Anna; Worf, Karolina; Breunig, Christopher; Baumann, Valentin; Herrero, Javier; Wiesbeck, Maximilian; Hutter, Lukas H; Götz, Magdalena; Fuchs, Christiane; Beck, Stephan; Stricker, Stefan H
2016-11-14
The bacterial CRISPR system is fast becoming the most popular genetic and epigenetic engineering tool due to its universal applicability and adaptability. The desire to deploy CRISPR-based methods in a large variety of species and contexts has created an urgent need for the development of easy, time- and cost-effective methods enabling large-scale screening approaches. Here we describe CORALINA (comprehensive gRNA library generation through controlled nuclease activity), a method for the generation of comprehensive gRNA libraries for CRISPR-based screens. CORALINA gRNA libraries can be derived from any source of DNA without the need of complex oligonucleotide synthesis. We show the utility of CORALINA for human and mouse genomic DNA, its reproducibility in covering the most relevant genomic features including regulatory, coding and non-coding sequences and confirm the functionality of CORALINA generated gRNAs. The simplicity and cost-effectiveness make CORALINA suitable for any experimental system. The unprecedented sequence complexities obtainable with CORALINA libraries are a necessary pre-requisite for less biased large scale genomic and epigenomic screens.
Eisenbach, Markus
2017-01-01
A major impediment to deploying next-generation high-performance computational systems is the required electrical power, often measured in units of megawatts. The solution to this problem is driving the introduction of novel machine architectures, such as those employing many-core processors and specialized accelerators. In this article, we describe the use of a hybrid accelerated architecture to achieve both reduced time to solution and the associated reduction in the electrical cost for a state-of-the-art materials science computation.
Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.
Development of deployable structures for large space platforms. Volume 2: Design development
NASA Technical Reports Server (NTRS)
Greenberg, H. S.
1983-01-01
Design evolution, test article design, test article mass properties, and structural analysis of deployable platform systems are discussed. Orbit transfer vehicle (OTV) hangar development, OTV hangar concept selection, and manned module development are discussed. Deployable platform systems requirements, material data base, technology development needs, concept selection and deployable volume enclosures are also discussed.
ERIC Educational Resources Information Center
Donaldson, Krista M.; Chen, Helen L.; Toye, George; Clark, Mia; Sheppard, Sheri D.
2008-01-01
The Academic Pathways of People Learning Engineering Survey (APPLES) was deployed for a second time in spring 2008 to undergraduate engineering students at 21 US universities. The goal of the second deployment of APPLES was to corroborate and extend findings from the Academic Pathways Study (APS; 2003-2007) and the first deployment of APPLES…
Security Issues in Cross-Organizational Peer-to-Peer Applications and Some Solutions
NASA Astrophysics Data System (ADS)
Gupta, Ankur; Awasthi, Lalit K.
Peer-to-Peer networks have been widely used for sharing millions of terabytes of content, for large-scale distributed computing and for a variety of other novel applications, due to their scalability and fault-tolerance. However, the scope of P2P networks has somehow been limited to individual computers connected to the internet. P2P networks are also notorious for blatant copyright violations and facilitating several kinds of security attacks. Businesses and large organizations have thus stayed away from deploying P2P applications citing security loopholes in P2P systems as the biggest reason for non-adoption. In theory P2P applications can help fulfill many organizational requirements such as collaboration and joint projects with other organizations, access to specialized computing infrastructure and finally accessing the specialized information/content and expert human knowledge available at other organizations. These potentially beneficial interactions necessitate that the research community attempt to alleviate the security shortcomings in P2P systems and ensure their acceptance and wide deployment. This research paper therefore examines the security issues prevalent in enabling cross-organizational P2P interactions and provides some technical insights into how some of these issues can be resolved.
A Flux-Pinning Mechanism for Segment Assembly and Alignment
NASA Technical Reports Server (NTRS)
Gersh-Range, Jessica A.; Arnold, William R.; Peck, Mason A.; Stahl, H. Philip
2011-01-01
Currently, the most compelling astrophysics questions include how planets and the first stars formed and whether there are protostellar disks that contain large organic molecules. Although answering these questions requires space telescopes with apertures of at least 10 meters, such large primaries are challenging to construct by scaling up previous designs; the limited capacity of a launch vehicle bounds the maximum diameter of a monolithic primary, and beyond a certain size, deployable telescopes cannot fit in current launch vehicle fairings. One potential solution is connecting the primary mirror segments edgewise using flux-pinning mechanisms, which are analogous to non-contacting damped springs. In the baseline design, a flux-pinning mechanism consists of a magnet and a superconductor separated by a predetermined gap, with the damping adjusted by placing aluminum near the interface. Since flux pinning is possible only when the superconductor is cooled below a critical temperature, flux-pinning mechanisms are uniquely suited for cryogenic space telescopes. By placing these mechanisms along the edges of the mirror segments, a primary can be built up over time. Since flux pinning requires no mechanical deployments, the assembly process could be robotic or use some other non-contacting scheme. Advantages of this approach include scalability and passive stability.
Optimizing Cluster Heads for Energy Efficiency in Large-Scale Heterogeneous Wireless Sensor Networks
Gu, Yi; Wu, Qishi; Rao, Nageswara S. V.
2010-01-01
Many complex sensor network applications require deploying a large number of inexpensive and small sensors in a vast geographical region to achieve quality through quantity. Hierarchical clustering is generally considered as an efficient and scalable way to facilitate the management and operation of such large-scale networks and minimize the total energy consumption for prolonged lifetime. Judicious selection of cluster heads for data integration and communication is critical to the success of applications based on hierarchical sensor networks organized as layered clusters. We investigate the problem of selecting sensor nodes in a predeployed sensor network to be the cluster heads tomore » minimize the total energy needed for data gathering. We rigorously derive an analytical formula to optimize the number of cluster heads in sensor networks under uniform node distribution, and propose a Distance-based Crowdedness Clustering algorithm to determine the cluster heads in sensor networks under general node distribution. The results from an extensive set of experiments on a large number of simulated sensor networks illustrate the performance superiority of the proposed solution over the clustering schemes based on k -means algorithm.« less
NASA Astrophysics Data System (ADS)
Waggoner, L. A.; Capalbo, S. M.; Talbott, J.
2007-05-01
Within the Big Sky region, including Montana, Idaho, South Dakota, Wyoming and the Pacific Northwest, industry is developing new coal-fired power plants using the abundant coal and other fossil-based resources. Of crucial importance to future development programs are robust carbon mitigation plans that include a technical and economic assessment of regional carbon sequestration opportunities. The objective of the Big Sky Carbon Sequestration Partnership (BSCSP) is to promote the development of a regional framework and infrastructure required to validate and deploy carbon sequestration technologies. Initial work compiled sources and potential sinks for carbon dioxide (CO2) in the Big Sky Region and developed the online Carbon Atlas. Current efforts couple geologic and terrestrial field validation tests with market assessments, economic analysis and regulatory and public outreach. The primary geological efforts are in the demonstration of carbon storage in mafic/basalt formations, a geology not yet well characterized but with significant long-term storage potential in the region and other parts of the world; and in the Madison Formation, a large carbonate aquifer in Wyoming and Montana. Terrestrial sequestration relies on management practices and technologies to remove atmospheric CO2 to storage in trees, plants, and soil. This indirect sequestration method can be implemented today and is on the front-line of voluntary, market-based approaches to reduce CO2 emissions. Details of pilot projects are presented including: new technologies, challenges and successes of projects and potential for commercial-scale deployment.
Early opportunities of CO2 geological storage deployment in coal chemical industry in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Ning; Li, Xiaochun; Liu, Shengnan
2014-11-12
Abstract: Carbon dioxide capture and geological storage (CCS) is regarded as a promising option for climate change mitigation; however, the high capture cost is the major barrier to large-scale deployment of CCS technologies. High-purity CO2 emission sources can reduce or even avoid the capture requirements and costs. Among these high-purity CO2 sources, certain coal chemical industry processes are very important, especially in China. In this paper, the basic characteristics of coal chemical industries in China is investigated and analyzed. As of 2013 there were more than 100 coal chemical plants in operation or in late planning stages. These emission sourcesmore » together emit 430 million tons CO2 per year, of which about 30% are emit high-purity and pure CO2 (CO2 concentration >80% and >99% respectively).Four typical source-sink pairs are studied by a techno-economic evaluation, including site screening and selection, source-sink matching, concept design, and experienced economic evaluation. The technical-economic evaluation shows that the levelized cost of a CO2 capture and aquifer storage project in the coal chemistry industry ranges from 14 USD/t to 17 USD/t CO2. When a 15USD/t CO2 tax and 15USD/t for CO2 sold to EOR are considered, the levelized cost of CCS project are negative, which suggests a net economic benefit from some of these CCS projects. This might provide China early opportunities to deploy and scale-up CCS projects in the near future.« less
NASA Astrophysics Data System (ADS)
Hitzman, M.
2012-12-01
Economic geology is a highly interdisciplinary field utilizing a diverse set of petrologic, geochemical, geophysical, and tectonic data for improved scientific understanding of element migration and concentration in the crust (ore formation). A number of elements that were once laboratory curiosities now figure prominently in new energy technologies (e.g. wind turbines, solar energy collectors). If widely deployed, such technologies have the capacity to transform the way we produce, transmit, store, and conserve energy. To meet domestic and worldwide renewable energy needs these systems must be scaled from laboratory, to demonstration, to widespread deployment. Such technologies are materials intensive. If widely deployed, the elements required by these technologies will be needed in significant quantities and shortage of these "energy critical elements" could significantly inhibit the adoption of otherwise game changing energy technologies. It is imperative to better understand the geology, metallurgy, and mining engineering of critical mineral deposits if we are to sustainably develop these new technologies. There is currently no consensus among federal and state agencies, the national and international mining industry, the public, and the U.S. academic community regarding the importance of economic geology to secure sufficient energy critical elements to undertake large-scale renewable energy development. Available federal funding for critical elements focuses on downstream areas such as metallurgy, substitutions, and recycling rather than primary deposits. Undertaking the required research to discover and mine critical element deposits in an environmentally friendly manner will require significant partnering with industry due to the current lack of federal research support.
NASA Astrophysics Data System (ADS)
Wosnik, M.; Bachant, P.; Nedyalkov, I.; Rowell, M.; Dufresne, N.; Lyon, V.
2013-12-01
We report on research related to MHK turbines at the Center for Ocean Renewable Energy (CORE) at the University of New Hampshire (UNH). The research projects span varies scales, levels of complexity and environments - from fundamental studies of hydrofoil sections in a high speed water tunnel, to moderate Reynolds number turbine tests with inflow and wake studies in a large cross-section tow tank, to deployments of highly instrumented process models at tidal energy test sites in New England. A concerted effort over the past few years has brought significant new research infrastructure for marine hydrokinetic energy conversion online at UNH-CORE. It includes: a high-speed cavitation tunnel with independent control of velocity and pressure; a highly accurate tow mechanism, turbine test bed and wake traversing system for the 3.7m x 2.4m cross-section UNH tow tank; a 10.7m x 3.0m tidal energy test platform which can accommodate turbines up to 1.5m in diameter, for deployments at the UNH-CORE Tidal Energy Test Site in Great Bay Estuary, NH, a sheltered 'nursery site' suitable for intermediate scale tidal energy conversion device testing with peak currents typically above 2 m/s during each tidal cycle. Further, a large boundary layer wind tunnel, the new UNH Flow Physics Facility (W6.0m x H2.7m xL72m) is being used for detailed turbine wake studies, producing data and insight also applicable to MHK turbines in low Froude number deployments. Bi-directional hydrofoils, which perform equally well in either flow direction and could avoid the use of complex and maintenance-intensive yaw or blade pitch mechanisms, are being investigated theoretically, numerically and experimentally. For selected candidate shapes lift, drag, wake, and cavitation inception/desinence are measured. When combined with a cavitation inception model for MHK turbines, this information can be used to prescribe turbine design/operational parameters. Experiments were performed with a 1m diameter and 1m tall three-bladed cross-flow axis turbine (UNH RVAT) in a tow tank. For cross-flow axis turbines hydrofoil performance remains Reynolds number dependent at intermediate scales due to the large range of angles of attack encountered during turbine rotation. The experiments, with turbine diameter Reynolds numbers ReD = 0.5 x105 to 2.0 x106, were aimed at providing detailed data for model comparison at significantly higher Reynolds numbers than previously available. Measurements include rotor power, thrust, tip speed ratio, and detailed maps of mean flow and turbulence components in the near-wake. Mechanical exergy efficiency was calculated from power and drag measurements using an actuator disk approach. The spatial and temporal resolutions of different flow measurement techniques (ADCP, ADV, PIV) were systematically characterized. Finally, Reynolds-averaged Navier-Stokes (RANS) simulations were performed to assess their ability to predict the experimental results. A scaled version of a mixer-ejector hydrokinetic turbine, with a specially designed shroud to promotes wake mixing to enable increased mass flow through the turbine rotor, was evaluated experimentally at the UNH Tidal Energy Test Site in Great Bay Estuary, NH and in Muskeget Channel, MA. State-of-the-art instrumentation was used to measure the tidal energy resource and turbine wake flow velocities, turbine power extraction, test platform loadings and platform motion induced by sea state.
The 15th Aerospace Mechanisms Symposium
NASA Technical Reports Server (NTRS)
1981-01-01
Technological areas covered include: aerospace propulsion; aerodynamic devices; crew safety; space vehicle control; spacecraft deployment, positioning, and pointing; deployable antennas/reflectors; and large space structures. Devices for payload deployment, payload retention, and crew extravehicular activities on the space shuttle orbiter are also described.
Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.
Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.
Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics
Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285
NASA Astrophysics Data System (ADS)
Klumpar, D. M.; Gunderson, A.
2014-12-01
A 10-satellite constellation placed in Low Earth Orbit (LEO) will carry high geometric factor omnidirectional integrating energetic particle detectors responsive to electrons greater than ~500 keV to characterize the near-Earth distribution of Van Allen Belt electrons precipitating or mirroring at altitudes between ~350 and ~500 km. The full constellation will be constructed by two deployments of identical 1.5U CubeSats into LEO. The first launch will deploy eight satellites into a polar sun-synchronous orbit from the Island of Kauai in the Hawaiian Islands to form the NASA/Ames Research Center "Edison Demonstration of Smallsat Networks" (EDSN) swarm of satellites. The on-board Energetic Particle Integrating Space Environment Monitor (EPISEM) instrument built by the Space Science and Engineering Laboratory at Montana State University consists of a cylindrical 12 cm*2-ster omnidirectional Geiger counter sensitive to electrons above about 500 keV. The eight EDSN satellites are expected to deploy in late November 2014 into an 410 x 485 km orbit at ~92 degrees inclination forming two slowly-separating groups of four measurement platforms each to set up the initial 8-satellite swarm. Separately, two additional copies of the EDSN satellites will deploy from the International Space Station as elements of the NODES mission into a 52 degree inclination orbit at about 375 km altitude. Together the 10 satellites will characterize the distribution of low altitude penetrating electrons over spatial scales from 10's to thousands of km. The paper will describe the mission concept, the implementation of the spacecraft, and the unusual operations concept that allows stored science data to be collected from all eight satellites of the EDSN swarm through an intersatellite communications link and transferred to the ground by a single member of the swarm. The EDSN satellites operate completely autonomously without ground uplink. The paper will also include early scientific results if available by mid-December, 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deason, Jeff; Murphy, Sean
A new study by Berkeley Lab found that residential Property Assessed Clean Energy (R-PACE) programs increased deployment of residential solar photovoltaic (PV) systems in California, raising it by about 7-12% in cities that adopt these programs. R-PACE is a financing mechanism that uses a voluntary property tax assessment, paid off over time, to facilitate energy improvements and, in some jurisdictions, water and resilience measures. While previous studies demonstrated that early, regional R-PACE programs increased solar PV deployment, this new analysis is the first to demonstrate these impacts from the large, statewide R-PACE programs dominating the California market today, which usemore » private capital to fund the upfront costs of the improvements. Berkeley Lab estimated the impacts using econometric techniques on two samples: -Large cities only, allowing annual demographic and economic data as control variables -All California cities, without these annual data Analysis of both samples controls for several factors other than R-PACE that would be expected to drive solar PV deployment. We infer that on average, cities with R-PACE programs were associated with greater solar PV deployment in our study period (2010-2015). In the large cities sample, solar PV deployment in jurisdictions with R-PACE programs was higher by 1.1 watts per owner-occupied household per month, or 12%. Across all cities, solar PV deployment in jurisdictions with R-PACE programs was higher by 0.6 watts per owner-occupied household per month, or 7%. The large cities results are statistically significant at conventional levels; the all-cities results are not. The estimates imply that the majority of solar PV deployment financed by R-PACE programs would likely not have occurred in their absence. Results suggest that R-PACE programs have increased PV deployment in California even in relatively recent years, as R-PACE programs have grown in market share and as alternate approaches for financing solar PV have developed. The U.S. Department of Energy’s Building Technologies Office supported this research.« less
DOT National Transportation Integrated Search
1998-01-01
As states begin to consider full-scale deployment of intelligent transportation system (ITS) technologies to support commercial vehicle operations (CVO), Governors and state legislatures will need answers to the following questions: (1) What savings ...
[Mental disorders in German soldiers after deployment - impact of personal values and resilience].
Zimmermann, Peter; Firnkes, Susanne; Kowalski, Jens; Backus, Johannes; Alliger-Horn, Christina; Willmund, Gerd; Hellenthal, Andrea; Bauer, Amanda; Petermann, Franz; Maercker, Andreas
2015-11-01
Soldiers are at increased risk of developing mental health disorders after military deployment. The impact of personal values on psychological symptomatology based on an empirical working model has not yet been studied in a military environment. 117 German Armed Forces soldiers completed the Portrait-Values-Questionnaire (PVQ), the Patient-Health-Questionnaire (PHQ) and the Resilience-Scale (RS-11) after their deployment to Afghanistan. In the regression analyses the values hedonism, benevolence, tradition, self-direction and universalism had a differential significant impact on depression, anxiety and somatoform symptoms of the PHQ. The RS-11 sum scale values were negatively correlated with symptomatology. Personal values and resilience seem to be associated with psychological symptomatology in soldiers after military deployment. The results can contribute to the further development of both preventive and therapeutic approaches. © Georg Thieme Verlag KG Stuttgart · New York.
The Seismic component of the IBERARRAY: Placing constraints on the Lithosphere and Mantle.
NASA Astrophysics Data System (ADS)
Carbonell, R.; Diaz, J.; Villaseñor, A.; Gallart, J.; Morales, J.; Pazos, A.; Cordoba, D.; Pulgar, J.; Garcia-Lobon, J.; Harnafi, M.
2008-12-01
TOPOIBERIA, is a multidisciplinary large scale research project which aims to study the links between the deep and superficial processes within the Iberian Peninsula.One of its main experimental components is the deployment of the IBERARRAY seismic network. This is a dense array (60x60 km) of new generation dataloggers equipped with broad-band seismometers which will cover Iberia and North Morocco in three successive deployments, each lasting for about 18 months. The first leg, deployed since late 2007, covers the southern part of Iberia (35 stations) and northern Morocco (20 stations). Two data centers have been established one at the CSIC-Institute of Earth Sciences (CSIC-Barcelona) and a second at the Geologic and Mining Insititute (IGME-Madrid) the data follows a standard-conventional flow from recovery to archival. The field teams collect the recorded hard disk on the field and send data and metadata to a processing center, where raw data is collected and stored and a quality control checking is performed. This include a systematic inspection of the experimental parameters (batteries charge, thermal insulation, time adjustments, geophone leveling etc), the visual verification of the seismic waveforms and the analysis, using power density spectra (PSD), of the noise level of each station. All this information is disseminated between the research teams involved in the project using a dedicated website and the continuous seismic data is made accessible through FTP and CWQ servers. Some of the nodes of the theoretical network are covered by permanent stations of the national broad-band network (IGN) or other networks operating in the region (IAG-UGR, ROA). Data from those stations will also be integrated to the Iberarray database. This Iberarray network will provide a large database of both waveform and catalogued events, with an unprecedented resolution. Earthquake data at local, regional and teleseismic scales will be analyzed using different methodologies. The first result would be an increase in the accuracy of the location of regional seismicity and the termination of focal mechanisms. A special emphasis will be attributed to seismic tomographic techniques using travel times and waveforms of P and S arrivals at different scales as well as surface waves, using dispersion measurements as well as studies dealing with background/environmental noise. In addition, receiver function analysis for seismic imaging of deep lithospheric features and splitting analysis of shear-wave arrivals will also be developed.
Ship-Based Nuclear Energy Systems for Accelerating Developing World Socioeconomic Advance
NASA Astrophysics Data System (ADS)
Petroski, Robert; Wood, Lowell
2014-07-01
Technological, economic, and policy aspects of supplying energy to newly industrializing and developing countries using ship-deployed nuclear energy systems are described. The approach analyzed comprises nuclear installations of up to gigawatt scale deployed within currently mass-produced large ship hulls which are capable of flexibly supplying energy for electricity, water desalination and district heating-&-cooling with low latencies and minimized shoreside capital expenditures. Nuclear energy is uniquely suited for mobile deployment due to its combination of extraordinary energy density and high power density, which enable enormous supplies of energy to be deployed at extremely low marginal costs. Nuclear installations on ships also confer technological advantages by essentially eliminating risk from earthquakes, tsunamis, and floods; taking advantage of assured access to an effectively unlimited amount of cooling water, and involving minimal onshore preparations and commitments. Instances of floating nuclear power stations that have been proposed in the past, some of which are currently being pursued, have generally been based on conventional LWR technology, moreover without flexibility or completeness of power output options. We consider nuclear technology options for their applicability to the unique opportunities and challenges of a marine environment, with special attention given to low-pressure, high thermal margin systems with continuous and assured afterheat dissipation into the ambient seawater. Such systems appear promising for offering an exceptionally high degree of safety while using a maximally simple set of components. We furthermore consider systems tailored to Developing World contexts, which satisfy societal requirements beyond electrification, e.g., flexible sourcing of potable water and HVAC services, servicing time-varying user requirements, and compatibility with the full spectrum of local renewable energy supplies, specifically including those having intermittency characteristics. Consideration is directed to the relative economics of ship-based and land-based nuclear power stations, and the costs of undersea transmission lines and suitable moorings are discussed, as well as station-maintenance expenses. Potential cost savings from reduced seismic engineering, serialized production, and reduction/elimination of site-specific engineering are determined to be likely to enable large floating nuclear energy systems to be deployed at both significantly lower cost and with lower financial risk than comparable land-based systems. Such plants thus appear to be a compelling option for agilely supplying flexible energy-flows to developing regions, especially as they allow major components of the overhead costs and time-delays of large-scale energy systems to be avoided. Finally, the critical set of issues related to appropriately regulating and insuring floating nuclear power plants designed for export is examined. Approaches to ensuring adequate safety and environmental stewardship while properly allocating risks between system owners/operators and host countries of floating nuclear energy systems are discussed, along with possible pathways toward implementation. Robustness of exemplary nuclear energy systems from all forms of misuse, including materials diversion, is noted, thus ensuring suitability for complications-free, non-discriminatory global deployments. Availability of abundant, low-cost nuclear energy which can flexibly satisfy the full spectrum of energy demands of the economies of developing countries will inevitably result in significantly earlier and more environmentally-sound energy intensification of societies enjoying such advantages. This will help spur autocatalytic gains in human well-being and economic development rates similar to those seen in the developed world during the last two-thirds of a century, while avoiding some of the undesirable sideeffects often associated with those gains. Quantitative estimates of these considerations are offered.
Deployable Debris Shields For Space Station
NASA Technical Reports Server (NTRS)
Christiansen, Eric L.; Cour-Palais, Burton G.; Crews, Jeanne
1993-01-01
Multilayer shields made of lightweight sheet materials deployed from proposed Space Station Freedom for additional protection against orbiting debris. Deployment mechanism attached at each location on exterior where extra protection needed. Equipment withdraws layer of material from storage in manner similar to unfurling sail or extending window shade. Number of layers deployed depends on required degree of protection, and could be as large as five.
Development of deployable structures for large space platform systems, volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
Generic deployable spacecraft configurations and deployable platform systems concepts were identified. Sizing, building block concepts, orbiter packaging, thermal analysis, cost analysis, and mass properties analysis as related to platform systems integration are considered. Technology needs are examined and the major criteria used in concept selection are delineated. Requirements for deployable habitat modules, tunnels, and OTV hangars are considered.
NASA Technical Reports Server (NTRS)
Valinia, Azita; Moe, Rud; Seery, Bernard D.; Mankins, John C.
2013-01-01
We present a concept for an ISS-based optical system assembly demonstration designed to advance technologies related to future large in-space optical facilities deployment, including space solar power collectors and large-aperture astronomy telescopes. The large solar power collector problem is not unlike the large astronomical telescope problem, but at least conceptually it should be easier in principle, given the tolerances involved. We strive in this application to leverage heavily the work done on the NASA Optical Testbed Integration on ISS Experiment (OpTIIX) effort to erect a 1.5 m imaging telescope on the International Space Station (ISS). Specifically, we examine a robotic assembly sequence for constructing a large (meter diameter) slightly aspheric or spherical primary reflector, comprised of hexagonal mirror segments affixed to a lightweight rigidizing backplane structure. This approach, together with a structured robot assembler, will be shown to be scalable to the area and areal densities required for large-scale solar concentrator arrays.
Advanced Nuclear Fuel Cycle Transitions: Optimization, Modeling Choices, and Disruptions
NASA Astrophysics Data System (ADS)
Carlsen, Robert W.
Many nuclear fuel cycle simulators have evolved over time to help understan the nuclear industry/ecosystem at a macroscopic level. Cyclus is one of th first fuel cycle simulators to accommodate larger-scale analysis with it liberal open-source licensing and first-class Linux support. Cyclus also ha features that uniquely enable investigating the effects of modeling choices o fuel cycle simulators and scenarios. This work is divided into thre experiments focusing on optimization, effects of modeling choices, and fue cycle uncertainty. Effective optimization techniques are developed for automatically determinin desirable facility deployment schedules with Cyclus. A novel method fo mapping optimization variables to deployment schedules is developed. Thi allows relationships between reactor types and scenario constraints to b represented implicitly in the variable definitions enabling the usage o optimizers lacking constraint support. It also prevents wasting computationa resources evaluating infeasible deployment schedules. Deployed power capacit over time and deployment of non-reactor facilities are also included a optimization variables There are many fuel cycle simulators built with different combinations o modeling choices. Comparing results between them is often difficult. Cyclus flexibility allows comparing effects of many such modeling choices. Reacto refueling cycle synchronization and inter-facility competition among othe effects are compared in four cases each using combinations of fleet of individually modeled reactors with 1-month or 3-month time steps. There are noticeable differences in results for the different cases. The larges differences occur during periods of constrained reactor fuel availability This and similar work can help improve the quality of fuel cycle analysi generally There is significant uncertainty associated deploying new nuclear technologie such as time-frames for technology availability and the cost of buildin advanced reactors. Historically, fuel cycle analysis has focused on answerin questions of fuel cycle feasibility and optimality. However, there has no been much work done to address uncertainty in fuel cycle analysis helpin answer questions of fuel cycle robustness. This work develops an demonstrates a methodology for evaluating deployment strategies whil accounting for uncertainty. Techniques are developed for measuring th hedging properties of deployment strategies under uncertainty. Additionally methods for using optimization to automatically find good hedging strategie are demonstrated.
Airframe noise prediction evaluation
NASA Technical Reports Server (NTRS)
Yamamoto, Kingo J.; Donelson, Michael J.; Huang, Shumei C.; Joshi, Mahendra C.
1995-01-01
The objective of this study is to evaluate the accuracy and adequacy of current airframe noise prediction methods using available airframe noise measurements from tests of a narrow body transport (DC-9) and a wide body transport (DC-10) in addition to scale model test data. General features of the airframe noise from these aircraft and models are outlined. The results of the assessment of two airframe prediction methods, Fink's and Munson's methods, against flight test data of these aircraft and scale model wind tunnel test data are presented. These methods were extensively evaluated against measured data from several configurations including clean, slat deployed, landing gear-deployed, flap deployed, and landing configurations of both DC-9 and DC-10. They were also assessed against a limited number of configurations of scale models. The evaluation was conducted in terms of overall sound pressure level (OASPL), tone corrected perceived noise level (PNLT), and one-third-octave band sound pressure level (SPL).
Novel Sensor for the In Situ Measurement of Uranium Fluxes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hatfield, Kirk
2015-02-10
The goal of this project was to develop a sensor that incorporates the field-tested concepts of the passive flux meter to provide direct in situ measures of flux for uranium and groundwater in porous media. Measurable contaminant fluxes [J] are essentially the product of concentration [C] and groundwater flux or specific discharge [q ]. The sensor measures [J] and [q] by changes in contaminant and tracer amounts respectively on a sorbent. By using measurement rather than inference from static parameters, the sensor can directly advance conceptual and computational models for field scale simulations. The sensor was deployed in conjunction withmore » DOE in obtaining field-scale quantification of subsurface processes affecting uranium transport (e.g., advection) and transformation (e.g., uranium attenuation) at the Rifle IFRC Site in Rifle, Colorado. Project results have expanded our current understanding of how field-scale spatial variations in fluxes of uranium, groundwater and salient electron donor/acceptors are coupled to spatial variations in measured microbial biomass/community composition, effective field-scale uranium mass balances, attenuation, and stability. The coupling between uranium, various nutrients and micro flora can be used to estimate field-scale rates of uranium attenuation and field-scale transitions in microbial communities. This research focuses on uranium (VI), but the sensor principles and design are applicable to field-scale fate and transport of other radionuclides. Laboratory studies focused on sorbent selection and calibration, along with sensor development and validation under controlled conditions. Field studies were conducted at the Rifle IFRC Site in Rifle, Colorado. These studies were closely coordinated with existing SBR (formerly ERSP) projects to complement data collection. Small field tests were conducted during the first two years that focused on evaluating field-scale deployment procedures and validating sensor performance under controlled field conditions. In the third and fourth year a suite of larger field studies were conducted. For these studies, the uranium flux sensor was used with uranium speciation measurements and molecular-biological tools to characterize microbial community and active biomass at synonymous wells distributed in a large grid. These field efforts quantified spatial changes in uranium flux and field-scale rates of uranium attenuation (ambient and stimulated), uranium stability, and quantitatively assessed how fluxes and effective reaction rates were coupled to spatial variations in microbial community and active biomass. Analyses of data from these field experiments were used to generate estimates of Monod kinetic parameters that are ‘effective’ in nature and optimal for modeling uranium fate and transport at the field-scale. This project provided the opportunity to develop the first sensor that provides direct measures of both uranium (VI) and groundwater flux. A multidisciplinary team was assembled to include two geochemists, a microbiologist, and two quantitative contaminant hydrologists. Now that the project is complete, the sensor can be deployed at DOE sites to evaluate field-scale uranium attenuation, source behavior, the efficacy of remediation, and off-site risk. Because the sensor requires no power, it can be deployed at remote sites for periods of days to months. The fundamental science derived from this project can be used to advance the development of predictive models for various transport and attenuation processes in aquifers. Proper development of these models is critical for long-term stewardship of contaminated sites in the context of predicting uranium source behavior, remediation performance, and off-site risk.« less
Application-level regression testing framework using Jenkins
Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen
2017-09-26
Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less
Application-level regression testing framework using Jenkins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen
Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less
Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology
Ellis, Daniel P. W.; Pérez, Jonathan H.; Wingfield, John C.; Boelman, Natalie T.
2018-01-01
Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape’s snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change. PMID:29938220
NASA Technical Reports Server (NTRS)
1984-01-01
The Large Deployable Reflector (LDR), a proposed 20 m diameter telescope designed for infrared and submillimeter astronomical measurements from space, is discussed in terms of scientific purposes, capabilities, current status, and history of development. The LDR systems goals and functional/telescope requirements are enumerated.
NASA Astrophysics Data System (ADS)
Smyrnakis, Christos; Phocas-Cosmetatos, Alex; Kynigalakis, Kostantinos
2016-05-01
Large scale Concentrated Solar Power (CSP) plants need large plots of land with very high solar resource and thus are often deployed in desert areas which are usually owned by the state or a municipal authority. This study discusses the implication and practices of land lease policies with regards to CSP development. The strategy followed on a land lease is examined by definition on a case-specific basis and this text is by no means exhaustive with regards to its content. The study also discusses the pricing of land in various cases, presents the governing types of land lease and their effect on the economic performance of hypothetical CSP projects under various cases.
Spoked wheels to deploy large surfaces in space-weight estimates for solar arrays
NASA Technical Reports Server (NTRS)
Crawford, R. F.; Hedgepeth, J. M.; Preiswerk, P. R.
1975-01-01
Extensible booms were used to deploy and support solar cell arrays of varying areas. Solar cell array systems were built with one or two booms to deploy and tension a blanket with attached cells and bussing. A segmented and hinged rim supported by spokes joined to a common hub is described. This structure can be compactly packaged and deployed.
Meremonte, M.; Frankel, A.; Cranswick, E.; Carver, D.; Worley, D.
1996-01-01
We deployed portable digital seismographs in the San Fernando Valley (SFV), the Los Angeles basin (LAB), and surrounding hills to record aftershocks of the 17 January 1994 Northridge California earthquake. The purpose of the deployment was to investigate factors relevant to seismic zonation in urban areas, such as site amplification, sedimentary basin effects, and the variability of ground motion over short baselines. We placed seismographs at 47 sites (not all concurrently) and recorded about 290 earthquakes with magnitudes up to 5.1 at five stations or more. We deployed widely spaced stations for profiles across the San Fernando Valley, as well as five dense arrays (apertures of 200 to 500 m) in areas of high damage, such as the collapsed Interstate 10 overpass, Sherman Oaks, and the collapsed parking garage at CalState Northridge. Aftershock data analysis indicates a correlation of site amplification with mainshock damage. We found several cases where the site amplification depended on the azimuth of the aftershock, possibly indicating focusing from basin structures. For the parking garage array, we found large ground-motion variabilities (a factor of 2) over 200-m distances for sites on the same mapped soil unit. Array analysis of the aftershock seismograms demonstrates that sizable arrivals after the direct 5 waves consist of surface waves traveling from the same azimuth as that of the epicenter. These surface waves increase the duration of motions and can have frequencies as high as about 4 Hz. For the events studied here, we do not observe large arrivals reflected from the southern edge of the San Fernando Valley.
Experience in using commercial clouds in CMS
NASA Astrophysics Data System (ADS)
Bauerdick, L.; Bockelman, B.; Dykstra, D.; Fuess, S.; Garzoglio, G.; Girone, M.; Gutsche, O.; Holzman, B.; Hufnagel, D.; Kim, H.; Kennedy, R.; Mason, D.; Spentzouris, P.; Timm, S.; Tiradani, A.; Vaandering, E.; CMS Collaboration
2017-10-01
Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.
High Fidelity Simulations of Large-Scale Wireless Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onunkwo, Uzoma; Benz, Zachary
The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulationsmore » (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.« less
Experience in using commercial clouds in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauerdick, L.; Bockelman, B.; Dykstra, D.
Historically high energy physics computing has been performed on large purposebuilt computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is amore » growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.« less
Summer circulation in the Mexican tropical Pacific
NASA Astrophysics Data System (ADS)
Trasviña, A.; Barton, E. D.
2008-05-01
The main components of large-scale circulation of the eastern tropical Pacific were identified in the mid 20th century, but the details of the circulation at length scales of 10 2 km or less, the mesoscale field, are less well known particularly during summer. The winter circulation is characterized by large mesoscale eddies generated by intense cross-shore wind pulses. These eddies propagate offshore to provide an important source of mesoscale variability for the eastern tropical Pacific. The summer circulation has not commanded similar attention, the main reason being that the frequent generation of hurricanes in the area renders in situ observations difficult. Before the experiment presented here, the large-scale summer circulation of the Gulf of Tehuantepec was thought to be dominated by a poleward flow along the coast. A drifter-deployment experiment carried out in June 2000, supported by satellite altimetry and wind data, was designed to characterize this hypothesized Costa Rica Coastal Current. We present a detailed comparison between altimetry-estimated geostrophic and in situ currents estimated from drifters. Contrary to expectation, no evidence of a coherent poleward coastal flow across the gulf was found. During the 10-week period of observations, we documented a recurrent pattern of circulation within 500 km of shore, forced by a combination of local winds and the regional-scale flow. Instead of the Costa Rica Coastal Current, we found a summer eddy field capable of influencing large areas of the eastern tropical Pacific. Even in summer, the cross-isthmus wind jet is capable of inducing eddy formation.
OBSIP: Advancing Capabilities and Expanding the Ocean Bottom Seismology Community
NASA Astrophysics Data System (ADS)
Aderhold, K.; Evers, B.
2016-12-01
The Ocean Bottom Seismograph Instrument Pool (OBSIP) is a National Science Foundation sponsored instrument facility that provides ocean bottom seismometers (OBS) and technical support for research in the areas of marine geology, seismology, and geodynamics. OBSIP is comprised of an OBSIP Management Office (OMO) and three Institutional Instrument Contributors (IICs), who each contribute instruments and technical support to the pool. OBSIP operates both short period and broadband OBS instruments with a variety of capabilities to operate in shallow or deep water over both short and long term durations. Engineering developments at the IICs include capability for freshwater deployments, increased recording duration (15+ months), more efficient recovery systems, and sensor upgrades for a less heterogeneous fleet. OBSIP will provide instruments for three experiments in 2016 with deployments along a 1500 km transect in the South Atlantic, a large active-source experiment on the Chilean megathrust, and the very first seismometers ever deployed in Yellowstone Lake. OBSIP OMO strives to lower the barrier to working with OBS data by performing quality checks on data, investigating and responding to community questions, and providing data products like horizontal orientation calculations. This has resulted in a significant increase in new users to OBS data, especially for the open data sets from community seismic experiments. In 2015 the five-year Cascadia Initiative community seismic experiment concluded with over 250 OBS deployments and recoveries in an extensive grid off-shore Washington, Oregon, and California. The logistics of the Cascadia Initiative were challenging, but lessons were learned and efficiencies have been identified for implementation in future experiments. Large-scale community seismic experiments that cross the shoreline like the Cascadia Initiative and the Eastern North American Margin experiment have led to the proposal of even more ambitious endeavors like the Subduction Zone Observatory. OBSIP also is working to develop international collaboration and networking between OBS operators and researchers through special interest group meetings and the biannual OBS Symposium, to be held again in Fall 2017.
NASA Astrophysics Data System (ADS)
Wollheim, W. M.; Mulukutla, G.; Cook, C.; Carey, R. O.
2014-12-01
Biogeochemical conditions throughout aquatic landscapes are spatially varied and temporally dynamic due to interactions of upstream land use, climate, hydrologic responses, and internal aquatic processes. One of the key goals in aquatic ecosystem ecology is to parse the upstream influences of terrestrial and aquatic processes on local conditions, which becomes progressively more difficult as watershed size increases and as processes are altered by diverse human activities. Simultaneous deployments of high frequency, in situ aquatic sensors for multiple constituents (e.g. NO3-N, CDOM, turbidity, conductivity, D.O., water temperature, along with flow) offer a new approach for understanding patterns along the aquatic continuum. For this talk, we explore strategies for deployments within single watersheds to improve understanding of terrestrial and aquatic processes. We address applications regarding mobilization of non-point nutrient sources across temporal scales, interactions with land use and watershed size, and the importance of aquatic processes. We also explore ways in which simultaneous sensor deployments can be designed to improve parameterization and testing of river network biogeochemical models. We will provide several specific examples using conductivity, nitrate and carbon from ongoing sensor deployments in New England, USA. We expect that improved deployments of sensors and sensor networks will benefit the management of critical freshwater resources.
Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network
NASA Astrophysics Data System (ADS)
Jones, A. S.; Horsburgh, J. S.
2014-12-01
Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.
Life satisfaction and quality in Korean War veterans five decades after the war.
Ikin, J F; Sim, M R; McKenzie, D P; Horsley, K W A; Wilson, E J; Harrex, W K; Moore, M R; Jelfs, P L; Henderson, S
2009-05-01
Military service is considered to be a hidden variable underlying current knowledge about well-being in the elderly. This study aimed to examine life satisfaction and quality of life in Australia's surviving male Korean War veterans and a community comparison group, and to investigate any association with war deployment-related factors. Participants completed a postal questionnaire which included the Life Satisfaction Scale, the brief World Health Organization Quality of Life (WHOQOL-Bref) questionnaire and the Combat Exposure Scale. Korean War veterans reported significantly lower Percentage Life Satisfaction (PLS) and quality of life scores on four WHOQOL-Bref domains, compared with similarly aged Australian men (each p value <0.001). These outcomes were most strongly associated with severity of combat exposure and low rank. Mean PLS was approximately 15% lower in veterans who reported heavy combat compared with those reporting no combat, and approximately 12% lower in enlisted ranked veterans compared with officers. Fifty years after the Korean War, life satisfaction and quality in Australian veterans is poor relative to other Australian men, and is associated with deployment-related factors including combat severity and low rank. In order to respond effectively to current and projected population health needs, nations with large veteran populations may need to consider the impact of military service on well-being in later life.
Coiling of elastic rods on rigid substrates
Jawed, Mohammad K.; Da, Fang; Joo, Jungseock; Grinspun, Eitan; Reis, Pedro M.
2014-01-01
We investigate the deployment of a thin elastic rod onto a rigid substrate and study the resulting coiling patterns. In our approach, we combine precision model experiments, scaling analyses, and computer simulations toward developing predictive understanding of the coiling process. Both cases of deposition onto static and moving substrates are considered. We construct phase diagrams for the possible coiling patterns and characterize them as a function of the geometric and material properties of the rod, as well as the height and relative speeds of deployment. The modes selected and their characteristic length scales are found to arise from a complex interplay between gravitational, bending, and twisting energies of the rod, coupled to the geometric nonlinearities intrinsic to the large deformations. We give particular emphasis to the first sinusoidal mode of instability, which we find to be consistent with a Hopf bifurcation, and analyze the meandering wavelength and amplitude. Throughout, we systematically vary natural curvature of the rod as a control parameter, which has a qualitative and quantitative effect on the pattern formation, above a critical value that we determine. The universality conferred by the prominent role of geometry in the deformation modes of the rod suggests using the gained understanding as design guidelines, in the original applications that motivated the study. PMID:25267649
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ade, P. A. R.; Aikin, R. W.; Bock, J. J.
2015-06-20
bicep2 and the Keck Array are polarization-sensitive microwave telescopes that observe the cosmic microwave background (CMB) from the South Pole at degree angular scales in search of a signature of inflation imprinted as B-mode polarization in the CMB. bicep2 was deployed in late 2009, observed for three years until the end of 2012 at 150 GHz with 512 antenna-coupled transition edge sensor bolometers, and has reported a detection of B-mode polarization on degree angular scales. The Keck Array was first deployed in late 2010 and will observe through 2016 with five receivers at several frequencies (95, 150, and 220 GHz). bicep2 and the Keck Array sharemore » a common optical design and employ the field-proven bicep1 strategy of using small-aperture, cold, on-axis refractive optics, providing excellent control of systematics while maintaining a large field of view. This design allows for full characterization of far-field optical performance using microwave sources on the ground. Here we describe the optical design of both instruments and report a full characterization of the optical performance and beams of bicep2 and the Keck Array at 150 GHz.« less
Rule-Based vs. Behavior-Based Self-Deployment for Mobile Wireless Sensor Networks
Urdiales, Cristina; Aguilera, Francisco; González-Parada, Eva; Cano-García, Jose; Sandoval, Francisco
2016-01-01
In mobile wireless sensor networks (MWSN), nodes are allowed to move autonomously for deployment. This process is meant: (i) to achieve good coverage; and (ii) to distribute the communication load as homogeneously as possible. Rather than optimizing deployment, reactive algorithms are based on a set of rules or behaviors, so nodes can determine when to move. This paper presents an experimental evaluation of both reactive deployment approaches: rule-based and behavior-based ones. Specifically, we compare a backbone dispersion algorithm with a social potential fields algorithm. Most tests are done under simulation for a large number of nodes in environments with and without obstacles. Results are validated using a small robot network in the real world. Our results show that behavior-based deployment tends to provide better coverage and communication balance, especially for a large number of nodes in areas with obstacles. PMID:27399709
Improve California trap programs for detection of fruit flies
USDA-ARS?s Scientific Manuscript database
There are >160,000 federal and state fruit fly detection traps deployed in southern and western U.S. States and Puerto Rico. In California alone, >100,000 traps are deployed and maintained just for exotic fruit flies detection. Fruit fly detection and eradication requires deployment of large numbers...
Appendange deployment mechanism for the Hubble Space Telescope program
NASA Technical Reports Server (NTRS)
Greenfield, H. T.
1985-01-01
The key requirements, a design overview, development testing (qualification levels), and two problems and their solutions resolved during the mechanism development testing phase are presented. The mechanism described herein has demonstrated its capability to deploy/restow two large Hubble Space Telescope deployable appendages in a varying but controlled manner.
NASA Astrophysics Data System (ADS)
Schmalstieg, Dieter; Langlotz, Tobias; Billinghurst, Mark
Augmented Reality (AR) was first demonstrated in the 1960s, but only recently have technologies emerged that can be used to easily deploy AR applications to many users. Camera-equipped cell phones with significant processing power and graphics abilities provide an inexpensive and versatile platform for AR applications, while the social networking technology of Web 2.0 provides a large-scale infrastructure for collaboratively producing and distributing geo-referenced AR content. This combination of widely used mobile hardware and Web 2.0 software allows the development of a new type of AR platform that can be used on a global scale. In this paper we describe the Augmented Reality 2.0 concept and present existing work on mobile AR and web technologies that could be used to create AR 2.0 applications.
NASA Astrophysics Data System (ADS)
Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Tovar, Benjamin; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas
2017-10-01
The University of Notre Dame (ND) CMS group operates a modest-sized Tier-3 site suitable for local, final-stage analysis of CMS data. However, through the ND Center for Research Computing (CRC), Notre Dame researchers have opportunistic access to roughly 25k CPU cores of computing and a 100 Gb/s WAN network link. To understand the limits of what might be possible in this scenario, we undertook to use these resources for a wide range of CMS computing tasks from user analysis through large-scale Monte Carlo production (including both detector simulation and data reconstruction.) We will discuss the challenges inherent in effectively utilizing CRC resources for these tasks and the solutions deployed to overcome them.
Reactor concepts for bioelectrochemical syntheses and energy conversion.
Krieg, Thomas; Sydow, Anne; Schröder, Uwe; Schrader, Jens; Holtmann, Dirk
2014-12-01
In bioelectrochemical systems (BESs) at least one electrode reaction is catalyzed by microorganisms or isolated enzymes. One of the existing challenges for BESs is shifting the technology towards industrial use and engineering reactor systems at adequate scales. Due to the fact that most BESs are usually deployed in the production of large-volume but low-value products (e.g., energy, fuels, and bulk chemicals), investment and operating costs must be minimized. Recent advances in reactor concepts for different BESs, in particular biofuel cells and electrosynthesis, are summarized in this review including electrode development and first applications on a technical scale. A better understanding of the impact of reactor components on the performance of the reaction system is an important step towards commercialization of BESs. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Inamori, Takaya; Sugawara, Yoshiki; Satou, Yasutaka
2015-12-01
Increasingly, spacecraft are installed with large-area structures that are extended and deployed post-launch. These extensible structures have been applied in several missions for power generation, thermal radiation, and solar propulsion. Here, we propose a deployment and retraction method using the electromagnetic force generated when the geomagnetic field interacts with electric current flowing on extensible panels. The panels are installed on a satellite in low Earth orbit. Specifically, electrical wires placed on the extensible panels generate magnetic moments, which interfere with the geomagnetic field. The resulting repulsive and retraction forces enable panel deployment and retraction. In the proposed method, a satellite realizes structural deployment using simple electrical wires. Furthermore, the satellite can achieve not only deployment but also retraction for avoiding damage from space debris and for agile attitude maneuvers. Moreover, because the proposed method realizes quasi-static deployment and the retraction of panels by electromagnetic forces, low impulsive force is exerted on fragile panels. The electrical wires can also be used to detect the panel deployment and retraction and generate a large magnetic moment for attitude control. The proposed method was assessed in numerical simulations based on multibody dynamics. Simulation results shows that a small cubic satellite with a wire current of 25 AT deployed 4 panels (20 cm × 20 cm) in 500 s and retracted 4 panels in 100 s.
Ice911 Research: Preserving and Rebuilding Multi-Year Ice
NASA Astrophysics Data System (ADS)
Field, L. A.; Chetty, S.; Manzara, A.
2013-12-01
A localized surface albedo modification technique is being developed that shows promise as a method to increase multi-year ice using reflective floating materials, chosen so as to have low subsidiary environmental impact. Multi-year ice has diminished rapidly in the Arctic over the past 3 decades (Riihela et al, Nature Climate Change, August 4, 2013) and this plays a part in the continuing rapid decrease of summer-time ice. As summer-time ice disappears, the Arctic is losing its ability to act as the earth's refrigeration system, and this has widespread climatic effects, as well as a direct effect on sea level rise, as oceans heat, and once-land-based ice melts into the sea. We have tested the albedo modification technique on a small scale over five Winter/Spring seasons at sites including California's Sierra Nevada Mountains, a Canadian lake, and a small man-made lake in Minnesota, using various materials and an evolving array of instrumentation. The materials can float and can be made to minimize effects on marine habitat and species. The instrumentation is designed to be deployed in harsh and remote locations. Localized snow and ice preservation, and reductions in water heating, have been quantified in small-scale testing. Climate modeling is underway to analyze the effects of this method of surface albedo modification in key areas on the rate of oceanic and atmospheric temperature rise. We are also evaluating the effects of snow and ice preservation for protection of infrastructure and habitat stabilization. This paper will also discuss a possible reduction of sea level rise with an eye to quantification of cost/benefit. The most recent season's experimentation on a man-made private lake in Minnesota saw further evolution in the material and deployment approach. The materials were successfully deployed to shield underlying snow and ice from melting; applications of granular materials remained stable in the face of local wind and storms. Localized albedo modification options such as the one being studied in this work may act to preserve ice, glaciers, permafrost and seasonal snow areas, and perhaps aid natural ice formation processes. If this method could be deployed on a large enough scale, it could conceivably bring about a reduction in the Ice-Albedo Feedback Effect, possibly slowing one of the key effects and factors in climate change. Test site at man-made lake in Minnesota 2013
Jacobsen, Sonja; Patel, Pranav; Rieger, Toni; Eickmann, Markus; Becker, Stephan; Günther, Stephan; Naidoo, Dhamari; Schrick, Livia; Keeren, Kathrin; Targosz, Angelina; Teichmann, Anette; Formenty, Pierre; Niedrig, Matthias
2017-01-01
During the recent Ebola outbreak in West Africa several international mobile laboratories were deployed to the mainly affected countries Guinea, Sierra Leone and Liberia to provide ebolavirus diagnostic capacity. Additionally, imported cases and small outbreaks in other countries required global preparedness for Ebola diagnostics. Detection of viral RNA by reverse transcription polymerase chain reaction has proven effective for diagnosis of ebolavirus disease and several assays are available. However, reliability of these assays is largely unknown and requires serious evaluation. Therefore, a proficiency test panel of 11 samples was generated and distributed on a global scale. Panels were analyzed by 83 expert laboratories and 106 data sets were returned. From these 78 results were rated optimal and 3 acceptable, 25 indicated need for improvement. While performance of the laboratories deployed to West Africa was superior to the overall performance there was no significant difference between the different assays applied. PMID:28459810
Ellerbrok, Heinz; Jacobsen, Sonja; Patel, Pranav; Rieger, Toni; Eickmann, Markus; Becker, Stephan; Günther, Stephan; Naidoo, Dhamari; Schrick, Livia; Keeren, Kathrin; Targosz, Angelina; Teichmann, Anette; Formenty, Pierre; Niedrig, Matthias
2017-05-01
During the recent Ebola outbreak in West Africa several international mobile laboratories were deployed to the mainly affected countries Guinea, Sierra Leone and Liberia to provide ebolavirus diagnostic capacity. Additionally, imported cases and small outbreaks in other countries required global preparedness for Ebola diagnostics. Detection of viral RNA by reverse transcription polymerase chain reaction has proven effective for diagnosis of ebolavirus disease and several assays are available. However, reliability of these assays is largely unknown and requires serious evaluation. Therefore, a proficiency test panel of 11 samples was generated and distributed on a global scale. Panels were analyzed by 83 expert laboratories and 106 data sets were returned. From these 78 results were rated optimal and 3 acceptable, 25 indicated need for improvement. While performance of the laboratories deployed to West Africa was superior to the overall performance there was no significant difference between the different assays applied.
Deployment of the OSIRIS EM-PIC code on the Intel Knights Landing architecture
NASA Astrophysics Data System (ADS)
Fonseca, Ricardo
2017-10-01
Electromagnetic particle-in-cell (EM-PIC) codes such as OSIRIS have found widespread use in modelling the highly nonlinear and kinetic processes that occur in several relevant plasma physics scenarios, ranging from astrophysical settings to high-intensity laser plasma interaction. Being computationally intensive, these codes require large scale HPC systems, and a continuous effort in adapting the algorithm to new hardware and computing paradigms. In this work, we report on our efforts on deploying the OSIRIS code on the new Intel Knights Landing (KNL) architecture. Unlike the previous generation (Knights Corner), these boards are standalone systems, and introduce several new features, include the new AVX-512 instructions and on-package MCDRAM. We will focus on the parallelization and vectorization strategies followed, as well as memory management, and present a detailed performance evaluation of code performance in comparison with the CPU code. This work was partially supported by Fundaçã para a Ciência e Tecnologia (FCT), Portugal, through Grant No. PTDC/FIS-PLA/2940/2014.
Broadhurst, Melanie; Orme, C David L
2014-08-01
The addition of man-made structures to the marine environment is known to increase the physical complexity of the seafloor, which can influence benthic species community patterns and habitat structure. However, knowledge of how deployed tidal energy device structures influence benthic communities is currently lacking. Here we examined species biodiversity, composition and habitat type surrounding a tidal energy device within the European Marine Energy Centre test site, Orkney. Commercial fishing and towed video camera techniques were used over three temporal periods, from 2009 to 2010. Our results showed increased species biodiversity and compositional differences within the device site, compared to a control site. Both sites largely comprised of crustacean species, omnivore or predatory feeding regimes and marine tide-swept EUNIS habitat types, which varied over the time. We conclude that the device could act as a localised artificial reef structure, but that further in-depth investigations are required. Copyright © 2014 Elsevier Ltd. All rights reserved.
WRAP-RIB antenna technology development
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Garcia, N. F.; Iwamoto, H.
1985-01-01
The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.
LESS: Link Estimation with Sparse Sampling in Intertidal WSNs
Ji, Xiaoyu; Chen, Yi-chao; Li, Xiaopeng; Xu, Wenyuan
2018-01-01
Deploying wireless sensor networks (WSN) in the intertidal area is an effective approach for environmental monitoring. To sustain reliable data delivery in such a dynamic environment, a link quality estimation mechanism is crucial. However, our observations in two real WSN systems deployed in the intertidal areas reveal that link update in routing protocols often suffers from energy and bandwidth waste due to the frequent link quality measurement and updates. In this paper, we carefully investigate the network dynamics using real-world sensor network data and find it feasible to achieve accurate estimation of link quality using sparse sampling. We design and implement a compressive-sensing-based link quality estimation protocol, LESS, which incorporates both spatial and temporal characteristics of the system to aid the link update in routing protocols. We evaluate LESS in both real WSN systems and a large-scale simulation, and the results show that LESS can reduce energy and bandwidth consumption by up to 50% while still achieving more than 90% link quality estimation accuracy. PMID:29494557
Underwater Sensor Network Redeployment Algorithm Based on Wolf Search
Jiang, Peng; Feng, Yang; Wu, Feng
2016-01-01
This study addresses the optimization of node redeployment coverage in underwater wireless sensor networks. Given that nodes could easily become invalid under a poor environment and the large scale of underwater wireless sensor networks, an underwater sensor network redeployment algorithm was developed based on wolf search. This study is to apply the wolf search algorithm combined with crowded degree control in the deployment of underwater wireless sensor networks. The proposed algorithm uses nodes to ensure coverage of the events, and it avoids the prematurity of the nodes. The algorithm has good coverage effects. In addition, considering that obstacles exist in the underwater environment, nodes are prevented from being invalid by imitating the mechanism of avoiding predators. Thus, the energy consumption of the network is reduced. Comparative analysis shows that the algorithm is simple and effective in wireless sensor network deployment. Compared with the optimized artificial fish swarm algorithm, the proposed algorithm exhibits advantages in network coverage, energy conservation, and obstacle avoidance. PMID:27775659
Forward and correctional OFDM-based visible light positioning
NASA Astrophysics Data System (ADS)
Li, Wei; Huang, Zhitong; Zhao, Runmei; He, Peixuan; Ji, Yuefeng
2017-09-01
Visible light positioning (VLP) has attracted much attention in both academic and industrial areas due to the extensive deployment of light-emitting diodes (LEDs) as next-generation green lighting. Generally, the coverage of a single LED lamp is limited, so LED arrays are always utilized to achieve uniform illumination within the large-scale indoor environment. However, in such dense LED deployment scenario, the superposition of the light signals becomes an important challenge for accurate VLP. To solve this problem, we propose a forward and correctional orthogonal frequency division multiplexing (OFDM)-based VLP (FCO-VLP) scheme with low complexity in generating and processing of signals. In the first forward procedure of FCO-VLP, an initial position is obtained by the trilateration method based on OFDM-subcarriers. The positioning accuracy will be further improved in the second correctional procedure based on the database of reference points. As demonstrated in our experiments, our approach yields an improved average positioning error of 4.65 cm and an enhanced positioning accuracy by 24.2% compared with trilateration method.
Learning through a portfolio of carbon capture and storage demonstration projects
NASA Astrophysics Data System (ADS)
Reiner, David M.
2016-01-01
Carbon dioxide capture and storage (CCS) technology is considered by many to be an essential route to meet climate mitigation targets in the power and industrial sectors. Deploying CCS technologies globally will first require a portfolio of large-scale demonstration projects. These first projects should assist learning by diversity, learning by replication, de-risking the technologies and developing viable business models. From 2005 to 2009, optimism about the pace of CCS rollout led to mutually independent efforts in the European Union, North America and Australia to assemble portfolios of projects. Since 2009, only a few of these many project proposals remain viable, but the initial rationales for demonstration have not been revisited in the face of changing circumstances. Here I argue that learning is now both more difficult and more important given the slow pace of deployment. Developing a more coordinated global portfolio will facilitate learning across projects and may determine whether CCS ever emerges from the demonstration phase.
Qvist, Staffan A.; Brook, Barry W.
2015-01-01
There is an ongoing debate about the deployment rates and composition of alternative energy plans that could feasibly displace fossil fuels globally by mid-century, as required to avoid the more extreme impacts of climate change. Here we demonstrate the potential for a large-scale expansion of global nuclear power to replace fossil-fuel electricity production, based on empirical data from the Swedish and French light water reactor programs of the 1960s to 1990s. Analysis of these historical deployments show that if the world built nuclear power at no more than the per capita rate of these exemplar nations during their national expansion, then coal- and gas-fired electricity could be replaced worldwide in less than a decade. Under more conservative projections that take into account probable constraints and uncertainties such as differing relative economic output across regions, current and past unit construction time and costs, future electricity demand growth forecasts and the retiring of existing aging nuclear plants, our modelling estimates that the global share of fossil-fuel-derived electricity could be replaced within 25–34 years. This would allow the world to meet the most stringent greenhouse-gas mitigation targets. PMID:25970621
Nguyen, Ha T.; Pearce, Joshua M.; Harrap, Rob; Barber, Gerald
2012-01-01
A methodology is provided for the application of Light Detection and Ranging (LiDAR) to automated solar photovoltaic (PV) deployment analysis on the regional scale. Challenges in urban information extraction and management for solar PV deployment assessment are determined and quantitative solutions are offered. This paper provides the following contributions: (i) a methodology that is consistent with recommendations from existing literature advocating the integration of cross-disciplinary competences in remote sensing (RS), GIS, computer vision and urban environmental studies; (ii) a robust methodology that can work with low-resolution, incomprehensive data and reconstruct vegetation and building separately, but concurrently; (iii) recommendations for future generation of software. A case study is presented as an example of the methodology. Experience from the case study such as the trade-off between time consumption and data quality are discussed to highlight a need for connectivity between demographic information, electrical engineering schemes and GIS and a typical factor of solar useful roofs extracted per method. Finally, conclusions are developed to provide a final methodology to extract the most useful information from the lowest resolution and least comprehensive data to provide solar electric assessments over large areas, which can be adapted anywhere in the world. PMID:22666044
Cognitive systems at the point of care: The CREDO program.
Fox, John
2017-04-01
CREDO is a framework for understanding human expertise and for designing and deploying systems that support cognitive tasks like situation and risk assessment, decision-making, therapy planning and workflow management. The framework has evolved through an extensive program of research on human decision-making and clinical practice. It draws on concepts from cognitive science, and has contributed new results to cognitive theory and understanding of human expertise and knowledge-based AI. These results are exploited in a suite of technologies for designing, implementing and deploying clinical services, early versions of which were reported by Das et al. (1997) [9] and Fox and Das (2000) [26]. A practical outcome of the CREDO program is a technology stack, a key element of which is an agent specification language (PROforma: Sutton and Fox (2003) [55]) which has proved to be a versatile tool for designing point of care applications in many clinical specialties and settings. Since software became available for implementing and deploying PROforma applications many kinds of services have been successfully built and trialed, some of which are in large-scale routine use. This retrospective describes the foundations of the CREDO model, summarizes the main theoretical, technical and clinical contributions, and discusses benefits of the cognitive approach. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
Moran Jay, Brighid; Howard, David; Hughes, Nick; Whitaker, Jeanette; Anandarajah, Gabrial
2014-01-01
Low carbon energy technologies are not deployed in a social vacuum; there are a variety of complex ways in which people understand and engage with these technologies and the changing energy system overall. However, the role of the public's socio-environmental sensitivities to low carbon energy technologies and their responses to energy deployments does not receive much serious attention in planning decarbonisation pathways to 2050. Resistance to certain resources and technologies based on particular socio-environmental sensitivities would alter the portfolio of options available which could shape how the energy system achieves decarbonisation (the decarbonisation pathway) as well as affecting the cost and achievability of decarbonisation. Thus, this paper presents a series of three modelled scenarios which illustrate the way that a variety of socio-environmental sensitivities could impact the development of the energy system and the decarbonisation pathway. The scenarios represent risk aversion (DREAD) which avoids deployment of potentially unsafe large-scale technology, local protectionism (NIMBY) that constrains systems to their existing spatial footprint, and environmental awareness (ECO) where protection of natural resources is paramount. Very different solutions for all three sets of constraints are identified; some seem slightly implausible (DREAD) and all show increased cost (especially in ECO).
Moran Jay, Brighid
2014-01-01
Low carbon energy technologies are not deployed in a social vacuum; there are a variety of complex ways in which people understand and engage with these technologies and the changing energy system overall. However, the role of the public's socio-environmental sensitivities to low carbon energy technologies and their responses to energy deployments does not receive much serious attention in planning decarbonisation pathways to 2050. Resistance to certain resources and technologies based on particular socio-environmental sensitivities would alter the portfolio of options available which could shape how the energy system achieves decarbonisation (the decarbonisation pathway) as well as affecting the cost and achievability of decarbonisation. Thus, this paper presents a series of three modelled scenarios which illustrate the way that a variety of socio-environmental sensitivities could impact the development of the energy system and the decarbonisation pathway. The scenarios represent risk aversion (DREAD) which avoids deployment of potentially unsafe large-scale technology, local protectionism (NIMBY) that constrains systems to their existing spatial footprint, and environmental awareness (ECO) where protection of natural resources is paramount. Very different solutions for all three sets of constraints are identified; some seem slightly implausible (DREAD) and all show increased cost (especially in ECO). PMID:24587735
A New Tool for Quality: The Internal Audit.
Haycock, Camille; Schandl, Annette
As health care systems aspire to improve the quality and value for the consumers they serve, quality outcomes must be at the forefront of this value equation. As organizations implement evidence-based practices, electronic records to standardize processes, and quality improvement initiatives, many tactics are deployed to accelerate improvement and care outcomes. This article describes how one organization utilized a formal clinical audit process to identify gaps and/or barriers that may be contributing to underperforming measures and outcomes. This partnership between quality and audit can be a powerful tool and produce insights that can be scaled across a large health care system.
Role of Concentrating Solar Power in Integrating Solar and Wind Energy: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denholm, P.; Mehos, M.
2015-06-03
As wind and solar photovoltaics (PV) increase in penetration it is increasingly important to examine enabling technologies that can help integrate these resources at large scale. Concentrating solar power (CSP) when deployed with thermal energy storage (TES) can provide multiple services that can help integrate variable generation (VG) resources such as wind and PV. CSP with TES can provide firm, highly flexible capacity, reducing minimum generation constraints which limit penetration and results in curtailment. By acting as an enabling technology, CSP can complement PV and wind, substantially increasing their penetration in locations with adequate solar resource.
Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly
NASA Technical Reports Server (NTRS)
LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.
2006-01-01
The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.
DRAGON - 8U Nanosatellite Orbital Deployer
NASA Technical Reports Server (NTRS)
Dobrowolski, Marcin; Grygorczuk, Jerzy; Kedziora, Bartosz; Tokarz, Marta; Borys, Maciej
2014-01-01
The Space Research Centre of the Polish Academy of Sciences (SRC PAS) together with Astronika company have developed an Orbital Deployer called DRAGON for ejection of the Polish scientific nanosatellite BRITE-PL Heweliusz (Fig. 1). The device has three unique mechanisms including an adopted and scaled lock and release mechanism from the ESA Rosetta mission MUPUS instrument. This paper discusses major design restrictions of the deployer, unique design features, and lessons learned from development through testing.
Policy Considerations for Commercializing Natural Gas and Biomass CCUS
NASA Astrophysics Data System (ADS)
Abrahams, L.; Clavin, C.
2017-12-01
Captured CO2 from power generation has been discussed as an opportunity to improve the environmental sustainability of fossil fuel-based electricity generation and likely necessary technological solution necessary for meeting long-term climate change mitigation goals. In our presentation, we review the findings of a study of natural gas CCUS technology research and development and discuss their applications to biomass CCUS technology potential. Based on interviews conducted with key stakeholders in CCUS technology development and operations, this presentation will discuss these technical and economic challenges and potential policy opportunities to support commercial scale CCUS deployment. In current domestic and electricity and oil markets, CCUS faces economic challenges for commercial deployment. In particular, the economic viability of CCUS has been impacted by the sustained low oil prices that have limited the potential for enhanced oil recovery (EOR) to serve as a near-term utilization opportunity for the captured CO2. In addition, large scale commercial adoption of CCUS is constrained by regulatory inconsistencies and uncertainties across the United States, high initial capital costs, achieving familiarity with new technology applications to existing markets, developing a successful performance track record to acquire financing agreements, and competing against well-established incumbent technologies. CCUS also has additional technical hurdles for measurement, verification, and reporting within states that have existing policy and regulatory frameworks for climate change mitigation. In addition to fossil-fuel based CCUS, we will discuss emerging opportunities to utilize CCUS fueled by gasified biomass resulting in carbon negative power generation with expanded economic opportunities associated with the enhanced carbon sequestration. Successful technology development of CCUS technology requires a portfolio of research leading to technical advances, advances in financial instruments to leverage the benefits of multiple commodity markets (e.g. natural gas, oil, biomass), and policy instruments that address regulatory hurdles posed CCUS technology deployment.
Bock, Christian; Demiris, George; Choi, Yong; Le, Thai; Thompson, Hilaire J; Samuel, Arjmand; Huang, Danny
2016-03-11
The use of smart home sensor systems is growing primarily due to the appeal of unobtrusively monitoring older adult health and wellness. However, integrating large-scale sensor systems within residential settings can be challenging when deployment takes place across multiple environments, requiring customization of applications, connection across various devices and effective visualization of complex longitudinal data. The objective of the study was to demonstrate the implementation of a smart home system using an open, extensible platform in a real-world setting and develop an application to visualize data real time. We deployed the open source Lab of Things platform in a house of 11 residents as a demonstration of feasibility over the course of 3 months. The system consisted of Aeon Labs Z-wave Door/Window sensors and an Aeon Labs Multi-sensor that collected data on motion, temperature, luminosity, and humidity. We applied a Rapid Iterative Testing and Evaluation approach towards designing a visualization interface engaging gerontological experts. We then conducted a survey with 19 older adult and caregiver stakeholders to inform further design revisions. Our initial visualization mockups consisted of a bar chart representing activity level over time. Family members felt comfortable using the application. Older adults however, indicated it would be difficult to learn to use the application, and had trouble identifying utility. A key for older adults was ensuring that the data collected could be utilized by their family members, physicians, or caregivers. The approach described in this work is generalizable towards future smart home deployments and can be a valuable guide for researchers to scale a study across multiple homes and connected devices, and to create personalized interfaces for end users.
NASA Astrophysics Data System (ADS)
Dinsmore, Kerry; Drewer, Julia; Leeson, Sarah; Skiba, Ute; Levy, Pete; George, Charles
2014-05-01
Arctic and sub arctic wetlands are a major source of atmospheric CH4 and therefore have the potential to be important in controlling global radiative forcing. Furthermore, the strong links between wetland CH4 emissions and vegetation community, hydrology and temperature suggest potentially large feedbacks between climate change and future emissions. Quantifying current emissions over large spatial scales and predicting future climatic feedbacks requires a fundamental understanding of the ground based drivers of plot scale emissions. The MAMM project (Methane in the Arctic: Measurements and Modelling) aims to understand and quantify current CH4 emissions and future climatic impacts by combining both ground and aircraft measurements across the European Arctic with regional computer modelling. Here we present results from the ground-based MAMM measurement campaigns, analysing chamber-measured CH4 emissions from two sites in the European Arctic/Sub-Arctic region (Sodankylä, Finland; Stordalen Mire, Sweden) from growing seasons in 2012 and 2013. A total of 85 wetland static chambers were deployed across the two field sites; 39 at Sodankylä (67° 22'01' N, 26° 3'06' E) in 2012 and 46 at Stordalen Mire (68° 21'20' N, 19° 02'56' E) in 2013. Chamber design, protocol and deployment were the same across both sites. Chambers were located at sites chosen strategically to cover the local range of water table depths and vegetation communities. A total of 18 and 15 repeated measurements were made at each chamber in Sodankylä and Stordalen Mire, respectively, over the snow-free season. Preliminary results show a large range of CH4 fluxes across both sites ranging from a CH4 uptake of up to 0.07 and 0.06 mg CH4-C m-2 hr-1 to emissions of 17.3 and 44.2 mg CH4-C m-2 hr-1 in Sodankylä and Stordalen Mire, respectively. Empirical models based on vegetation community, water table depth, temperature and soil nutrient availability (Plant Root Simulator Probes, PRSTM) have been constructed with the aim of understanding the drivers of chamber scale fluxes. By combining measurements made at two different sites, >300km apart, using the same experimental setup, we are uniquely able to investigate whether CH4 emissions are driven by common parameters. Furthermore we are able to determine if plot scale empirical models and parameterisations can be used effectively to upscale emissions to landscape and whole Arctic scale.
The Price of Precision: Large-Scale Mapping of Forest Structure and Biomass Using Airborne Lidar
NASA Astrophysics Data System (ADS)
Dubayah, R.
2015-12-01
Lidar remote sensing provides one of the best means for acquiring detailed information on forest structure. However, its application over large areas has been limited largely because of its expense. Nonetheless, extant data exist over many states in the U.S., funded largely by state and federal consortia and mainly for infrastructure, emergency response, flood plain and coastal mapping. These lidar data are almost always acquired in leaf-off seasons, and until recently, usually with low point count densities. Even with these limitations, they provide unprecedented wall-to-wall mappings that enable development of appropriate methodologies for large-scale deployment of lidar. In this talk we summarize our research and lessons learned in deriving forest structure over regional areas as part of NASA's Carbon Monitoring System (CMS). We focus on two areas: the entire state of Maryland and Sonoma County, California. The Maryland effort used low density, leaf-off data acquired by each county in varying epochs, while the on-going Sonoma work employs state-of-the-art, high density, wall-to-wall, leaf-on lidar data. In each area we combine these lidar coverages with high-resolution multispectral imagery from the National Agricultural Imagery Program (NAIP) and in situ plot data to produce maps of canopy height, tree cover and biomass, and compare our results against FIA plot data and national biomass maps. Our work demonstrates that large-scale mapping of forest structure at high spatial resolution is achievable but products may be complex to produce and validate over large areas. Furthermore, fundamental issues involving statistical approaches, plot types and sizes, geolocation, modeling scales, allometry, and even the definitions of "forest" and "non-forest" must be approached carefully. Ultimately, determining the "price of precision", that is, does the value of wall-to-wall forest structure data justify their expense, should consider not only carbon market applications, but the other ways the underlying lidar data may be used.
ADEPT - A Mechanically Deployable Entry System Technology in Development at NASA
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Wercinski, Paul; Cassell, Alan; Smith, Brandon; Yount, Bryan
2016-01-01
The proposed presentation will give an overview of a mechanically deployable entry system concept development with a comprehensive summary of the ground tests and design development completed to-date, and current plans for a small-scale flight test in the near future.
3D printing via ambient reactive extrusion
Rios, Orlando; Carter, William G.; Post, Brian K.; ...
2018-03-14
Here, Additive Manufacturing (AM) has the potential to offer many benefits over traditional manufacturing methods in the fabrication of complex parts with advantages such as low weight, complex geometry, and embedded functionality. In practice, today’s AM technologies are limited by their slow speed and highly directional properties. To address both issues, we have developed a reactive mixture deposition approach that can enable 3D printing of polymer materials at over 100X the volumetric deposition rate, enabled by a greater than 10X reduction in print head mass compared to existing large-scale thermoplastic deposition methods, with material chemistries that can be tuned formore » specific properties. Additionally, the reaction kinetics and transient rheological properties are specifically designed for the target deposition rates, enabling the synchronized development of increasing shear modulus and extensive cross linking across the printed layers. This ambient cure eliminates the internal stresses and bulk distortions that typically hamper AM of large parts, and yields a printed part with inter-layer covalent bonds that significantly improve the strength of the part along the build direction. The fast cure kinetics combined with the fine-tuned viscoelastic properties of the mixture enable rapid vertical builds that are not possible using other approaches. Through rheological characterization of mixtures that were capable of printing in this process as well as materials that have sufficient structural integrity for layer-on-layer printing, a “printability” rheological phase diagram has been developed, and is presented here. We envision this approach implemented as a deployable manufacturing system, where manufacturing is done on-site using the efficiently-shipped polymer, locally-sourced fillers, and a small, deployable print system. Unlike existing additive manufacturing approaches which require larger and slower print systems and complex thermal management strategies as scale increases, liquid reactive polymers decouple performance and print speed from the scale of the part, enabling a new class of cost-effective, fuel-efficient additive manufacturing.« less
3D printing via ambient reactive extrusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rios, Orlando; Carter, William G.; Post, Brian K.
Here, Additive Manufacturing (AM) has the potential to offer many benefits over traditional manufacturing methods in the fabrication of complex parts with advantages such as low weight, complex geometry, and embedded functionality. In practice, today’s AM technologies are limited by their slow speed and highly directional properties. To address both issues, we have developed a reactive mixture deposition approach that can enable 3D printing of polymer materials at over 100X the volumetric deposition rate, enabled by a greater than 10X reduction in print head mass compared to existing large-scale thermoplastic deposition methods, with material chemistries that can be tuned formore » specific properties. Additionally, the reaction kinetics and transient rheological properties are specifically designed for the target deposition rates, enabling the synchronized development of increasing shear modulus and extensive cross linking across the printed layers. This ambient cure eliminates the internal stresses and bulk distortions that typically hamper AM of large parts, and yields a printed part with inter-layer covalent bonds that significantly improve the strength of the part along the build direction. The fast cure kinetics combined with the fine-tuned viscoelastic properties of the mixture enable rapid vertical builds that are not possible using other approaches. Through rheological characterization of mixtures that were capable of printing in this process as well as materials that have sufficient structural integrity for layer-on-layer printing, a “printability” rheological phase diagram has been developed, and is presented here. We envision this approach implemented as a deployable manufacturing system, where manufacturing is done on-site using the efficiently-shipped polymer, locally-sourced fillers, and a small, deployable print system. Unlike existing additive manufacturing approaches which require larger and slower print systems and complex thermal management strategies as scale increases, liquid reactive polymers decouple performance and print speed from the scale of the part, enabling a new class of cost-effective, fuel-efficient additive manufacturing.« less
Laboratory Study on the Effect of Tidal Stream Turbines on Hydrodynamics and Sediment Dynamics
NASA Astrophysics Data System (ADS)
Amoudry, L.; Ramirez-Mendoza, R.; Peter, T.; McLelland, S.; Simmons, S.; Parsons, D. R.; Vybulkova, L.
2016-02-01
Tidal stream turbines (TST) are one potential technology for harnessing tidal energy, and the measurement and characterisation of their wakes is important both for environmental and development reasons. Indeed, wake recovery length is an important parameter for appropriate design of arrays, and wakes may result in altered dynamics both in the water column and at the seabed. We will report on laboratory scale experiments over a mobile sediment bed, which aim to quantify the detailed wake structure and its impact on sediment transport dynamics. A 0.2 m diameter model turbine was installed in a large-scale flume (16 m long, 1.6 m wide, 0.6 m deep) at the University of Hull's Total Environment Simulator and a steady current was driven over an artificial sediment bed using recirculating pumps. A high-resolution pulse-coherent acoustic Doppler profiler (Nortek Aquadopp HR) was used to measure vertical profiles of the three-dimensional mean current at different locations downstream of the model turbine. A three-dimensional Acoustic Ripple Profiler was used to map the bed and its evolution during the experiments. Acoustic backscatter systems were also deployed in two-dimensional arrays both along the flume and across the flume. These measurements revealed that the presence of the model turbine resulted in an expected reduction of the mean current and in changes in the vertical shear profiles. The bed mapping highlighted a horseshoe-shaped scour near the model turbine, and sediment deposition in the far wake region. The model turbine significantly influenced the suspension patterns, and generated significant asymmetry in the process, which was also evident from the other measurements (flow and sediment bed). These results highlight the effects induced by TSTs on near-bed hydrodynamics, suspension dynamics, and geomorphology, which may all have to be considered prior to large-scale deployments of arrays of TSTs in shelf seas.
NASA Technical Reports Server (NTRS)
Leidich, C. A. (Editor); Pittman, R. B. (Editor)
1984-01-01
The results of five technology panels which convened to discuss the Large Deployable Reflector (LDR) are presented. The proposed LDR is a large, ambient-temperature, far infrared/submillimeter telescope designed for space. Panel topics included optics, materials and structures, sensing and control, science instruments, and systems and missions. The telescope requirements, the estimated technology levels, and the areas in which the generic technology work has to be augmented are enumerated.
Modeling and analysis of a large deployable antenna structure
NASA Astrophysics Data System (ADS)
Chu, Zhengrong; Deng, Zongquan; Qi, Xiaozhi; Li, Bing
2014-02-01
One kind of large deployable antenna (LDA) structure is proposed by combining a number of basic deployable units in this paper. In order to avoid vibration caused by fast deployment speed of the mechanism, a braking system is used to control the spring-actuated system. Comparisons between the LDA structure and a similar structure used by the large deployable reflector (LDR) indicate that the former has potential for use in antennas with up to 30 m aperture due to its lighter weight. The LDA structure is designed to form a spherical surface found by the least square fitting method so that it can be symmetrical. In this case, the positions of the terminal points in the structure are determined by two principles. A method to calculate the cable network stretched on the LDA structure is developed, which combines the original force density method and the parabolic surface constraint. Genetic algorithm is applied to ensure that each cable reaches a desired tension, which avoids the non-convergence issue effectively. We find that the pattern for the front and rear cable net must be the same when finding the shape of the rear cable net, otherwise anticlastic surface would generate.
Relative dispersion of clustered drifters in a small micro-tidal estuary
NASA Astrophysics Data System (ADS)
Suara, Kabir; Chanson, Hubert; Borgas, Michael; Brown, Richard J.
2017-07-01
Small tide-dominated estuaries are affected by large scale flow structures which combine with the underlying bed generated smaller scale turbulence to significantly increase the magnitude of horizontal diffusivity. Field estimates of horizontal diffusivity and its associated scales are however rare due to limitations in instrumentation. Data from multiple deployments of low and high resolution clusters of GPS-drifters are used to examine the dynamics of a surface flow in a small micro-tidal estuary through relative dispersion analyses. During the field study, cluster diffusivity, which combines both large- and small-scale processes ranged between, 0.01 and 3.01 m2/s for spreading clusters and, -0.06 and -4.2 m2/s for contracting clusters. Pair-particle dispersion, Dp2, was scale dependent and grew as Dp2 ∼ t1.83 in streamwise and Dp2 ∼ t0.8 in cross-stream directions. At small separation scale, pair-particle (d < 0.5 m) relative diffusivity followed the Richardson's 4/3 power law and became weaker as separation scale increases. Pair-particle diffusivity was described as Kp ∼ d1.01 and Kp ∼ d0.85 in the streamwise and cross-stream directions, respectively for separation scales ranging from 0.1 to 10 m. Two methods were used to identify the mechanism responsible for dispersion within the channel. The results clearly revealed the importance of strain fields (stretching and shearing) in the spreading of particles within a small micro-tidal channel. The work provided input for modelling dispersion of passive particle in shallow micro-tidal estuaries where these were not previously experimentally studied.
1997-04-01
technology matures. Mid-course phase Warhead & Booster ■’-=>- penaid deployment burnout v y...phase Warhead & Booster _^, penaid deployment burnout v y^ complete...and penaids fit so equippec I) are deployed immediately following boost phase burnout . • Large deceleration occurs from atmospheric drag upon re
NEON: High Frequency Monitoring Network for Watershed-Scale Processes and Aquatic Ecology
NASA Astrophysics Data System (ADS)
Vance, J. M.; Fitzgerald, M.; Parker, S. M.; Roehm, C. L.; Goodman, K. J.; Bohall, C.; Utz, R.
2014-12-01
Networked high frequency hydrologic and water quality measurements needed to investigate physical and biogeochemical processes at the watershed scale and create robust models are limited and lacking standardization. Determining the drivers and mechanisms of ecological changes in aquatic systems in response to natural and anthropogenic pressures is challenging due to the large amounts of terrestrial, aquatic, atmospheric, biological, chemical, and physical data it requires at varied spatiotemporal scales. The National Ecological Observatory Network (NEON) is a continental-scale infrastructure project designed to provide data to address the impacts of climate change, land-use, and invasive species on ecosystem structure and function. Using a combination of standardized continuous in situ measurements and observational sampling, the NEON Aquatic array will produce over 200 data products across its spatially-distributed field sites for 30 years to facilitate spatiotemporal analysis of the drivers of ecosystem change. Three NEON sites in Alabama were chosen to address linkages between watershed-scale processes and ecosystem changes along an eco-hydrological gradient within the Tombigbee River Basin. The NEON Aquatic design, once deployed, will include continuous measurements of surface water physical, chemical, and biological parameters, groundwater level, temperature and conductivity and local meteorology. Observational sampling will include bathymetry, water chemistry and isotopes, and a suite of organismal sampling from microbes to macroinvertebrates to vertebrates. NEON deployed a buoy to measure the temperature profile of the Black Warrior River from July - November, 2013 to determine the spatiotemporal variability across the water column from a daily to seasonal scale. In July 2014 a series of water quality profiles were performed to assess the contribution of physical and biogeochemical drivers over a diurnal cycle. Additional river transects were performed across our site reach to capture the spatial variability of surface water parameters. Our preliminary data show differing response times to precipitation events and diurnal processes informing our infrastructure designs and sampling protocols aimed at providing data to address the eco-hydrological gradient.
Impact of Federal Tax Policy on Utility-Scale Solar Deployment Given Financing Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Trieu; Cole, Wesley; Krishnan, Venkat
In this study, the authors conducted a literature review of approaches and assumptions used by other modeling teams and consultants with respect to solar project financing; developed and incorporated an ability to model the likely financing shift away from more expensive sources of capital and toward cheaper sources as the investment tax credit declines in the ReEDS model; and used the 'before and after' versions of the ReEDS model to isolate and analyze the deployment impact of the financing shift under a range of conditions. Using ReEDS scenarios with this improved capability, we find that this 'financing' shift would softenmore » the blow of the ITC reversion; however, the overall impacts of such a shift in capital structure are estimated to be small and near-term utility-scale PV deployment is found to be much more sensitive to other factors that might drive down utility-scale PV prices.« less
Secure Data Aggregation with Fully Homomorphic Encryption in Large-Scale Wireless Sensor Networks.
Li, Xing; Chen, Dexin; Li, Chunyan; Wang, Liangmin
2015-07-03
With the rapid development of wireless communication technology, sensor technology, information acquisition and processing technology, sensor networks will finally have a deep influence on all aspects of people's lives. The battery resources of sensor nodes should be managed efficiently in order to prolong network lifetime in large-scale wireless sensor networks (LWSNs). Data aggregation represents an important method to remove redundancy as well as unnecessary data transmission and hence cut down the energy used in communication. As sensor nodes are deployed in hostile environments, the security of the sensitive information such as confidentiality and integrity should be considered. This paper proposes Fully homomorphic Encryption based Secure data Aggregation (FESA) in LWSNs which can protect end-to-end data confidentiality and support arbitrary aggregation operations over encrypted data. In addition, by utilizing message authentication codes (MACs), this scheme can also verify data integrity during data aggregation and forwarding processes so that false data can be detected as early as possible. Although the FHE increase the computation overhead due to its large public key size, simulation results show that it is implementable in LWSNs and performs well. Compared with other protocols, the transmitted data and network overhead are reduced in our scheme.
Using real options to evaluate the flexibility in the deployment of SMR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Locatelli, G.; Mancini, M.; Ruiz, F.
2012-07-01
According to recent estimations the financial gap between Large Reactors (LR) and Small Medium Reactors (SMRs) seems not as huge as the economy of scale would suggest, so the SMRs are going to be important players of the worldwide nuclear renaissance. POLIMIs INCAS model has been developed to compare the investment in SMR with respect to LR. It provides the value of IRR (Internal Rate of Return), NPV (Net Present Value), LUEC (Levelized Unitary Electricity Cost), up-front investment, etc. The aim of this research is to integrate the actual INCAS model, based on discounted cash flows, with the real optionmore » theory to measure flexibility of the investor to expand, defer or abandon a nuclear project, under future uncertainties. The work compares the investment in a large nuclear power plant with a series of smaller, modular nuclear power plants on the same site. As a consequence it compares the benefits of the large power plant, coming from the economy of scale, to the benefit of the modular project (flexibility) concluding that managerial flexibility can be measured and used by an investor to face the investment risks. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emmanuel Ohene Opare, Jr.; Charles V. Park
The Next Generation Nuclear Plant (NGNP) Project, managed by the Idaho National Laboratory (INL), is authored by the Energy Policy Act of 2005, to research, develop, design, construct, and operate a prototype fourth generation nuclear reactor to meet the needs of the 21st Century. A section in this document proposes that the NGNP will provide heat for process heat applications. As with all large projects developing and deploying new technologies, the NGNP is expected to meet high performance and availability targets relative to current state of the art systems and technology. One requirement for the NGNP is to provide heatmore » for the generation of hydrogen for large scale productions and this process heat application is required to be at least 90% or more available relative to other technologies currently on the market. To reach this goal, a RAM Roadmap was developed highlighting the actions to be taken to ensure that various milestones in system development and maturation concurrently meet required availability requirements. Integral to the RAM Roadmap was the use of a RAM analytical/simulation tool which was used to estimate the availability of the system when deployed based on current design configuration and the maturation level of the system.« less
AES based secure low energy adaptive clustering hierarchy for WSNs
NASA Astrophysics Data System (ADS)
Kishore, K. R.; Sarma, N. V. S. N.
2013-01-01
Wireless sensor networks (WSNs) provide a low cost solution in diversified application areas. The wireless sensor nodes are inexpensive tiny devices with limited storage, computational capability and power. They are being deployed in large scale in both military and civilian applications. Security of the data is one of the key concerns where large numbers of nodes are deployed. Here, an energy-efficient secure routing protocol, secure-LEACH (Low Energy Adaptive Clustering Hierarchy) for WSNs based on the Advanced Encryption Standard (AES) is being proposed. This crypto system is a session based one and a new session key is assigned for each new session. The network (WSN) is divided into number of groups or clusters and a cluster head (CH) is selected among the member nodes of each cluster. The measured data from the nodes is aggregated by the respective CH's and then each CH relays this data to another CH towards the gateway node in the WSN which in turn sends the same to the Base station (BS). In order to maintain confidentiality of data while being transmitted, it is necessary to encrypt the data before sending at every hop, from a node to the CH and from the CH to another CH or to the gateway node.
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Bednar, J. A.; Rudiger, P.; Stevens, J. L. R.; Ball, C. E.; Christensen, S. D.; Pothina, D.
2017-12-01
The rich variety of software libraries available in the Python scientific ecosystem provides a flexible and powerful alternative to traditional integrated GIS (geographic information system) programs. Each such library focuses on doing a certain set of general-purpose tasks well, and Python makes it relatively simple to glue the libraries together to solve a wide range of complex, open-ended problems in Earth science. However, choosing an appropriate set of libraries can be challenging, and it is difficult to predict how much "glue code" will be needed for any particular combination of libraries and tasks. Here we present a set of libraries that have been designed to work well together to build interactive analyses and visualizations of large geographic datasets, in standard web browsers. The resulting workflows run on ordinary laptops even for billions of data points, and easily scale up to larger compute clusters when available. The declarative top-level interface used in these libraries means that even complex, fully interactive applications can be built and deployed as web services using only a few dozen lines of code, making it simple to create and share custom interactive applications even for datasets too large for most traditional GIS systems. The libraries we will cover include GeoViews (HoloViews extended for geographic applications) for declaring visualizable/plottable objects, Bokeh for building visual web applications from GeoViews objects, Datashader for rendering arbitrarily large datasets faithfully as fixed-size images, Param for specifying user-modifiable parameters that model your domain, Xarray for computing with n-dimensional array data, Dask for flexibly dispatching computational tasks across processors, and Numba for compiling array-based Python code down to fast machine code. We will show how to use the resulting workflow with static datasets and with simulators such as GSSHA or AdH, allowing you to deploy flexible, high-performance web-based dashboards for your GIS data or simulations without needing major investments in code development or maintenance.
Overview of the 2009 and 2011 Sayarim Infrasound Calibration Experiments
NASA Astrophysics Data System (ADS)
Fee, D.; Waxler, R.; Drob, D.; Gitterman, Y.; Given, J.
2012-04-01
The establishment of the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has stimulated infrasound research and development. However, as the network comes closer to completion there exists a lack of large, well-constrained sources to test the network and its capabilities. Also, significant uncertainties exist in long-range acoustic propagation due to a dynamic, difficult to characterize atmosphere, particularly the thermosphere. In 2009 and 2011 three large scale infrasound calibration experiments were performed in Europe, the Middle East, Africa, and Asia. The goal of the calibration experiments were to test the IMS infrasound network and validate atmospheric and propagation models with large, well-constrained infrasound sources. This presentation provides an overview of the calibration experiments, including deployment, atmospheric conditions during the experiments, explosion characterization, infrasonic signal detection and identification, and a discussion of the results and implications. Each calibration experiment consisted of singular surface detonation of explosives with nominal weights of 82, 10.24, and 102.08 tons on 26 August 2009, 24 January 2011, and 26 January 2011, respectively. These explosions were designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range, Israel and produced significant infrasound detected by numerous permanent and temporary infrasound arrays in the region. The 2009 experiment was performed in the summer to take advantage of the westerly stratospheric winds. Infrasonic arrivals were detected by both IMS and temporary arrays deployed to the north and west of the source, including clear stratospheric arrivals and thermospheric arrivals with low celerities. The 2011 experiment was performed during the winter, when strong easterly stratospheric winds dominated in addition to a strong tropospheric jet (the jet stream). These wind jets allowed detection out to 6500 km, in addition to multiple tropospheric, stratospheric, and thermospheric arrivals at arrays deployed to the east. These experiments represented a considerable, successful collaboration between the CTBTO and numerous other groups and will provide a rich ground-truth dataset for detailed infrasound studies in the future.
Preliminary design method for deployable spacecraft beams
NASA Technical Reports Server (NTRS)
Mikulas, Martin M., Jr.; Cassapakis, Costas
1995-01-01
There is currently considerable interest in low-cost, lightweight, compactly packageable deployable elements for various future missions involving small spacecraft. These elements must also have a simple and reliable deployment scheme and possess zero or very small free-play. Although most small spacecraft do not experience large disturbances, very low stiffness appendages or free-play can couple with even small disturbances and lead to unacceptably large attitude errors which may involve the introduction of a flexible-body control system. A class of structures referred to as 'rigidized structures' offers significant promise in providing deployable elements that will meet these needs for small spacecraft. The purpose of this paper is to introduce several rigidizable concepts and to develop a design methodology which permits a rational comparison of these elements to be made with alternate concepts.
NASA Technical Reports Server (NTRS)
Jernell, L. S.; Croom, D. R.
1979-01-01
Wind tunnel tests were conducted on a 0.03 scale model of a large wide-body commercial aircraft to determine the effects on the static aerodynamic characteristics resulting from the attachment of a belly pod for the long-range deployment of outsize military equipment. The effectiveness of horizontal-tip fins in augmenting directional stability was investigated. At a test Reynolds number of 1.08 x 1,000,000, the addition of the pod results in an increase in total drag of approximately 20 percent. Trim drag due to the pod is very small. Although the pod produces a significant decrease in directional stability, the addition of the tip fins restores some of the stability, particularly at the lower angles of attack.
The Cosmology Large Angular Scale Surveyor (CLASS)
NASA Astrophysics Data System (ADS)
Cleary, Joseph
2018-01-01
The Cosmology Large Angular Scale Surveyor (CLASS) is an array of four telescopes designed to measure the polarization of the Cosmic Microwave Background. CLASS aims to detect the B-mode polarization from primordial gravitational waves predicted by cosmic inflation theory, as well as the imprint left by reionization upon the CMB E-mode polarization. This will be achieved through a combination of observing strategy and state-of-the-art instrumentation. CLASS is observing 70% of the sky to characterize the CMB at large angular scales, which will measure the entire CMB power spectrum from the reionization peak to the recombination peak. The four telescopes operate at frequencies of 38, 93, 145, and 217 GHz, in order to estimate Galactic synchrotron and dust foregrounds while avoiding atmospheric absorption. CLASS employs rapid polarization modulation to overcome atmospheric and instrumental noise. Polarization sensitive cryogenic detectors with low noise levels provide CLASS the sensitivity required to constrain the tensor-to-scalar ratio down to levels of r ~ 0.01 while also measuring the optical depth the reionization to sample-variance levels. These improved constraints on the optical depth to reionization are required to pin down the mass of neutrinos from complementary cosmological data. CLASS has completed a year of observations at 38 GHz and is in the process of deploying the rest of the telescope array. This poster provides an overview and update on the CLASS science, hardware and survey operations.
Cheadle, Lucy; Deanes, Lauren; Sadighi, Kira; Gordon Casey, Joanna; Collier-Oxandale, Ashley; Hannigan, Michael
2017-09-10
Recent advances in air pollution sensors have led to a new wave of low-cost measurement systems that can be deployed in dense networks to capture small-scale spatio-temporal variations in ozone, a pollutant known to cause negative human health impacts. This study deployed a network of seven low-cost ozone metal oxide sensor systems (UPods) in both an open space and an urban location in Boulder, Colorado during June and July of 2015, to quantify ozone variations on spatial scales ranging from 12 m between UPods to 6.7 km between open space and urban measurement sites with a measurement uncertainty of ~5 ppb. The results showed spatial variability of ozone at both deployment sites, with the largest differences between UPod measurements occurring during the afternoons. The peak median hourly difference between UPods was 6 ppb at 1:00 p.m. at the open space site, and 11 ppb at 4:00 p.m. at the urban site. Overall, the urban ozone measurements were higher than in the open space measurements. This study evaluates the effectiveness of using low-cost sensors to capture microscale spatial and temporal variation of ozone; additionally, it highlights the importance of field calibrations and measurement uncertainty quantification when deploying low-cost sensors.
NASA Astrophysics Data System (ADS)
Field, J.; Paustian, K.
2016-12-01
The interior mountain West is particularly vulnerable to climate change with potential impacts including drought and wildfire intensification, and wide-scale species disruptions due to shifts in habitable elevation ranges or other effects. One such example is the current outbreak of native mountain pine and spruce beetles across the Rockies, with warmer winters, dryer summers, and a legacy of logging and fire suppression all interacting to result in infestation and unprecedented tree mortality over more than 42 million acres. Current global climate change mitigation commitments imply that shifts to renewable energy must be supplemented with widespread deployment of carbon-negative technologies such as BECCS and biochar. Carefully-designed forest bioenergy and biochar industries can play an important role in meeting these targets, valorizing woody biomass and allowing more acres to be actively managed under existing land management goals while simultaneously displacing fossil energy use and directly sequestering carbon. In this work we assess the negative emissions potential from the deployment of biochar co-producing thermochemical bioenergy technologies in the Rockies using beetle-kill wood as a feedstock, a way of leveraging a climate change driven problem for climate mitigation. We start with a review and classification of bioenergy lifecycle assessment emission source categories, clarifying the differences in mechanism and confidence around emissions sources, offsets, sequestration, and leakage effects. Next we develop methods for modeling ecosystem carbon response to biomass removals at the stand scale, considering potential species shifts and regrowth rates under different harvest systems deployed in different areas. We then apply a lifecycle assessment framework to evaluate the performance of a set of real-world bioenergy technologies at enterprise scale, including biomass logistics and conversion product yields. We end with an exploration of regional-scale mitigation capacity considering wide-scale deployment and potential wildfire feedback effects of harvest, highlighting the relative importance of supply chain, conversion technology, ecological, and epistemological uncertainties in realizing wide-scale negative emissions in this region.
Deployable System for Crash-Load Attenuation
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Jackson, Karen E.
2007-01-01
An externally deployable honeycomb structure is investigated with respect to crash energy management for light aircraft. The new concept utilizes an expandable honeycomb-like structure to absorb impact energy by crushing. Distinguished by flexible hinges between cell wall junctions that enable effortless deployment, the new energy absorber offers most of the desirable features of an external airbag system without the limitations of poor shear stability, system complexity, and timing sensitivity. Like conventional honeycomb, once expanded, the energy absorber is transformed into a crush efficient and stable cellular structure. Other advantages, afforded by the flexible hinge feature, include a variety of deployment options such as linear, radial, and/or hybrid deployment methods. Radial deployment is utilized when omnidirectional cushioning is required. Linear deployment offers better efficiency, which is preferred when the impact orientation is known in advance. Several energy absorbers utilizing different deployment modes could also be combined to optimize overall performance and/or improve system reliability as outlined in the paper. Results from a series of component and full scale demonstration tests are presented as well as typical deployment techniques and mechanisms. LS-DYNA analytical simulations of selected tests are also presented.
Compiling and using input-output frameworks through collaborative virtual laboratories.
Lenzen, Manfred; Geschke, Arne; Wiedmann, Thomas; Lane, Joe; Anderson, Neal; Baynes, Timothy; Boland, John; Daniels, Peter; Dey, Christopher; Fry, Jacob; Hadjikakou, Michalis; Kenway, Steven; Malik, Arunima; Moran, Daniel; Murray, Joy; Nettleton, Stuart; Poruschi, Lavinia; Reynolds, Christian; Rowley, Hazel; Ugon, Julien; Webb, Dean; West, James
2014-07-01
Compiling, deploying and utilising large-scale databases that integrate environmental and economic data have traditionally been labour- and cost-intensive processes, hindered by the large amount of disparate and misaligned data that must be collected and harmonised. The Australian Industrial Ecology Virtual Laboratory (IELab) is a novel, collaborative approach to compiling large-scale environmentally extended multi-region input-output (MRIO) models. The utility of the IELab product is greatly enhanced by avoiding the need to lock in an MRIO structure at the time the MRIO system is developed. The IELab advances the idea of the "mother-daughter" construction principle, whereby a regionally and sectorally very detailed "mother" table is set up, from which "daughter" tables are derived to suit specific research questions. By introducing a third tier - the "root classification" - IELab users are able to define their own mother-MRIO configuration, at no additional cost in terms of data handling. Customised mother-MRIOs can then be built, which maximise disaggregation in aspects that are useful to a family of research questions. The second innovation in the IELab system is to provide a highly automated collaborative research platform in a cloud-computing environment, greatly expediting workflows and making these computational benefits accessible to all users. Combining these two aspects realises many benefits. The collaborative nature of the IELab development project allows significant savings in resources. Timely deployment is possible by coupling automation procedures with the comprehensive input from multiple teams. User-defined MRIO tables, coupled with high performance computing, mean that MRIO analysis will be useful and accessible for a great many more research applications than would otherwise be possible. By ensuring that a common set of analytical tools such as for hybrid life-cycle assessment is adopted, the IELab will facilitate the harmonisation of fragmented, dispersed and misaligned raw data for the benefit of all interested parties. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingersoll, Daniel T
2007-01-01
Technical Requirements For Reactors To Be Deployed Internationally For the Global Nuclear Energy Partnership Robert Price U.S. Department of Energy, 1000 Independence Ave, SW, Washington, DC 20585, Daniel T. Ingersoll Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6162, INTRODUCTION The Global Nuclear Energy Partnership (GNEP) seeks to create an international regime to support large-scale growth in the worldwide use of nuclear energy. Fully meeting the GNEP vision may require the deployment of thousands of reactors in scores of countries, many of which do not use nuclear energy currently. Some of these needs will be met by large-scalemore » Generation III and III+ reactors (>1000 MWe) and Generation IV reactors when they are available. However, because many developing countries have small and immature electricity grids, the currently available Generation III(+) reactors may be unsuitable since they are too large, too expensive, and too complex. Therefore, GNEP envisions new types of reactors that must be developed for international deployment that are "right sized" for the developing countries and that are based on technologies, designs, and policies focused on reducing proliferation risk. The first step in developing such systems is the generation of technical requirements that will ensure that the systems meet both the GNEP policy goals and the power needs of the recipient countries. REQUIREMENTS Reactor systems deployed internationally within the GNEP context must meet a number of requirements similar to the safety, reliability, economics, and proliferation goals established for the DOE Generation IV program. Because of the emphasis on deployment to nonnuclear developing countries, the requirements will be weighted differently than with Generation IV, especially regarding safety and non-proliferation goals. Also, the reactors should be sized for market conditions in developing countries where energy demand per capita, institutional maturity and industrial infrastructure vary considerably, and must utilize fuel that is compatible with the fuel recycle technologies being developed by GNEP. Arrangements are already underway to establish Working Groups jointly with Japan and Russia to develop requirements for reactor systems. Additional bilateral and multilateral arrangements are expected as GNEP progresses. These Working Groups will be instrumental in establishing an international consensus on reactor system requirements. GNEP CERTIFICATION After establishing an accepted set of requirements for new reactors that are deployed internationally, a mechanism is needed that allows capable countries to continue to market their reactor technologies and services while assuring that they are compatible with GNEP goals and technologies. This will help to preserve the current system of open, commercial competition while steering the international community to meet common policy goals. The proposed vehicle to achieve this is the concept of GNEP Certification. Using objective criteria derived from the technical requirements in several key areas such as safety, security, non-proliferation, and safeguards, reactor designs could be evaluated and then certified if they meet the criteria. This certification would ensure that reactor designs meet internationally approved standards and that the designs are compatible with GNEP assured fuel services. SUMMARY New "right sized" power reactor systems will need to be developed and deployed internationally to fully achieve the GNEP vision of an expanded use of nuclear energy world-wide. The technical requirements for these systems are being developed through national and international Working Groups. The process is expected to culminate in a new GNEP Certification process that enables commercial competition while ensuring that the policy goals of GNEP are adequately met.« less
USDA-ARS?s Scientific Manuscript database
There are >160,000 federal and state fruit fly detection traps deployed in southern and western U.S. and Puerto Rico. In California alone, >100,000 traps are deployed and maintained just for exotic fruit flies detection. Fruit fly detection and eradication requires deployment of large numbers of tra...
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
Economic evaluation on CO₂-EOR of onshore oil fields in China
Wei, Ning; Li, Xiaochun; Dahowski, Robert T.; ...
2015-06-01
Carbon dioxide enhanced oil recovery (CO₂-EOR) and sequestration in depleted oil reservoirs is a plausible option for utilizing anthropogenic CO₂ to increase oil production while storing CO₂ underground. Evaluation of the storage resources and cost of potential CO₂-EOR projects is an essential step before the commencement of large-scale deployment of such activities. In this paper, a hybrid techno-economic evaluation method, including a performance model and cost model for onshore CO₂-EOR projects, has been developed based on previous studies. Total 296 onshore oil fields, accounting for about 70% of total mature onshore oil fields in China, were evaluated by the techno-economicmore » method. The key findings of this study are summarized as follows: (1) deterministic analysis shows there are approximately 1.1 billion tons (7.7 billion barrels) of incremental crude oil and 2.2 billion tons CO₂ storage resource for onshore CO₂-EOR at net positive revenue within the Chinese oil fields reviewed under the given operating strategy and economic assumptions. (2) Sensitivity study highlights that the cumulative oil production and cumulative CO₂ storage resource are very sensitive to crude oil price, CO₂ cost, project lifetime, discount rate and tax policy. High oil price, short project lifetime, low discount rate, low CO₂ cost, and low tax policy can greatly increase the net income of the oil enterprise, incremental oil recovery and CO₂ storage resource. (3) From this techno-economic evaluation, the major barriers to large-scale deployment of CO₂-EOR include complex geological conditions, low API of crude oil, high tax policy, and lack of incentives for the CO₂-EOR project.« less
Economic evaluation on CO₂-EOR of onshore oil fields in China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Ning; Li, Xiaochun; Dahowski, Robert T.
Carbon dioxide enhanced oil recovery (CO₂-EOR) and sequestration in depleted oil reservoirs is a plausible option for utilizing anthropogenic CO₂ to increase oil production while storing CO₂ underground. Evaluation of the storage resources and cost of potential CO₂-EOR projects is an essential step before the commencement of large-scale deployment of such activities. In this paper, a hybrid techno-economic evaluation method, including a performance model and cost model for onshore CO₂-EOR projects, has been developed based on previous studies. Total 296 onshore oil fields, accounting for about 70% of total mature onshore oil fields in China, were evaluated by the techno-economicmore » method. The key findings of this study are summarized as follows: (1) deterministic analysis shows there are approximately 1.1 billion tons (7.7 billion barrels) of incremental crude oil and 2.2 billion tons CO₂ storage resource for onshore CO₂-EOR at net positive revenue within the Chinese oil fields reviewed under the given operating strategy and economic assumptions. (2) Sensitivity study highlights that the cumulative oil production and cumulative CO₂ storage resource are very sensitive to crude oil price, CO₂ cost, project lifetime, discount rate and tax policy. High oil price, short project lifetime, low discount rate, low CO₂ cost, and low tax policy can greatly increase the net income of the oil enterprise, incremental oil recovery and CO₂ storage resource. (3) From this techno-economic evaluation, the major barriers to large-scale deployment of CO₂-EOR include complex geological conditions, low API of crude oil, high tax policy, and lack of incentives for the CO₂-EOR project.« less
Goldmann, Emily; Calabrese, Joseph R; Prescott, Marta R; Tamburrino, Marijo; Liberzon, Israel; Slembarski, Renee; Shirley, Edwin; Fine, Thomas; Goto, Toyomi; Wilson, Kimberly; Ganocy, Stephen; Chan, Philip; Serrano, Mary Beth; Sizemore, James; Galea, Sandro
2012-02-01
To evaluate potentially modifiable deployment characteristics-- predeployment preparedness, unit support during deployment, and postdeployment support-that may be associated with deployment-related posttraumatic stress disorder (PTSD). We recruited a sample of 2616 Ohio Army National Guard (OHARNG) soldiers and conducted structured interviews to assess traumatic event exposure and PTSD related to the soldiers' most recent deployment, consistent with DSM-IV criteria. We assessed preparedness, unit support, and postdeployment support by using multimeasure scales adapted from the Deployment Risk and Resilience Survey. The prevalence of deployment-related PTSD was 9.6%. In adjusted logistic models, high levels of all three deployment characteristics (compared with low) were independently associated with lower odds of PTSD. When we evaluated the influence of combinations of deployment characteristics on the development of PTSD, we found that postdeployment support was an essential factor in the prevention of PTSD. Results show that factors throughout the life course of deployment-in particular, postdeployment support-may influence the development of PTSD. These results suggest that the development of suitable postdeployment support opportunities may be centrally important in mitigating the psychological consequences of war. Copyright © 2012 Elsevier Inc. All rights reserved.
An analysis of the deployment of a pumpkin balloon on mars
NASA Astrophysics Data System (ADS)
Rand, J.; Phillips, M.
The design of large superpressure balloons has received significant attention in recent years due to the successful demonstration of various enabling technologies and materials. Of particular note is the "pumpkin" shaped balloon concept, which allows the stress in the envelope to be limited by the surface geometry. Unlike a sphere, which produces stress resultants determined by the volume of the system, the pumpkin utilizes a system of meridional tendons to react the loading in one direction, and form a number of lobes, which limit the stress in the circumferential direction. The application of this technology to very large systems is currently being demonstrated by NASA's Ultra Long Duration Balloon (ULDB) Program. However, this type of balloon has certain features that may be exploited to produce a system far more robust than a comparable sphere during deployment, inflation, and operation for long periods of time. When this concept is applied to a system designed to carry two kilograms in the atmosphere of Mars, the resulting balloon is small enough to alter the construction techniques and produce an envelope which is free of tucks and folds which may cause uncontrolled stress concentrations. A technique has been demonstrated where high strength tendons may be pretensioned prior to installation along the centerline of each gore. Since this position is the shortest distance between the apex and nadir of the balloon, the tendons will automatically resist the forces caused by deployment and inflation and thereby protect the thin film gas barrier from damage. A suitable balloon has been designed for this type of mission using five-micron Mylar Type C film for the gas barrier and P O braided cables for the meridionalB load carrying members. The deployment of this balloon is assumed to occur while falling on a decelerator suitably designed for the Mars atmosphere. The inflation is accomplished by a ten-kilogram system suspended at the nadir of the balloon. As the system falls toward the surface of the planet, helium gas is transferred to the balloon, forming a partially inflated system very similar to an ascending zero pressure balloon. This analysis incorporates the flow of the planetary gas around the inflating balloon, altering the pressure distribution and shape. As a result, stresses are seen to increase beyond the design margin of safety, requiring the balloon to be redesigned. In addition, several scale models of this balloon were dynamically deployed in the laboratory to demonstrate that the deployment forces are indeed carried by the tendons
A new type of tri-axial accelerometers with high dynamic range MEMS for earthquake early warning
NASA Astrophysics Data System (ADS)
Peng, Chaoyong; Chen, Yang; Chen, Quansheng; Yang, Jiansi; Wang, Hongti; Zhu, Xiaoyi; Xu, Zhiqiang; Zheng, Yu
2017-03-01
Earthquake Early Warning System (EEWS) has shown its efficiency for earthquake damage mitigation. As the progress of low-cost Micro Electro Mechanical System (MEMS), many types of MEMS-based accelerometers have been developed and widely used in deploying large-scale, dense seismic networks for EEWS. However, the noise performance of these commercially available MEMS is still insufficient for weak seismic signals, leading to the large scatter of early-warning parameters estimation. In this study, we developed a new type of tri-axial accelerometer based on high dynamic range MEMS with low noise level using for EEWS. It is a MEMS-integrated data logger with built-in seismological processing. The device is built on a custom-tailored Linux 2.6.27 operating system and the method for automatic detecting seismic events is STA/LTA algorithms. When a seismic event is detected, peak ground parameters of all data components will be calculated at an interval of 1 s, and τc-Pd values will be evaluated using the initial 3 s of P wave. These values will then be organized as a trigger packet actively sent to the processing center for event combining detection. The output data of all three components are calibrated to sensitivity 500 counts/cm/s2. Several tests and a real field test deployment were performed to obtain the performances of this device. The results show that the dynamic range can reach 98 dB for the vertical component and 99 dB for the horizontal components, and majority of bias temperature coefficients are lower than 200 μg/°C. In addition, the results of event detection and real field deployment have shown its capabilities for EEWS and rapid intensity reporting.
The exposure of children to deploying side air bags: an initial field assessment.
Arbogast, Kristy B; Kallan, Michael J
2007-01-01
Tremendous effort has been invested in the laboratory to ensure side air bag (SAB) deployments minimize injury metrics in pediatric anthropometric test devices (ATDs). Little is known, however, about the experience of children exposed to this technology in real world crashes. Therefore, the objective of this study was to determine the prevalence of SAB exposure in children and provide estimates of injury risk among those exposed. This study utilized data from the Partners for Child Passenger Safety study, a large-scale child-focused crash surveillance system, to identify a probability sample of 348 child occupants, age 0-15 years, weighted to represent 6,600 children, in vehicles of model year 1998 and newer, equipped with SABs, in side impact crashes from three large U.S. regions between 1/1/05 and 12/31/06. In the study sample, 27 children per 1000 children in crashes were exposed to a deployed side air bag. Over 75% of these children were seated in the rear seat and 83% were exposed to a head curtain SAB. 65% of those exposed were less than 9 years of age. Of those exposed, 10.6% sustained an AIS2+ injury; all injuries were of the AIS 2 level and limited to the head or upper extremity. This paper provides the first population-based estimates of the exposure of children to SABs. Initial experience suggests that the risk of injury is fairly low with only one in ten sustaining injury - none of which were serious or life threatening. These findings offer assurance that efforts by regulators and the automotive industry to minimize negative consequences from SABs to vulnerable occupants appear to be effective and cause no change in the current recommendation of safe seating for children next to SABs.
The Exposure of Children to Deploying Side Air Bags: An Initial Field Assessment
Arbogast, Kristy B.; Kallan, Michael J.
2007-01-01
Tremendous effort has been invested in the laboratory to ensure side air bag (SAB) deployments minimize injury metrics in pediatric anthropometric test devices (ATDs). Little is known, however, about the experience of children exposed to this technology in real world crashes. Therefore, the objective of this study was to determine the prevalence of SAB exposure in children and provide estimates of injury risk among those exposed. This study utilized data from the Partners for Child Passenger Safety study, a large-scale child-focused crash surveillance system, to identify a probability sample of 348 child occupants, age 0–15 years, weighted to represent 6,600 children, in vehicles of model year 1998 and newer, equipped with SABs, in side impact crashes from three large U.S. regions between 1/1/05 and 12/31/06. In the study sample, 27 children per 1000 children in crashes were exposed to a deployed side airbag. Over 75% of these children were seated in the rear seat and 83% were exposed to a head curtain SAB. 65% of those exposed were less than 9 years of age. Of those exposed, 10.6% sustained an AIS2+ injury; all injuries were of the AIS 2 level and limited to the head or upper extremity. This paper provides the first population-based estimates of the exposure of children to SABs. Initial experience suggests that the risk of injury is fairly low with only one in ten sustaining injury – none of which were serious or life threatening. These findings offer assurance that efforts by regulators and the automotive industry to minimize negative consequences from SABs to vulnerable occupants appear to be effective and cause no change in the current recommendation of safe seating for children next to SABs. PMID:18184496
David, A S; Farrin, L; Hull, L; Unwin, C; Wessely, S; Wykes, T
2002-11-01
Complaints of poor memory and concentration are common in veterans of the 1991 Persian Gulf War as are other symptoms. Despite a large research effort, such symptoms remain largely unexplained. A comprehensive battery of neuropsychological tests and rating scales was administered to 341 UK servicemen who were returnees from the Gulf War and peace keeping duties in Bosnia, plus non-deployed military controls. All were drawn from a large randomized survey. Most were selected on the basis of impaired physical functioning defined operationally. Group comparisons revealed an association between physical functioning and symptoms of depression, post-traumatic stress reactions, increased anger and subjective cognitive failures. Poorer performance on some general cognitive measures, sequencing and attention was also seen in association with being 'ill' but virtually all differences disappeared after adjusting for depressed mood or multiple comparisons. Deployment was also associated with symptoms of post-traumatic stress and subjective cognitive failures, independently of health status, as well as minor general cognitive and constructional impairment. The latter remained significantly poorer in the Gulf group even after adjusting for depressed mood. Disturbances of mood are more prominent than quantifiable cognitive deficits in Gulf War veterans and probably lead to subjective underestimation of ability. Task performance deficits can themselves be explained by depressed mood although the direction of causality cannot be inferred confidently. Reduced constructional ability cannot be explained in this way and could be an effect of Gulf-specific exposures.
Deployment Technology of a Heliogyro Solar Sail for Long Duration Propulsion
NASA Technical Reports Server (NTRS)
Peerawan, Wiwattananon; Bryant, Robert G.; Edmonson, William W.; Moore, William B.; Bell, Jared M.
2015-01-01
Interplanetary, multi-mission, station-keeping capabilities will require that a spacecraft employ a highly efficient propulsion-navigation system. The majority of space propulsion systems are fuel-based and require the vehicle to carry and consume fuel as part of the mission. Once the fuel is consumed, the mission is set, thereby limiting the potential capability. Alternatively, a method that derives its acceleration and direction from solar photon pressure using a solar sail would eliminate the requirement of onboard fuel to meet mission objectives. MacNeal theorized that the heliogyro-configured solar sail architecture would be lighter, less complex, cheaper, and less risky to deploy a large sail area versus a masted sail. As sail size increases, the masted sail requires longer booms resulting in increased mass, and chaotic uncontrollable deployment. With a heliogyro, the sail membrane is stowed as a roll of thin film forming a blade when deployed that can extend up to kilometers. Thus, a benefit of using a heliogyro-configured solar sail propulsion technology is the mission scalability as compared to masted versions, which are size constrained. Studies have shown that interplanetary travel is achievable by the heliogyro solar sail concept. Heliogyro solar sail concept also enables multi-mission missions such as sample returns, and supply transportation from Earth to Mars as well as station-keeping missions to provide enhanced warning of solar storm. This paper describes deployment technology being developed at NASA Langley Research Center to deploy and control the center-of-mass/center-of-pressure using a twin bladed heliogyro solar sail 6-unit (6U) CubeSat. The 6U comprises 2x2U blade deployers and 2U for payload. The 2U blade deployers can be mounted to 6U or larger scaled systems to serve as a non-chemical in-space propulsion system. A single solar sail blade length is estimated to be 2.4 km with a total area from two blades of 720 m2; total allowable weight of a 6U CubeSat is approximately 8 kg. This makes the theoretical characteristic acceleration of approximately 0.75 mm/s2 at I AU (astronomical unit), when compared to IKAROS (0.005 mm/s2) and NanoSail-D (0.02 mm/s2).
Parenting Stress After Deployment in Navy Active Duty Fathers.
Yablonsky, Abigail M; Yan, Guofen; Bullock, Linda
2016-08-01
Military fathers are being deployed, and leaving their families, for greater lengths of time and more frequently than ever before. The purpose of this study was to examine the impact of recent deployment on parenting stress in U.S. Navy fathers with young children. Of the 111 participants who completed the one-time study questionnaire at a large military outpatient clinic on the Eastern seaboard, 67.6% had returned from a ship-based deployment. Regression analyses were performed, using the Parenting Stress Index as the outcome variable, deployment elements (such as time away from home in the past 5 years) as predictors, and adjusting for other factors such as post-traumatic stress disorder (PTSD) and depression. Higher perceived threat and greater warfare exposure were both associated with increased parenting stress (p < 0.05) in the unadjusted model. These associations were greatly attenuated and no longer significant after adjustment for depression. In addition, rates of positive screens for PTSD and depression (17.1%) in this sample were higher than in other recent studies. In summary, these data indicate that various deployment factors are associated with increased parenting stress in Navy fathers back from deployment within the past year; these relationships are largely explained by depressive symptoms. Clinical implications are discussed. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
NASA Technical Reports Server (NTRS)
Humphreys, William M., Jr.; Lockard, David P.; Khorrami, Mehdi R.; Culliton, William G.; McSwain, Robert G.; Ravetta, Patricio A.; Johns, Zachary
2016-01-01
A new aeroacoustic measurement capability has been developed consisting of a large channelcount, field-deployable microphone phased array suitable for airframe noise flyover measurements for a range of aircraft types and scales. The array incorporates up to 185 hardened, weather-resistant sensors suitable for outdoor use. A custom 4-mA current loop receiver circuit with temperature compensation was developed to power the sensors over extended cable lengths with minimal degradation of the signal to noise ratio and frequency response. Extensive laboratory calibrations and environmental testing of the sensors were conducted to verify the design's performance specifications. A compact data system combining sensor power, signal conditioning, and digitization was assembled for use with the array. Complementing the data system is a robust analysis system capable of near real-time presentation of beamformed and deconvolved contour plots and integrated spectra obtained from array data acquired during flyover passes. Additional instrumentation systems needed to process the array data were also assembled. These include a commercial weather station and a video monitoring / recording system. A detailed mock-up of the instrumentation suite (phased array, weather station, and data processor) was performed in the NASA Langley Acoustic Development Laboratory to vet the system performance. The first deployment of the system occurred at Finnegan Airfield at Fort A.P. Hill where the array was utilized to measure the vehicle noise from a number of sUAS (small Unmanned Aerial System) aircraft. A unique in-situ calibration method for the array microphones using a hovering aerial sound source was attempted for the first time during the deployment.
Sieblist, Christian; Jenzsch, Marco; Pohlscheidt, Michael
2016-08-01
The production of monoclonal antibodies by mammalian cell culture in bioreactors up to 25,000 L is state of the art technology in the biotech industry. During the lifecycle of a product, several scale up activities and technology transfers are typically executed to enable the supply chain strategy of a global pharmaceutical company. Given the sensitivity of mammalian cells to physicochemical culture conditions, process and equipment knowledge are critical to avoid impacts on timelines, product quantity and quality. Especially, the fluid dynamics of large scale bioreactors versus small scale models need to be described, and similarity demonstrated, in light of the Quality by Design approach promoted by the FDA. This approach comprises an associated design space which is established during process characterization and validation in bench scale bioreactors. Therefore the establishment of predictive models and simulation tools for major operating conditions of stirred vessels (mixing, mass transfer, and shear force.), based on fundamental engineering principles, have experienced a renaissance in the recent years. This work illustrates the systematic characterization of a large variety of bioreactor designs deployed in a global manufacturing network ranging from small bench scale equipment to large scale production equipment (25,000 L). Several traditional methods to determine power input, mixing, mass transfer and shear force have been used to create a data base and identify differences for various impeller types and configurations in operating ranges typically applied in cell culture processes at manufacturing scale. In addition, extrapolation of different empirical models, e.g. Cooke et al. (Paper presented at the proceedings of the 2nd international conference of bioreactor fluid dynamics, Cranfield, UK, 1988), have been assessed for their validity in these operational ranges. Results for selected designs are shown and serve as examples of structured characterization to enable fast and agile process transfers, scale up and troubleshooting.
NASA Astrophysics Data System (ADS)
Barty, C. P. J.; Key, M.; Britten, J.; Beach, R.; Beer, G.; Brown, C.; Bryan, S.; Caird, J.; Carlson, T.; Crane, J.; Dawson, J.; Erlandson, A. C.; Fittinghoff, D.; Hermann, M.; Hoaglan, C.; Iyer, A.; Jones, L., II; Jovanovic, I.; Komashko, A.; Landen, O.; Liao, Z.; Molander, W.; Mitchell, S.; Moses, E.; Nielsen, N.; Nguyen, H.-H.; Nissen, J.; Payne, S.; Pennington, D.; Risinger, L.; Rushford, M.; Skulina, K.; Spaeth, M.; Stuart, B.; Tietbohl, G.; Wattellier, B.
2004-12-01
The technical challenges and motivations for high-energy, short-pulse generation with NIF and possibly other large-scale Nd : glass lasers are reviewed. High-energy short-pulse generation (multi-kilojoule, picosecond pulses) will be possible via the adaptation of chirped pulse amplification laser techniques on NIF. Development of metre-scale, high-efficiency, high-damage-threshold final optics is a key technical challenge. In addition, deployment of high energy petawatt (HEPW) pulses on NIF is constrained by existing laser infrastructure and requires new, compact compressor designs and short-pulse, fibre-based, seed-laser systems. The key motivations for HEPW pulses on NIF is briefly outlined and includes high-energy, x-ray radiography, proton beam radiography, proton isochoric heating and tests of the fast ignitor concept for inertial confinement fusion.
Imager for Mars Pathfinder (IMF)
NASA Technical Reports Server (NTRS)
Smith, Peter H.
1994-01-01
The IMP camera is a near-surface sensing experiment with many capabilities beyond those normally associated with an imager. It is fully pointable in both elevation and azimuth with a protected, stowed position looking straight down. Stereo separation is provided with two optical paths; each has a 12-position filter wheel. The primary function of the camera, strongly tied to mission success, is to take a color panorama of the surrounding terrain. IMP requires approximately 120 images to give a complete downward hemisphere from the deployed position. IMP provides the geologist, and everyone else, a view of the local morphology with millimeter-tometer-scale resolution over a broad area. In addition to the general morphology of the scale, IMP has a large compliment of specially chosen filters to aid in both the identification of the mineral types and their degree of weathering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hodge, Bri-Mathias
2016-08-11
This paper discusses the development of, approaches for, experiences with, and some results from a large-scale, high-performance-computer-based (HPC-based) co-simulation of electric power transmission and distribution systems using the Integrated Grid Modeling System (IGMS). IGMS was developed at the National Renewable Energy Laboratory (NREL) as a novel Independent System Operator (ISO)-to-appliance scale electric power system modeling platform that combines off-the-shelf tools to simultaneously model 100s to 1000s of distribution systems in co-simulation with detailed ISO markets, transmission power flows, and AGC-level reserve deployment. Lessons learned from the co-simulation architecture development are shared, along with a case study that explores the reactivemore » power impacts of PV inverter voltage support on the bulk power system.« less
2014-09-30
second project, collaboration is sought with institutions in Seychelles and Singapore for atmospheric deployments. In all cases, the project expects to...suite of atmospheric instruments in the coasts of three IO island nations, Sri Lanka, Seychelles and Singapore to capture small-scale events pertinent...necessary for the deployments are being developed in Sri Lanka. The nature of the deployments in Seychelles and Singapore do not require additional
Intelligent Network Flow Optimization (INFLO) prototype : Seattle small-scale demonstration report.
DOT National Transportation Integrated Search
2015-05-01
This report describes the performance and results of the INFLO Prototype Small-Scale Demonstration. The purpose of the Small-Scale Demonstration was to deploy the INFLO Prototype System to demonstrate its functionality and performance in an operation...
Novel large deployable antenna backing structure concepts for foldable reflectors
NASA Astrophysics Data System (ADS)
Fraux, V.; Lawton, M.; Reveles, J. R.; You, Z.
2013-12-01
This paper describes a number of large deployable antenna (LDA) reflector structure concepts developed at EnerSys-ABSL. Furthermore, EnerSys-ABSL has confirmed the desire to build a breadboard demonstrator of a backing deployable structure for a foldable reflector in the diameter range of 4-9 m. As part of this project EnerSys-ABSL has explored five novel deployable structure concepts. This paper presents the top level definition of these concepts together with the requirements considered in the design and selection of the preferred candidate. These new concepts are described and then compared through a trade-off analysis to identify the most suitable concept that EnerSys-ABSL would like to consider for the breadboard demonstrator. Finally, the kinematics of the chosen concept is described in more detail and future steps in the development process are highlighted.
Moody, John A.; Ebel, Brian A.
2012-01-01
We developed a difference infiltrometer to measure time series of non-steady infiltration rates during rainstorms at the point scale. The infiltrometer uses two, tipping bucket rain gages. One gage measures rainfall onto, and the other measures runoff from, a small circular plot about 0.5-m in diameter. The small size allows the infiltration rate to be computed as the difference of the cumulative rainfall and cumulative runoff without having to route water through a large plot. Difference infiltrometers were deployed in an area burned by the 2010 Fourmile Canyon Fire near Boulder, Colorado, USA, and data were collected during the summer of 2011. The difference infiltrometer demonstrated the capability to capture different magnitudes of infiltration rates and temporal variability associated with convective (high intensity, short duration) and cyclonic (low intensity, long duration) rainstorms. Data from the difference infiltrometer were used to estimate saturated hydraulic conductivity of soil affected by the heat from a wildfire. The difference infiltrometer is portable and can be deployed in rugged, steep terrain and does not require the transport of water, as many rainfall simulators require, because it uses natural rainfall. It can be used to assess infiltration models, determine runoff coefficients, identify rainfall depth or rainfall intensity thresholds to initiate runoff, estimate parameters for infiltration models, and compare remediation treatments on disturbed landscapes. The difference infiltrometer can be linked with other types of soil monitoring equipment in long-term studies for detecting temporal and spatial variability at multiple time scales and in nested designs where it can be linked to hillslope and basin-scale runoff responses.
Solar Geoengineering and the Modulation of North Atlantic Tropical Cyclone Frequency
NASA Astrophysics Data System (ADS)
Jones, A. C.; Haywood, J. M.; Hawcroft, M.; Jones, A.; Dunstone, N. J.; Hodges, K.
2017-12-01
Solar geoengineering (SG) refers to a wide range of proposed methods for counteracting global warming by artificially reducing solar insolation at Earth's surface. The most widely known SG proposal is stratospheric aerosol injection (SAI) which has impacts analogous to those from large-scale volcanic eruptions. Observations following major volcanic eruptions indicate that aerosol enhancements confined to a single hemisphere effectively modulate North Atlantic tropical cyclone (TC) activity in the following years. Here we investigate the effects of both single-hemisphere and global SAI scenarios on North Atlantic TC activity using the HadGEM2-ES general circulation model (GCM). We show that a 5 Tg y-1 injection of sulphur dioxide (SO2) into the northern hemisphere (NH) stratosphere would produce a global-mean cooling of 1 K and simultaneously reduce TC activity (to 8 TCs y-1), while the same injection in the southern hemisphere (SH) would enhance TC activity (to 14 TCs y-1), relative to a recent historical period (1950-2000, 10 TCs y-1). Our results reemphasize the risks of regional geoengineering and should motivate policymakers to regulate large-scale unilateral geoengineering deployments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, Daniel S; Jha, Shantenu; Weissman, Jon
2017-01-31
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperablemore » distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weissman, Jon; Katz, Dan; Jha, Shantenu
2017-01-31
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable andmore » interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less
A Survey on Virtualization of Wireless Sensor Networks
Islam, Md. Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam
2012-01-01
Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization. PMID:22438759
A survey on virtualization of Wireless Sensor Networks.
Islam, Md Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam
2012-01-01
Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization.
Observations of thunderstorm-related 630 nm airglow depletions
NASA Astrophysics Data System (ADS)
Kendall, E. A.; Bhatt, A.
2015-12-01
The Midlatitude All-sky imaging Network for Geophysical Observations (MANGO) is an NSF-funded network of 630 nm all-sky imagers in the continental United States. MANGO will be used to observe the generation, propagation, and dissipation of medium and large-scale wave activity in the subauroral, mid and low-latitude thermosphere. This network is actively being deployed and will ultimately consist of nine all-sky imagers. These imagers form a network providing continuous coverage over the western United States, including California, Oregon, Washington, Utah, Arizona and Texas extending south into Mexico. This network sees high levels of both medium and large scale wave activity. Apart from the widely reported northeast to southwest propagating wave fronts resulting from the so called Perkins mechanism, this network observes wave fronts propagating to the west, north and northeast. At least three of these anomalous events have been associated with thunderstorm activity. Imager data has been correlated with both GPS data and data from the AIRS (Atmospheric Infrared Sounder) instrument on board NASA's Earth Observing System Aqua satellite. We will present a comprehensive analysis of these events and discuss the potential thunderstorm source mechanism.
Engineering design for a large scale renewable energy network installation in an urban environment
NASA Astrophysics Data System (ADS)
Mansouri Kouhestani, F.; Byrne, J. M.; Hazendonk, P.; Spencer, L.; Brown, M. B.
2016-12-01
Humanity's current avid consumption of resources cannot be maintained and the use of renewable energy is a significant approach towards sustainable energy future. Alberta is the largest greenhouse gas-producing province in Canada (per capita) and Climate change is expected to impact Alberta with warmer temperatures, intense floods, and earlier snow melting. However, as one of the sunniest and windiest places in Canada, Alberta is poised to become one of Canada's leader provinces in utilizing renewable energies. This research has four main objectives. First, to determine the feasibility of implementing solar and wind energy systems at the University of Lethbridge campus. Second, to quantify rooftop and parking lot solar photovoltaic potential for the city of Lethbridge. Third, to determine the available rooftop area for PV deployment in a large scale region (Province of Alberta). Forth, to investigate different strategies for correlating solar PV array production with electricity demand in the province of Alberta. The proposed work addresses the need for Alberta reductions to fossil fuel pollution that drives climate change, and degrades our air, water and land resources.
NASA Astrophysics Data System (ADS)
Agnew, Donald L.; Vinkey, Victor F.; Runge, Fritz C.
1989-04-01
A study was conducted to determine how the Large Deployable Reflector (LDR) might benefit from the use of the space station for assembly, checkout, deployment, servicing, refurbishment, and technology development. Requirements that must be met by the space station to supply benefits for a selected scenario are summarized. Quantitative and qualitative data are supplied. Space station requirements for LDR which may be utilized by other missions are identified. A technology development mission for LDR is outlined and requirements summarized. A preliminary experiment plan is included. Space Station Data Base SAA 0020 and TDM 2411 are updated.
NASA Technical Reports Server (NTRS)
Agnew, Donald L.; Vinkey, Victor F.; Runge, Fritz C.
1989-01-01
A study was conducted to determine how the Large Deployable Reflector (LDR) might benefit from the use of the space station for assembly, checkout, deployment, servicing, refurbishment, and technology development. Requirements that must be met by the space station to supply benefits for a selected scenario are summarized. Quantitative and qualitative data are supplied. Space station requirements for LDR which may be utilized by other missions are identified. A technology development mission for LDR is outlined and requirements summarized. A preliminary experiment plan is included. Space Station Data Base SAA 0020 and TDM 2411 are updated.
High-frequency field-deployable isotope analyzer for hydrological applications
Elena S.F. Berman; Manish Gupta; Chris Gabrielli; Tina Garland; Jeffrey J. McDonnell
2009-01-01
A high-frequency, field-deployable liquid water isotope analyzer was developed. The instrument was deployed for 4 contiguous weeks in the H. J. Andrews Experimental Forest Long-term Ecological Research site in western Oregon, where it was used for real-time measurement of the isotope ratios of precipitation and stream water during three large storm events. We were able...