Sample records for wide scale deployment

  1. Beetle-kill to carbon-negative bioenergy in the Rockies: stand, enterprise, and regional-scale perspectives

    NASA Astrophysics Data System (ADS)

    Field, J.; Paustian, K.

    2016-12-01

    The interior mountain West is particularly vulnerable to climate change with potential impacts including drought and wildfire intensification, and wide-scale species disruptions due to shifts in habitable elevation ranges or other effects. One such example is the current outbreak of native mountain pine and spruce beetles across the Rockies, with warmer winters, dryer summers, and a legacy of logging and fire suppression all interacting to result in infestation and unprecedented tree mortality over more than 42 million acres. Current global climate change mitigation commitments imply that shifts to renewable energy must be supplemented with widespread deployment of carbon-negative technologies such as BECCS and biochar. Carefully-designed forest bioenergy and biochar industries can play an important role in meeting these targets, valorizing woody biomass and allowing more acres to be actively managed under existing land management goals while simultaneously displacing fossil energy use and directly sequestering carbon. In this work we assess the negative emissions potential from the deployment of biochar co-producing thermochemical bioenergy technologies in the Rockies using beetle-kill wood as a feedstock, a way of leveraging a climate change driven problem for climate mitigation. We start with a review and classification of bioenergy lifecycle assessment emission source categories, clarifying the differences in mechanism and confidence around emissions sources, offsets, sequestration, and leakage effects. Next we develop methods for modeling ecosystem carbon response to biomass removals at the stand scale, considering potential species shifts and regrowth rates under different harvest systems deployed in different areas. We then apply a lifecycle assessment framework to evaluate the performance of a set of real-world bioenergy technologies at enterprise scale, including biomass logistics and conversion product yields. We end with an exploration of regional-scale mitigation capacity considering wide-scale deployment and potential wildfire feedback effects of harvest, highlighting the relative importance of supply chain, conversion technology, ecological, and epistemological uncertainties in realizing wide-scale negative emissions in this region.

  2. Seattle wide-area information for travelers (SWIFT) : architecture study

    DOT National Transportation Integrated Search

    1998-10-19

    The SWIFT (Seattle Wide-area Information For Travelers) Field Operational Test was intended to evaluate the performance of a large-scale urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. The unique features of the SWIF...

  3. Seattle wide-area information for travelers (SWIFT) : consumer acceptance study

    DOT National Transportation Integrated Search

    1998-10-19

    The Seattle Wide-area Information for Travelers (SWIFT) 0perational Test was intended to evaluate the performance of a large-scale, urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. With the majority of the SWIFT syste...

  4. Deployment, Design, and Commercialization of Carbon-Negative Energy Systems

    NASA Astrophysics Data System (ADS)

    Sanchez, Daniel Lucio

    Climate change mitigation requires gigaton-scale carbon dioxide removal technologies, yet few examples exist beyond niche markets. This dissertation informs large-scale implementation of bioenergy with carbon capture and sequestration (BECCS), a carbon-negative energy technology. It builds on existing literature with a novel focus on deployment, design, commercialization, and communication of BECCS. BECCS, combined with aggressive renewable deployment and fossil emission reductions, can enable a carbon-negative power system in Western North America by 2050, with up to 145% emissions reduction from 1990 levels. BECCS complements other sources of renewable energy, and can be deployed in a manner consistent with regional policies and design considerations. The amount of biomass resource available limits the level of fossil CO2 emissions that can still satisfy carbon emissions caps. Offsets produced by BECCS are more valuable to the power system than the electricity it provides. Implied costs of carbon for BECCS are relatively low ( 75/ton CO2 at scale) for a capital-intensive technology. Optimal scales for BECCS are an order of magnitude larger than proposed scales found in existing literature. Deviations from optimal scaled size have little effect on overall systems costs - suggesting that other factors, including regulatory, political, or logistical considerations, may ultimately have a greater influence on plant size than the techno-economic factors considered. The flexibility of thermochemical conversion enables a viable transition pathway for firms, utilities and governments to achieve net-negative CO 2 emissions in production of electricity and fuels given increasingly stringent climate policy. Primary research, development (R&D), and deployment needs are in large-scale biomass logistics, gasification, gas cleaning, and geological CO2 storage. R&D programs, subsidies, and policy that recognize co-conversion processes can support this pathway to commercialization. Here, firms can embrace a gradual transition pathway to deep decarbonization, limiting economic dislocation and increasing transfer of knowledge between the fossil and renewable sectors. Global cumulative capital investment needs for BECCS through 2050 are over 1.9 trillion (2015$, 4% real interest rate) for scenarios likely to limit global warming to 2 °C. This scenario envisions deployment of as much as 24 GW/yr of BECCS by 2040 in the electricity sector. To achieve theses rates of deployment within 15-20 years, governments and firms must commit to research, development, and deployment on an unprecedented scale. Three primary issues complicate emissions accounting for BECCS: cross-sector CO2 accounting, regrowth, and timing. Switchgrass integration decreases lifecycle greenhouse gas impacts of co-conversion systems with CCS, across a wide range of land-use change scenarios. Risks at commercial scale include adverse effects on food security, land conservation, social equity, and biodiversity, as well as competition for water resources. This dissertation argues for an iterative risk management approach to BECCS sustainability, with standards being updated as more knowledge is gained through deployment. Sustainability impacts and public opposition to BECCS may be reduced with transparent measurement and communication. Commercial-scale deployment is dependent on the coordination of a wide range of actors, many with different incentives and worldviews. Despite this problem, this dissertation challenges governments, industry incumbents, and emerging players to research, support, and deploy BECCS.

  5. Ten Years of Analyzing the Duck Chart: How an NREL Discovery in 2008 Is

    Science.gov Websites

    examined how to plan for future large-scale integration of solar photovoltaic (PV) generation on the result, PV was deployed more widely, and system operators became increasingly concerned about how solar emerging energy and environmental policy initiatives pushing for higher levels of solar PV deployment. As a

  6. Airframe noise prediction evaluation

    NASA Technical Reports Server (NTRS)

    Yamamoto, Kingo J.; Donelson, Michael J.; Huang, Shumei C.; Joshi, Mahendra C.

    1995-01-01

    The objective of this study is to evaluate the accuracy and adequacy of current airframe noise prediction methods using available airframe noise measurements from tests of a narrow body transport (DC-9) and a wide body transport (DC-10) in addition to scale model test data. General features of the airframe noise from these aircraft and models are outlined. The results of the assessment of two airframe prediction methods, Fink's and Munson's methods, against flight test data of these aircraft and scale model wind tunnel test data are presented. These methods were extensively evaluated against measured data from several configurations including clean, slat deployed, landing gear-deployed, flap deployed, and landing configurations of both DC-9 and DC-10. They were also assessed against a limited number of configurations of scale models. The evaluation was conducted in terms of overall sound pressure level (OASPL), tone corrected perceived noise level (PNLT), and one-third-octave band sound pressure level (SPL).

  7. ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.

    PubMed

    Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng

    2017-08-30

    While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.

  8. Travtek Evaluation Modeling Study

    DOT National Transportation Integrated Search

    1996-03-01

    THE FOLLOWING REPORT DESCRIBES A MODELING STUDY THAT WAS PERFORMED TO EXTRAPOLATE, FROM THE TRAVTEK OPERATIONAL TEST DATA, A SET OF SYSTEM WIDE BENEFITS AND PERFORMANCE VALUES FOR A WIDER-SCALE DEPLOYMENT OF A TRAVTEK-LIKE SYSTEM. IN THE FIRST PART O...

  9. Deployment Effects of Marine Renewable Energy Technologies: Wave Energy Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mirko Previsic

    2010-06-17

    Given proper care in siting, design, deployment, operation and maintenance, wave energy conversion could become one of the more environmentally benign sources of electricity generation. In order to accelerate the adoption of these emerging hydrokinetic and marine energy technologies, navigational and environmental concerns must be identified and addressed. All developing hydrokinetic projects involve a wide variety of stakeholders. One of the key issues that site developers face as they engage with this range of stakeholders is that, due to a lack of technical certainty, many of the possible conflicts (e.g., shipping and fishing) and environmental issues are not well-understood,. Inmore » September 2008, re vision consulting, LLC was selected by the Department of Energy (DoE) to apply a scenario-based assessment to the emerging hydrokinetic technology sector in order to evaluate the potential impact of these technologies on the marine environment and navigation constraints. The project’s scope of work includes the establishment of baseline scenarios for wave and tidal power conversion at potential future deployment sites. The scenarios capture variations in technical approaches and deployment scales to properly identify and characterize environmental effects and navigational effects. The goal of the project is to provide all stakeholders with an improved understanding of the potential range of technical attributes and potential effects of these emerging technologies and focus all stakeholders on the critical issues that need to be addressed. By identifying and addressing navigational and environmental concerns in the early stages of the industry’s development, serious mistakes that could potentially derail industry-wide development can be avoided. This groundwork will also help in streamlining siting and associated permitting processes, which are considered key hurdles for the industry’s development in the U.S. today. Re vision is coordinating its efforts with two other project teams funded by DoE which are focused on regulatory issues (Pacific Energy Ventures) and navigational issues (PCCI). The results of this study are structured into three reports: (1) Wave power scenario description (2) Tidal power scenario description (3) Framework for Identifying Key Environmental Concerns This is the first report in the sequence and describes the results of conceptual feasibility studies of wave power plants deployed in Humboldt County, California and Oahu, Hawaii. These two sites contain many of the same competing stakeholder interactions identified at other wave power sites in the U.S. and serve as representative case studies. Wave power remains at an early stage of development. As such, a wide range of different technologies are being pursued by different manufacturers. In order to properly characterize potential effects, it is useful to characterize the range of technologies that could be deployed at the site of interest. An industry survey informed the process of selecting representative wave power devices. The selection criteria requires that devices are at an advanced stage of development to reduce technical uncertainties, and that enough data are available from the manufacturers to inform the conceptual design process of this study. Further, an attempt is made to cover the range of different technologies under development to capture variations in potential environmental effects. Table 1 summarizes the selected wave power technologies. A number of other developers are also at an advanced stage of development, but are not directly mentioned here. Many environmental effects will largely scale with the size of the wave power plant. In many cases, the effects of a single device may not be measurable, while larger scale device arrays may have cumulative impacts that differ significantly from smaller scale deployments. In order to characterize these effects, scenarios are established at three deployment scales which nominally represent (1) a small pilot deployment, (2) a small commercial deployment, and (3) a large commercial scale plant. It is important to understand that the purpose of this study was to establish baseline scenarios based on basic device data that was provided to use by the manufacturer for illustrative purposes only.« less

  10. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  11. Financial Incentives to Enable Clean Energy Deployment: Policy Overview and Good Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sadie

    Financial incentives have been widely implemented by governments around the world to support scaled up deployment of renewable energy and energy efficiency technologies and practices. As of 2015, at least 48 countries have adopted financial incentives to support renewable energy and energy efficiency deployment. Broader clean energy strategies and plans provide a crucial foundation for financial incentives that often complement regulatory policies such as renewable energy targets, standards, and other mandates. This policy brief provides a primer on key financial incentive design elements, lessons from different country experiences, and curated support resources for more detailed and country-specific financial incentive designmore » information.« less

  12. IRIS Arrays: Observing Wavefields at Multiple Scales and Frequencies

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; Woodward, R.; Frassetto, A.

    2014-12-01

    The Incorporated Research Institutions for Seismology (IRIS) provides instruments for creating and operating seismic arrays at a wide range of scales. As an example, for over thirty years the IRIS PASSCAL program has provided instruments to individual Principal Investigators to deploy arrays of all shapes and sizes on every continent. These arrays have ranged from just a few sensors to hundreds or even thousands of sensors, covering areas with dimensions of meters to thousands of kilometers. IRIS also operates arrays directly, such as the USArray Transportable Array (TA) as part of the EarthScope program. Since 2004, the TA has rolled across North America, at any given time spanning a swath of approximately 800 km by 2,500 km, and thus far sampling 2% of the Earth's surface. This achievement includes all of the lower-48 U.S., southernmost Canada, and now parts of Alaska. IRIS has also facilitated specialized arrays in polar environments and on the seafloor. In all cases, the data from these arrays are freely available to the scientific community. As the community of scientists who use IRIS facilities and data look to the future they have identified a clear need for new array capabilities. In particular, as part of its Wavefields Initiative, IRIS is exploring new technologies that can enable large, dense array deployments to record unaliased wavefields at a wide range of frequencies. Large-scale arrays might utilize multiple sensor technologies to best achieve observing objectives and optimize equipment and logistical costs. Improvements in packaging and power systems can provide equipment with reduced size, weight, and power that will reduce logistical constraints for large experiments, and can make a critical difference for deployments in harsh environments or other situations where rapid deployment is required. We will review the range of existing IRIS array capabilities with an overview of previous and current deployments and examples of data and results. We will review existing IRIS projects that explore new array capabilities and highlight future directions for IRIS instrumentation facilities.

  13. A Commercialization Roadmap for Carbon-Negative Energy Systems

    NASA Astrophysics Data System (ADS)

    Sanchez, D.

    2016-12-01

    The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.

  14. Automated Deployment of Advanced Controls and Analytics in Buildings

    NASA Astrophysics Data System (ADS)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  15. Scalable and fail-safe deployment of the ATLAS Distributed Data Management system Rucio

    NASA Astrophysics Data System (ADS)

    Lassnig, M.; Vigne, R.; Beermann, T.; Barisits, M.; Garonne, V.; Serfon, C.

    2015-12-01

    This contribution details the deployment of Rucio, the ATLAS Distributed Data Management system. The main complication is that Rucio interacts with a wide variety of external services, and connects globally distributed data centres under different technological and administrative control, at an unprecedented data volume. It is therefore not possible to create a duplicate instance of Rucio for testing or integration. Every software upgrade or configuration change is thus potentially disruptive and requires fail-safe software and automatic error recovery. Rucio uses a three-layer scaling and mitigation strategy based on quasi-realtime monitoring. This strategy mainly employs independent stateless services, automatic failover, and service migration. The technologies used for deployment and mitigation include OpenStack, Puppet, Graphite, HAProxy and Apache. In this contribution, the interplay between these components, their deployment, software mitigation, and the monitoring strategy are discussed.

  16. On the Path to SunShot. Emerging Issues and Challenges in Integrating Solar with the Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Broderick, Robert; Mather, Barry

    2016-05-01

    This report analyzes distribution-integration challenges, solutions, and research needs in the context of distributed generation from PV (DGPV) deployment to date and the much higher levels of deployment expected with achievement of the U.S. Department of Energy's SunShot targets. Recent analyses have improved estimates of the DGPV hosting capacities of distribution systems. This report uses these results to statistically estimate the minimum DGPV hosting capacity for the contiguous United States using traditional inverters of approximately 170 GW without distribution system modifications. This hosting capacity roughly doubles if advanced inverters are used to manage local voltage and additional minor, low-cost changesmore » could further increase these levels substantially. Key to achieving these deployment levels at minimum cost is siting DGPV based on local hosting capacities, suggesting opportunities for regulatory, incentive, and interconnection innovation. Already, pre-computed hosting capacity is beginning to expedite DGPV interconnection requests and installations in select regions; however, realizing SunShot-scale deployment will require further improvements to DGPV interconnection processes, standards and codes, and compensation mechanisms so they embrace the contributions of DGPV to system-wide operations. SunShot-scale DGPV deployment will also require unprecedented coordination of the distribution and transmission systems. This includes harnessing DGPV's ability to relieve congestion and reduce system losses by generating closer to loads; minimizing system operating costs and reserve deployments through improved DGPV visibility; developing communication and control architectures that incorporate DGPV into system operations; providing frequency response, transient stability, and synthesized inertia with DGPV in the event of large-scale system disturbances; and potentially managing reactive power requirements due to large-scale deployment of advanced inverter functions. Finally, additional local and system-level value could be provided by integrating DGPV with energy storage and 'virtual storage,' which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Together, continued innovation across this rich distribution landscape can enable the very-high deployment levels envisioned by SunShot.« less

  17. Lessons Learned in Deploying the World s Largest Scale Lustre File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillow, David A; Fuller, Douglas; Wang, Feiyi

    2010-01-01

    The Spider system at the Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) is the world's largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF's diverse computational environment, the project had a number of ambitious goals. To support the workloads of the OLCF's diverse computational platforms, the aggregate performance and storage capacity of Spider exceed that of our previously deployed systems by a factor of 6x - 240 GB/sec, and 17x - 10 Petabytes, respectively. Furthermore, Spider supports over 26,000 clients concurrently accessing themore » file system, which exceeds our previously deployed systems by nearly 4x. In addition to these scalability challenges, moving to a center-wide shared file system required dramatically improved resiliency and fault-tolerance mechanisms. This paper details our efforts in designing, deploying, and operating Spider. Through a phased approach of research and development, prototyping, deployment, and transition to operations, this work has resulted in a number of insights into large-scale parallel file system architectures, from both the design and the operational perspectives. We present in this paper our solutions to issues such as network congestion, performance baselining and evaluation, file system journaling overheads, and high availability in a system with tens of thousands of components. We also discuss areas of continued challenges, such as stressed metadata performance and the need for file system quality of service alongside with our efforts to address them. Finally, operational aspects of managing a system of this scale are discussed along with real-world data and observations.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cory, K.; Coughlin, J.; Coggeshall, C.

    State and local governments have grown increasingly aware of the economic, environmental, and societal benefits of taking a lead role in U.S. implementation of renewable energy, particularly distributed photovoltaic (PV) installations. Recently, solar energy's cost premium has declined as a result of technology improvements and an increase in the cost of traditional energy generation. At the same time, a nationwide public policy focus on carbon-free, renewable energy has created a wide range of financial incentives to lower the costs of deploying PV even further. These changes have led to exponential increases in the availability of capital for solar projects, andmore » tremendous creativity in the development of third-party ownership structures. As significant users of electricity, state and local governments can be an excellent example for solar PV system deployment on a national scale. Many public entities are not only considering deployment on public building rooftops, but also large-scale applications on available public lands. The changing marketplace requires that state and local governments be financially sophisticated to capture as much of the economic potential of a PV system as possible. This report examines ways that state and local governments can optimize the financial structure of deploying solar PV for public uses.« less

  19. Deployment Effects of Marin Renewable Energy Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian Polagye; Mirko Previsic

    2010-06-17

    Given proper care in siting, design, deployment, operation and maintenance, marine and hydrokinetic technologies could become one of the more environmentally benign sources of electricity generation. In order to accelerate the adoption of these emerging hydrokinetic and marine energy technologies, navigational and environmental concerns must be identified and addressed. All developing hydrokinetic projects involve a wide variety of stakeholders. One of the key issues that site developers face as they engage with this range of stakeholders is that many of the possible conflicts (e.g., shipping and fishing) and environmental issues are not well-understood, due to a lack of technical certainty.more » In September 2008, re vision consulting, LLC was selected by the Department of Energy (DoE) to apply a scenario-based approach to the emerging wave and tidal technology sectors in order to evaluate the impact of these technologies on the marine environment and potentially conflicting uses. The project’s scope of work includes the establishment of baseline scenarios for wave and tidal power conversion at potential future deployment sites. The scenarios will capture variations in technical approaches and deployment scales to properly identify and characterize environmental impacts and navigational effects. The goal of the project is to provide all stakeholders with an improved understanding of the potential effects of these emerging technologies and focus all stakeholders onto the critical issues that need to be addressed. This groundwork will also help in streamlining siting and associated permitting processes, which are considered key hurdles for the industry’s development in the U.S. today. Re vision is coordinating its efforts with two other project teams funded by DoE which are focused on regulatory and navigational issues. The results of this study are structured into three reports: 1. Wave power scenario description 2. Tidal power scenario description 3. Framework for Identifying Key Environmental Concerns This is the second report in the sequence and describes the results of conceptual feasibility studies of tidal power plants deployed in Tacoma Narrows, Washington. The Narrows contain many of the same competing stakeholder interactions identified at other tidal power sites and serves as a representative case study. Tidal power remains at an early stage of development. As such, a wide range of different technologies are being pursued by different manufacturers. In order to properly characterize impacts, it is useful to characterize the range of technologies that could be deployed at the site of interest. An industry survey informs the process of selecting representative tidal power devices. The selection criteria is that such devices are at an advanced stage of development to reduce technical uncertainties and that enough data are available from the manufacturers to inform the conceptual design process of this study. Further, an attempt is made to cover the range of different technologies under development to capture variations in potential environmental effects. A number of other developers are also at an advanced stage of development including Verdant Power, which has demonstrated an array of turbines in the East River of New York, Clean Current, which has demonstrated a device off Race Rocks, BC, and OpenHydro, which has demonstrated a device at the European Marine Energy Test Center and is on the verge of deploying a larger device in the Bay of Fundy. MCT demonstrated their device both at Devon (UK) and Strangford Narrows (Northern Ireland). Furthermore OpenHydro, CleanCurrent, and MCT are the three devices being installed at the Minas Passage (Canada). Environmental effects will largely scale with the size of tidal power development. In many cases, the effects of a single device may not be measurable, while larger scale device arrays may have cumulative impacts that differ significantly from smaller scale deployments. In order to characterize these effects, scenarios are established at three deployment scales which nominally represent (1) a small pilot deployment, (2) an early, small commercial deployment, and (3) a large commercial scale plant. For the three technologies and scales at the selected site, this results in a total of nine deployment scenarios outlined in the report.« less

  20. Taking it to the streets: delivering on deployment.

    PubMed

    Carr, Dafna; Welch, Vickie; Fabik, Trish; Hirji, Nadir; O'Connor, Casey

    2009-01-01

    From inception to deployment, the Wait Time Information System (WTIS) project faced significant challenges associated with time, scope and complexity. It involved not only the creation and deployment of two large-scale province-wide systems (the WTIS and Ontario's Client Registry/Enterprise Master Patient Index) within aggressive time frames, but also the active engagement of 82 Ontario hospitals, scores of healthcare leaders and several thousand clinicians who would eventually be using the new technology and its data. The provincial WTIS project team (see Figure 1) also had to be able to adapt and evolve their planning in an environment that was changing day-by-day. This article looks at the factors that allowed the team to take the WTIS out to the field and shares the approach, processes and tools used to deploy this complex and ambitious information management and information technology (IM/IT) initiative.

  1. Scaling the PuNDIT project for wide area deployments

    NASA Astrophysics Data System (ADS)

    McKee, Shawn; Batista, Jorge; Carcassi, Gabriele; Dovrolis, Constantine; Lee, Danny

    2017-10-01

    In today’s world of distributed scientific collaborations, there are many challenges to providing reliable inter-domain network infrastructure. Network operators use a combination of active monitoring and trouble tickets to detect problems, but these are often ineffective at identifying issues that impact wide-area network users. Additionally, these approaches do not scale to wide area inter-domain networks due to unavailability of data from all the domains along typical network paths. The Pythia Network Diagnostic InfrasTructure (PuNDIT) project aims to create a scalable infrastructure for automating the detection and localization of problems across these networks. The project goal is to gather and analyze metrics from existing perfSONAR monitoring infrastructures to identify the signatures of possible problems, locate affected network links, and report them to the user in an intuitive fashion. Simply put, PuNDIT seeks to convert complex network metrics into easily understood diagnoses in an automated manner. We present our progress in creating the PuNDIT system and our status in developing, testing and deploying PuNDIT. We report on the project progress to-date, describe the current implementation architecture and demonstrate some of the various user interfaces it will support. We close by discussing the remaining challenges and next steps and where we see the project going in the future.

  2. The Spider Center Wide File System; From Concept to Reality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shipman, Galen M; Dillow, David A; Oral, H Sarp

    2009-01-01

    The Leadership Computing Facility (LCF) at Oak Ridge National Laboratory (ORNL) has a diverse portfolio of computational resources ranging from a petascale XT4/XT5 simulation system (Jaguar) to numerous other systems supporting development, visualization, and data analytics. In order to support vastly different I/O needs of these systems Spider, a Lustre-based center wide file system was designed and deployed to provide over 240 GB/s of aggregate throughput with over 10 Petabytes of formatted capacity. A multi-stage InfiniBand network, dubbed as Scalable I/O Network (SION), with over 889 GB/s of bisectional bandwidth was deployed as part of Spider to provide connectivity tomore » our simulation, development, visualization, and other platforms. To our knowledge, while writing this paper, Spider is the largest and fastest POSIX-compliant parallel file system in production. This paper will detail the overall architecture of the Spider system, challenges in deploying and initial testings of a file system of this scale, and novel solutions to these challenges which offer key insights into file system design in the future.« less

  3. Medical Informatics Education & Research in Greece.

    PubMed

    Chouvarda, I; Maglaveras, N

    2015-08-13

    This paper aims to present an overview of the medical informatics landscape in Greece, to describe the Greek ehealth background and to highlight the main education and research axes in medical informatics, along with activities, achievements and pitfalls. With respect to research and education, formal and informal sources were investigated and information was collected and presented in a qualitative manner, including also quantitative indicators when possible. Greece has adopted and applied medical informatics education in various ways, including undergraduate courses in health sciences schools as well as multidisciplinary postgraduate courses. There is a continuous research effort, and large participation in EU-wide initiatives, in all the spectrum of medical informatics research, with notable scientific contributions, although technology maturation is not without barriers. Wide-scale deployment of eHealth is anticipated in the healthcare system in the near future. While ePrescription deployment has been an important step, ICT for integrated care and telehealth have a lot of room for further deployment. Greece is a valuable contributor in the European medical informatics arena, and has the potential to offer more as long as the barriers of research and innovation fragmentation are addressed and alleviated.

  4. CentNet—A deployable 100-station network for surface exchange research

    NASA Astrophysics Data System (ADS)

    Oncley, S.; Horst, T. W.; Semmer, S.; Militzer, J.; Maclean, G.; Knudson, K.

    2014-12-01

    Climate, air quality, atmospheric composition, surface hydrology, and ecological processes are directly affected by the Earth's surface. Complexity of this surface exists at multiple spatial scales, which complicates the understanding of these processes. NCAR/EOL currently provides a facility to the research community to make direct eddy-covariance flux observations to quantify surface-atmosphere interactions. However, just as model resolution has continued to increase, there is a need to increase the spatial density of flux measurements to capture the wide variety of scales that contribute to exchange processes close to the surface. NCAR/EOL now has developed the CentNet facility, that is envisioned to have on the order of 100 surface flux stations deployable for periods of months to years. Each station would measure standard meteorological variables, all components of the surface energy balance (including turbulence fluxes and radiation), atmospheric composition, and other quantities to characterize the surface. Thus, CentNet can support observational research in the biogeosciences, hydrology, urban meteorology, basic meteorology, and turbulence. CentNet has been designed to be adaptable to a wide variety of research problems while keeping operations manageable. Tower infrastructure has been designed to be lightweight, easily deployed, and with a minimal set-up footprint. CentNet uses sensor networks to increase spatial sampling at each station. The data system saves every sample on site to retain flexibility in data analysis. We welcome guidance on development and funding priorities as we build CentNet.

  5. Practice brief. Securing wireless technology for healthcare.

    PubMed

    Retterer, John; Casto, Brian W

    2004-05-01

    Wireless networking can be a very complex science, requiring an understanding of physics and the electromagnetic spectrum. While the radio theory behind the technology can be challenging, a basic understanding of wireless networking can be sufficient for small-scale deployment. Numerous security mechanisms are available to wireless technologies, making it practical, scalable, and affordable for healthcare organizations. The decision on the selected security model should take into account the needs for additional server hardware and administrative costs. Where wide area network connections exist between cooperative organizations, deployment of a distributed security model can be considered to reduce administrative overhead. The wireless approach chosen should be dynamic and concentrate on the organization's specific environmental needs. Aspects of organizational mission, operations, service level, and budget allotment as well as an organization's risk tolerance are all part of the balance in the decision to deploy wireless technology.

  6. Capturing the Impact of Storage and Other Flexible Technologies on Electric System Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Stoll, Brady; Mai, Trieu

    Power systems of the future are likely to require additional flexibility. This has been well studied from an operational perspective, but has been more difficult to incorporate into capacity expansion models (CEMs) that study investment decisions on the decadal scale. There are two primary reasons for this. First, the necessary input data, including cost and resource projections, for flexibility options like demand response and storage are significantly uncertain. Second, it is computationally difficult to represent both investment and operational decisions in detail, the latter being necessary to properly value system flexibility, in CEMs for realistically sized systems. In this work,more » we extend a particular CEM, NREL's Resource Planning Model (RPM), to address the latter issue by better representing variable generation impacts on operations, and then adding two flexible technologies to RPM's suite of investment decisions: interruptible load and utility-scale storage. This work does not develop full suites of input data for these technologies, but is rather methodological and exploratory in nature. We thus exercise these new investment decisions in the context of exploring price points and value streams needed for significant deployment in the Western Interconnection by 2030. Our study of interruptible load finds significant variation by location, year, and overall system conditions. Some locations find no system need for interruptible load even with low costs, while others build the most expensive resources offered. System needs can include planning reserve capacity needs to ensure resource adequacy, but there are also particular cases in which spinning reserve requirements drive deployment. Utility-scale storage is found to require deep cost reductions to achieve wide deployment and is found to be more valuable in some locations with greater renewable deployment. Differences between more solar- and wind-reliant regions are also found: Storage technologies with lower energy capacities are deployed to support solar deployment, and higher energy capacity technologies support wind. Finally, we identify potential future research and areas of improvement to build on this initial analysis.« less

  7. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    PubMed

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.

  8. Medical Informatics Education & Research in Greece

    PubMed Central

    Chouvarda, I.

    2015-01-01

    Summary Objectives This paper aims to present an overview of the medical informatics landscape in Greece, to describe the Greek ehealth background and to highlight the main education and research axes in medical informatics, along with activities, achievements and pitfalls. Methods With respect to research and education, formal and informal sources were investigated and information was collected and presented in a qualitative manner, including also quantitative indicators when possible. Results Greece has adopted and applied medical informatics education in various ways, including undergraduate courses in health sciences schools as well as multidisciplinary postgraduate courses. There is a continuous research effort, and large participation in EU-wide initiatives, in all the spectrum of medical informatics research, with notable scientific contributions, although technology maturation is not without barriers. Wide-scale deployment of eHealth is anticipated in the healthcare system in the near future. While ePrescription deployment has been an important step, ICT for integrated care and telehealth have a lot of room for further deployment. Conclusions Greece is a valuable contributor in the European medical informatics arena, and has the potential to offer more as long as the barriers of research and innovation fragmentation are addressed and alleviated. PMID:26123910

  9. Necessity for Industry-Academic Economic Geology Collaborations for Energy Critical Minerals Research and Development

    NASA Astrophysics Data System (ADS)

    Hitzman, M.

    2012-12-01

    Economic geology is a highly interdisciplinary field utilizing a diverse set of petrologic, geochemical, geophysical, and tectonic data for improved scientific understanding of element migration and concentration in the crust (ore formation). A number of elements that were once laboratory curiosities now figure prominently in new energy technologies (e.g. wind turbines, solar energy collectors). If widely deployed, such technologies have the capacity to transform the way we produce, transmit, store, and conserve energy. To meet domestic and worldwide renewable energy needs these systems must be scaled from laboratory, to demonstration, to widespread deployment. Such technologies are materials intensive. If widely deployed, the elements required by these technologies will be needed in significant quantities and shortage of these "energy critical elements" could significantly inhibit the adoption of otherwise game changing energy technologies. It is imperative to better understand the geology, metallurgy, and mining engineering of critical mineral deposits if we are to sustainably develop these new technologies. There is currently no consensus among federal and state agencies, the national and international mining industry, the public, and the U.S. academic community regarding the importance of economic geology to secure sufficient energy critical elements to undertake large-scale renewable energy development. Available federal funding for critical elements focuses on downstream areas such as metallurgy, substitutions, and recycling rather than primary deposits. Undertaking the required research to discover and mine critical element deposits in an environmentally friendly manner will require significant partnering with industry due to the current lack of federal research support.

  10. Deployment Pulmonary Health

    DTIC Science & Technology

    2015-02-11

    A similar risk-based approach may be appropriate for deploying military personnel. e) If DoD were to consider implementing a large- scale pre...quality of existing spirometry programs prior to considering a larger scale pre-deployment effort. Identifying an accelerated decrease in spirometry...baseline spirometry on a wider scale . e) Conduct pre-deployment baseline spirometry if there is a significant risk of exposure to a pulmonary hazard based

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.

    CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenberg, Jim; Penuelas, J.; Guenther, Alex B.

    To survey landscape-scale fluxes of biogenic gases, a100-meterTeflon tube was attached to a tethered balloon as a sampling inlet for a fast response Proton Transfer Reaction Mass Spectrometer (PTRMS). Along with meteorological instruments deployed on the tethered balloon and at 3-mand outputs from a regional weather model, these observations were used to estimate landscape scale biogenic volatile organic compound fluxes with two micrometeorological techniques: mixed layer variance and surface layer gradients. This highly mobile sampling system was deployed at four field sites near Barcelona to estimate landscape-scale BVOC emission factors in a relatively short period (3 weeks). The two micrometeorologicalmore » techniques agreed within the uncertainty of the flux measurements at all four sites even though the locations had considerable heterogeneity in species distribution and complex terrain. The observed fluxes were significantly different than emissions predicted with an emission model using site-specific emission factors and land-cover characteristics. Considering the wide range in reported BVOC emission factors of VOCs for individual vegetation species (more than an order of magnitude), this flux estimation technique is useful for constraining BVOC emission factors used as model inputs.« less

  13. Study on the three-station typical network deployments of workspace Measurement and Positioning System

    NASA Astrophysics Data System (ADS)

    Xiong, Zhi; Zhu, J. G.; Xue, B.; Ye, Sh. H.; Xiong, Y.

    2013-10-01

    As a novel network coordinate measurement system based on multi-directional positioning, workspace Measurement and Positioning System (wMPS) has outstanding advantages of good parallelism, wide measurement range and high measurement accuracy, which makes it to be the research hotspots and important development direction in the field of large-scale measurement. Since station deployment has a significant impact on the measurement range and accuracy, and also restricts the use-cost, the optimization method of station deployment was researched in this paper. Firstly, positioning error model was established. Then focusing on the small network consisted of three stations, the typical deployments and error distribution characteristics were studied. Finally, through measuring the simulated fuselage using typical deployments at the industrial spot and comparing the results with Laser Tracker, some conclusions are obtained. The comparison results show that under existing prototype conditions, I_3 typical deployment of which three stations are distributed in a straight line has an average error of 0.30 mm and the maximum error is 0.50 mm in the range of 12 m. Meanwhile, C_3 typical deployment of which three stations are uniformly distributed in the half-circumference of an circle has an average error of 0.17 mm and the maximum error is 0.28 mm. Obviously, C_3 typical deployment has a higher control effect on precision than I_3 type. The research work provides effective theoretical support for global measurement network optimization in the future work.

  14. Assessing the durability and efficiency of landscape-based strategies to deploy plant resistance to pathogens

    PubMed Central

    Rey, Jean-François; Barrett, Luke G.; Thrall, Peter H.

    2018-01-01

    Genetically-controlled plant resistance can reduce the damage caused by pathogens. However, pathogens have the ability to evolve and overcome such resistance. This often occurs quickly after resistance is deployed, resulting in significant crop losses and a continuing need to develop new resistant cultivars. To tackle this issue, several strategies have been proposed to constrain the evolution of pathogen populations and thus increase genetic resistance durability. These strategies mainly rely on varying different combinations of resistance sources across time (crop rotations) and space. The spatial scale of deployment can vary from multiple resistance sources occurring in a single cultivar (pyramiding), in different cultivars within the same field (cultivar mixtures) or in different fields (mosaics). However, experimental comparison of the efficiency (i.e. ability to reduce disease impact) and durability (i.e. ability to limit pathogen evolution and delay resistance breakdown) of landscape-scale deployment strategies presents major logistical challenges. Therefore, we developed a spatially explicit stochastic model able to assess the epidemiological and evolutionary outcomes of the four major deployment options described above, including both qualitative resistance (i.e. major genes) and quantitative resistance traits against several components of pathogen aggressiveness: infection rate, latent period duration, propagule production rate, and infectious period duration. This model, implemented in the R package landsepi, provides a new and useful tool to assess the performance of a wide range of deployment options, and helps investigate the effect of landscape, epidemiological and evolutionary parameters. This article describes the model and its parameterisation for rust diseases of cereal crops, caused by fungi of the genus Puccinia. To illustrate the model, we use it to assess the epidemiological and evolutionary potential of the combination of a major gene and different traits of quantitative resistance. The comparison of the four major deployment strategies described above will be the objective of future studies. PMID:29649208

  15. Jumpstarting commercial-scale CO 2 capture and storage with ethylene production and enhanced oil recovery in the US Gulf

    DOE PAGES

    Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.; ...

    2015-04-27

    CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less

  16. Continuous stacking computational approach based automated microscope slide scanner

    NASA Astrophysics Data System (ADS)

    Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva

    2018-02-01

    Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.

  17. Hardiness as a predictor of mental health and well-being of Australian army reservists on and after stability operations.

    PubMed

    Orme, Geoffrey J; Kehoe, E James

    2014-04-01

    This study tested whether cognitive hardiness moderates the adverse effects of deployment-related stressors on health and well-being of soldiers on short-tour (4-7 months), peacekeeping operations. Australian Army reservists (N = 448) were surveyed at the start, end, and up to 24 months after serving as peacekeepers in Timor-Leste or the Solomon Islands. They retained sound mental health throughout (Kessler 10, Post-Traumatic Checklist-Civilian, Depression Anxiety Stress Scale 42). Ratings of either traumatic or nontraumatic stress were low. Despite range restrictions, scores on the Cognitive Hardiness Scale moderated the relationship between deployment stressors and a composite measure of psychological distress. Scatterplots revealed an asymmetric pattern for hardiness scores and measures of psychological distress. When hardiness scores were low, psychological distress scores were widely dispersed. However, when hardiness scores were higher, psychological distress scores became concentrated at a uniformly low level. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  18. Performance Evaluation of Wearable Sensor Systems: A Case Study in Moderate-Scale Deployment in Hospital Environment.

    PubMed

    Sun, Wen; Ge, Yu; Zhang, Zhiqiang; Wong, Wai-Choong

    2015-09-25

    A wearable sensor system enables continuous and remote health monitoring and is widely considered as the next generation of healthcare technology. The performance, the packet error rate (PER) in particular, of a wearable sensor system may deteriorate due to a number of factors, particularly the interference from the other wearable sensor systems in the vicinity. We systematically evaluate the performance of the wearable sensor system in terms of PER in the presence of such interference in this paper. The factors that affect the performance of the wearable sensor system, such as density, traffic load, and transmission power in a realistic moderate-scale deployment case in hospital are all considered. Simulation results show that with 20% duty cycle, only 68.5% of data transmission can achieve the targeted reliability requirement (PER is less than 0.05) even in the off-peak period in hospital. We then suggest some interference mitigation schemes based on the performance evaluation results in the case study.

  19. Envisioning a Low-Cost Solar Future: Exploring the Potential Impact of Achieving the SunShot 2030 Targets for Photovoltaics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley J; Frew, Bethany A; Gagnon, Pieter J

    In the context of recent dramatic solar energy cost reductions, the U.S. Department of Energy set new levelized cost of energy goals for photovoltaics (PV) to achieve by 2030 to enable significantly greater PV adoption: $0.03/kWh for utility-scale, $0.04/kWh for commercial, and $0.05/kWh for residential PV systems. We analyze the potential impacts of achieving these 'SunShot 2030' cost targets for the contiguous United States using the Regional Energy Deployment System (ReEDS) and Distributed Generation (dGen) capacity expansion models. We consider the impacts under a wide range of future conditions. We find that PV could provide 13%-18% of U.S. electricity demandmore » in 2030 and 28%-64% of demand if the SunShot 2030 goals are achieved, with PV deployment increasing in every state. The availability of low-cost storage has the largest impact on projected deployment, followed by natural gas prices and electricity demand. For comparison, PV deployed under a business-as-usual scenario could provide only 5% of generation in 2030 and 17% in 2050. We find that the high levels of PV deployment explored here lead to lower electricity prices and system costs, lower carbon dioxide emissions, lower water consumption, increased renewable energy curtailment, and increased storage deployment compared with the business-as-usual scenario.« less

  20. Household Energy Consumption Segmentation Using Hourly Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwac, J; Flora, J; Rajagopal, R

    2014-01-01

    The increasing US deployment of residential advanced metering infrastructure (AMI) has made hourly energy consumption data widely available. Using CA smart meter data, we investigate a household electricity segmentation methodology that uses an encoding system with a pre-processed load shape dictionary. Structured approaches using features derived from the encoded data drive five sample program and policy relevant energy lifestyle segmentation strategies. We also ensure that the methodologies developed scale to large data sets.

  1. Calibration procedure for Slocum glider deployed optical instruments.

    PubMed

    Cetinić, Ivona; Toro-Farmer, Gerardo; Ragan, Matthew; Oberg, Carl; Jones, Burton H

    2009-08-31

    Recent developments in the field of the autonomous underwater vehicles allow the wide usage of these platforms as part of scientific experiments, monitoring campaigns and more. The vehicles are often equipped with sensors measuring temperature, conductivity, chlorophyll a fluorescence (Chl a), colored dissolved organic matter (CDOM) fluorescence, phycoerithrin (PE) fluorescence and spectral volume scattering function at 117 degrees, providing users with high resolution, real time data. However, calibration of these instruments can be problematic. Most in situ calibrations are performed by deploying complementary instrument packages or water samplers in the proximity of the glider. Laboratory calibrations of the mounted sensors are difficult due to the placement of the instruments within the body of the vehicle. For the laboratory calibrations of the Slocum glider instruments we developed a small calibration chamber where we can perform precise calibrations of the optical instruments aboard our glider, as well as sensors from other deployment platforms. These procedures enable us to obtain pre- and post-deployment calibrations for optical fluorescence instruments, which may differ due to the biofouling and other physical damage that can occur during long-term glider deployments. We found that biofouling caused significant changes in the calibration scaling factors of fluorescent sensors, suggesting the need for consistent and repetitive calibrations for gliders as proposed in this paper.

  2. Deployment dynamics and control of large-scale flexible solar array system with deployable mast

    NASA Astrophysics Data System (ADS)

    Li, Hai-Quan; Liu, Xiao-Feng; Guo, Shao-Jing; Cai, Guo-Ping

    2016-10-01

    In this paper, deployment dynamics and control of large-scale flexible solar array system with deployable mast are investigated. The adopted solar array system is introduced firstly, including system configuration, deployable mast and solar arrays with several mechanisms. Then dynamic equation of the solar array system is established by the Jourdain velocity variation principle and a method for dynamics with topology changes is introduced. In addition, a PD controller with disturbance estimation is designed to eliminate the drift of spacecraft mainbody. Finally the validity of the dynamic model is verified through a comparison with ADAMS software and the deployment process and dynamic behavior of the system are studied in detail. Simulation results indicate that the proposed model is effective to describe the deployment dynamics of the large-scale flexible solar arrays and the proposed controller is practical to eliminate the drift of spacecraft mainbody.

  3. Balancing Europe's wind power output through spatial deployment informed by weather regimes.

    PubMed

    Grams, Christian M; Beerli, Remo; Pfenninger, Stefan; Staffell, Iain; Wernli, Heini

    2017-08-01

    As wind and solar power provide a growing share of Europe's electricity1, understanding and accommodating their variability on multiple timescales remains a critical problem. On weekly timescales, variability is related to long-lasting weather conditions, called weather regimes2-5, which can cause lulls with a loss of wind power across neighbouring countries6. Here we show that weather regimes provide a meteorological explanation for multi-day fluctuations in Europe's wind power and can help guide new deployment pathways which minimise this variability. Mean generation during different regimes currently ranges from 22 GW to 44 GW and is expected to triple by 2030 with current planning strategies. However, balancing future wind capacity across regions with contrasting inter-regime behaviour - specifically deploying in the Balkans instead of the North Sea - would almost eliminate these output variations, maintain mean generation, and increase fleet-wide minimum output. Solar photovoltaics could balance low-wind regimes locally, but only by expanding current capacity tenfold. New deployment strategies based on an understanding of continent-scale wind patterns and pan-European collaboration could enable a high share of wind energy whilst minimising the negative impacts of output variability.

  4. Validation of Scales from the Deployment Risk and Resilience Inventory in a Sample of Operation Iraqi Freedom Veterans

    ERIC Educational Resources Information Center

    Vogt, Dawne S.; Proctor, Susan P.; King, Daniel W.; King, Lynda A.; Vasterling, Jennifer J.

    2008-01-01

    The Deployment Risk and Resilience Inventory (DRRI) is a suite of scales that can be used to assess deployment-related factors implicated in the health and well-being of military veterans. Although initial evidence for the reliability and validity of DRRI scales based on Gulf War veteran samples is encouraging, evidence with respect to a more…

  5. Newborn Sequencing in Genomic Medicine and Public Health

    PubMed Central

    Agrawal, Pankaj B.; Bailey, Donald B.; Beggs, Alan H.; Brenner, Steven E.; Brower, Amy M.; Cakici, Julie A.; Ceyhan-Birsoy, Ozge; Chan, Kee; Chen, Flavia; Currier, Robert J.; Dukhovny, Dmitry; Green, Robert C.; Harris-Wai, Julie; Holm, Ingrid A.; Iglesias, Brenda; Joseph, Galen; Kingsmore, Stephen F.; Koenig, Barbara A.; Kwok, Pui-Yan; Lantos, John; Leeder, Steven J.; Lewis, Megan A.; McGuire, Amy L.; Milko, Laura V.; Mooney, Sean D.; Parad, Richard B.; Pereira, Stacey; Petrikin, Joshua; Powell, Bradford C.; Powell, Cynthia M.; Puck, Jennifer M.; Rehm, Heidi L.; Risch, Neil; Roche, Myra; Shieh, Joseph T.; Veeraraghavan, Narayanan; Watson, Michael S.; Willig, Laurel; Yu, Timothy W.; Urv, Tiina; Wise, Anastasia L.

    2017-01-01

    The rapid development of genomic sequencing technologies has decreased the cost of genetic analysis to the extent that it seems plausible that genome-scale sequencing could have widespread availability in pediatric care. Genomic sequencing provides a powerful diagnostic modality for patients who manifest symptoms of monogenic disease and an opportunity to detect health conditions before their development. However, many technical, clinical, ethical, and societal challenges should be addressed before such technology is widely deployed in pediatric practice. This article provides an overview of the Newborn Sequencing in Genomic Medicine and Public Health Consortium, which is investigating the application of genome-scale sequencing in newborns for both diagnosis and screening. PMID:28096516

  6. SunShot 2030 for Photovoltaics (PV): Envisioning a Low-cost PV Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley J.; Frew, Bethany A.; Gagnon, Pieter J.

    In this report we summarize the implications, impacts, and deployment potential of reaching the SunShot 2030 targets for the electricity system in the contiguous United States. We model 25 scenarios of the U.S. power sector using the Regional Energy Deployment Systems (ReEDS) and Distributed Generation (dGen) capacity expansion models. The scenarios cover a wide range of sensitivities to capture future uncertainties relating to fuel prices, retirements, renewable energy capital costs, and load growth. We give special attention to the potential for storage costs to also rapidly decline due to its large synergies with low-cost solar. The ReEDS and dGen modelsmore » project utility- and distributed-scale power sector evolution, respectively, for the United States. Both models have been designed with special emphasis on capturing the unique traits of renewable energy, including variability and grid integration requirements. Across the suite of scenarios modeled, we find that reaching the SunShot 2030 target has the potential to lead to significant capacity additions of PV in the United States. By 2050, PV penetration levels are projected to reach 28-46 percent of total generation. If storage also sees significant reductions in cost, then the 2050 solar penetration levels could reach 41-64 percent. PV deployment is projected to occur in all of the lower 48 states, though the specific deployment level is scenario dependent. The growth in PV is projected to be dominated by utility-scale systems, but the actual mix between utility and distributed systems could ultimately vary depending on how policies, system costs, and rate structures evolve.« less

  7. White Paper on Dish Stirling Technology: Path Toward Commercial Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andraka, Charles E.; Stechel, Ellen; Becker, Peter

    2016-07-01

    Dish Stirling energy systems have been developed for distributed and large-scale utility deployment. This report summarizes the state of the technology in a joint project between Stirling Energy Systems, Sandia National Laboratories, and the Department of Energy in 2011. It then lays out a feasible path to large scale deployment, including development needs and anticipated cost reduction paths that will make a viable deployment product.

  8. Pathology and toxicology findings for search-and-rescue dogs deployed to the September 11, 2001, terrorist attack sites: initial five-year surveillance.

    PubMed

    Fitzgerald, Scott D; Rumbeiha, Wilson K; Emmett Braselton, W; Downend, Amanda B; Otto, Cynthia M

    2008-07-01

    A long-term surveillance study was conducted on 95 search-and-rescue (S&R) dogs deployed to the September 11, 2001, terrorist attack sites; an additional 55 nondeployed S&R dogs served as controls. After 5 years of surveillance, 32% of the deployed dogs have died and 24% of the nondeployed dogs. The mean age at the time of death in these 2 groups of dogs is not significantly different. Causes of death in both groups of dogs include inflammatory, degenerative, and proliferative conditions. No primary pulmonary tumors have been identified to date nor has any significant level of toxicant been found in the tissues from these dogs using assays for general organic compounds and metals or, specifically, for polychlorinated biphenyls. However, significant numbers of both deployed and nondeployed dogs have evidence of inhaled matter as demonstrated by the presence of anthracotic pigments or refractile particulate matter in pulmonary tissue. Although S&R activities in response to the 9/11 terrorist attacks exposed dogs to a wide variety of potentially toxic compounds, to date, these dogs do not appear to suffer from higher mortality or increased pulmonary disease compared with nondeployed dogs. To the authors' knowledge, the current survey represents the first long-term and large-scale survey of the pathology and toxicology of S&R dogs deployed to a major disaster site.

  9. An Xdata Architecture for Federated Graph Models and Multi-tier Asymmetric Computing

    DTIC Science & Technology

    2014-01-01

    Wikipedia, a scale-free random graph (kron), Akamai trace route data, Bitcoin transaction data, and a Twitter follower network. We present results for...3x (SSSP on a random graph) and nearly 300x (Akamai and Bitcoin ) over the CPU performance of a well-known and widely deployed CPU-based graph...provided better throughput for smaller frontiers such as roadmaps or the Bitcoin data set. In our work, we have focused on two-phase kernels, but it

  10. Wireless Technology Infrastructures for Authentication of Patients: PKI that Rings

    PubMed Central

    Sax, Ulrich; Kohane, Isaac; Mandl, Kenneth D.

    2005-01-01

    As the public interest in consumer-driven electronic health care applications rises, so do concerns about the privacy and security of these applications. Achieving a balance between providing the necessary security while promoting user acceptance is a major obstacle in large-scale deployment of applications such as personal health records (PHRs). Robust and reliable forms of authentication are needed for PHRs, as the record will often contain sensitive and protected health information, including the patient's own annotations. Since the health care industry per se is unlikely to succeed at single-handedly developing and deploying a large scale, national authentication infrastructure, it makes sense to leverage existing hardware, software, and networks. This report proposes a new model for authentication of users to health care information applications, leveraging wireless mobile devices. Cell phones are widely distributed, have high user acceptance, and offer advanced security protocols. The authors propose harnessing this technology for the strong authentication of individuals by creating a registration authority and an authentication service, and examine the problems and promise of such a system. PMID:15684133

  11. Wireless technology infrastructures for authentication of patients: PKI that rings.

    PubMed

    Sax, Ulrich; Kohane, Isaac; Mandl, Kenneth D

    2005-01-01

    As the public interest in consumer-driven electronic health care applications rises, so do concerns about the privacy and security of these applications. Achieving a balance between providing the necessary security while promoting user acceptance is a major obstacle in large-scale deployment of applications such as personal health records (PHRs). Robust and reliable forms of authentication are needed for PHRs, as the record will often contain sensitive and protected health information, including the patient's own annotations. Since the health care industry per se is unlikely to succeed at single-handedly developing and deploying a large scale, national authentication infrastructure, it makes sense to leverage existing hardware, software, and networks. This report proposes a new model for authentication of users to health care information applications, leveraging wireless mobile devices. Cell phones are widely distributed, have high user acceptance, and offer advanced security protocols. The authors propose harnessing this technology for the strong authentication of individuals by creating a registration authority and an authentication service, and examine the problems and promise of such a system.

  12. Wave resource variability: Impacts on wave power supply over regional to international scales

    NASA Astrophysics Data System (ADS)

    Smith, Helen; Fairley, Iain; Robertson, Bryson; Abusara, Mohammad; Masters, Ian

    2017-04-01

    The intermittent, irregular and variable nature of the wave energy resource has implications for the supply of wave-generated electricity into the grid. Intermittency of renewable power may lead to frequency and voltage fluctuations in the transmission and distribution networks. A matching supply of electricity must be planned to meet the predicted demand, leading to a need for gas-fired and back-up generating plants to supplement intermittent supplies, and potentially limiting the integration of intermittent power into the grid. Issues relating to resource intermittency and their mitigation through the development of spatially separated sites have been widely researched in the wind industry, but have received little attention to date in the less mature wave industry. This study analyses the wave resource over three different spatial scales to investigate the potential impacts of the temporal and spatial resource variability on the grid supply. The primary focus is the Southwest UK, a region already home to multiple existing and proposed wave energy test sites. Concurrent wave buoy data from six locations, supported by SWAN wave model hindcast data, are analysed to assess the correlation of the resource across the region and the variation in wave power with direction. Power matrices for theoretical nearshore and offshore devices are used to calculate the maximum step change in generated power across the region as the number of deployment sites is increased. The step change analysis is also applied across national and international spatial scales using output from the European Centre for Medium-range Weather Forecasting (ECMWF) ERA-Interim hindcast model. It is found that the deployment of multiple wave energy sites, whether on a regional, national or international scale, results in both a reduction in step changes in power and reduced times of zero generation, leading to an overall smoothing of the wave-generated electrical power. This has implications for the planning and siting of future wave energy arrays when the industry reaches the point of large-scale deployment.

  13. ENGINEERING DEVELOPMENT UNIT SOLAR SAIL

    NASA Image and Video Library

    2016-01-13

    TIFFANY LOCKETT OVERSEES THE HALF SCALE (36 SQUARE METERS) ENGINEERING DEVELOPMENT UNIT (EDU) SOLAR SAIL DEPLOYMENT DEMONSTRATION IN PREPARATION FOR FULL SCALE EDU (86 SQUARE METERS) DEPLOYMENT IN APRIL, 2016

  14. A controlled field pilot for testing near surface CO2 detection techniques and transport models

    USGS Publications Warehouse

    Spangler, L.H.; Dobeck, L.M.; Repasky, K.; Nehrir, A.; Humphries, S.; Keith, C.; Shaw, J.; Rouse, J.; Cunningham, A.; Benson, S.; Oldenburg, C.M.; Lewicki, J.L.; Wells, A.; Diehl, R.; Strazisar, B.; Fessenden, J.; Rahn, Thomas; Amonette, J.; Barr, J.; Pickles, W.; Jacobson, J.; Silver, E.; Male, E.; Rauch, H.; Gullickson, K.; Trautz, R.; Kharaka, Y.; Birkholzer, J.; Wielopolski, L.

    2009-01-01

    A field facility has been developed to allow controlled studies of near surface CO2 transport and detection technologies. The key component of the facility is a shallow, slotted horizontal well divided into six zones. The scale and fluxes were designed to address large scale CO2 storage projects and desired retention rates for those projects. A wide variety of detection techniques were deployed by collaborators from 6 national labs, 2 universities, EPRI, and the USGS. Additionally, modeling of CO2 transport and concentrations in the saturated soil and in the vadose zone was conducted. An overview of these results will be presented. ?? 2009 Elsevier Ltd. All rights reserved.

  15. Newborn Sequencing in Genomic Medicine and Public Health.

    PubMed

    Berg, Jonathan S; Agrawal, Pankaj B; Bailey, Donald B; Beggs, Alan H; Brenner, Steven E; Brower, Amy M; Cakici, Julie A; Ceyhan-Birsoy, Ozge; Chan, Kee; Chen, Flavia; Currier, Robert J; Dukhovny, Dmitry; Green, Robert C; Harris-Wai, Julie; Holm, Ingrid A; Iglesias, Brenda; Joseph, Galen; Kingsmore, Stephen F; Koenig, Barbara A; Kwok, Pui-Yan; Lantos, John; Leeder, Steven J; Lewis, Megan A; McGuire, Amy L; Milko, Laura V; Mooney, Sean D; Parad, Richard B; Pereira, Stacey; Petrikin, Joshua; Powell, Bradford C; Powell, Cynthia M; Puck, Jennifer M; Rehm, Heidi L; Risch, Neil; Roche, Myra; Shieh, Joseph T; Veeraraghavan, Narayanan; Watson, Michael S; Willig, Laurel; Yu, Timothy W; Urv, Tiina; Wise, Anastasia L

    2017-02-01

    The rapid development of genomic sequencing technologies has decreased the cost of genetic analysis to the extent that it seems plausible that genome-scale sequencing could have widespread availability in pediatric care. Genomic sequencing provides a powerful diagnostic modality for patients who manifest symptoms of monogenic disease and an opportunity to detect health conditions before their development. However, many technical, clinical, ethical, and societal challenges should be addressed before such technology is widely deployed in pediatric practice. This article provides an overview of the Newborn Sequencing in Genomic Medicine and Public Health Consortium, which is investigating the application of genome-scale sequencing in newborns for both diagnosis and screening. Copyright © 2017 by the American Academy of Pediatrics.

  16. How well could existing sensors detect the deployment of a solar radiation management (SRM) geoengineering effort?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurd, Alan J.

    2016-04-29

    While the stated reason for asking this question is “to understand better our ability to warn policy makers in the unlikely event of an unanticipated SRM geoengineering deployment or large-scale field experiment”, my colleagues and I felt that motives would be important context because the scale of any meaningful SRM deployment would be so large that covert deployment seems impossible. However, several motives emerged that suggest a less-than-global effort might be important.

  17. Application of Robotics in Decommissioning and Decontamination - 12536

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banford, Anthony; Kuo, Jeffrey A.; Bowen, R.A.

    Decommissioning and dismantling of nuclear facilities is a significant challenge worldwide and one which is growing in size as more plants reach the end of their operational lives. The strategy chosen for individual projects varies from the hands-on approach with significant manual intervention using traditional demolition equipment at one extreme to bespoke highly engineered robotic solutions at the other. The degree of manual intervention is limited by the hazards and risks involved, and in some plants are unacceptable. Robotic remote engineering is often viewed as more expensive and less reliable than manual approaches, with significant lead times and capital expenditure.more » However, advances in robotics and automation in other industries offer potential benefits for future decommissioning activities, with the high probability of reducing worker exposure and other safety risks as well as reducing the schedule and costs required to complete these activities. Some nuclear decommissioning tasks and facility environments are so hazardous that they can only be accomplished by exclusive use of robotic and remote intervention. Less hazardous tasks can be accomplished by manual intervention and the use of PPE. However, PPE greatly decreases worker productivity and still exposes the worker to both risk and dose making remote operation preferable to achieve ALARP. Before remote operations can be widely accepted and deployed, there are some economic and technological challenges that must be addressed. These challenges will require long term investment commitments in order for technology to be: - Specifically developed for nuclear applications; - At a sufficient TRL for practical deployment; - Readily available as a COTS. Tremendous opportunities exist to reduce cost and schedule and improve safety in D and D activities through the use of robotic and/or tele-operated systems. - Increasing the level of remote intervention reduces the risk and dose to an operator. Better environmental information identifies hazards, which can be assessed, managed and mitigated. - Tele-autonomous control in a congested unstructured environment is more reliable compared to a human operator. Advances in Human Machine Interfaces contribute to reliability and task optimization. Use of standardized dexterous manipulators and COTS, including standardized communication protocols reduces project time scales. - The technologies identified, if developed to a sufficient TRL would all contribute to cost reductions. Additionally, optimizing a project's position on a Remote Intervention Scale, a Bespoke Equipment Scale and a Tele-autonomy Scale would provide cost reductions from the start of a project. Of the technologies identified, tele-autonomy is arguably the most significant, because this would provide a fundamental positive change for robotic control in the nuclear industry. The challenge for technology developers is to develop versatile robotic technology that can be economically deployed to a wide range of future D and D projects and industrial sectors. The challenge for facility owners and project managers is to partner with the developers to provide accurate systems requirements and an open and receptive environment for testing and deployment. To facilitate this development and deployment effort, the NNL and DOE have initiated discussions to explore a collaborative R and D program that would accelerate development and support the optimum utilization of resources. (authors)« less

  18. Security Issues in Cross-Organizational Peer-to-Peer Applications and Some Solutions

    NASA Astrophysics Data System (ADS)

    Gupta, Ankur; Awasthi, Lalit K.

    Peer-to-Peer networks have been widely used for sharing millions of terabytes of content, for large-scale distributed computing and for a variety of other novel applications, due to their scalability and fault-tolerance. However, the scope of P2P networks has somehow been limited to individual computers connected to the internet. P2P networks are also notorious for blatant copyright violations and facilitating several kinds of security attacks. Businesses and large organizations have thus stayed away from deploying P2P applications citing security loopholes in P2P systems as the biggest reason for non-adoption. In theory P2P applications can help fulfill many organizational requirements such as collaboration and joint projects with other organizations, access to specialized computing infrastructure and finally accessing the specialized information/content and expert human knowledge available at other organizations. These potentially beneficial interactions necessitate that the research community attempt to alleviate the security shortcomings in P2P systems and ensure their acceptance and wide deployment. This research paper therefore examines the security issues prevalent in enabling cross-organizational P2P interactions and provides some technical insights into how some of these issues can be resolved.

  19. CMS Use of a Data Federation

    NASA Astrophysics Data System (ADS)

    Bloom, Kenneth; Cms Collaboration

    2014-06-01

    CMS is in the process of deploying an Xrootd based infrastructure to facilitate a global data federation. The services of the federation are available to export data from half the physical capacity and the majority of sites are configured to read data over the federation as a back-up. CMS began with a relatively modest set of use-cases for recovery of failed local file opens, debugging and visualization. CMS is finding that the data federation can be used to support small scale analysis and load balancing. Looking forward we see potential in using the federation to provide more flexibility in the location workflows are executed as the difference between local access and wide area access are diminished by optimization and improved networking. In this presentation we discuss the application development work and the facility deployment work, the use-cases currently in production, and the potential for the technology moving forward.

  20. A three-dimensional actuated origami-inspired transformable metamaterial with multiple degrees of freedom

    NASA Astrophysics Data System (ADS)

    Overvelde, Johannes T. B.; de Jong, Twan A.; Shevchenko, Yanina; Becerra, Sergio A.; Whitesides, George M.; Weaver, James C.; Hoberman, Chuck; Bertoldi, Katia

    2016-03-01

    Reconfigurable devices, whose shape can be drastically altered, are central to expandable shelters, deployable space structures, reversible encapsulation systems and medical tools and robots. All these applications require structures whose shape can be actively controlled, both for deployment and to conform to the surrounding environment. While most current reconfigurable designs are application specific, here we present a mechanical metamaterial with tunable shape, volume and stiffness. Our approach exploits a simple modular origami-like design consisting of rigid faces and hinges, which are connected to form a periodic structure consisting of extruded cubes. We show both analytically and experimentally that the transformable metamaterial has three degrees of freedom, which can be actively deformed into numerous specific shapes through embedded actuation. The proposed metamaterial can be used to realize transformable structures with arbitrary architectures, highlighting a robust strategy for the design of reconfigurable devices over a wide range of length scales.

  1. Chip-based quantum key distribution

    NASA Astrophysics Data System (ADS)

    Sibson, P.; Erven, C.; Godfrey, M.; Miki, S.; Yamashita, T.; Fujiwara, M.; Sasaki, M.; Terai, H.; Tanner, M. G.; Natarajan, C. M.; Hadfield, R. H.; O'Brien, J. L.; Thompson, M. G.

    2017-02-01

    Improvement in secure transmission of information is an urgent need for governments, corporations and individuals. Quantum key distribution (QKD) promises security based on the laws of physics and has rapidly grown from proof-of-concept to robust demonstrations and deployment of commercial systems. Despite these advances, QKD has not been widely adopted, and large-scale deployment will likely require chip-based devices for improved performance, miniaturization and enhanced functionality. Here we report low error rate, GHz clocked QKD operation of an indium phosphide transmitter chip and a silicon oxynitride receiver chip--monolithically integrated devices using components and manufacturing processes from the telecommunications industry. We use the reconfigurability of these devices to demonstrate three prominent QKD protocols--BB84, Coherent One Way and Differential Phase Shift--with performance comparable to state-of-the-art. These devices, when combined with integrated single photon detectors, pave the way for successfully integrating QKD into future telecommunications networks.

  2. Chip-based quantum key distribution

    PubMed Central

    Sibson, P.; Erven, C.; Godfrey, M.; Miki, S.; Yamashita, T.; Fujiwara, M.; Sasaki, M.; Terai, H.; Tanner, M. G.; Natarajan, C. M.; Hadfield, R. H.; O'Brien, J. L.; Thompson, M. G.

    2017-01-01

    Improvement in secure transmission of information is an urgent need for governments, corporations and individuals. Quantum key distribution (QKD) promises security based on the laws of physics and has rapidly grown from proof-of-concept to robust demonstrations and deployment of commercial systems. Despite these advances, QKD has not been widely adopted, and large-scale deployment will likely require chip-based devices for improved performance, miniaturization and enhanced functionality. Here we report low error rate, GHz clocked QKD operation of an indium phosphide transmitter chip and a silicon oxynitride receiver chip—monolithically integrated devices using components and manufacturing processes from the telecommunications industry. We use the reconfigurability of these devices to demonstrate three prominent QKD protocols—BB84, Coherent One Way and Differential Phase Shift—with performance comparable to state-of-the-art. These devices, when combined with integrated single photon detectors, pave the way for successfully integrating QKD into future telecommunications networks. PMID:28181489

  3. TeleMed: Wide-area, secure, collaborative object computing with Java and CORBA for healthcare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.W.; George, J.E.; Gavrilov, E.M.

    1998-12-31

    Distributed computing is becoming commonplace in a variety of industries with healthcare being a particularly important one for society. The authors describe the development and deployment of TeleMed in a few healthcare domains. TeleMed is a 100% Java distributed application build on CORBA and OMG standards enabling the collaboration on the treatment of chronically ill patients in a secure manner over the Internet. These standards enable other systems to work interoperably with TeleMed and provide transparent access to high performance distributed computing to the healthcare domain. The goal of wide scale integration of electronic medical records is a grand-challenge scalemore » problem of global proportions with far-reaching social benefits.« less

  4. Near Earth Asteroid Scout Solar Sail Engineering Development Unit Test Suite

    NASA Technical Reports Server (NTRS)

    Lockett, Tiffany Russell; Few, Alexander; Wilson, Richard

    2017-01-01

    The Near Earth Asteroid (NEA) Scout project is a 6U reconnaissance mission to investigate a near Earth asteroid utilizing an 86m(sub 2) solar sail as the primary propulsion system. This will be the largest solar sail NASA has launched to date. NEA Scout is currently manifested on the maiden voyage of the Space Launch System in 2018. In development of the solar sail subsystem, design challenges were identified and investigated for packaging within a 6U form factor and deployment in cis-lunar space. Analysis was able to capture understanding of thermal, stress, and dynamics of the stowed system as well as mature an integrated sail membrane model for deployed flight dynamics. Full scale system testing on the ground is the optimal way to demonstrate system robustness, repeatability, and overall performance on a compressed flight schedule. To physically test the system, the team developed a flight sized engineering development unit with design features as close to flight as possible. The test suite included ascent vent, random vibration, functional deployments, thermal vacuum, and full sail deployments. All of these tests contributed towards development of the final flight unit. This paper will address several of the design challenges and lessons learned from the NEA Scout solar sail subsystem engineering development unit. Testing on the component level all the way to the integrated subsystem level. From optical properties of the sail material to fold and spooling the single sail, the team has developed a robust deployment system for the solar sail. The team completed several deployments of the sail system in preparation for flight at half scale (4m) and full scale (6.8m): boom only, half scale sail deployment, and full scale sail deployment. This paper will also address expected and received test results from ascent vent, random vibration, and deployment tests.

  5. Characterization of Vegetation using the UC Davis Remote Sensing Testbed

    NASA Astrophysics Data System (ADS)

    Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.

    2006-12-01

    Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.

  6. Cloud Computing for Protein-Ligand Binding Site Comparison

    PubMed Central

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery. PMID:23762824

  7. Cloud computing for protein-ligand binding site comparison.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.

  8. ENGINEERING DEVELOPMENT UNIT SOLAR SAIL

    NASA Image and Video Library

    2016-01-13

    TIFFANY LOCKETT OVERSEES THE HALF SCALE (36 SQUARE METERS) ENGINEERING DEVELOPMENT UNIT (EDU) SOLAR SAIL DEPLOYMENT DEMONSTRATION IN PREPARATION FOR FULL SCALE EDU (86 SQUARE METERS) DEPLOYMENT IN APRIL, 2016. DETAILS OF RIPS AND HOLES IN SOLAR SAIL FABRIC.

  9. Fast Open-World Person Re-Identification.

    PubMed

    Zhu, Xiatian; Wu, Botong; Huang, Dongcheng; Zheng, Wei-Shi

    2018-05-01

    Existing person re-identification (re-id) methods typically assume that: 1) any probe person is guaranteed to appear in the gallery target population during deployment (i.e., closed-world) and 2) the probe set contains only a limited number of people (i.e., small search scale). Both assumptions are artificial and breached in real-world applications, since the probe population in target people search can be extremely vast in practice due to the ambiguity of probe search space boundary. Therefore, it is unrealistic that any probe person is assumed as one target people, and a large-scale search in person images is inherently demanded. In this paper, we introduce a new person re-id search setting, called large scale open-world (LSOW) re-id, characterized by huge size probe images and open person population in search thus more close to practical deployments. Under LSOW, the under-studied problem of person re-id efficiency is essential in addition to that of commonly studied re-id accuracy. We, therefore, develop a novel fast person re-id method, called Cross-view Identity Correlation and vErification (X-ICE) hashing, for joint learning of cross-view identity representation binarisation and discrimination in a unified manner. Extensive comparative experiments on three large-scale benchmarks have been conducted to validate the superiority and advantages of the proposed X-ICE method over a wide range of the state-of-the-art hashing models, person re-id methods, and their combinations.

  10. Ultralightweight Space Deployable Primary Reflector Demonstrator

    NASA Technical Reports Server (NTRS)

    Montgomery, Edward E., IV; Zeiders, Glenn W.; Smith, W. Scott (Technical Monitor)

    2002-01-01

    A concept has been developed and analyzed and several generational prototypes built for a gossamer-class deployable truss for a mirror or reflector with many smaller precisely-figured solid elements attached will, for at least the next several decades, minimize the mass of a large primary mirror assembly while still providing the high image quality essential for planet-finding and cosmological astronomical missions. Primary mirror segments are mounted in turn on ultralightweight thermally-formed plastic panels that hold clusters of mirror segments in rigid arrays whose tip/tilt and piston would be corrected over the scale of the plastic panels by the control segments. Prototype panels developed under this program are 45 cm wide and fabricated from commercially available Kaplan sheets. A three-strut octahedral tensegrity is the basis for the overall support structure. Each fundamental is composed of two such octahedrons, rotated oppositely about a common triangular face. Adjacent modules are joined at the nodes of the upper and lower triangles to form a deployable structure that could be made arbitrarily large. A seven-module dowel-and-wire prototype has been constructed. Deployment techniques based on the use of collapsing toggled struts with diagonal tensional elements allows an assembly of tensegrities to be fully collapsed and redeployed. The prototype designs will be described and results of a test program for measuring strength and deformation will be presented.

  11. Women at war: implications for mental health.

    PubMed

    Dutra, Lissa; Grubbs, Kathleen; Greene, Carolyn; Trego, Lori L; McCartin, Tamarin L; Kloezeman, Karen; Morland, Leslie

    2011-01-01

    Few studies have investigated the impact of deployment stressors on the mental health outcomes of women deployed to Iraq in support of Operation Iraqi Freedom. This pilot study examined exposure to combat experiences and military sexual harassment in a sample of 54 active duty women and assessed the impact of these stressors on post-deployment posttraumatic stress disorder (PTSD) symptoms and depressive symptoms. Within 3 months of returning from deployment to Iraq, participants completed (a) the Combat Experiences Scale and the Sexual Harassment Scale of the Deployment Risk and Resilience Inventory, (b) the Primary Care PTSD Screen, and (c) an abbreviated version of the Center for Epidemiological Studies-Depression scale. Approximately three quarters of the sample endorsed exposure to combat experiences, and more than half of the sample reported experiencing deployment-related sexual harassment, with nearly half of the sample endorsing both stressors. Approximately one third of the sample endorsed clinical or subclinical levels of PTSD symptoms, with 11% screening positive for PTSD and 9% to 14% of the sample endorsing depressive symptoms. Regression analyses revealed that combat experiences and sexual harassment jointly accounted for significant variance in post-deployment PTSD symptoms, whereas military sexual harassment was identified as the only unique significant predictor of these symptoms. Findings from the present study lend support to research demonstrating that military sexual trauma may be more highly associated with post-deployment PTSD symptoms than combat exposure among female service members and veterans.

  12. The “Wireless Sensor Networks for City-Wide Ambient Intelligence (WISE-WAI)” Project

    PubMed Central

    Casari, Paolo; Castellani, Angelo P.; Cenedese, Angelo; Lora, Claudio; Rossi, Michele; Schenato, Luca; Zorzi, Michele

    2009-01-01

    This paper gives a detailed technical overview of some of the activities carried out in the context of the “Wireless Sensor networks for city-Wide Ambient Intelligence (WISE-WAI)” project, funded by the Cassa di Risparmio di Padova e Rovigo Foundation, Italy. The main aim of the project is to demonstrate the feasibility of large-scale wireless sensor network deployments, whereby tiny objects integrating one or more environmental sensors (humidity, temperature, light intensity), a microcontroller and a wireless transceiver are deployed over a large area, which in this case involves the buildings of the Department of Information Engineering at the University of Padova. We will describe how the network is organized to provide full-scale automated functions, and which services and applications it is configured to provide. These applications include long-term environmental monitoring, alarm event detection and propagation, single-sensor interrogation, localization and tracking of objects, assisted navigation, as well as fast data dissemination services to be used, e.g., to rapidly re-program all sensors over-the-air. The organization of such a large testbed requires notable efforts in terms of communication protocols and strategies, whose design must pursue scalability, energy efficiency (while sensors are connected through USB cables for logging and debugging purposes, most of them will be battery-operated), as well as the capability to support applications with diverse requirements. These efforts, the description of a subset of the results obtained so far, and of the final objectives to be met are the scope of the present paper. PMID:22408513

  13. The VolturnUS 1:8 Floating Wind Turbine: Design, Construction, Deployment, Testing, Retrieval, and Inspection of the First Grid-Connected Offshore Wind Turbine in US

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dagher, Habib; Viselli, Anthony; Goupee, Andrew

    Volume II of the Final Report for the DeepCwind Consortium National Research Program funded by US Department of Energy Award Number: DE-EE0003278.001 summarizes the design, construction, deployment, testing, numerical model validation, retrieval, and post-deployment inspection of the VolturnUS 1:8-scale floating wind turbine prototype deployed off Castine, Maine on June 2nd, 2013. The 1:8 scale VolturnUS design served as a de-risking exercise for a commercial multi-MW VolturnUS design. The American Bureau of Shipping Guide for Building and Classing Floating Offshore Wind Turbine Installations was used to design the prototype. The same analysis methods, design methods, construction techniques, deployment methods, mooring, andmore » anchoring planned for full-scale were used. A commercial 20kW grid-connected turbine was used and was the first offshore wind turbine in the US.« less

  14. Screening for Anger and Sleep Difficulties.

    PubMed

    Steele, Nicole M; Fogarty, Gerard J

    2017-03-01

    Mental health screens are designed to detect individuals at risk of psychological disorders. In the military setting of this study, these disorders were post-traumatic stress disorder (PTSD) and alcohol use. This study extends the literature on deployment-related mental health screening by including measures of sleep difficulties and anger as predictors of postdeployment PTSD and alcohol abuse. Evidence that measures of anger and sleep difficulties contribute incremental validity to the prediction of postdeployment mental health problems, including substance abuse, would be helpful in designing interventions to assist the rehabilitation of returning personnel. A test battery containing the PTSD Checklist-Civilian (PCL-C) to screen for PTSD, the Kessler 10 to screen for psychological distress, a Sleep Difficulties scale, an exposure to trauma scale, and an anger scale was administered to 212 personnel nearing completion of a deployment to the Middle East. A second battery containing the PCL-C, the Kessler 10, and a measure of alcohol consumption (Alcohol Use Disorders Identification Test [AUDIT]) was administered to the same personnel 3 to 6 months after return to Australia. Hierarchical regression analyses assessed the predictive validity of measures of psychological distress (anxiety and depression), PTSD symptomatology, sleep disturbance, and anger in relation to postdeployment measures of PTSD symptomatology and alcohol use. Time 1 measures predicted 24.4% of the variance in postdeployment PCL-C scores and 13.1% of the variance in AUDIT scores, with the Sleep Difficulties scale contributing to the prediction of the PCL-C score and the anger scale helping to predict AUDIT scores. On the basis of these findings, we recommend the inclusion of improved measures of both anger and sleep difficulties in end-of-deployment mental health screens. A less behaviorally specific and more wide-ranging anger scale is recommended for future studies that aim to evaluate the role of anger in screening batteries. Our findings suggest that the Sleep Difficulties scale used in this study would be a worthwhile addition to mental health screening because it is moderately correlated with both Time 1 and Time 2 measures of PTSD symptomatology and psychological distress. Furthermore, there is minimal stigma associated with the experience of sleep difficulties. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.

  15. Effects of a military cargo pod and tail fins on the aerodynamic characteristics of a large wide-body transport model

    NASA Technical Reports Server (NTRS)

    Jernell, L. S.; Croom, D. R.

    1979-01-01

    Wind tunnel tests were conducted on a 0.03 scale model of a large wide-body commercial aircraft to determine the effects on the static aerodynamic characteristics resulting from the attachment of a belly pod for the long-range deployment of outsize military equipment. The effectiveness of horizontal-tip fins in augmenting directional stability was investigated. At a test Reynolds number of 1.08 x 1,000,000, the addition of the pod results in an increase in total drag of approximately 20 percent. Trim drag due to the pod is very small. Although the pod produces a significant decrease in directional stability, the addition of the tip fins restores some of the stability, particularly at the lower angles of attack.

  16. Optimizing and Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems Following Parent Deployment

    DTIC Science & Technology

    2015-07-01

    prior to, during, and following deployment: Dyadic Adjustment Scale – measures marital functioning Conflict-Tactics Scale Family Adaptability and...Applied Psychosocial Measurement,1, 385-401. Rocissano, L., Slade, A., & Lynch, V. (1987). Dyadic synchrony and toddler compliance. Developmental...new criterion Q-sort scale. Developmental Psychology, 33, 906-916. Spanier, G.B. (1976). Measuring dyadic adjustment: new scales for assessing the

  17. A method exploiting direct communication between phasor measurement units for power system wide-area protection and control algorithms.

    PubMed

    Almas, Muhammad Shoaib; Vanfretti, Luigi

    2017-01-01

    Synchrophasor measurements from Phasor Measurement Units (PMUs) are the primary sensors used to deploy Wide-Area Monitoring, Protection and Control (WAMPAC) systems. PMUs stream out synchrophasor measurements through the IEEE C37.118.2 protocol using TCP/IP or UDP/IP. The proposed method establishes a direct communication between two PMUs, thus eliminating the requirement of an intermediate phasor data concentrator, data mediator and/or protocol parser and thereby ensuring minimum communication latency without considering communication link delays. This method allows utilizing synchrophasor measurements internally in a PMU to deploy custom protection and control algorithms. These algorithms are deployed using protection logic equations which are supported by all the PMU vendors. Moreover, this method reduces overall equipment cost as the algorithms execute internally in a PMU and therefore does not require any additional controller for their deployment. The proposed method can be utilized for fast prototyping of wide-area measurements based protection and control applications. The proposed method is tested by coupling commercial PMUs as Hardware-in-the-Loop (HIL) with Opal-RT's eMEGAsim Real-Time Simulator (RTS). As illustrative example, anti-islanding protection application is deployed using proposed method and its performance is assessed. The essential points in the method are: •Bypassing intermediate phasor data concentrator or protocol parsers as the synchrophasors are communicated directly between the PMUs (minimizes communication delays).•Wide Area Protection and Control Algorithm is deployed using logic equations in the client PMU, therefore eliminating the requirement for an external hardware controller (cost curtailment)•Effortless means to exploit PMU measurements in an environment familiar to protection engineers.

  18. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannotmore » be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem important to the nations scientific progress as described shortly. Further, SLAC researchers routinely generate massive amounts of data, and frequently collaborate with other researchers located around the world. Thus SLAC is an ideal teammate through which to develop, test and deploy this technology. The nature of the datasets generated by simulations performed at SLAC presented unique visualization challenges especially when dealing with higher-order elements that were addressed during this Phase II. During this Phase II, we have developed a strong platform for collaborative visualization based on ParaView. We have developed and deployed a ParaView Web Visualization framework that can be used for effective collaboration over the Web. Collaborating and visualizing over the Web presents the community with unique opportunities for sharing and accessing visualization and HPC resources that hitherto with either inaccessible or difficult to use. The technology we developed in here will alleviate both these issues as it becomes widely deployed and adopted.« less

  19. Sensing Human Activity: GPS Tracking

    PubMed Central

    van der Spek, Stefan; van Schaick, Jeroen; de Bois, Peter; de Haan, Remco

    2009-01-01

    The enhancement of GPS technology enables the use of GPS devices not only as navigation and orientation tools, but also as instruments used to capture travelled routes: as sensors that measure activity on a city scale or the regional scale. TU Delft developed a process and database architecture for collecting data on pedestrian movement in three European city centres, Norwich, Rouen and Koblenz, and in another experiment for collecting activity data of 13 families in Almere (The Netherlands) for one week. The question posed in this paper is: what is the value of GPS as ‘sensor technology’ measuring activities of people? The conclusion is that GPS offers a widely useable instrument to collect invaluable spatial-temporal data on different scales and in different settings adding new layers of knowledge to urban studies, but the use of GPS-technology and deployment of GPS-devices still offers significant challenges for future research. PMID:22574061

  20. Driver air bag effectiveness by severity of the crash.

    PubMed Central

    Segui-Gomez, M

    2000-01-01

    OBJECTIVES: This analysis provided effectiveness estimates of the driver-side air bag while controlling for severity of the crash and other potential confounders. METHODS: Data were from the National Automotive Sampling System (1993-1996). Injury severity was described on the basis of the Abbreviated Injury Scale, Injury Severity Score, Functional Capacity Index, and survival. Ordinal, linear, and logistic multivariate regression methods were used. RESULTS: Air bag deployment in frontal or near-frontal crashes decreases the probability of having severe and fatal injuries (e.g., Abbreviated Injury Scale score of 4-6), including those causing a long-lasting high degree of functional limitation. However, air bag deployment in low-severity crashes increases the probability that a driver (particularly a woman) will sustain injuries of Abbreviated Injury Scale level 1 to 3. Air bag deployment exerts a net injurious effect in low-severity crashes and a net protective effect in high-severity crashes. The level of crash severity at which air bags are protective is higher for female than for male drivers. CONCLUSIONS: Air bag improvement should minimize the injuries induced by their deployment. One possibility is to raise their deployment level so that they deploy only in more severe crashes. PMID:11029991

  1. Intra-urban spatial variability of surface ozone in Riverside, CA: viability and validation of low-cost sensors

    NASA Astrophysics Data System (ADS)

    Sadighi, Kira; Coffey, Evan; Polidori, Andrea; Feenstra, Brandon; Lv, Qin; Henze, Daven K.; Hannigan, Michael

    2018-03-01

    Sensor networks are being more widely used to characterize and understand compounds in the atmosphere like ozone (O3). This study employs a measurement tool, called the U-Pod, constructed at the University of Colorado Boulder, to investigate spatial and temporal variability of O3 in a 200 km2 area of Riverside County near Los Angeles, California. This tool contains low-cost sensors to collect ambient data at non-permanent locations. The U-Pods were calibrated using a pre-deployment field calibration technique; all the U-Pods were collocated with regulatory monitors. After collocation, the U-Pods were deployed in the area mentioned. A subset of pods was deployed at two local regulatory air quality monitoring stations providing validation for the collocation calibration method. Field validation of sensor O3 measurements to minute-resolution reference observations resulted in R2 and root mean squared errors (RMSEs) of 0.95-0.97 and 4.4-5.9 ppbv, respectively. Using the deployment data, ozone concentrations were observed to vary on this small spatial scale. In the analysis based on hourly binned data, the median R2 values between all possible U-Pod pairs varied from 0.52 to 0.86 for ozone during the deployment. The medians of absolute differences were calculated between all possible pod pairs, 21 pairs total. The median values of those median absolute differences for each hour of the day varied between 2.2 and 9.3 ppbv for the ozone deployment. Since median differences between U-Pod concentrations during deployment are larger than the respective root mean square error values, we can conclude that there is spatial variability in this criteria pollutant across the study area. This is important because it means that citizens may be exposed to more, or less, ozone than they would assume based on current regulatory monitoring.

  2. 2014 U.S. Offshore Wind Market Report: Industry Trends, Technology Advancement, and Cost Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Aaron; Stehly, Tyler; Walter Musial

    2015 has been an exciting year for the U.S. offshore wind market. After more than 15 years of development work, the U.S. has finally hit a crucial milestone; Deepwater Wind began construction on the 30 MW Block Island Wind Farm (BIWF) in April. A number of other promising projects, however, have run into economic, legal, and political headwinds, generating much speculation about the future of the industry. This slow, and somewhat painful, start to the industry is not without precedent; each country in northern Europe began with pilot-scale, proof-of-concept projects before eventually moving to larger commercial scale installations. Now, aftermore » more than a decade of commercial experience, the European industry is set to achieve a new deployment record, with more than 4 GW expected to be commissioned in 2015, with demonstrable progress towards industry-wide cost reduction goals. DWW is leveraging 25 years of European deployment experience; the BIWF combines state-of-the-art technologies such as the Alstom 6 MW turbine with U.S. fabrication and installation competencies. The successful deployment of the BIWF will provide a concrete showcase that will illustrate the potential of offshore wind to contribute to state, regional, and federal goals for clean, reliable power and lasting economic development. It is expected that this initial project will launch the U.S. industry into a phase of commercial development that will position offshore wind to contribute significantly to the electric systems in coastal states by 2030.« less

  3. Distributed Storage Healthcare — The Basis of a Planet-Wide Public Health Care Network

    PubMed Central

    Kakouros, Nikolaos

    2013-01-01

    Background: As health providers move towards higher levels of information technology (IT) integration, they become increasingly dependent on the availability of the electronic health record (EHR). Current solutions of individually managed storage by each healthcare provider focus on efforts to ensure data security, availability and redundancy. Such models, however, scale poorly to a future of a planet-wide public health-care network (PWPHN). Our aim was to review the research literature on distributed storage systems and propose methods that may aid the implementation of a PWPHN. Methods: A systematic review was carried out of the research dealing with distributed storage systems and EHR. A literature search was conducted on five electronic databases: Pubmed/Medline, Cinalh, EMBASE, Web of Science (ISI) and Google Scholar and then expanded to include non-authoritative sources. Results: The English National Health Service Spine represents the most established country-wide PHN but is limited in deployment and remains underused. Other, literature identified and established distributed EHR attempts are more limited in scope. We discuss the currently available distributed file storage solutions and propose a schema of how one of these technologies can be used to deploy a distributed storage of EHR with benefits in terms of enhanced fault tolerance and global availability within the PWPHN. We conclude that a PWPHN distributed health care record storage system is technically feasible over current Internet infrastructure. Nonetheless, the socioeconomic viability of PWPHN implementations remains to be determined. PMID:23459171

  4. Augmented Reality 2.0

    NASA Astrophysics Data System (ADS)

    Schmalstieg, Dieter; Langlotz, Tobias; Billinghurst, Mark

    Augmented Reality (AR) was first demonstrated in the 1960s, but only recently have technologies emerged that can be used to easily deploy AR applications to many users. Camera-equipped cell phones with significant processing power and graphics abilities provide an inexpensive and versatile platform for AR applications, while the social networking technology of Web 2.0 provides a large-scale infrastructure for collaboratively producing and distributing geo-referenced AR content. This combination of widely used mobile hardware and Web 2.0 software allows the development of a new type of AR platform that can be used on a global scale. In this paper we describe the Augmented Reality 2.0 concept and present existing work on mobile AR and web technologies that could be used to create AR 2.0 applications.

  5. Scaling up a CMS tier-3 site with campus resources and a 100 Gb/s network connection: what could go wrong?

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Tovar, Benjamin; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    The University of Notre Dame (ND) CMS group operates a modest-sized Tier-3 site suitable for local, final-stage analysis of CMS data. However, through the ND Center for Research Computing (CRC), Notre Dame researchers have opportunistic access to roughly 25k CPU cores of computing and a 100 Gb/s WAN network link. To understand the limits of what might be possible in this scenario, we undertook to use these resources for a wide range of CMS computing tasks from user analysis through large-scale Monte Carlo production (including both detector simulation and data reconstruction.) We will discuss the challenges inherent in effectively utilizing CRC resources for these tasks and the solutions deployed to overcome them.

  6. Software augmented buildings: Exploiting existing infrastructure to improve energy efficiency and comfort in commercial buildings

    NASA Astrophysics Data System (ADS)

    Balaji, Bharathan

    Commercial buildings consume 19% of energy in the US as of 2010, and traditionally, their energy use has been optimized through improved equipment efficiency and retrofits. Beyond improved hardware and infrastructure, there exists a tremendous potential in reducing energy use through better monitoring and operation. We present several applications that we developed and deployed to support our thesis that building energy use can be reduced through sensing, monitoring and optimization software that modulates use of building subsystems including HVAC. We focus on HVAC systems as these constitute 48-55% of building energy use. Specifically, in case of sensing, we describe an energy apportionment system that enables us to estimate real-time zonal HVAC power consumption by analyzing existing sensor information. With this energy breakdown, we can measure effectiveness of optimization solutions and identify inefficiencies. Central to energy efficiency improvement is determination of human occupancy in buildings. But this information is often unavailable or expensive to obtain using wide scale sensor deployment. We present our system that infers room level occupancy inexpensively by leveraging existing WiFi infrastructure. Occupancy information can be used not only to directly control HVAC but also to infer state of the building for predictive control. Building energy use is strongly influenced by human behaviors, and timely feedback mechanisms can encourage energy saving behavior. Occupants interact with HVAC using thermostats which has shown to be inadequate for thermal comfort. Building managers are responsible for incorporating energy efficiency measures, but our interviews reveal that they struggle to maintain efficiency due to lack of analytical tools and contextual information. We present our software services that provide energy feedback to occupants and building managers, improves comfort with personalized control and identifies energy wasting faults. For wide scale deployment of such energy saving software, they need to be portable across multiple buildings. However, buildings consist of heterogeneous equipment and use inconsistent naming schema, and developers need extensive domain knowledge to map sensor information to a standard format. To enable portability, we present an active learning algorithm that automates mapping building sensor metadata to a standard naming schema.

  7. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  8. Solar Geoengineering and the Modulation of North Atlantic Tropical Cyclone Frequency

    NASA Astrophysics Data System (ADS)

    Jones, A. C.; Haywood, J. M.; Hawcroft, M.; Jones, A.; Dunstone, N. J.; Hodges, K.

    2017-12-01

    Solar geoengineering (SG) refers to a wide range of proposed methods for counteracting global warming by artificially reducing solar insolation at Earth's surface. The most widely known SG proposal is stratospheric aerosol injection (SAI) which has impacts analogous to those from large-scale volcanic eruptions. Observations following major volcanic eruptions indicate that aerosol enhancements confined to a single hemisphere effectively modulate North Atlantic tropical cyclone (TC) activity in the following years. Here we investigate the effects of both single-hemisphere and global SAI scenarios on North Atlantic TC activity using the HadGEM2-ES general circulation model (GCM). We show that a 5 Tg y-1 injection of sulphur dioxide (SO2) into the northern hemisphere (NH) stratosphere would produce a global-mean cooling of 1 K and simultaneously reduce TC activity (to 8 TCs y-1), while the same injection in the southern hemisphere (SH) would enhance TC activity (to 14 TCs y-1), relative to a recent historical period (1950-2000, 10 TCs y-1). Our results reemphasize the risks of regional geoengineering and should motivate policymakers to regulate large-scale unilateral geoengineering deployments.

  9. Implementation of a piezoelectric energy harvester in railway health monitoring

    NASA Astrophysics Data System (ADS)

    Li, Jingcheng; Jang, Shinae; Tang, Jiong

    2014-03-01

    With development of wireless sensor technology, wireless sensor network has shown a great potential for railway health monitoring. However, how to supply continuous power to the wireless sensor nodes is one of the critical issues in long-term full-scale deployment of the wireless smart sensors. Some energy harvesting methodologies have been available including solar, vibration, wind, etc; among them, vibration-based energy harvester using piezoelectric material showed the potential for converting ambient vibration energy to electric energy in railway health monitoring even for underground subway systems. However, the piezoelectric energy harvester has two major problems including that it could only generate small amount of energy, and that it should match the exact narrow band natural frequency with the excitation frequency. To overcome these problems, a wide band piezoelectric energy harvester, which could generate more power on various frequencies regions, has been designed and validated with experimental test. Then it was applied to a full-scale field test using actual railway train. The power generation of the wide band piezoelectric array has been compared to a narrow-band, resonant-based, piezoelectric energy harvester.

  10. Farm Deployable Microbial Bioreactor for Fuel Ethanol Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okeke, Benedict

    Research was conducted to develop a farm and field deployable microbial bioreactor for bioethanol production from biomass. Experiments were conducted to select the most efficient microorganisms for conversion of plant fiber to sugars for fermentation to ethanol. Mixtures of biomass and surface soil samples were collected from selected sites in Alabama black belt counties (Macon, Sumter, Choctaw, Dallas, Montgomery, Lowndes) and other areas within the state of Alabama. Experiments were conducted to determine the effects of culture parameters on key biomass saccharifying enzymes (cellulase, beta-glucosidase, xylanase and beta-xylosidase). A wide-scale sampling of locally-grown fruits in Central Alabama was embarked tomore » isolate potential xylose fermenting microorganisms. Yeast isolates were evaluated for xylose fermentation. Selected microorganisms were characterized by DNA based methods. Factors affecting enzyme production and biomass saccharification were examined and optimized in the laboratory. Methods of biomass pretreatment were compared. Co-production of amylolytic enzymes with celluloytic-xylanolytic enzymes was evaluated; and co-saccharification of a combination of biomass, and starch-rich materials was examined. Simultaneous saccharification and fermentation with and without pre-saccharifcation was studied. Whole culture broth and filtered culture broth simultaneous saccahrifcation and fermentation were compared. A bioreactor system was designed and constructed to employ laboratory results for scale up of biomass saccharification.« less

  11. Deployable Soft Composite Structures.

    PubMed

    Wang, Wei; Rodrigue, Hugo; Ahn, Sung-Hoon

    2016-02-19

    Deployable structure composed of smart materials based actuators can reconcile its inherently conflicting requirements of low mass, good shape adaptability, and high load-bearing capability. This work describes the fabrication of deployable structures using smart soft composite actuators combining a soft matrix with variable stiffness properties and hinge-like movement through a rigid skeleton. The hinge actuator has the advantage of being simple to fabricate, inexpensive, lightweight and simple to actuate. This basic actuator can then be used to form modules capable of different types of deformations, which can then be assembled into deployable structures. The design of deployable structures is based on three principles: design of basic hinge actuators, assembly of modules and assembly of modules into large-scale deployable structures. Various deployable structures such as a segmented triangular mast, a planar structure comprised of single-loop hexagonal modules and a ring structure comprised of single-loop quadrilateral modules were designed and fabricated to verify this approach. Finally, a prototype for a deployable mirror was developed by attaching a foldable reflective membrane to the designed ring structure and its functionality was tested by using it to reflect sunlight onto to a small-scale solar panel.

  12. Deployable Soft Composite Structures

    PubMed Central

    Wang, Wei; Rodrigue, Hugo; Ahn, Sung-Hoon

    2016-01-01

    Deployable structure composed of smart materials based actuators can reconcile its inherently conflicting requirements of low mass, good shape adaptability, and high load-bearing capability. This work describes the fabrication of deployable structures using smart soft composite actuators combining a soft matrix with variable stiffness properties and hinge-like movement through a rigid skeleton. The hinge actuator has the advantage of being simple to fabricate, inexpensive, lightweight and simple to actuate. This basic actuator can then be used to form modules capable of different types of deformations, which can then be assembled into deployable structures. The design of deployable structures is based on three principles: design of basic hinge actuators, assembly of modules and assembly of modules into large-scale deployable structures. Various deployable structures such as a segmented triangular mast, a planar structure comprised of single-loop hexagonal modules and a ring structure comprised of single-loop quadrilateral modules were designed and fabricated to verify this approach. Finally, a prototype for a deployable mirror was developed by attaching a foldable reflective membrane to the designed ring structure and its functionality was tested by using it to reflect sunlight onto to a small-scale solar panel. PMID:26892762

  13. The Impact of Combat Deployment on Health Care Provider Burnout in a Military Emergency Department: A Cross-Sectional Professional Quality of Life Scale V Survey Study.

    PubMed

    Cragun, Joshua N; April, Michael D; Thaxton, Robert E

    2016-08-01

    Compassion fatigue is a problem for many health care providers manifesting as physical, mental, and spiritual exhaustion. Our objective was to evaluate the association between prior combat deployment and compassion fatigue among military emergency medicine providers. We conducted a nonexperimental cross-sectional survey of health care providers assigned to the San Antonio Military Medical Center, Department of Emergency Medicine. We used the Professional Quality of Life Scale V survey instrument that evaluates provider burnout, secondary traumatic stress, and compassion satisfaction. Outcomes included burnout, secondary traumatic stress, and compassion satisfaction raw scores. Scores were compared between providers based on previous combat deployments using two-tailed independent sample t tests and multiple regression models. Surveys were completed by 105 respondents: 42 nurses (20 previously deployed), 30 technicians (11 previously deployed), and 33 physicians (16 previously deployed). No statistically significant differences in burnout, secondary traumatic stress, or compassion satisfaction scores were detected between previously deployed providers versus providers not previously deployed. There was no association between previous combat deployment and emergency department provider burnout, secondary traumatic stress, or compassion satisfaction scores. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  14. Risks of exposure to ionizing and millimeter-wave radiation from airport whole-body scanners.

    PubMed

    Moulder, John E

    2012-06-01

    Considerable public concern has been expressed around the world about the radiation risks posed by the backscatter (ionizing radiation) and millimeter-wave (nonionizing radiation) whole-body scanners that have been deployed at many airports. The backscatter and millimeter-wave scanners currently deployed in the U.S. almost certainly pose negligible radiation risks if used as intended, but their safety is difficult-to-impossible to prove using publicly accessible data. The scanners are widely disliked and often feared, which is a problem made worse by what appears to be a veil of secrecy that covers their specifications and dosimetry. Therefore, for these and future similar technologies to gain wide acceptance, more openness is needed, as is independent review and regulation. Publicly accessible, and preferably peer-reviewed evidence is needed that the deployed units (not just the prototypes) meet widely-accepted safety standards. It is also critical that risk-perception issues be handled more competently.

  15. Risk of adverse health outcomes associated with frequency and duration of deployment with the Australian Defence Force.

    PubMed

    Bleier, Jonathan; McFarlane, Alexander; McGuire, Annabel; Treloar, Susan; Waller, Michael; Dobson, Annette

    2011-02-01

    The operational tempo of the Australian Defence Force has increased over the last two decades. We examine the relationship between health of personnel and the frequency and duration of their deployment. Self-reported health measures (number of symptoms, Kessler Psychological Distress Scale, and Post Traumatic Stress Disorder Checklist) were compared for people who had never deployed to those who had deployed only once and for those who had deployed at least twice with at least one deployment to East Timor and one deployment to Afghanistan or Iraq. Comparisons were also made between people who had deployed for at least one month and those who had deployed for longer periods. Frequency of deployment but not duration of deployment was associated with poorer health.

  16. Pilot-scale data provide enhanced estimates of the life cycle energy and emissions profile of algae biofuels produced via hydrothermal liquefaction.

    PubMed

    Liu, Xiaowei; Saydah, Benjamin; Eranki, Pragnya; Colosi, Lisa M; Greg Mitchell, B; Rhodes, James; Clarens, Andres F

    2013-11-01

    Life cycle assessment (LCA) has been used widely to estimate the environmental implications of deploying algae-to-energy systems even though no full-scale facilities have yet to be built. Here, data from a pilot-scale facility using hydrothermal liquefaction (HTL) is used to estimate the life cycle profiles at full scale. Three scenarios (lab-, pilot-, and full-scale) were defined to understand how development in the industry could impact its life cycle burdens. HTL-derived algae fuels were found to have lower greenhouse gas (GHG) emissions than petroleum fuels. Algae-derived gasoline had significantly lower GHG emissions than corn ethanol. Most algae-based fuels have an energy return on investment between 1 and 3, which is lower than petroleum biofuels. Sensitivity analyses reveal several areas in which improvements by algae bioenergy companies (e.g., biocrude yields, nutrient recycle) and by supporting industries (e.g., CO2 supply chains) could reduce the burdens of the industry. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Design and Analysis of a Formation Flying System for the Cross-Scale Mission Concept

    NASA Technical Reports Server (NTRS)

    Cornara, Stefania; Bastante, Juan C.; Jubineau, Franck

    2007-01-01

    The ESA-funded "Cross-Scale Technology Reference Study has been carried out with the primary aim to identify and analyse a mission concept for the investigation of fundamental space plasma processes that involve dynamical non-linear coupling across multiple length scales. To fulfill this scientific mission goal, a constellation of spacecraft is required, flying in loose formations around the Earth and sampling three characteristic plasma scale distances simultaneously, with at least two satellites per scale: electron kinetic (10 km), ion kinetic (100-2000 km), magnetospheric fluid (3000-15000 km). The key Cross-Scale mission drivers identified are the number of S/C, the space segment configuration, the reference orbit design, the transfer and deployment strategy, the inter-satellite localization and synchronization process and the mission operations. This paper presents a comprehensive overview of the mission design and analysis for the Cross-Scale concept and outlines a technically feasible mission architecture for a multi-dimensional investigation of space plasma phenomena. The main effort has been devoted to apply a thorough mission-level trade-off approach and to accomplish an exhaustive analysis, so as to allow the characterization of a wide range of mission requirements and design solutions.

  18. Mechanism Design and Testing of a Self-Deploying Structure Using Flexible Composite Tape Springs

    NASA Technical Reports Server (NTRS)

    Footdale, Joseph N.; Murphey, Thomas W.

    2014-01-01

    The detailed mechanical design of a novel deployable support structure that positions and tensions a membrane optic for space imagining applications is presented. This is a complex three-dimensional deployment using freely deploying rollable composite tape spring booms that become load bearing structural members at full deployment. The deployment tests successfully demonstrate a new architecture based on rolled and freely deployed composite tape spring members that achieve simultaneous deployment without mechanical synchronization. Proper design of the flexible component mounting interface and constraint systems, which were critical in achieving a functioning unit, are described. These flexible composite components have much potential for advancing the state of the art in deployable structures, but have yet to be widely adopted. This paper demonstrates the feasibility and advantages of implementing flexible composite components, including the design details on how to integrate with required traditional mechanisms.

  19. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  20. Deployment and retraction of a cable-driven solar array: Testing and simulation

    NASA Technical Reports Server (NTRS)

    Kumar, P.; Pellegrino, S.

    1995-01-01

    The paper investigates three critical areas in cable-driven rigid-panel solar arrays: First, the variation of deployment and retraction cable tensions due to friction at the hinges; Second, the change in deployment dynamics associated with different deployment histories; Third, the relationship between the level of pre-tension in the closed contact loops and the synchronization of deployment. A small scale model array has been made and tested, and its behavior has been compared to numerical simulations.

  1. Clean vehicles as an enabler for a clean electricity grid

    NASA Astrophysics Data System (ADS)

    Coignard, Jonathan; Saxena, Samveg; Greenblatt, Jeffery; Wang, Dai

    2018-05-01

    California has issued ambitious targets to decarbonize transportation through the deployment of electric vehicles (EVs), and to decarbonize the electricity grid through the expansion of both renewable generation and energy storage. These parallel efforts can provide an untapped synergistic opportunity for clean transportation to be an enabler for a clean electricity grid. To quantify this potential, we forecast the hourly system-wide balancing problems arising out to 2025 as more renewables are deployed and load continues to grow. We then quantify the system-wide balancing benefits from EVs modulating the charging or discharging of their batteries to mitigate renewable intermittency, without compromising the mobility needs of drivers. Our results show that with its EV deployment target and with only one-way charging control of EVs, California can achieve much of the same benefit of its Storage Mandate for mitigating renewable intermittency, but at a small fraction of the cost. Moreover, EVs provide many times these benefits if two-way charging control becomes widely available. Thus, EVs support the state’s renewable integration targets while avoiding much of the tremendous capital investment of stationary storage that can instead be applied towards further deployment of clean vehicles.

  2. Military women's attitudes toward menstruation and menstrual suppression in relation to the deployed environment: development and testing of the MWATMS-9 (short form).

    PubMed

    Trego, Lori L; Jordan, Patricia J

    2010-01-01

    To determine military women's attitudes toward menstruation and menstrual suppression with oral contraceptives in the deployed environment. A cross-sectional descriptive design with the administration of the Menstrual Attitude Questionnaire (MAQ) and the 55-item Military Women's Attitudes Towards Menstrual Suppression Scale (MWATMS) to a convenience sample (n = 278) of women in the U.S. Army with deployment experience. The MAQ's five subscales' mean scores ranged from 3.4 (+/-1.11) to 5.1 (+/-1.06), indicating neutral to moderate attitudes toward menstruation. Measurement development on the MWATMS produced a nine-item scale with three components: stress effects, benefits to self, and environmental barriers. Menstrual attitudes were generally neutral in this sample; however, military women favor menstrual suppression during deployment owing to the effects of stress during deployment, benefits that suppression would provide, and the barriers to menstrual hygiene in the deployed environment. Women who perceived menstruation as bothersome and debilitating had positive attitudes toward menstrual suppression. These findings can contribute to appropriate predeployment women's health care and improve the readiness for deployment in female soldiers. Providers should educate women on the risks and benefits of menstrual suppression methods and provide guidance on impact that the deployed environment can have on their menstrual experiences.

  3. Systems biology and livestock production.

    PubMed

    Headon, D

    2013-12-01

    The mapping of complete sets of genes, transcripts and proteins from many organisms has prompted the development of new '-omic' technologies for collecting and analysing very large amounts of data. Now that the tools to generate and interrogate such complete data sets are widely used, much of the focus of biological research has begun to turn towards understanding systems as a whole, rather than studying their components in isolation. This very broadly defined systems approach is being deployed across a range of problems and scales of organisation, including many aspects of the animal sciences. Here I review selected examples of this systems approach as applied to poultry and livestock production, product quality and welfare.

  4. Texas Department of Transportation, intelligent transportation systems (ITS) deployment strategy

    DOT National Transportation Integrated Search

    1996-05-01

    The purpose of this document is to present an initial TxDOT-wide framework for the deployment : of Intelligent Transportation Systems (ITS) technologies, techniques and practices in support of the : principal agency mission of moving people and goods...

  5. Austin area-wide IVHS plan and IH-35 corridor deployment plan

    DOT National Transportation Integrated Search

    1998-02-01

    This report describes efforts towards developing an intelligent transportation systems (ITS) deployment plan for the Austin metropolitan area in Texas. ITS issues, strategies, and technologies are discussed. The focus of the study is to support the f...

  6. Economic Impact of Large-Scale Deployment of Offshore Marine and Hydrokinetic Technology in Oregon Coastal Counties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez, T.; Tegen, S.; Beiter, P.

    To begin understanding the potential economic impacts of large-scale WEC technology, the Bureau of Ocean Energy Management (BOEM) commissioned the National Renewable Energy Laboratory (NREL) to conduct an economic impact analysis of largescale WEC deployment for Oregon coastal counties. This report follows a previously published report by BOEM and NREL on the jobs and economic impacts of WEC technology for the entire state (Jimenez and Tegen 2015). As in Jimenez and Tegen (2015), this analysis examined two deployment scenarios in the 2026-2045 timeframe: the first scenario assumed 13,000 megawatts (MW) of WEC technology deployed during the analysis period, and themore » second assumed 18,000 MW of WEC technology deployed by 2045. Both scenarios require major technology and cost improvements in the WEC devices. The study is on very large-scale deployment so readers can examine and discuss the potential of a successful and very large WEC industry. The 13,000-MW is used as the basis for the county analysis as it is the smaller of the two scenarios. Sensitivity studies examined the effects of a robust in-state WEC supply chain. The region of analysis is comprised of the seven coastal counties in Oregon—Clatsop, Coos, Curry, Douglas, Lane, Lincoln, and Tillamook—so estimates of jobs and other economic impacts are specific to this coastal county area.« less

  7. Full-Scale Crash Test of a MD-500 Helicopter with Deployable Energy Absorbers

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Karen E.; Littell, Justin D.

    2010-01-01

    A new externally deployable energy absorbing system was demonstrated during a full-scale crash test of an MD-500 helicopter. The deployable system is a honeycomb structure and utilizes composite materials in its construction. A set of two Deployable Energy Absorbers (DEAs) were fitted on the MD-500 helicopter for the full-scale crash demonstration. Four anthropomorphic dummy occupants were also used to assess human survivability. A demonstration test was performed at NASA Langley's Landing and Impact Research Facility (LandIR). The test involved impacting the helicopter on a concrete surface with combined forward and vertical velocity components of 40-ft/s and 26-ft/s, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of dynamic finite element simulations. Descriptions of this test as well as other component and full-scale tests leading to the helicopter test are discussed. Acceleration data from the anthropomorphic dummies showed that dynamic loads were successfully attenuated to within non-injurious levels. Moreover, the airframe itself survived the relatively severe impact and was retested to provide baseline data for comparison for cases with and without DEAs.

  8. Build It: Will They Come?

    NASA Astrophysics Data System (ADS)

    Corrie, Brian; Zimmerman, Todd

    Scientific research is fundamentally collaborative in nature, and many of today's complex scientific problems require domain expertise in a wide range of disciplines. In order to create research groups that can effectively explore such problems, research collaborations are often formed that involve colleagues at many institutions, sometimes spanning a country and often spanning the world. An increasingly common manifestation of such a collaboration is the collaboratory (Bos et al., 2007), a “…center without walls in which the nation's researchers can perform research without regard to geographical location — interacting with colleagues, accessing instrumentation, sharing data and computational resources, and accessing information from digital libraries.” In order to bring groups together on such a scale, a wide range of components need to be available to researchers, including distributed computer systems, remote instrumentation, data storage, collaboration tools, and the financial and human resources to operate and run such a system (National Research Council, 1993). Media Spaces, as both a technology and a social facilitator, have the potential to meet many of these needs. In this chapter, we focus on the use of scientific media spaces (SMS) as a tool for supporting collaboration in scientific research. In particular, we discuss the design, deployment, and use of a set of SMS environments deployed by WestGrid and one of its collaborating organizations, the Centre for Interdisciplinary Research in the Mathematical and Computational Sciences (IRMACS) over a 5-year period.

  9. The United States Department Of Agriculture Northeast Area-wide Tick Control Project: history and protocol.

    PubMed

    Pound, Joe Mathews; Miller, John Allen; George, John E; Fish, Durland

    2009-08-01

    The Northeast Area-wide Tick Control Project (NEATCP) was funded by the United States Department of Agriculture (USDA) as a large-scale cooperative demonstration project of the USDA-Agricultural Research Service (ARS)-patented 4-Poster tick control technology (Pound et al. 1994) involving the USDA-ARS and a consortium of universities, state agencies, and a consulting firm at research locations in the five states of Connecticut (CT), Maryland (MD), New Jersey (NJ), New York (NY), and Rhode Island (RI). The stated objective of the project was "A community-based field trial of ARS-patented tick control technology designed to reduce the risk of Lyme disease in northeastern states." Here we relate the rationale and history of the technology, a chronological listing of events leading to implementation of the project, the original protocol for selecting treatment, and control sites, and protocols for deployment of treatments, sampling, assays, data analyses, and estimates of efficacy.

  10. Supporting Knowledge Transfer in IS Deployment Projects

    NASA Astrophysics Data System (ADS)

    Schönström, Mikael

    To deploy new information systems is an expensive and complex task, and does seldom result in successful usage where the system adds strategic value to the firm (e.g. Sharma et al. 2003). It has been argued that innovation diffusion is a knowledge integration problem (Newell et al. 2000). Knowledge about business processes, deployment processes, information systems and technology are needed in a large-scale deployment of a corporate IS. These deployments can therefore to a large extent be argued to be a knowledge management (KM) problem. An effective deployment requires that knowledge about the system is effectively transferred to the target organization (Ko et al. 2005).

  11. Policies to Support Wind Power Deployment: Key Considerations and Good Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sadie; Tegen, Suzanne; Baring-Gould, Ian

    2015-05-19

    Policies have played an important role in scaling up wind deployment and increasing its economic viability while also supporting country-specific economic, social, and environmental development goals. Although wind power has become cost-competitive in several contexts, challenges to wind power deployment remain. Within the context of country-specific goals and challenges, policymakers are seeking

  12. Escape and evade control policies for ensuring the physical security of nonholonomic, ground-based, unattended mobile sensor nodes

    NASA Astrophysics Data System (ADS)

    Mascarenas, David; Stull, Christopher; Farrar, Charles

    2011-06-01

    In order to realize the wide-scale deployment of high-endurance, unattended mobile sensing technologies, it is vital to ensure the self-preservation of the sensing assets. Deployed mobile sensor nodes face a variety of physical security threats including theft, vandalism and physical damage. Unattended mobile sensor nodes must be able to respond to these threats with control policies that facilitate escape and evasion to a low-risk state. In this work the Precision Immobilization Technique (PIT) problem has been considered. The PIT maneuver is a technique that a pursuing, car-like vehicle can use to force a fleeing vehicle to abruptly turn ninety degrees to the direction of travel. The abrupt change in direction generally causes the fleeing driver to lose control and stop. The PIT maneuver was originally developed by law enforcement to end vehicular pursuits in a manner that minimizes damage to the persons and property involved. It is easy to imagine that unattended autonomous convoys could be targets of this type of action by adversarial agents. This effort focused on developing control policies unattended mobile sensor nodes could employ to escape, evade and recover from PIT-maneuver-like attacks. The development of these control policies involved both simulation as well as small-scale experimental testing. The goal of this work is to be a step toward ensuring the physical security of unattended sensor node assets.

  13. Psychophysiological response to virtual reality and subthreshold posttraumatic stress disorder symptoms in recently deployed military.

    PubMed

    Costanzo, Michelle E; Leaman, Suzanne; Jovanovic, Tanja; Norrholm, Seth D; Rizzo, Albert A; Taylor, Patricia; Roy, Michael J

    2014-01-01

    Subthreshold posttraumatic stress disorder (PTSD) has garnered recent attention because of the significant distress and functional impairment associated with the symptoms as well as the increased risk of progression to full PTSD. However, the clinical presentation of subthreshold PTSD can vary widely and therefore is not clearly defined, nor is there an evidence-based treatment approach. Thus, we aim to further the understanding of subthreshold PTSD symptoms by reporting the use of a virtual combat environment in eliciting distinctive psychophysiological responses associated with PTSD symptoms in a sample of subthreshold recently deployed US service members. Heart rate, skin conductance, electromyography (startle), respiratory rate, and blood pressure were monitored during three unique combat-related virtual reality scenarios as a novel procedure to assess subthreshold symptoms in a sample of 78 service members. The Clinician-Administered PTSD Scale was administered, and linear regression analyses were used to investigate the relationship between symptom clusters and physiological variables. Among the range of psychophysiological measures that were studied, regression analysis revealed heart rate as most strongly associated with Clinician-Administered PTSD Scale-based measures hyperarousal (R = 0.11, p = .035,) reexperiencing (R = 0.24, p = .001), and global PTSD symptoms (R = 0.17, p = .003). Our findings support the use of a virtual reality environment in eliciting physiological responses associated with subthreshold PTSD symptoms.

  14. Utility-Scale Photovoltaic Deployment Scenarios of the Western United States: Implications for Solar Energy Zones in Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany; Mai, Trieu; Krishnan, Venkat

    2016-12-01

    In this study, we use the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) capacity expansion model to estimate utility-scale photovoltaic (UPV) deployment trends from present day through 2030. The analysis seeks to inform the U.S. Bureau of Land Management's (BLM's) planning activities related to UPV development on federal lands in Nevada as part of the Resource Management Plan (RMP) revision for the Las Vegas and Pahrump field offices. These planning activities include assessing the demand for new or expanded additional Solar Energy Zones (SEZ), per the process outlined in BLM's Western Solar Plan process.

  15. Deployment simulation of a deployable reflector for earth science application

    NASA Astrophysics Data System (ADS)

    Wang, Xiaokai; Fang, Houfei; Cai, Bei; Ma, Xiaofei

    2015-10-01

    A novel mission concept namely NEXRAD-In-Space (NIS) has been developed for monitoring hurricanes, cyclones and other severe storms from a geostationary orbit. It requires a space deployable 35-meter diameter Ka-band (35 GHz) reflector. NIS can measure hurricane precipitation intensity, dynamics and its life cycle. These information is necessary for predicting the track, intensity, rain rate and hurricane-induced floods. To meet the requirements of the radar system, a Membrane Shell Reflector Segment (MSRS) reflector technology has been developed and several technologies have been evaluated. However, the deployment analysis of this large size and high-precision reflector has not been investigated. For a pre-studies, a scaled tetrahedral truss reflector with spring driving deployment system has been made and tested, deployment dynamics analysis of this scaled reflector has been performed using ADAMS to understand its deployment dynamic behaviors. Eliminating the redundant constraints in the reflector system with a large number of moving parts is a challenging issue. A primitive joint and flexible struts were introduced to the analytical model and they can effectively eliminate over constraints of the model. By using a high-speed camera and a force transducer, a deployment experiment of a single-bay tetrahedral module has been conducted. With the tested results, an optimization process has been performed by using the parameter optimization module of ADAMS to obtain the parameters of the analytical model. These parameters were incorporated to the analytical model of the whole reflector. It is observed from the analysis results that the deployment process of the reflector with a fixed boundary experiences three stages. These stages are rapid deployment stage, slow deployment stage and impact stage. The insight of the force peak distributions of the reflector can help the optimization design of the structure.

  16. Impact of Financial Structure on the Cost of Solar Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, M.; Kreycik, C.; Bird, L.

    2012-03-01

    To stimulate investment in renewable energy generation projects, the federal government developed a series of support structures that reduce taxes for eligible investors--the investment tax credit, the production tax credit, and accelerated depreciation. The nature of these tax incentives often requires an outside investor and a complex financial arrangement to allocate risk and reward among the parties. These financial arrangements are generally categorized as 'advanced financial structures.' Among renewable energy technologies, advanced financial structures were first widely deployed by the wind industry and are now being explored by the solar industry to support significant scale-up in project development. This reportmore » describes four of the most prevalent financial structures used by the renewable sector and evaluates the impact of financial structure on energy costs for utility-scale solar projects that use photovoltaic and concentrating solar power technologies.« less

  17. Very Low Head Turbine Deployment in Canada

    NASA Astrophysics Data System (ADS)

    Kemp, P.; Williams, C.; Sasseville, Remi; Anderson, N.

    2014-03-01

    The Very Low Head (VLH) turbine is a recent turbine technology developed in Europe for low head sites in the 1.4 - 4.2 m range. The VLH turbine is primarily targeted for installation at existing hydraulic structures to provide a low impact, low cost, yet highly efficient solution. Over 35 VLH turbines have been successfully installed in Europe and the first VLH deployment for North America is underway at Wasdell Falls in Ontario, Canada. Deployment opportunities abound in Canada with an estimated 80,000 existing structures within North America for possible low-head hydro development. There are several new considerations and challenges for the deployment of the VLH turbine technology in Canada in adapting to the hydraulic, environmental, electrical and social requirements. Several studies were completed to determine suitable approaches and design modifications to mitigate risk and confirm turbine performance. Diverse types of existing weirs and spillways pose certain hydraulic design challenges. Physical and numerical modelling of the VLH deployment alternatives provided for performance optimization. For this application, studies characterizing the influence of upstream obstacles using water tunnel model testing as well as full-scale prototype flow dynamics testing were completed. A Cold Climate Adaptation Package (CCA) was developed to allow year-round turbine operation in ice covered rivers. The CCA package facilitates turbine extraction and accommodates ice forces, frazil ice, ad-freezing and cold temperatures that are not present at the European sites. The Permanent Magnet Generator (PMG) presents some unique challenges in meeting Canadian utility interconnection requirements. Specific attention to the frequency driver control and protection requirements resulted in a driver design with greater over-voltage capability for the PMG as well as other key attributes. Environmental studies in Europe included fish friendliness testing comprised of multiple in-river live passage tests for a wide variety of fish species. Latest test results indicate fish passage survivability close to 100%. Further fish studies are planned in Canada later this year. Successful deployment must meet societal requirements to gain community acceptance and public approval. Aesthetics considerations include low noise, disguised control buildings and vigilant turbine integration into the low profile existing structures. The resulting design was selected for deployment at existing historic National Park waterway structures. The integration of all of these design elements permits the successful deployment of the VLH turbine in Canada.

  18. Doppler lidar characterization of the boundary layer for aircraft mass-balance estimates of greenhouse gas emissions

    NASA Astrophysics Data System (ADS)

    Hardesty, R.; Brewer, A.; Banta, R. M.; Senff, C. J.; Sandberg, S. P.; Alvarez, R. J.; Weickmann, A. M.; Sweeney, C.; Karion, A.; Petron, G.; Frost, G. J.; Trainer, M.

    2012-12-01

    Aircraft-based mass balance approaches are often used to estimate greenhouse gas emissions from distributed sources such as urban areas and oil and gas fields. A scanning Doppler lidar, which measures range-resolved wind and aerosol backscatter information, can provide important information on mixing and transport processes in the planetary boundary layer for these studies. As part of the Uintah Basin Winter Ozone Study, we deployed a high resolution Doppler lidar to characterize winds and turbulence, atmospheric mixing, and mixing layer depth in the oil and gas fields near Vernal, Utah. The lidar observations showed evolution of the horizontal wind field, vertical mixing and aerosol structure for each day during the 5-week deployment. This information was used in conjunction with airborne in situ observations of methane and carbon dioxide to compute methane fluxes and estimate basin-wide methane emissions. A similar experiment incorporating a lidar along with a radar wind profiler and instrumented aircraft was subsequently carried out in the vicinity of the Denver-Julesburg Basin in Colorado. Using examples from these two studies we discuss the use of Doppler lidar in conjunction with other sources of wind information and boundary layer structure for mass-balance type studies. Plans for a one-year deployment of a Doppler lidar as part of the Indianapolis Flux experiment to estimate urban-scale greenhouse gas emissions near are also presented.

  19. Social Acceptance of Wind Energy: Managing and Evaluating Its Market Impacts (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baring-Gould, I.

    2012-06-01

    As with any industrial-scale technology, wind power has impacts. As wind technology deployment becomes more widespread, a defined opposition will form as a result of fear of change and competing energy technologies. As the easy-to-deploy sites are developed, the costs of developing at sites with deployment barriers will increase, therefore increasing the total cost of power. This presentation provides an overview of wind development stakeholders and related stakeholder engagement questions, Energy Department activities that provide wind project deployment information, and the quantification of deployment barriers and costs in the continental United States.

  20. Characterizing Micro- and Macro-Scale Seismicity from Bayou Corne, Louisiana

    NASA Astrophysics Data System (ADS)

    Baig, A. M.; Urbancic, T.; Karimi, S.

    2013-12-01

    The initiation of felt seismicity in Bayou Corne, Louisiana, coupled with other phenomena detected by residents on the nearby housing development, prompted a call to install a broadband seismic network to monitor subsurface deformation. The initial deployment was in place to characterize the deformation contemporaneous with the formation of a sinkhole located in close proximity to a salt dome. Seismic events generated during this period followed a swarm-like behaviour with moment magnitudes culminating around Mw2.5. However, the seismic data recorded during this sequence suffer from poor signal to noise, onsets that are very difficult to pick, and the presence of a significant amount of energy arriving later in the waveforms. Efforts to understand the complexity in these waveforms are ongoing, and involve invoking the complexities inherent in recording in a highly attenuating swamp overlying a complex three-dimensional structure with the strong material property contrast of the salt dome. In order to understand the event character, as well as to locally lower the completeness threshold of the sequence, a downhole array of 15 Hz sensors was deployed in a newly drilled well around the salt dome. Although the deployment lasted a little over a month in duration, over 1000 events were detected down to moment magnitude -Mw3. Waveform quality tended to be excellent, with very distinct P and S wave arrivals observable across the array for most events. The highest magnitude events were seen as well on the surface network and allowed for the opportunity to observe the complexities introduced by the site effects, while overcoming the saturation effects on the higher-frequency downhole geophones. This hybrid downhole and surface array illustrates how a full picture of subsurface deformation is only made possible by combining the high-frequency downhole instrumentation to see the microseismicity complemented with a broadband array to accurately characterize the source parameters for the larger magnitude events. Our presentation is focused on investigating this deformation, characterizing the scaling behaviour and the other source processes by taking advantage of the wide-band afforded to us through the deployment.

  1. Scaling up: Taking the Academic Pathways of People Learning Engineering Survey (APPLES) National. Research Brief

    ERIC Educational Resources Information Center

    Donaldson, Krista M.; Chen, Helen L.; Toye, George; Clark, Mia; Sheppard, Sheri D.

    2008-01-01

    The Academic Pathways of People Learning Engineering Survey (APPLES) was deployed for a second time in spring 2008 to undergraduate engineering students at 21 US universities. The goal of the second deployment of APPLES was to corroborate and extend findings from the Academic Pathways Study (APS; 2003-2007) and the first deployment of APPLES…

  2. Experimental study of a quantum random-number generator based on two independent lasers

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Xu, Feihu

    2017-12-01

    A quantum random-number generator (QRNG) can produce true randomness by utilizing the inherent probabilistic nature of quantum mechanics. Recently, the spontaneous-emission quantum phase noise of the laser has been widely deployed for quantum random-number generation, due to its high rate, its low cost, and the feasibility of chip-scale integration. Here, we perform a comprehensive experimental study of a phase-noise-based QRNG with two independent lasers, each of which operates in either continuous-wave (CW) or pulsed mode. We implement the QRNG by operating the two lasers in three configurations, namely, CW + CW, CW + pulsed, and pulsed + pulsed, and demonstrate their trade-offs, strengths, and weaknesses.

  3. cFE/CFS (Core Flight Executive/Core Flight System)

    NASA Technical Reports Server (NTRS)

    Wildermann, Charles P.

    2008-01-01

    This viewgraph presentation describes in detail the requirements and goals of the Core Flight Executive (cFE) and the Core Flight System (CFS). The Core Flight Software System is a mission independent, platform-independent, Flight Software (FSW) environment integrating a reusable core flight executive (cFE). The CFS goals include: 1) Reduce time to deploy high quality flight software; 2) Reduce project schedule and cost uncertainty; 3) Directly facilitate formalized software reuse; 4) Enable collaboration across organizations; 5) Simplify sustaining engineering (AKA. FSW maintenance); 6) Scale from small instruments to System of Systems; 7) Platform for advanced concepts and prototyping; and 7) Common standards and tools across the branch and NASA wide.

  4. Riometer based Neural Network Prediction of Kp

    NASA Astrophysics Data System (ADS)

    Arnason, K. M.; Spanswick, E.; Chaddock, D.; Tabrizi, A. F.; Behjat, L.

    2017-12-01

    The Canadian Geospace Observatory Riometer Array is a network of 11 wide-beam riometers deployed across Central and Northern Canada. The geographic coverage of the network affords a near continent scale view of high energy (>30keV) electron precipitation at a very course spatial resolution. In this paper we present the first results from a neural network based analysis of riometer data. Trained on decades of riometer data, the neural network is tuned to predict a simple index of global geomagnetic activity (Kp) based solely on the information provided by the high energy electron precipitation over Canada. We present results from various configurations of training and discuss the applicability of this technique for short term prediction of geomagnetic activity.

  5. State fiscal implications of intelligent transportation systems/commercial vehicle operations deployment

    DOT National Transportation Integrated Search

    1998-01-01

    As states begin to consider full-scale deployment of intelligent transportation system (ITS) technologies to support commercial vehicle operations (CVO), Governors and state legislatures will need answers to the following questions: (1) What savings ...

  6. [Mental disorders in German soldiers after deployment - impact of personal values and resilience].

    PubMed

    Zimmermann, Peter; Firnkes, Susanne; Kowalski, Jens; Backus, Johannes; Alliger-Horn, Christina; Willmund, Gerd; Hellenthal, Andrea; Bauer, Amanda; Petermann, Franz; Maercker, Andreas

    2015-11-01

    Soldiers are at increased risk of developing mental health disorders after military deployment. The impact of personal values on psychological symptomatology based on an empirical working model has not yet been studied in a military environment. 117 German Armed Forces soldiers completed the Portrait-Values-Questionnaire (PVQ), the Patient-Health-Questionnaire (PHQ) and the Resilience-Scale (RS-11) after their deployment to Afghanistan. In the regression analyses the values hedonism, benevolence, tradition, self-direction and universalism had a differential significant impact on depression, anxiety and somatoform symptoms of the PHQ. The RS-11 sum scale values were negatively correlated with symptomatology. Personal values and resilience seem to be associated with psychological symptomatology in soldiers after military deployment. The results can contribute to the further development of both preventive and therapeutic approaches. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Defense Science Board 1996 Summer Study Task Force On Tactics and Technology for 21st Century Military Superiority. Volume 2, Part 1. Supporting Materials

    DTIC Science & Technology

    1996-10-01

    systems currently headed for deployment ( BIDS is highlighted in the chart) to widely dispersed microsensors on micro, autonomous platforms. Small room... Small , Rapidly Deployable Forces" Joe Polito, Dan Rondeau, Sandia National Laboratory V.2. "Robotic Concepts for Small Rapidly Deployable Forces" V-7...Robert Palmquist, Jill Fahrenholtz, Richard Wheeler, Sandia National Laboratory V.3. "Potential for Distributed Ground Sensors in Support of Small Unit V

  8. Outlook and Challenges of Perovskite Solar Cells toward Terawatt-Scale Photovoltaic Module Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Kai; Kim, Donghoe; Whitaker, James B

    Rapid development of perovskite solar cells (PSCs) during the past several years has made this photovoltaic (PV) technology a serious contender for potential large-scale deployment on the terawatt scale in the PV market. To successfully transition PSC technology from the laboratory to industry scale, substantial efforts need to focus on scalable fabrication of high-performance perovskite modules with minimum negative environmental impact. Here, we provide an overview of the current research and our perspective regarding PSC technology toward future large-scale manufacturing and deployment. Several key challenges discussed are (1) a scalable process for large-area perovskite module fabrication; (2) less hazardous chemicalmore » routes for PSC fabrication; and (3) suitable perovskite module designs for different applications.« less

  9. Ocean Thermal Energy Conversion (OTEC) Programmatic Environmental Analysis--Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Authors, Various

    1980-01-01

    The programmatic environmental analysis is an initial assessment of Ocean Thermal Energy Conversion (OTEC) technology considering development, demonstration and commercialization. It is concluded that the OTEC development program should continue because the development, demonstration, and commercialization on a single-plant deployment basis should not present significant environmental impacts. However, several areas within the OTEC program require further investigation in order to assess the potential for environmental impacts from OTEC operation, particularly in large-scale deployments and in defining alternatives to closed-cycle biofouling control: (1) Larger-scale deployments of OTEC clusters or parks require further investigations in order to assess optimal platform siting distancesmore » necessary to minimize adverse environmental impacts. (2) The deployment and operation of the preoperational platform (OTEC-1) and future demonstration platforms must be carefully monitored to refine environmental assessment predictions, and to provide design modifications which may mitigate or reduce environmental impacts for larger-scale operations. These platforms will provide a valuable opportunity to fully evaluate the intake and discharge configurations, biofouling control methods, and both short-term and long-term environmental effects associated with platform operations. (3) Successful development of OTEC technology to use the maximal resource capabilities and to minimize environmental effects will require a concerted environmental management program, encompassing many different disciplines and environmental specialties. This volume contains these appendices: Appendix A -- Deployment Scenario; Appendix B -- OTEC Regional Characterization; and Appendix C -- Impact and Related Calculations.« less

  10. Security Techniques for Sensor Systems and the Internet of Things

    ERIC Educational Resources Information Center

    Midi, Daniele

    2016-01-01

    Sensor systems are becoming pervasive in many domains, and are recently being generalized by the Internet of Things (IoT). This wide deployment, however, presents significant security issues. We develop security techniques for sensor systems and IoT, addressing all security management phases. Prior to deployment, the nodes need to be hardened. We…

  11. Advanced Vehicle Control Systems (Avcs) For Maintenance Vehicle Applications, Contract No. Dtfh61-94-C-00131, Work Order 11, Prepared For: Department Of Transportation

    DOT National Transportation Integrated Search

    1996-12-20

    IT IS WIDELY BELIEVED THAT BARRIERS TO AN AUTOMATED HIGHWAY SYSTEM (AHS) : DEPLOYMENT ARE DUE MORE TO INSTITUTIONAL, ECONOMIC, AND LEGAL ISSUES THAN TECHNOLOGY LIMITATIONS. IN ORDER TO SUSTAIN AND ACCELERATE THE AHS DEPLOYMENT PROCESS, IT IS DESIRABL...

  12. Implementation factors affecting the large-scale deployment of digital health and well-being technologies: A qualitative study of the initial phases of the 'Living-It-Up' programme.

    PubMed

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S

    2016-12-01

    Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment.Initial findings illustrate that it is clear - and perhaps not surprising - that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a 'multi-stakeholder' environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased 'buy-in' from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder - or at least substantially slow down - the effective rollout of services at scale.The deployment of Living-It-Up services is ongoing, but our results to date suggest that - in order to be successful - the roll-out of digital health and well-being technologies at scale requires a delicate and pragmatic trade-off between co-design activities, the development of innovative services and the efforts allocated to widespread marketing and recruitment initiatives. © The Author(s) 2015.

  13. Human Factors in the Large: Experiences from Denmark, Finland and Canada in Moving Towards Regional and National Evaluations of Health Information System Usability. Contribution of the IMIA Human Factors Working Group.

    PubMed

    Kushniruk, A; Kaipio, J; Nieminen, M; Hyppönen, H; Lääveri, T; Nohr, C; Kanstrup, A M; Berg Christiansen, M; Kuo, M-H; Borycki, E

    2014-08-15

    The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems.

  14. Human Factors in the Large: Experiences from Denmark, Finland and Canada in Moving Towards Regional and National Evaluations of Health Information System Usability

    PubMed Central

    Kaipio, J.; Nieminen, M.; Hyppönen, H.; Lääveri, T.; Nohr, C.; Kanstrup, A. M.; Berg Christiansen, M.; Kuo, M.-H.; Borycki, E.

    2014-01-01

    Summary Objectives The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Methods Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. Results It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. Conclusion As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems. PMID:25123725

  15. A 1.2m Deployable, Transportable Space Surveillance Telescope Designed to Meet AF Space Situational Awareness Needs

    NASA Astrophysics Data System (ADS)

    McGraw, J.; Ackermann, M.

    Recent years have seen significant interest in optical-infrared (OIR) space surveillance capabilities to complement and supplement radar-based sensors. To address this legitimate need for OIR sensors, the Air Force Research Laboratory has been working on several projects intended to meet SSA requirements in practical, fieldable and affordable packages. In particular, while the PanStarrs system is primarily an astronomy project, their well-designed telescope(s) will have substantial SSA capability, but the system, based on four 1.8m apertures on the same mount, will be a fixed location asset. For world-wide deployment, we are studying a smaller "PanStarrs derived" system which would be replicable and inexpensive. A fixed set of telescope arrays would provide substantial SSA search and monitor capability. These telescopes are also designed to be deployed in pairs in a standard cargo container package for theater SSA. With a 1.2m aperture and a 4.5deg FOV, each telescope would have the same etendue as its big brother PanStarrs telescope, but with image quality optimized for space surveillance rather than astronomy. The telescope is even scaled to use production PanStarrs focal plane arrays. A single 1.2m system has almost the same search rate for dim targets as any other system in development. Two such telescopes working together will exceed the performance of any SSA asset either in production or on the drawing boards. Because they are small they can be designed to be replicable and inexpensive and thus could be abandoned in place should the political climate at their deployment sites change for the worse.

  16. Development of a School-Wide Behavior Program in a Public Middle School: An Illustration of Deployment-Focused Intervention Development, Stage 1

    ERIC Educational Resources Information Center

    Molina, Brooke S. G.; Smith, Bradley H.; Pelham, William E., Jr.

    2005-01-01

    School-wide behavior management systems can improve academic performance and behavior in middle schools, and they should have positive effects on students with ADHD. Unfortunately, evidence-based, school-wide behavior management systems have not been widely adopted because of problems with feasibility, acceptability, and sustainability. The…

  17. Modeling of In-stream Tidal Energy Development and its Potential Effects in Tacoma Narrows, Washington, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Wang, Taiping; Copping, Andrea E.

    Understanding and providing proactive information on the potential for tidal energy projects to cause changes to the physical system and to key water quality constituents in tidal waters is a necessary and cost-effective means to avoid costly regulatory involvement and late stage surprises in the permitting process. This paper presents a modeling study for evaluating the tidal energy extraction and its potential impacts on the marine environment in a real world site - Tacoma Narrows of Puget Sound, Washington State, USA. An unstructured-grid coastal ocean model, fitted with a module that simulates tidal energy devices, was applied to simulate themore » tidal energy extracted by different turbine array configurations and the potential effects of the extraction at local and system-wide scales in Tacoma Narrows and South Puget Sound. Model results demonstrated the advantage of an unstructured-grid model for simulating the far-field effects of tidal energy extraction in a large model domain, as well as assessing the near-field effect using a fine grid resolution near the tidal turbines. The outcome shows that a realistic near-term deployment scenario extracts a very small fraction of the total tidal energy in the system and that system wide environmental effects are not likely; however, near-field effects on the flow field and bed shear stress in the area of tidal turbine farm are more likely. Model results also indicate that from a practical standpoint, hydrodynamic or water quality effects are not likely to be the limiting factor for development of large commercial-scale tidal farms. Results indicate that very high numbers of turbines are required to significantly alter the tidal system; limitations on marine space or other environmental concerns are likely to be reached before reaching these deployment levels. These findings show that important information obtained from numerical modeling can be used to inform regulatory and policy processes for tidal energy development.« less

  18. Understanding dynamic pattern and process across spatial scales in river systems using simultaneous deployments of in situ sensors

    NASA Astrophysics Data System (ADS)

    Wollheim, W. M.; Mulukutla, G.; Cook, C.; Carey, R. O.

    2014-12-01

    Biogeochemical conditions throughout aquatic landscapes are spatially varied and temporally dynamic due to interactions of upstream land use, climate, hydrologic responses, and internal aquatic processes. One of the key goals in aquatic ecosystem ecology is to parse the upstream influences of terrestrial and aquatic processes on local conditions, which becomes progressively more difficult as watershed size increases and as processes are altered by diverse human activities. Simultaneous deployments of high frequency, in situ aquatic sensors for multiple constituents (e.g. NO3-N, CDOM, turbidity, conductivity, D.O., water temperature, along with flow) offer a new approach for understanding patterns along the aquatic continuum. For this talk, we explore strategies for deployments within single watersheds to improve understanding of terrestrial and aquatic processes. We address applications regarding mobilization of non-point nutrient sources across temporal scales, interactions with land use and watershed size, and the importance of aquatic processes. We also explore ways in which simultaneous sensor deployments can be designed to improve parameterization and testing of river network biogeochemical models. We will provide several specific examples using conductivity, nitrate and carbon from ongoing sensor deployments in New England, USA. We expect that improved deployments of sensors and sensor networks will benefit the management of critical freshwater resources.

  19. Deployment Methods for an Origami-Inspired Rigid-Foldable Array

    NASA Technical Reports Server (NTRS)

    Zirbel, Shannon A.; Trease, Brian P.; Magleby, Spencer P.; Howell, Larry L.

    2014-01-01

    The purpose of this work is to evaluate several deployment methods for an origami-inspired solar array at two size scales: 25-meter array and CubeSat array. The array enables rigid panel deployment and introduces new concepts for actuating CubeSat deployables. The design for the array was inspired by the origami flasher model (Lang, 1997; Shafer, 2001). Figure 1 shows the array prototyped from Garolite and Kapton film at the CubeSat scale. Prior work demonstrated that rigid panels like solar cells could successfully be folded into the final stowed configuration without requiring the panels to flex (Zirbel, Lang, Thomson, & al., 2013). The design of the array is novel and enables efficient use of space. The array can be wrapped around the central bus of the spacecraft in the case of the large array, or can accommodate storage of a small instrument payload in the case of the CubeSat array. The radial symmetry of this array around the spacecraft is ideally suited for spacecraft that need to spin. This work focuses on several actuation methods for a one-time deployment of the array. The array is launched in its stowed configuration and it will be deployed when it is in space. Concepts for both passive and active actuation were considered.

  20. The military social health index: a partial multicultural validation.

    PubMed

    Van Breda, Adrian D

    2008-05-01

    Routine military deployments place great stress on military families. Before South African soldiers can be deployed, they undergo a comprehensive health assessment, which includes a social work assessment. The assessment focuses on the resilience of the family system to estimate how well the family will cope when exposed to the stress of deployments. This article reports on the development and validation of a new measuring tool, the Military Social Health Index, or MSHI. The MSHI is made up of four scales, each comprising 14 items, viz social support, problem solving, stressor appraisal, and generalized resistance resources. An initial, large-scale, multicultural validation of the MSHI revealed strong levels of reliability (Cronbach a and standard error of measurement) and validity (factorial, construct, convergent, and discriminant).

  1. Turbulence Measurements from a Moored Platform at Mid-Depth in a Swift Tidal Channel

    NASA Astrophysics Data System (ADS)

    Hay, Alex; Lueck, Rolf; Wolk, Fabian; McMillan, Justine

    2014-05-01

    Results are presented from a turbulence experiment with a 3-m long streamlined floatation body, instrumented with velocity shear probes, fast-response thermistors, a 1 MHz Acoustic Doppler Current Profiler (AD2CP), and an Acoustic Doppler Velocimeter (ADV). The system was deployed over seven tidal cycles at mid-depth in a 30-m deep tidal channel in the lower Bay of Fundy, Canada. Peak flow speeds exceeded 2 m s-1, and while 10-min time scale average speeds were similar between ebb and flood, the variances were markedly higher during flood. Turbulent kinetic energy (TKE) dissipation rates measured with the shear probes exhibit a pronounced flood/ebb contrast: O(10-4) W kg-1 peak values during flood, but lower by an order of magnitude during ebb. Dissipation rates follow u3 scaling over a wide range of flow speeds between 0.5 and 2.5 m s-1. Below 0.5 m s-1 an asymmetry in the mounting arrangement caused the floatation body to pitch upward, biasing the measured dissipation values high. The ADV on the platform registered mean speed - used to implement Taylor's hypothesis - which was corroborated with the platform-mounted ADCP. Additional ADCPs were also deployed on a nearby bottom pod, sampling at turbulence resolving rates - up to 8 Hz. Comparisons between the shear probe and acoustic estimates of the TKE spectrum and dissipation rate - at comparable depths - are presented.

  2. Airborne Lidar Measurements of Surface Topography and Structure in Arctic-Boreal Ecosystems

    NASA Astrophysics Data System (ADS)

    Hofton, M. A.; Blair, J. B.; Rabine, D.; Cornejo, H.; Story, S.

    2017-12-01

    In June-July 2017, NASA's Land, Vegetation and Ice Sensor (LVIS) Facility was deployed to sites in northern Canada and Alaska as part of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE) 2017 airborne campaign. ABoVE is a large-scale, multi-year study of environmental change and its implications for social-ecological systems, and involves multiple airborne sensors flying both field-based and larger scale sampling sites. During the 4 week deployment of LVIS-F, a total of 15 flights were flown over diverse science targets based out of multiple airports in Canada and Alaska. LVIS-F is NASA's high-altitude airborne lidar sensor, collecting a nominal 2km wide swath of data from 10km altitude above the ground. Footprints are continguous both along and across track and for ABoVE operations, were 6m in diameter. Full waveform data are collected for every footprint and georeferenced to provide a true 3 dimensional view of overflown terrain. Along with precise positioning and pointing information, the LVIS laser range and waveform data are processed to provide high-quality measurements of surface structure including ground elevation, canopy height and canopy volume metrics. Information on data coverage and examples of level1b and level2 data products at science target sites will be shown along with initial results for data precision and accuracy. All AboVe LVIS data products will be available to investigators via a NASA DAAC.

  3. Experiences with the ALICE Mesos infrastructure

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Eulisse, G.; Grigoraş, C.; Napoli, K.

    2017-10-01

    Apache Mesos is a resource management system for large data centres, initially developed by UC Berkeley, and now maintained under the Apache Foundation umbrella. It is widely used in the industry by companies like Apple, Twitter, and Airbnb and it is known to scale to 10 000s of nodes. Together with other tools of its ecosystem, such as Mesosphere Marathon or Metronome, it provides an end-to-end solution for datacenter operations and a unified way to exploit large distributed systems. We present the experience of the ALICE Experiment Offline & Computing in deploying and using in production the Apache Mesos ecosystem for a variety of tasks on a small 500 cores cluster, using hybrid OpenStack and bare metal resources. We will initially introduce the architecture of our setup and its operation, we will then describe the tasks which are performed by it, including release building and QA, release validation, and simple Monte Carlo production. We will show how we developed Mesos enabled components (called “Mesos Frameworks”) to carry out ALICE specific needs. In particular, we will illustrate our effort to integrate Work Queue, a lightweight batch processing engine developed by University of Notre Dame, which ALICE uses to orchestrate release validation. Finally, we will give an outlook on how to use Mesos as resource manager for DDS, a software deployment system developed by GSI which will be the foundation of the system deployment for ALICE next generation Online-Offline (O2).

  4. Floating Offshore Wind in Oregon: Potential for Jobs and Economic Impacts in Oregon Coastal Counties from Two Future Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez, Tony; Keyser, David; Tegen, Suzanne

    This analysis examines the employment and potential economic impacts of large-scale deployment of offshore wind technology off the coast of Oregon. This analysis examines impacts within the seven Oregon coastal counties: Clatsop, Tillamook, Lincoln, Lane, Douglas, Coos, and Curry. The impacts highlighted here can be used in county, state, and regional planning discussions and can be scaled to get a general sense of the economic development opportunities associated with other deployment scenarios.

  5. Intense deformation field at oceanic front inferred from directional sea surface roughness observations

    NASA Astrophysics Data System (ADS)

    Rascle, Nicolas; Molemaker, Jeroen; Marié, Louis; Nouguier, Frédéric; Chapron, Bertrand; Lund, Björn; Mouche, Alexis

    2017-06-01

    Fine-scale current gradients at the ocean surface can be observed by sea surface roughness. More specifically, directional surface roughness anomalies are related to the different horizontal current gradient components. This paper reports results from a dedicated experiment during the Lagrangian Submesoscale Experiment (LASER) drifter deployment. A very sharp front, 50 m wide, is detected simultaneously in drifter trajectories, sea surface temperature, and sea surface roughness. A new observational method is applied, using Sun glitter reflections during multiple airplane passes to reconstruct the multiangle roughness anomaly. This multiangle anomaly is consistent with wave-current interactions over a front, including both cross-front convergence and along-front shear with cyclonic vorticity. Qualitatively, results agree with drifters and X-band radar observations. Quantitatively, the sharpness of roughness anomaly suggests intense current gradients, 0.3 m s-1 over the 50 m wide front. This work opens new perspectives for monitoring intense oceanic fronts using drones or satellite constellations.

  6. DRAGON - 8U Nanosatellite Orbital Deployer

    NASA Technical Reports Server (NTRS)

    Dobrowolski, Marcin; Grygorczuk, Jerzy; Kedziora, Bartosz; Tokarz, Marta; Borys, Maciej

    2014-01-01

    The Space Research Centre of the Polish Academy of Sciences (SRC PAS) together with Astronika company have developed an Orbital Deployer called DRAGON for ejection of the Polish scientific nanosatellite BRITE-PL Heweliusz (Fig. 1). The device has three unique mechanisms including an adopted and scaled lock and release mechanism from the ESA Rosetta mission MUPUS instrument. This paper discusses major design restrictions of the deployer, unique design features, and lessons learned from development through testing.

  7. ADEPT - A Mechanically Deployable Entry System Technology in Development at NASA

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Wercinski, Paul; Cassell, Alan; Smith, Brandon; Yount, Bryan

    2016-01-01

    The proposed presentation will give an overview of a mechanically deployable entry system concept development with a comprehensive summary of the ground tests and design development completed to-date, and current plans for a small-scale flight test in the near future.

  8. Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Jensen, David; Poll, Scott

    2009-01-01

    Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.

  9. High-energy synchrotron x-ray techniques for studying irradiated materials

    DOE PAGES

    Park, Jun-Sang; Zhang, Xuan; Sharma, Hemant; ...

    2015-03-20

    High performance materials that can withstand radiation, heat, multiaxial stresses, and corrosive environment are necessary for the deployment of advanced nuclear energy systems. Nondestructive in situ experimental techniques utilizing high energy x-rays from synchrotron sources can be an attractive set of tools for engineers and scientists to investigate the structure–processing–property relationship systematically at smaller length scales and help build better material models. In this paper, two unique and interconnected experimental techniques, namely, simultaneous small-angle/wide-angle x-ray scattering (SAXS/WAXS) and far-field high-energy diffraction microscopy (FF-HEDM) are presented. Finally, the changes in material state as Fe-based alloys are heated to high temperatures ormore » subject to irradiation are examined using these techniques.« less

  10. Advanced Deployable Shell-Based Composite Booms for Small Satellite Structural Applications Including Solar Sails

    NASA Technical Reports Server (NTRS)

    Fernandez, Juan M.

    2017-01-01

    State of the art deployable structures are mainly being designed for medium to large size satellites. The lack of reliable deployable structural systems for low cost, small volume, rideshare-class spacecraft severely constrains the potential for using small satellite platforms for affordable deep space science and exploration precursor missions that could be realized with solar sails. There is thus a need for reliable, lightweight, high packaging efficiency deployable booms that can serve as the supporting structure for a wide range of small satellite systems including solar sails for propulsion. The National Air and Space Administration (NASA) is currently investing in the development of a new class of advanced deployable shell-based composite booms to support future deep space small satellite missions using solar sails. The concepts are being designed to: meet the unique requirements of small satellites, maximize ground testability, permit the use of low-cost manufacturing processes that will benefit scalability, be scalable for use as elements of hierarchical structures (e.g. trusses), allow long duration storage, have high deployment reliability, and have controlled deployment behavior and predictable deployed dynamics. This paper will present the various rollable boom concepts that are being developed for 5-20 m class size deployable structures that include solar sails with the so-called High Strain Composites (HSC) materials. The deployable composite booms to be presented are being developed to expand the portfolio of available rollable booms for small satellites and maximize their length for a given packaged volume. Given that solar sails are a great example of volume and mass optimization, the booms were designed to comply with nominal solar sail system requirements for 6U CubeSats, which are a good compromise between those of smaller form factors (1U, 2U and 3U CubeSats) and larger ones (12 U and 27 U future CubeSats, and ESPA-class microsatellites). Solar sail missions for such composite boom systems are already under consideration and development at NASA, as well as mission studies that will benefit from planned scaled-up versions of the composite boom technologies to be introduced. The paper presents ongoing research and development of thin-shell rollable composite booms designed under the particular stringent and challenging system requirements of relatively large solar sails housed on small satellites. These requirements will be derived and listed. Several new boom concepts are proposed and other existing ones are improved upon using thin-ply composite materials to yield unprecedented compact deployable structures. Some of these booms are shown in Fig. 1. For every boom to be introduced the scalable fabrication process developed to keep the overall boom system cost down will be shown. Finally, the initial results of purposely designed boom structural characterization test methods with gravity off-loading will be presented to compare their structural performance under expected and general load cases.

  11. Dispersion and Cluster Scales in the Ocean

    NASA Astrophysics Data System (ADS)

    Kirwan, A. D., Jr.; Chang, H.; Huntley, H.; Carlson, D. F.; Mensa, J. A.; Poje, A. C.; Fox-Kemper, B.

    2017-12-01

    Ocean flow space scales range from centimeters to thousands of kilometers. Because of their large Reynolds number these flows are considered turbulent. However, because of rotation and stratification constraints they do not conform to classical turbulence scaling theory. Mesoscale and large-scale motions are well described by geostrophic or "2D turbulence" theory, however extending this theory to submesoscales has proved to be problematic. One obvious reason is the difficulty in obtaining reliable data over many orders of magnitude of spatial scales in an ocean environment. The goal of this presentation is to provide a preliminary synopsis of two recent experiments that overcame these obstacles. The first experiment, the Grand LAgrangian Deployment (GLAD) was conducted during July 2012 in the eastern half of the Gulf of Mexico. Here approximately 300 GPS-tracked drifters were deployed with the primary goal to determine whether the relative dispersion of an initially densely clustered array was driven by processes acting at local pair separation scales or by straining imposed by mesoscale motions. The second experiment was a component of the LAgrangian Submesoscale Experiment (LASER) conducted during the winter of 2016. Here thousands of bamboo plates were tracked optically from an Aerostat. Together these two deployments provided an unprecedented data set on dispersion and clustering processes from 1 to 106 meter scales. Calculations of statistics such as two point separations, structure functions, and scale dependent relative diffusivities showed: inverse energy cascade as expected for scales above 10 km, a forward energy cascade at scales below 10 km with a possible energy input at Langmuir circulation scales. We also find evidence from structure function calculations for surface flow convergence at scales less than 10 km that account for material clustering at the ocean surface.

  12. Impact of Federal Tax Policy on Utility-Scale Solar Deployment Given Financing Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mai, Trieu; Cole, Wesley; Krishnan, Venkat

    In this study, the authors conducted a literature review of approaches and assumptions used by other modeling teams and consultants with respect to solar project financing; developed and incorporated an ability to model the likely financing shift away from more expensive sources of capital and toward cheaper sources as the investment tax credit declines in the ReEDS model; and used the 'before and after' versions of the ReEDS model to isolate and analyze the deployment impact of the financing shift under a range of conditions. Using ReEDS scenarios with this improved capability, we find that this 'financing' shift would softenmore » the blow of the ITC reversion; however, the overall impacts of such a shift in capital structure are estimated to be small and near-term utility-scale PV deployment is found to be much more sensitive to other factors that might drive down utility-scale PV prices.« less

  13. Seattle wide-area information for travelers (SWIFT) : deployment cost study

    DOT National Transportation Integrated Search

    1998-10-19

    The Seattle Wide-area Information For Travelers (SWIFT) project was a highly successful Intelligent Transportation System (ITS) Field Operational Test (FOT) that was conducted over a four-year period from 1993 to 1997. The purpose of the project was ...

  14. A statistical investigation into the stability of iris recognition in diverse population sets

    NASA Astrophysics Data System (ADS)

    Howard, John J.; Etter, Delores M.

    2014-05-01

    Iris recognition is increasingly being deployed on population wide scales for important applications such as border security, social service administration, criminal identification and general population management. The error rates for this incredibly accurate form of biometric identification are established using well known, laboratory quality datasets. However, it is has long been acknowledged in biometric theory that not all individuals have the same likelihood of being correctly serviced by a biometric system. Typically, techniques for identifying clients that are likely to experience a false non-match or a false match error are carried out on a per-subject basis. This research makes the novel hypothesis that certain ethnical denominations are more or less likely to experience a biometric error. Through established statistical techniques, we demonstrate this hypothesis to be true and document the notable effect that the ethnicity of the client has on iris similarity scores. Understanding the expected impact of ethnical diversity on iris recognition accuracy is crucial to the future success of this technology as it is deployed in areas where the target population consists of clientele from a range of geographic backgrounds, such as border crossings and immigration check points.

  15. Field calibration of polyurethane foam (PUF) disk passive air samplers for PCBs and OC pesticides.

    PubMed

    Chaemfa, Chakra; Barber, Jonathan L; Gocht, Tilman; Harner, Tom; Holoubek, Ivan; Klanova, Jana; Jones, Kevin C

    2008-12-01

    Different passive air sampler (PAS) strategies have been developed for sampling in remote areas and for cost-effective simultaneous spatial mapping of POPs (persistent organic pollutants) over differing geographical scales. The polyurethane foam (PUF) disk-based PAS is probably the most widely used. In a PUF-based PAS, the PUF disk is generally mounted inside two stainless steel bowls to buffer the air flow to the disk and to shield it from precipitation and light. The field study described in this manuscript was conducted to: compare performance of 3 different designs of sampler; to further calibrate the sampler against the conventional active sampler; to derive more information on field-based uptake rates and equilibrium times of the samplers. Samplers were also deployed at different locations across the field site, and at different heights up a meteorological tower, to investigate the possible influence of sampler location. Samplers deployed <5m above ground, and not directly sheltered from the wind gave similar uptake rates. Small differences in dimensions between the 3 designs of passive sampler chamber had no discernable effect on accumulation rates, allowing comparison with previously published data.

  16. New set of solar arrays deployed on Hubble Space Telescope

    NASA Image and Video Library

    1993-12-09

    STS061-99-002 (2-13 Dec 1993) --- The new set of solar array panels deployed on the Hubble Space Telescope (HST) is backdropped against the blackness of space and a widely cloud-covered area on Earth. The 70mm frame was exposed by one of the Space Shuttle Endeavour's seven crew members on the aft flight deck.

  17. Low-Cost Sensor Units for Measuring Urban Air Quality

    NASA Astrophysics Data System (ADS)

    Popoola, O. A.; Mead, M.; Stewart, G.; Hodgson, T.; McLoed, M.; Baldovi, J.; Landshoff, P.; Hayes, M.; Calleja, M.; Jones, R.

    2010-12-01

    Measurements of selected key air quality gases (CO, NO & NO2) have been made with a range of miniature low-cost sensors based on electrochemical gas sensing technology incorporating GPS and GPRS for position and communication respectively. Two types of simple to operate sensors units have been designed to be deployed in relatively large numbers. Mobile handheld sensor units designed for operation by members of the public have been deployed on numerous occasions including in Cambridge, London and Valencia. Static sensor units have also been designed for long-term autonomous deployment on existing street furniture. A study was recently completed in which 45 sensor units were deployed in the Cambridge area for a period of 3 months. Results from these studies indicate that air quality varies widely both spatially and temporally. The widely varying concentrations found suggest that the urban environment cannot be fully understood using limited static site (AURN) networks and that a higher resolution, more dispersed network is required to better define air quality in the urban environment. The results also suggest that higher spatial and temporal resolution measurements could improve knowledge of the levels of individual exposure in the urban environment.

  18. How much do electric drive vehicles matter to future U.S. emissions?

    PubMed

    Babaee, Samaneh; Nagpure, Ajay S; DeCarolis, Joseph F

    2014-01-01

    Hybrid, plug-in hybrid, and battery electric vehicles--known collectively as electric drive vehicles (EDVs)--may represent a clean and affordable option to meet growing U.S. light duty vehicle (LDV) demand. The goal of this study is 2-fold: identify the conditions under which EDVs achieve high LDV market penetration in the U.S. and quantify the associated change in CO2, SO2, and NOX emissions through midcentury. We employ the Integrated MARKAL-EFOM System (TIMES), a bottom-up energy system model, along with a U.S. data set developed for this analysis. To characterize EDV deployment through 2050, varying assumptions related to crude oil and natural gas prices, a CO2 policy, a federal renewable portfolio standard, and vehicle battery cost were combined to form 108 different scenarios. Across these scenarios, oil prices and battery cost have the biggest effect on EDV deployment. The model results do not demonstrate a clear and consistent trend toward lower system-wide emissions as EDV deployment increases. In addition to the trade-off between lower tailpipe and higher electric sector emissions associated with plug-in vehicles, the scenarios produce system-wide emissions effects that often mask the effect of EDV deployment.

  19. "Tactic": Traffic Aware Cloud for Tiered Infrastructure Consolidation

    ERIC Educational Resources Information Center

    Sangpetch, Akkarit

    2013-01-01

    Large-scale enterprise applications are deployed as distributed applications. These applications consist of many inter-connected components with heterogeneous roles and complex dependencies. Each component typically consumes 5-15% of the server capacity. Deploying each component as a separate virtual machine (VM) allows us to consolidate the…

  20. Tony Jimenez | NREL

    Science.gov Websites

    pre-feasibility analysis; wind data analysis; the small wind turbine certification process; economic Regional Test Center effort, analysis of the potential economic impact of large-scale MHK deployment off pre-feasibility analysis. Tony is an engineer officer in the Army Reserve. He has deployed twice

  1. Transforming a Liability Into An Asset-Creating a Market for CO2-based Products

    NASA Astrophysics Data System (ADS)

    David, B. J.

    2016-12-01

    This session will discuss converting CO2 from a liability into an asset. It will specifically discuss how at least 25 products can be created using CO2 as a feedstock and deployed in the market at large scale. Focus will be on products that can both achieve scale from a market standpoint as well as climate significance in use of CO2 as a feedstock. The session will describe the market drivers supporting and inhibiting commercial deployment of CO2-based products. It will list key barriers and risks in the various CO2-based product segments. These barriers/risks could occur across technology, policy, institutional, economic, and other dimensions. The means to mitigate each barrier and the likelihood for such means to be deployed will be discussed.

  2. Quantifying Neighborhood-Scale Spatial Variations of Ozone at Open Space and Urban Sites in Boulder, Colorado Using Low-Cost Sensor Technology.

    PubMed

    Cheadle, Lucy; Deanes, Lauren; Sadighi, Kira; Gordon Casey, Joanna; Collier-Oxandale, Ashley; Hannigan, Michael

    2017-09-10

    Recent advances in air pollution sensors have led to a new wave of low-cost measurement systems that can be deployed in dense networks to capture small-scale spatio-temporal variations in ozone, a pollutant known to cause negative human health impacts. This study deployed a network of seven low-cost ozone metal oxide sensor systems (UPods) in both an open space and an urban location in Boulder, Colorado during June and July of 2015, to quantify ozone variations on spatial scales ranging from 12 m between UPods to 6.7 km between open space and urban measurement sites with a measurement uncertainty of ~5 ppb. The results showed spatial variability of ozone at both deployment sites, with the largest differences between UPod measurements occurring during the afternoons. The peak median hourly difference between UPods was 6 ppb at 1:00 p.m. at the open space site, and 11 ppb at 4:00 p.m. at the urban site. Overall, the urban ozone measurements were higher than in the open space measurements. This study evaluates the effectiveness of using low-cost sensors to capture microscale spatial and temporal variation of ozone; additionally, it highlights the importance of field calibrations and measurement uncertainty quantification when deploying low-cost sensors.

  3. OCEAN THERMAL ENERGY CONVERSION (OTEC) PROGRAMMATIC ENVIRONMENTAL ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sands, M. D.

    1980-01-01

    This programmatic environmental analysis is an initial assessment of OTEC technology considering development, demonstration and commercialization; it is concluded that the OTEC development program should continue because the development, demonstration, and commercialization on a single-plant deployment basis should not present significant environmental impacts. However, several areas within the OTEC program require further investigation in order to assess the potential for environmental impacts from OTEC operation, particularly in large-scale deployments and in defining alternatives to closed-cycle biofouling control: (1) Larger-scale deployments of OTEC clusters or parks require further investigations in order to assess optimal platform siting distances necessary to minimize adversemore » environmental impacts. (2) The deployment and operation of the preoperational platform (OTEC-1) and future demonstration platforms must be carefully monitored to refine environmental assessment predictions, and to provide design modifications which may mitigate or reduce environmental impacts for larger-scale operations. These platforms will provide a valuable opportunity to fully evaluate the intake and discharge configurations, biofouling control methods, and both short-term and long-term environmental effects associated with platform operations. (3) Successful development of OTEC technology to use the maximal resource capabilities and to minimize environmental effects will require a concerted environmental management program, encompassing many different disciplines and environmental specialties.« less

  4. Deployable System for Crash-Load Attenuation

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Karen E.

    2007-01-01

    An externally deployable honeycomb structure is investigated with respect to crash energy management for light aircraft. The new concept utilizes an expandable honeycomb-like structure to absorb impact energy by crushing. Distinguished by flexible hinges between cell wall junctions that enable effortless deployment, the new energy absorber offers most of the desirable features of an external airbag system without the limitations of poor shear stability, system complexity, and timing sensitivity. Like conventional honeycomb, once expanded, the energy absorber is transformed into a crush efficient and stable cellular structure. Other advantages, afforded by the flexible hinge feature, include a variety of deployment options such as linear, radial, and/or hybrid deployment methods. Radial deployment is utilized when omnidirectional cushioning is required. Linear deployment offers better efficiency, which is preferred when the impact orientation is known in advance. Several energy absorbers utilizing different deployment modes could also be combined to optimize overall performance and/or improve system reliability as outlined in the paper. Results from a series of component and full scale demonstration tests are presented as well as typical deployment techniques and mechanisms. LS-DYNA analytical simulations of selected tests are also presented.

  5. Fine-grained dengue forecasting using telephone triage services

    PubMed Central

    Abdur Rehman, Nabeel; Kalyanaraman, Shankar; Ahmad, Talal; Pervaiz, Fahad; Saif, Umar; Subramanian, Lakshminarayanan

    2016-01-01

    Thousands of lives are lost every year in developing countries for failing to detect epidemics early because of the lack of real-time disease surveillance data. We present results from a large-scale deployment of a telephone triage service as a basis for dengue forecasting in Pakistan. Our system uses statistical analysis of dengue-related phone calls to accurately forecast suspected dengue cases 2 to 3 weeks ahead of time at a subcity level (correlation of up to 0.93). Our system has been operational at scale in Pakistan for the past 3 years and has received more than 300,000 phone calls. The predictions from our system are widely disseminated to public health officials and form a critical part of active government strategies for dengue containment. Our work is the first to demonstrate, with significant empirical evidence, that an accurate, location-specific disease forecasting system can be built using analysis of call volume data from a public health hotline. PMID:27419226

  6. Limited impact of beach nourishment on macrofaunal recruitment/settlement in a site of community interest in coastal area of the Adriatic Sea (Mediterranean Sea).

    PubMed

    Danovaro, Roberto; Nepote, Ettore; Martire, Marco Lo; Ciotti, Claudia; De Grandis, Gianluca; Corinaldesi, Cinzia; Carugati, Laura; Cerrano, Carlo; Pica, Daniela; Di Camillo, Cristina Gioia; Dell'Anno, Antonio

    2018-03-01

    Beach nourishment is a widely utilized solution to counteract the erosion of shorelines, and there is an active discussion on its possible consequences on coastal marine assemblages. We investigated the impact caused by a small-scale beach nourishment carried out in the Western Adriatic Sea on macrofaunal recruitment and post-settlement events. Artificial substrates were deployed in proximity of nourished and non-manipulated beaches and turbidity and sedimentation rates were measured. Our results indicate that sedimentation rates in the impacted site showed a different temporal change compared to the control sites, suggesting potential modifications due to the beach nourishment. The impact site was characterized by subtle changes in terms of polychaete abundance and community structure when compared to controls, possibly due to beach nourishment, although the role of other factors cannot be ruled out. We conclude that small-scale beach nourishments appear to be an eco-sustainable approach to contrast coastal erosion. Copyright © 2018. Published by Elsevier Ltd.

  7. Potentially modifiable pre-, peri-, and postdeployment characteristics associated with deployment-related posttraumatic stress disorder among ohio army national guard soldiers.

    PubMed

    Goldmann, Emily; Calabrese, Joseph R; Prescott, Marta R; Tamburrino, Marijo; Liberzon, Israel; Slembarski, Renee; Shirley, Edwin; Fine, Thomas; Goto, Toyomi; Wilson, Kimberly; Ganocy, Stephen; Chan, Philip; Serrano, Mary Beth; Sizemore, James; Galea, Sandro

    2012-02-01

    To evaluate potentially modifiable deployment characteristics-- predeployment preparedness, unit support during deployment, and postdeployment support-that may be associated with deployment-related posttraumatic stress disorder (PTSD). We recruited a sample of 2616 Ohio Army National Guard (OHARNG) soldiers and conducted structured interviews to assess traumatic event exposure and PTSD related to the soldiers' most recent deployment, consistent with DSM-IV criteria. We assessed preparedness, unit support, and postdeployment support by using multimeasure scales adapted from the Deployment Risk and Resilience Survey. The prevalence of deployment-related PTSD was 9.6%. In adjusted logistic models, high levels of all three deployment characteristics (compared with low) were independently associated with lower odds of PTSD. When we evaluated the influence of combinations of deployment characteristics on the development of PTSD, we found that postdeployment support was an essential factor in the prevention of PTSD. Results show that factors throughout the life course of deployment-in particular, postdeployment support-may influence the development of PTSD. These results suggest that the development of suitable postdeployment support opportunities may be centrally important in mitigating the psychological consequences of war. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallarno, George; Rogers, James H; Maxwell, Don E

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less

  9. 1. Air-Sea and Lateral Exchange Processes in East Indian Coastal Current off Sri Lanka 2. ASIRI: Remote Sensing of Atmospheric Waves and Instabilities (RAWI)

    DTIC Science & Technology

    2014-09-30

    second project, collaboration is sought with institutions in Seychelles and Singapore for atmospheric deployments. In all cases, the project expects to...suite of atmospheric instruments in the coasts of three IO island nations, Sri Lanka, Seychelles and Singapore to capture small-scale events pertinent...necessary for the deployments are being developed in Sri Lanka. The nature of the deployments in Seychelles and Singapore do not require additional

  10. Radiator on S0 truss after remote deployment

    NASA Image and Video Library

    2002-10-14

    STS112-E-05563 (14 October 2002) --- View of one of the radiators on the newly installed Starboard One (S1) Truss which was remotely deployed to verify the connections established on the first spacewalk for the STS-112 mission. Its extended length was 75 feet with each of the eight panels being 11 feet wide. The cooling systems will not formally be activated until next year.

  11. Intelligent Network Flow Optimization (INFLO) prototype : Seattle small-scale demonstration report.

    DOT National Transportation Integrated Search

    2015-05-01

    This report describes the performance and results of the INFLO Prototype Small-Scale Demonstration. The purpose of the Small-Scale Demonstration was to deploy the INFLO Prototype System to demonstrate its functionality and performance in an operation...

  12. Shape accuracy optimization for cable-rib tension deployable antenna structure with tensioned cables

    NASA Astrophysics Data System (ADS)

    Liu, Ruiwei; Guo, Hongwei; Liu, Rongqiang; Wang, Hongxiang; Tang, Dewei; Song, Xiaoke

    2017-11-01

    Shape accuracy is of substantial importance in deployable structures as the demand for large-scale deployable structures in various fields, especially in aerospace engineering, increases. The main purpose of this paper is to present a shape accuracy optimization method to find the optimal pretensions for the desired shape of cable-rib tension deployable antenna structure with tensioned cables. First, an analysis model of the deployable structure is established by using finite element method. In this model, geometrical nonlinearity is considered for the cable element and beam element. Flexible deformations of the deployable structure under the action of cable network and tensioned cables are subsequently analyzed separately. Moreover, the influence of pretension of tensioned cables on natural frequencies is studied. Based on the results, a genetic algorithm is used to find a set of reasonable pretension and thus minimize structural deformation under the first natural frequency constraint. Finally, numerical simulations are presented to analyze the deployable structure under two kinds of constraints. Results show that the shape accuracy and natural frequencies of deployable structure can be effectively improved by pretension optimization.

  13. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    NASA Astrophysics Data System (ADS)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner shelf settings. This vision is illustrated through an idealised composition of models for a ~ 70 km stretch of the Suffolk coast, eastern England. A key advantage of model linking is that it allows a wide range of real-world situations to be simulated from a small set of model components. However, this process involves more than just the development of software that allows for flexible model coupling. The compatibility of radically different modelling assumptions remains to be carefully assessed and testing as well as evaluating uncertainties of models in composition are areas that require further attention.

  14. Testing Starshade Manufacturing and Deployment Through NASA's Technology Development for Exoplanet Missions Program

    NASA Astrophysics Data System (ADS)

    Kasdin, N. J.; Shaklan, S.; Lisman, D.; Thomson, M.; Cady, E.; Lo, A.; Macintosh, B.

    2014-01-01

    An external occulter is a satellite employing a large screen, or starshade, that flies in formation with a spaceborne telescope to provide the starlight suppression needed for detecting and characterizing exoplanets. Among the advantages of using an occulter are the broadband allowed for characterization and the removal of light before entering the observatory, greatly relaxing the requirements on the telescope and instrument. In this poster we report on the results of our two Technology Development for Exoplanet Missions (TDEM) studies. In the first we examined the manufacturability and metrology of starshade petals, successfully constructing a full size petal from flight like materials and showing through precise edge shape measurements that an occulter made with petals consistent with the measured accuracy would achieve close to 10^-10 contrast. Our second TDEM tested the deployment precision of a roughly half-scale starshade. We demonstrated the deployment of an existing deployable truss outfitted with four sub-scale petals and a custom designed central hub. We showed that the system can be deployed multiple times with a repeatable positioning accuracy of the petals better than the requirement of 1.0 mm. The combined results of these two TDEM projects has significantly advanced the readiness level of occulter technology and moved the community closer to a realizable mission.

  15. Coping with the challenges of early disaster response: 24 years of field hospital experience after earthquakes.

    PubMed

    Bar-On, Elhanan; Abargel, Avi; Peleg, Kobi; Kreiss, Yitshak

    2013-10-01

    To propose strategies and recommendations for future planning and deployment of field hospitals after earthquakes by comparing the experience of 4 field hospitals deployed by The Israel Defense Forces (IDF) Medical Corps in Armenia, Turkey, India and Haiti. Quantitative data regarding the earthquakes were collected from published sources; data regarding hospital activity were collected from IDF records; and qualitative information was obtained from structured interviews with key figures involved in the missions. The hospitals started operating between 89 and 262 hours after the earthquakes. Their sizes ranged from 25 to 72 beds, and their personnel numbered between 34 and 100. The number of patients treated varied from 1111 to 2400. The proportion of earthquake-related diagnoses ranged from 28% to 67% (P < .001), with hospitalization rates between 3% and 66% (P < .001) and surgical rates from 1% to 24% (P < .001). In spite of characteristic scenarios and injury patterns after earthquakes, patient caseload and treatment requirements varied widely. The variables affecting the patient profile most significantly were time until deployment, total number of injured, availability of adjacent medical facilities, and possibility of evacuation from the disaster area. When deploying a field hospital in the early phase after an earthquake, a wide variability in patient caseload should be anticipated. Customization is difficult due to the paucity of information. Therefore, early deployment necessitates full logistic self-sufficiency and operational versatility. Also, collaboration with local and international medical teams can greatly enhance treatment capabilities.

  16. ADOPT: Automotive Deployment Options Projection Tool | Transportation

    Science.gov Websites

    new model options by combining high-selling powertrains and high-selling vehicle platforms. NREL has . Screenshot of the ADOPT user interface, with two simulation scenario options (low tech and high tech emissions. Biomass Market Dynamics Supporting the Large-Scale Deployment of High-Octane Fuel Production in

  17. Designing, Evaluating, and Deploying Automated Scoring Systems with Validity in Mind: Methodological Design Decisions

    ERIC Educational Resources Information Center

    Rupp, André A.

    2018-01-01

    This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…

  18. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  19. Sleep quality of German soldiers before, during and after deployment in Afghanistan-a prospective study.

    PubMed

    Danker-Hopfe, Heidi; Sauter, Cornelia; Kowalski, Jens T; Kropp, Stefan; Ströhle, Andreas; Wesemann, Ulrich; Zimmermann, Peter L

    2017-06-01

    In this prospective study, subjective sleep quality and excessive daytime sleepiness prior to, during and after deployment of German soldiers in Afghanistan were examined. Sleep quality (Pittsburgh Sleep Quality Index; PSQI) and daytime sleepiness (Epworth Sleepiness Scale; ESS) were assessed in 118 soldiers of the German army, who were deployed in Afghanistan for 6 months (deployment group: DG) and in 146 soldiers of a non-deployed control group (CG) at baseline. Results of the longitudinal analysis are reported, based on assessments conducted prior to, during the deployment and afterwards in the DG, and in the CG in parallel. Sleep quality and daytime sleepiness in the DG were already impaired during the predeployment training phase and remained at that level during the deployment phase, which clearly indicates the need for more attention on sleep in young soldiers, already at this early stage. The percentage of impaired sleepers decreased significantly after deployment. Programmes to teach techniques to improve sleep and reduce stress should be implemented prior to deployment to reduce sleep difficulties and excessive daytime sleepiness and subsequent psychiatric disorders. © 2017 European Sleep Research Society.

  20. Beyond Widgets -- Systems Incentive Programs for Utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regnier, Cindy; Mathew, Paul; Robinson, Alastair

    Utility incentive programs remain one of the most significant means of deploying commercialized, but underutilized building technologies to scale. However, these programs have been largely limited to component-based products (e.g., lamps, RTUs). While some utilities do provide ‘custom’ incentive programs with whole building and system level technical assistance, these programs require deeper levels of analysis, resulting in higher program costs. This results in custom programs being restricted to utilities with greater resources, and are typically applied mainly to large or energy-intensive facilities, leaving much of the market without cost effective access and incentives for these solutions. In addition, with increasinglymore » stringent energy codes, cost effective component-based solutions that achieve significant savings are dwindling. Building systems (e.g., integrated façade, HVAC and/or lighting solutions) can deliver higher savings that translate into large sector-wide savings if deployed at the scale of these programs. However, systems application poses a number of challenges – baseline energy use must be defined and measured; the metrics for energy and performance must be defined and tested against; in addition, system savings must be validated under well understood conditions. This paper presents a sample of findings of a project to develop validated utility incentive program packages for three specific integrated building systems, in collaboration with Xcel Energy (CO, MN), ComEd, and a consortium of California Public Owned Utilities (CA POUs) (Northern California Power Agency(NCPA) and the Southern California Public Power Authority(SCPPA)). Furthermore, these program packages consist of system specifications, system performance, M&V protocols, streamlined assessment methods, market assessment and implementation guidance.« less

  1. Leveraging Oceanic and Surface Intensive Field Campaign Data Sets for Validation and Improvement of Recent Hyperspectral IR Satellite Data Products

    NASA Astrophysics Data System (ADS)

    Joseph, E.; Nalli, N. R.; Oyola, M. I.; Morris, V. R.; Sakai, R.

    2014-12-01

    An overview is given of research to validate or improve the retrieval of environmental data records (EDRs) from recently deployed hyperspectral IR satellite sensors such as Suomi NPP Cross-track Infrared Microwave Sounder Suite (CrIMSS). The effort centers around several surface field intensive campaigns that are designed or leveraged for EDR validation. These data include ship-based observations of upper air ozone, pressure, temperature and relative humidity soundings; aerosol and cloud properties; and sea surface temperature. Similar intensive data from two land-based sites are also utilized as well. One site, the Howard University Beltsville site, is at a single point location but has a comprehensive array of observations for an extended period of time. The other land site, presently being deployed by the University at Albany, is under development with limited upper air soundings but will have regionally distributed surface based microwave profiling of temperature and relative humidity on the scale of 10 - 50 km and other standard meteorological observations. Combined these observations provide data that are unique in their wide range including, a variety of meteorological conditions and atmospheric compositions over the ocean and urban-suburban environments. With the distributed surface sites the variability of atmospheric conditions are captured concurrently across a regional spatial scale. Some specific examples are given of comparisons of moisture and temperature correlative EDRs from the satellite sensors and surface based observations. An additional example is given of the use of this data to correct sea surface temperature (SST) retrieval biases from the hyperspectral IR satellite observations due to aerosol contamination.

  2. Evaluation of Environmental Effects of Wave Energy Convertor Arrays

    NASA Astrophysics Data System (ADS)

    Jones, C. A.

    2015-12-01

    Stakeholders and regulators in the U.S. are generally uncertain as to the potential environmental impacts posed by deployments of marine and hydrokinetic (MHK) devices, and in particular wave energy conversion (WEC) devices, in coastal waters. The first pilot-scale WEC deployments in the U.S. have had to absorb unsustainable costs and delays associated with permitting to get devices in the water. As such, there is an urgent industry need to streamline the technical activities and processes used to assess potential environmental impacts. To enable regulators and stakeholders to become more comfortable and confident with developing effective MHK environmental assessments, a better understanding of the potential environmental effects induced by arrays of WEC devices is needed. A key challenge in developing this understanding is that the assessment of the WEC effects must come prior to deployment. A typical approach in similar environmental assessments is to use numerical models to simulate the WEC devices and array layouts so that the appropriate environmental stressors and receptors can be identified and assessed. Sandia National Laboratories (SNL) and the U.S. Department of Energy are fulfilling the industry-wide need to develop "WEC-friendly" open-source numerical modeling tools capable of assessing potential changes to the physical environment caused by the operation of WEC arrays. Studies using these tools will advance the nation's general knowledge of the interrelationships among the number, size, efficiency, and configuration of MHK arrays and the subsequent effects these relationships may have on the deployment environment. By better understanding these relationships, industry, stakeholders, and regulators will be able to work together to optimize WEC deployments such that environmental impacts are minimized while power output is maximized. The present work outlines the initial effort in coupling the SNL WEC-friendly tools with the environmental assessment process. The development of the initial phases of a WEC case study in the offshore waters of Newport, Oregon will be presented. Examples of the quantitative evaluation of changes to important parameters that mau constitute an environmental stressors will be presented.

  3. Toward Robust Climate Baselining: Objective Assessment of Climate Change Using Widely Distributed Miniaturized Sensors for Accurate World-Wide Geophysical Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teller, E; Leith, C; Canavan, G

    A gap-free, world-wide, ocean-, atmosphere-, and land surface-spanning geophysical data-set of three decades time-duration containing the full set of geophysical parameters characterizing global weather is the scientific perquisite for defining the climate; the generally-accepted definition in the meteorological community is that climate is the 30-year running-average of weather. Until such a tridecadal climate base line exists, climate change discussions inevitably will have a semi-speculative, vs. a purely scientific, character, as the baseline against which changes are referenced will be at least somewhat uncertain. The contemporary technology base provides ways-and-means for commencing the development of such a meteorological measurement-intensive climate baseline, moreover with a program budget far less than the {approx}more » $2.5 B/year which the US. currently spends on ''global change'' studies. In particular, the recent advent of satellite-based global telephony enables real-time control of, and data-return from, instrument packages of very modest scale, and Silicon Revolution-based sensor, data-processing and -storage advances permit 'intelligent' data-gathering payloads to be created with 10 gram-scale mass budgets. A geophysical measurement system implemented in such modern technology is a populous constellation 03 long-lived, highly-miniaturized robotic weather stations deployed throughout the weather-generating portions of the Earths atmosphere, throughout its oceans and across its land surfaces. Leveraging the technological advances of the OS, the filly-developed atmospheric weather station of this system has a projected weight of the order of 1 ounce, and contains a satellite telephone, a GPS receiver, a full set of atmospheric sensing instruments and a control computer - and has an operational life of the order of 1 year and a mass-production cost of the order of $$20. Such stations are effectively ''intra-atmospheric satellites'' but likely have serial-production unit costs only about twenty-billionths that of a contemporary NASA global change satellite, whose entirely-remote sensing capabilities they complement with entirely-local sensing. It's thus feasible to deploy millions of them, and thereby to intensively monitor all aspects of the Earths weather. Analogs of these atmospheric weather stations will be employed to provide comparable-quality reporting of oceanic and land-surface geophysical parameters affecting weather. This definitive climate baselining system could be in initial-prototype operation on a one-year time-scale, and in intermediate-scale, proof-of-principle operation within three years, at a total cost of {approx}$$95M. Steady-state operating costs are estimated to be {approx} $$75M/year, or {approx}3% of the current US. ''global change'' program-cost. Its data-return would be of great value very quickly as simply the best weather information, and within a few years as the definitive climatic variability-reporting system. It would become the generator of a definitive climate baseline at a total present-value cost of {approx}$$0.9 B.« less

  4. Securing Mobile Networks in an Operational Setting

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Stewart, David H.; Bell, Terry L.; Paulsen, Phillip E.; Shell, Dan

    2004-01-01

    This paper describes a network demonstration and three month field trial of mobile networking using mobile-IPv4. The network was implemented as part of the US Coast Guard operational network which is a ".mil" network and requires stringent levels of security. The initial demonstrations took place in November 2002 and a three month field trial took place from July through September of 2003. The mobile network utilized encryptors capable of NSA-approved Type 1 algorithms, mobile router from Cisco Systems and 802.11 and satellite wireless links. This paper also describes a conceptual architecture for wide-scale deployment of secure mobile networking in operational environments where both private and public infrastructure is used. Additional issues presented include link costs, placement of encryptors and running routing protocols over layer-3 encryption devices.

  5. Urban seismology - Northridge aftershocks recorded by multi-scale arrays of portable digital seismographs

    USGS Publications Warehouse

    Meremonte, M.; Frankel, A.; Cranswick, E.; Carver, D.; Worley, D.

    1996-01-01

    We deployed portable digital seismographs in the San Fernando Valley (SFV), the Los Angeles basin (LAB), and surrounding hills to record aftershocks of the 17 January 1994 Northridge California earthquake. The purpose of the deployment was to investigate factors relevant to seismic zonation in urban areas, such as site amplification, sedimentary basin effects, and the variability of ground motion over short baselines. We placed seismographs at 47 sites (not all concurrently) and recorded about 290 earthquakes with magnitudes up to 5.1 at five stations or more. We deployed widely spaced stations for profiles across the San Fernando Valley, as well as five dense arrays (apertures of 200 to 500 m) in areas of high damage, such as the collapsed Interstate 10 overpass, Sherman Oaks, and the collapsed parking garage at CalState Northridge. Aftershock data analysis indicates a correlation of site amplification with mainshock damage. We found several cases where the site amplification depended on the azimuth of the aftershock, possibly indicating focusing from basin structures. For the parking garage array, we found large ground-motion variabilities (a factor of 2) over 200-m distances for sites on the same mapped soil unit. Array analysis of the aftershock seismograms demonstrates that sizable arrivals after the direct 5 waves consist of surface waves traveling from the same azimuth as that of the epicenter. These surface waves increase the duration of motions and can have frequencies as high as about 4 Hz. For the events studied here, we do not observe large arrivals reflected from the southern edge of the San Fernando Valley.

  6. Estimation of velocity structure around a natural gas reservoir at Yufutsu, Japan, by microtremor survey

    NASA Astrophysics Data System (ADS)

    Shiraishi, H.; Asanuma, H.; Tezuka, K.

    2010-12-01

    Seismic reflection survey has been commonly used for exploration and time-lapse monitoring of oil/gas resources. Seismic reflection images typically have reasonable reliability and resolution for commercial production. However, cost consideration sometimes avoids deployment of widely distributed array or repeating survey in cases of time lapse monitoring or exploration of small-scale reservoir. Hence, technologies to estimate structures and physical properties around the reservoir with limited cost would be effectively used. Microtremor survey method (MSM) has an ability to realize long-term monitoring of reservoir with low cost, because this technique has a passive nature and minimum numbers of the monitoring station is four. MSM has been mainly used for earthquake disaster prevention, because velocity structure of S-wave is directly estimated from velocity dispersion of the Rayleigh wave. The authors experimentally investigated feasibility of the MSM survey for exploration of oil/gas reservoir. The field measurement was carried out around natural gas reservoir at Yufutsu, Hokkaido, Japan. Four types of arrays with array radii of 30m, 100m, 300m and 600m are deployed in each area. Dispersion curves of the velocity of Rayleigh wave were estimated from observed microtremors, and S-wave velocity structures were estimated by an inverse analysis of the dispersion curves with genetic algorism (GA). The estimated velocity structures showed good consistency with one dimensional velocity structure by previous reflection surveys up to 4-5 km. We also found from the field experiment that a data of 40min is effective to estimate the velocity structure even the seismometers are deployed along roads with heavy traffic.

  7. Using a Multibeam Echosounder to Monitor AN Artificial Reef

    NASA Astrophysics Data System (ADS)

    Tassetti, A. N.; Malaspina, S.; Fabi, G.

    2015-04-01

    Artificial reefs (ARs) have become popular technological interventions in shallow water environments characterized by soft seabed for a wide number of purposes, from fisheries/environmental protection and enhancement to research and tourism. AR deployment has the potential for causing significant hydrographical and biological changes in the receiving environments and, in turn, ARs are strongly affected by the surrounding area in terms of spatial arrangement and structural integrity as well as colonization by benthic communities and finfish. In this context, ARs require a systematic monitoring program that a multibeam echosounder (MBES) can provide better than other sampling methods such as visual dives and ROV inspections that are not quantitative and often influenced by water visibility and diver experience/skills. In this paper, some subsequent MBES surveys of the Senigallia scientifically-planned AR (Northern Adriatic Sea) are presented and state-of-the art data processing and visualization techniques are used to draw post-reef deployment comparisons and quantify the evolution of the reef in terms of spatial arrangement and bulk volume. These multibeam surveys play a leading part in a general multi-year program, started simultaneously with the AR design and deployment and aimed to map how the reef structure quantitatively changes over time, as well as it affects the sea-bottom morphology and the fishery resource. All the data, surveyed over years making use of different sampling methods such as visual and instrumental echosounding observations and catch rate surveys, gain a mechanistic and predictive understanding of how the Senigallia AR functions ecologically and physically across spatial and temporal scales during its design life

  8. On-chip infrared sensors: redefining the benefits of scaling

    NASA Astrophysics Data System (ADS)

    Kita, Derek; Lin, Hongtao; Agarwal, Anu; Yadav, Anupama; Richardson, Kathleen; Luzinov, Igor; Gu, Tian; Hu, Juejun

    2017-03-01

    Infrared (IR) spectroscopy is widely recognized as a gold standard technique for chemical and biological analysis. Traditional IR spectroscopy relies on fragile bench-top instruments located in dedicated laboratory settings, and is thus not suitable for emerging field-deployed applications such as in-line industrial process control, environmental monitoring, and point-of-care diagnosis. Recent strides in photonic integration technologies provide a promising route towards enabling miniaturized, rugged platforms for IR spectroscopic analysis. It is therefore attempting to simply replace the bulky discrete optical elements used in conventional IR spectroscopy with their on-chip counterparts. This size down-scaling approach, however, cripples the system performance as both the sensitivity of spectroscopic sensors and spectral resolution of spectrometers scale with optical path length. In light of this challenge, we will discuss two novel photonic device designs uniquely capable of reaping performance benefits from microphotonic scaling. We leverage strong optical and thermal confinement in judiciously designed micro-cavities to circumvent the thermal diffusion and optical diffraction limits in conventional photothermal sensors and achieve a record 104 photothermal sensitivity enhancement. In the second example, an on-chip spectrometer design with the Fellgett's advantage is analyzed. The design enables sub-nm spectral resolution on a millimeter-sized, fully packaged chip without moving parts.

  9. Weatherization and Intergovernmental Programs Office FY 2017 Budget At-A-Glance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-03-01

    The Weatherization and Intergovernmental Programs (WIP) Office is part of EERE’s balanced research, development, demonstration, and deployment approach to accelerate America’s transition to a clean energy economy. WIP’s mission is to partner with state and local organizations to improve energy security and to significantly accelerate the deployment of clean energy technologies and practices by a wide range of government, community, and business stakeholders.

  10. fastBMA: scalable network inference and transitive reduction.

    PubMed

    Hung, Ling-Hong; Shi, Kaiyuan; Wu, Migao; Young, William Chad; Raftery, Adrian E; Yeung, Ka Yee

    2017-10-01

    Inferring genetic networks from genome-wide expression data is extremely demanding computationally. We have developed fastBMA, a distributed, parallel, and scalable implementation of Bayesian model averaging (BMA) for this purpose. fastBMA also includes a computationally efficient module for eliminating redundant indirect edges in the network by mapping the transitive reduction to an easily solved shortest-path problem. We evaluated the performance of fastBMA on synthetic data and experimental genome-wide time series yeast and human datasets. When using a single CPU core, fastBMA is up to 100 times faster than the next fastest method, LASSO, with increased accuracy. It is a memory-efficient, parallel, and distributed application that scales to human genome-wide expression data. A 10 000-gene regulation network can be obtained in a matter of hours using a 32-core cloud cluster (2 nodes of 16 cores). fastBMA is a significant improvement over its predecessor ScanBMA. It is more accurate and orders of magnitude faster than other fast network inference methods such as the 1 based on LASSO. The improved scalability allows it to calculate networks from genome scale data in a reasonable time frame. The transitive reduction method can improve accuracy in denser networks. fastBMA is available as code (M.I.T. license) from GitHub (https://github.com/lhhunghimself/fastBMA), as part of the updated networkBMA Bioconductor package (https://www.bioconductor.org/packages/release/bioc/html/networkBMA.html) and as ready-to-deploy Docker images (https://hub.docker.com/r/biodepot/fastbma/). © The Authors 2017. Published by Oxford University Press.

  11. Wind Turbines as Landscape Impediments to the Migratory Connectivity of Bats

    USGS Publications Warehouse

    Cryan, Paul M.

    2011-01-01

    Unprecedented numbers of migratory bats are found dead beneath industrial-scale wind turbines during late summer and autumn in both North America and Europe. Prior to the wide-scale deployment of wind turbines, fatal collisions of migratory bats with anthropogenic structures were rarely reported and likely occurred very infrequently. There are no other well-documented threats to populations of migratory tree bats that cause mortality of similar magnitude to that observed at wind turbines. Just three migratory species comprise the vast majority of bat kills at turbines in North America and there are indications that turbines may actually attract migrating individuals toward their blades. Although fatality of certain migratory species is consistent in occurrence across large geographic regions, fatality rates differ across sites for reasons mostly unknown. Cumulative fatality for turbines in North America might already range into the hundreds of thousands of bats per year. Research into the causes of bat fatalities at wind turbines can ascertain the scale of the problem and help identify solutions. None of the migratory bats known to be most affected by wind turbines are protected by conservation laws, nor is there a legal mandate driving research into the problem or implementation of potential solutions.

  12. Using archived ITS data to measure the operational benefits of a system-wide adaptive ramp metering system.

    DOT National Transportation Integrated Search

    2008-12-01

    A System-Wide Adaptive Ramp Metering (SWARM) system has been implemented in the Portland, Oregon metropolitan area, replacing the previous pre-timed ramp-metering system that had been in operation since 1981. SWARM has been deployed on six major corr...

  13. Prevalence of mental health symptoms in Dutch military personnel returning from deployment to Afghanistan: a 2-year longitudinal analysis.

    PubMed

    Reijnen, A; Rademaker, A R; Vermetten, E; Geuze, E

    2015-02-01

    Recent studies in troops deployed to Iraq and Afghanistan have shown that combat exposure and exposure to deployment-related stressors increase the risk for the development of mental health symptoms. The aim of this study is to assess the prevalence of mental health symptoms in a cohort of Dutch military personnel prior to and at multiple time-points after deployment. Military personnel (n=994) completed various questionnaires at 5 time-points; starting prior to deployment and following the same cohort at 1 and 6 months and 1 and 2 years after their return from Afghanistan. The prevalence of symptoms of fatigue, PTSD, hostility, depression and anxiety was found to significantly increase after deployment compared with pre-deployment rates. As opposed to depressive symptoms and fatigue, the prevalence of PTSD was found to decrease after the 6-month assessment. The prevalence of sleeping problems and hostility remained relatively stable. The prevalence of mental health symptoms in military personnel increases after deployment, however, symptoms progression over time appears to be specific for various mental health symptoms. Comprehensive screening and monitoring for a wide range of mental health symptoms at multiple time-points after deployment is essential for early detection and to provide opportunities for intervention. This project was funded by the Dutch Ministry of Defence. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  14. Gender differences in the risk and protective factors associated with PTSD: a prospective study of National Guard troops deployed to Iraq.

    PubMed

    Kline, Anna; Ciccone, Donald S; Weiner, Marc; Interian, Alejandro; St Hill, Lauren; Falca-Dodson, Maria; Black, Christopher M; Losonczy, Miklos

    2013-01-01

    This study examines gender differences in post-traumatic stress symptoms (PTSS) and PTSS risk/protective factors among soldiers deployed to Iraq. We pay special attention to two potentially modifiable military factors, military preparedness and unit cohesion, which may buffer the deleterious psychological effects of combat. Longitudinal data were collected on 922 New Jersey National Guard soldiers (91 women) deployed to Iraq in 2008. Anonymous surveys administered at pre- and post-deployment included the PTSD Checklist (PCL), the Unit Support Scale, and a preparedness scale adapted from the Iowa Gulf War Study. Bivariate analyses and hierarchical multiple regression were used to identify predictors of PTSS and their explanatory effects on the relationship between gender and PTSS. Women had a higher prevalence of probable post-deployment PTSD than men (18.7% vs. 8.7%; OR = 2.45; CI [1.37, 4.37]) and significantly higher post-deployment PTSS (33.73 vs. 27.37; p = .001). While there were no gender differences in combat exposure, women scored higher on pre-deployment PTSS (26.9 vs. 23.1; p ≤ .001) and lower on military preparedness (1.65 vs. 2.41; p ≤ .001) and unit cohesion (32.5 vs. 38.1; p ≤ .001). In a multivariate model, controlling for all PTSS risk/resilience factors reduced the gender difference as measured by the unstandardized Beta (B) by 45%, with 18% uniquely attributable to low cohesion and low preparedness. In the fully controlled model, gender remained a significant predictor of PTSS but the effect size was small (d = .26). Modifiable military institutional factors may account for much of the increased vulnerability of women soldiers to PTSD.

  15. Sheath-Based Rollable Lenticular-Shaped and Low-Stiction Composite Boom

    NASA Technical Reports Server (NTRS)

    Fernandez, Juan M. (Inventor)

    2018-01-01

    Various embodiments provide rollable and deployable composite booms that may be used in a wide range of applications both for space and terrestrial structural solutions. Various embodiment composite booms may be bistable, i.e. having a stable strain energy minimum in the coiled configuration as well as the in the deployed configuration. In various embodiments, a boom may be fabricated by aligning two independent tape-springs front-to-front encircled by a durable seamless polymer sleeve. The durable seamless polymer sleeve may allow the two tape-springs to slide past each other during the coiling/deployment process so as to reduce, e.g., minimize, shear and its derived problems.

  16. Assessment of the trade-offs and synergies between low-carbon power sector transition and land and water resources of the United Kingdom using the "ForeseerTM" approach

    NASA Astrophysics Data System (ADS)

    Konadu, D. D.; Sobral Mourao, Z.

    2016-12-01

    Transitioning to a low-carbon power system has been identified as one of the main strategies for achieving GHG emissions reduction targets stipulated in the UK Climate Change Act (2008). However, projected mix of technologies aimed at achieving the targeted level of decarbonisation have implications for sustainable level natural resource exploitation at different spatial and temporal scales. Critical among these are the impact on land use (food production) and water resources, which are usually not adequately analysed and accounted for in developing these long-term energy system transition strategies and scenarios. Given the importance of the UK power sector to meeting economy-wide emissions targets, the overall environmental consequence of the prescribed scenarios could significantly affect meeting long-term legislated GHG emission reduction targets. It is therefore imperative that synergies and trade-offs between the power systems and these resources are comprehensively analysed. The current study employs an integrated energy and resource use accounting methodology, called ForeseerTM, to assess the land and water requirement for the deployment of the power sector technologies of the UK Committee on Climate Change (CCC) Carbon Budget scenarios. This is analysed under different scenarios of energy crop yield and electricity infrastructure location. The outputs are then compared with sustainable limits of resource exploitation to establish the environmental tractability of the scenarios. The results show that even if stringent environmental and land use restrictions are applied, all the projected bioenergy and ground-mounted solar PV can be deployed within the UK with no significant impacts on land use and food production. However, inland water resources would be significantly affected if high Carbon Capture and Storage deployment, and without new nuclear capacity. Overall, the output highlights that contrary to the notion of the inevitability of CCS deployment in delivering emissions reduction targets, a future without CCS poses the least overall environmental impacts.

  17. Global economic consequences of deploying bioenergy with carbon capture and storage (BECCS)

    NASA Astrophysics Data System (ADS)

    Muratori, Matteo; Calvin, Katherine; Wise, Marshall; Kyle, Page; Edmonds, Jae

    2016-09-01

    Bioenergy with carbon capture and storage (BECCS) is considered a potential source of net negative carbon emissions and, if deployed at sufficient scale, could help reduce carbon dioxide emissions and concentrations. However, the viability and economic consequences of large-scale BECCS deployment are not fully understood. We use the Global Change Assessment Model (GCAM) integrated assessment model to explore the potential global and regional economic impacts of BECCS. As a negative-emissions technology, BECCS would entail a net subsidy in a policy environment in which carbon emissions are taxed. We show that by mid-century, in a world committed to limiting climate change to 2 °C, carbon tax revenues have peaked and are rapidly approaching the point where climate mitigation is a net burden on general tax revenues. Assuming that the required policy instruments are available to support BECCS deployment, we consider its effects on global trade patterns of fossil fuels, biomass, and agricultural products. We find that in a world committed to limiting climate change to 2 °C, the absence of CCS harms fossil-fuel exporting regions, while the presence of CCS, and BECCS in particular, allows greater continued use and export of fossil fuels. We also explore the relationship between carbon prices, food-crop prices and use of BECCS. We show that the carbon price and biomass and food crop prices are directly related. We also show that BECCS reduces the upward pressure on food crop prices by lowering carbon prices and lowering the total biomass demand in climate change mitigation scenarios. All of this notwithstanding, many challenges, both technical and institutional, remain to be addressed before BECCS can be deployed at scale.

  18. Development and Deployment of a Portable Water Isotope Analyzer for Accurate, Continuous and High-Frequency Oxygen and Hydrogen Isotope Measurements in Water Vapor and Liquid Water

    NASA Astrophysics Data System (ADS)

    Dong, Feng; Baer, Douglas

    2010-05-01

    Stable isotopes of water in liquid and vapor samples are powerful tracers to investigate the hydrological cycle and ecological processes. Therefore, continuous, in-situ and accurate measurements of del_18O and del_2H are critical to advance the understanding of water cycle dynamics around the globe. Furthermore, the combination of meteorological techniques and high-frequency isotopic water measurements can provide detailed time-resolved information on the eco-physiological performance of plants and enable improved understanding of water fluxes at ecosystem scales. In this work, we present recent laboratory development and field deployment of a novel Water Vapor Isotope Analyzer (WVIA), based on cavity enhanced laser absorption spectroscopy, capable of simultaneous in-situ measurements of del_18O and del_2H and water mixing ratio with high precision and high frequency (up to 10 Hz measurement rate). In addition, to ensure the accuracy of the water vapor isotope measurements, a novel Water Vapor Isotope Standard Source (WVISS), based on the instantaneous evaporation of micro-droplets of liquid water (with known isotope composition), has been developed to provide the reference water vapor with widely adjustable mixing ratio (500-30,000 ppmv) for real-time calibration of the WVIA. The comprehensive system that includes the WVIA and WVISS has been validated in extensive laboratory and field studies to be insensitive to ambient temperature changes (5-40 C) and to changes in water mixing ratio over a wide range of mixing ratios. In addition, by operating in the dual inlet mode, measurement drift has essentially been eliminated. The system (WVIA+WVISS) has also been deployed for long-term unattended continuous measurements in the field. In addition to water vapor isotope measurements, the new Water Vapor Isotopic Standard Source (WVISS) may be combined with the WVIA to provide continuous isotopic measurements of liquid water samples at rapid data rate. The availability of these new field instruments provides new opportunities for detailed continuous measurements of the hydrological cycle and ecological systems.

  19. Knowledge Creation and Deployment in the Small, but Growing, Enterprise and the Psychological Contract

    ERIC Educational Resources Information Center

    Leach, Tony

    2010-01-01

    This paper contains an account of a small scale investigation into the usefulness of the concepts of the learning organisation and organisational learning when seeking to describe the processes of knowledge creation and deployment within the small, but growing, enterprise (SME). A review of the literature reveals a concern that the relationship…

  20. Integrating personalized medical test contents with XML and XSL-FO.

    PubMed

    Toddenroth, Dennis; Dugas, Martin; Frankewitsch, Thomas

    2011-03-01

    In 2004 the adoption of a modular curriculum at the medical faculty in Muenster led to the introduction of centralized examinations based on multiple-choice questions (MCQs). We report on how organizational challenges of realizing faculty-wide personalized tests were addressed by implementation of a specialized software module to automatically generate test sheets from individual test registrations and MCQ contents. Key steps of the presented method for preparing personalized test sheets are (1) the compilation of relevant item contents and graphical media from a relational database with database queries, (2) the creation of Extensible Markup Language (XML) intermediates, and (3) the transformation into paginated documents. The software module by use of an open source print formatter consistently produced high-quality test sheets, while the blending of vectorized textual contents and pixel graphics resulted in efficient output file sizes. Concomitantly the module permitted an individual randomization of item sequences to prevent illicit collusion. The automatic generation of personalized MCQ test sheets is feasible using freely available open source software libraries, and can be efficiently deployed on a faculty-wide scale.

  1. Swarm Deployable Boom Assembly (DBA) Development of a Deployable Magnetometer Boom for the Swarm Spacecraft

    NASA Astrophysics Data System (ADS)

    McMahon, Paul; Jung, Hans-Juergen; Edwards, Jeff

    2013-09-01

    The Swarm programme consists of 3 magnetically clean satellites flying in close formation designed to measure the Earth's magnetic field using 2 Magnetometers mounted on a 4.3m long deployable boom.Deployment is initiated by releasing 3 HDRMs, once released the boom oscillates back and forth on a pair of pivots, similar to a restaurant kitchen door hinge, for around 120 seconds before coming to rest on 3 kinematic mounts which are used to provide an accurate reference location in the deployed position. Motion of the boom is damped through a combination of friction, spring hysteresis and flexing of the 120+ cables crossing the hinge. Considerable development work and accurate numerical modelling of the hinge motion was required to predict performance across a wide temperature range and ensure that during the 1st overshoot the boom did not damage itself, the harness or the spacecraft.Due to the magnetic cleanliness requirements of the spacecraft no magnetic materials could be used in the design of the hardware.

  2. Methodology for fleet deployment decisions. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, J.; Matousek, M.

    1995-01-01

    In today`s more competitive energy market, selecting investment and operating plans for a generating system, specific plants, and major plant components is becoming increasingly critical and complex. As utilities consider off-system sales, the key factor for fleet deployment decisions is no longer simply minimizing revenue requirements. Rather, system-level value dominates. This is a measure that can be difficult to determine in the context of traditional decision making methods. Selecting the best fleet deployment option requires the ability to account for multiple sources of value under uncertain conditions for multiple utility stakeholders. The object of this paper was to develope andmore » test an approach for assessing the system-wide value of alternative fleet deployment decisions. This was done, and the approach was tested at Consolidated Edison and at Central Illinois Public Service Company.« less

  3. Coverage-guaranteed sensor node deployment strategies for wireless sensor networks.

    PubMed

    Fan, Gaojuan; Wang, Ruchuan; Huang, Haiping; Sun, Lijuan; Sha, Chao

    2010-01-01

    Deployment quality and cost are two conflicting aspects in wireless sensor networks. Random deployment, where the monitored field is covered by randomly and uniformly deployed sensor nodes, is an appropriate approach for large-scale network applications. However, their successful applications depend considerably on the deployment quality that uses the minimum number of sensors to achieve a desired coverage. Currently, the number of sensors required to meet the desired coverage is based on asymptotic analysis, which cannot meet deployment quality due to coverage overestimation in real applications. In this paper, we first investigate the coverage overestimation and address the challenge of designing coverage-guaranteed deployment strategies. To overcome this problem, we propose two deployment strategies, namely, the Expected-area Coverage Deployment (ECD) and BOundary Assistant Deployment (BOAD). The deployment quality of the two strategies is analyzed mathematically. Under the analysis, a lower bound on the number of deployed sensor nodes is given to satisfy the desired deployment quality. We justify the correctness of our analysis through rigorous proof, and validate the effectiveness of the two strategies through extensive simulation experiments. The simulation results show that both strategies alleviate the coverage overestimation significantly. In addition, we also evaluate two proposed strategies in the context of target detection application. The comparison results demonstrate that if the target appears at the boundary of monitored region in a given random deployment, the average intrusion distance of BOAD is considerably shorter than that of ECD with the same desired deployment quality. In contrast, ECD has better performance in terms of the average intrusion distance when the invasion of intruder is from the inside of monitored region.

  4. Digimarc Discover on Google Glass

    NASA Astrophysics Data System (ADS)

    Rogers, Eliot; Rodriguez, Tony; Lord, John; Alattar, Adnan

    2015-03-01

    This paper reports on the implementation of the Digimarc® Discover platform on Google Glass, enabling the reading of a watermark embedded in a printed material or audio. The embedded watermark typically contains a unique code that identifies the containing media or object and a synchronization signal that allows the watermark to be read robustly. The Digimarc Discover smartphone application can read the watermark from a small portion of printed image presented at any orientation or reasonable distance. Likewise, Discover can read the recently introduced Digimarc Barcode to identify and manage consumer packaged goods in the retail channel. The Digimarc Barcode has several advantages over the traditional barcode and is expected to save the retail industry millions of dollars when deployed at scale. Discover can also read an audio watermark from ambient audio captured using a microphone. The Digimarc Discover platform has been widely deployed on the iPad, iPhone and many Android-based devices, but it has not yet been implemented on a head-worn wearable device, such as Google Glass. Implementing Discover on Google Glass is a challenging task due to the current hardware and software limitations of the device. This paper identifies the challenges encountered in porting Discover to the Google Glass and reports on the solutions created to deliver a prototype implementation.

  5. The stentable in vitro artery: an instrumented platform for endovascular device development and optimization.

    PubMed

    Antoine, Elizabeth E; Cornat, François P; Barakat, Abdul I

    2016-12-01

    Although vascular disease is a leading cause of mortality, in vitro tools for controlled, quantitative studies of vascular biological processes in an environment that reflects physiological complexity remain limited. We developed a novel in vitro artery that exhibits a number of unique features distinguishing it from tissue-engineered or organ-on-a-chip constructs, most notably that it allows deployment of endovascular devices including stents, quantitative real-time tracking of cellular responses and detailed measurement of flow velocity and lumenal shear stress using particle image velocimetry. The wall of the stentable in vitro artery consists of an annular collagen hydrogel containing smooth muscle cells (SMCs) and whose lumenal surface is lined with a monolayer of endothelial cells (ECs). The system has in vivo dimensions and physiological flow conditions and allows automated high-resolution live imaging of both SMCs and ECs. To demonstrate proof-of-concept, we imaged and quantified EC wound healing, SMC motility and altered shear stresses on the endothelium after deployment of a coronary stent. The stentable in vitro artery provides a unique platform suited for a broad array of research applications. Wide-scale adoption of this system promises to enhance our understanding of important biological events affecting endovascular device performance and to reduce dependence on animal studies. © 2016 The Author(s).

  6. Interventional magnetic resonance imaging-guided cell transplantation into the brain with radially branched deployment.

    PubMed

    Silvestrini, Matthew T; Yin, Dali; Martin, Alastair J; Coppes, Valerie G; Mann, Preeti; Larson, Paul S; Starr, Philip A; Zeng, Xianmin; Gupta, Nalin; Panter, S S; Desai, Tejal A; Lim, Daniel A

    2015-01-01

    Intracerebral cell transplantation is being pursued as a treatment for many neurological diseases, and effective cell delivery is critical for clinical success. To facilitate intracerebral cell transplantation at the scale and complexity of the human brain, we developed a platform technology that enables radially branched deployment (RBD) of cells to multiple target locations at variable radial distances and depths along the initial brain penetration tract with real-time interventional magnetic resonance image (iMRI) guidance. iMRI-guided RBD functioned as an "add-on" to standard neurosurgical and imaging workflows, and procedures were performed in a commonly available clinical MRI scanner. Multiple deposits of super paramagnetic iron oxide beads were safely delivered to the striatum of live swine, and distribution to the entire putamen was achieved via a single cannula insertion in human cadaveric heads. Human embryonic stem cell-derived dopaminergic neurons were biocompatible with the iMRI-guided RBD platform and successfully delivered with iMRI guidance into the swine striatum. Thus, iMRI-guided RBD overcomes some of the technical limitations inherent to the use of straight cannulas and standard stereotactic targeting. This platform technology could have a major impact on the clinical translation of a wide range of cell therapeutics for the treatment of many neurological diseases.

  7. Accommodating Thickness in Origami-Based Deployable Arrays

    NASA Technical Reports Server (NTRS)

    Zirbel, Shannon A.; Magleby, Spencer P.; Howell, Larry L.; Lang, Robert J.; Thomson, Mark W.; Sigel, Deborah A.; Walkemeyer, Phillip E.; Trease, Brian P.

    2013-01-01

    The purpose of this work is to create deployment systems with a large ratio of stowed-to-deployed diameter. Deployment from a compact form to a final flat state can be achieved through origami-inspired folding of panels. There are many models capable of this motion when folded in a material with negligible thickness; however, when the application requires the folding of thick, rigid panels, attention must be paid to the effect of material thickness not only on the final folded state, but also during the folding motion (i.e., the panels must not be required to flex to attain the final folded form). The objective is to develop new methods for deployment from a compact folded form to a large circular array (or other final form). This paper describes a mathematical model for modifying the pattern to accommodate material thickness in the context of the design, modeling, and testing of a deployable system inspired by an origami six-sided flasher model. The model is demonstrated in hardware as a 1/20th scale prototype of a deployable solar array for space applications. The resulting prototype has a ratio of stowed-to-deployed diameter of 9.2 (or 1.25 m deployed outer diameter to 0.136 m stowed outer diameter).

  8. Posttraumatic Stress Symptoms Among National Guard Soldiers Deployed to Iraq: Associations with Parenting Behaviors and Couple Adjustment

    PubMed Central

    Gewirtz, Abigail H.; Polusny, Melissa A.; DeGarmo, David S.; Khaylis, Anna; Erbes, Christopher R.

    2011-01-01

    Objective This article reports findings from a one-year longitudinal study examining the impact of change in PTSD symptoms following combat deployment on National Guard soldiers’ perceived parenting, and couple adjustment one year following return from Iraq. Method Participants were 468 Army National Guard fathers from a Brigade Combat Team (mean age 36 years; median deployment length 16 months; 89% European American, 5% African American, 6% Hispanic American). Participants completed an in-theater survey one month before returning home from OIF deployment (Time 1), and again, one year post-deployment (Time 2). The PTSD Checklist-Military Version (PCL-M; Weathers, Litz, Herman, Huska, & Keane, 1993) was gathered at both times, and two items assessing social support were gathered at baseline only. At Time 2, participants also completed self-report measures of parenting (Alabama Parenting Questionnaire—Short Form; Elgar, Waschbusch, Dadds, & Sigvaldason, 2007), couple adjustment (Dyadic Adjustment Scale-7; Sharpley & Rogers, 1984; Spanier, 1976), parent-child relationship quality (4 items from the Social Adjustment Scale-Self Report; Weissman & Bothwell, 1976), alcohol use (Alcohol Use Disorders Identification Test; Babor, Higgins-Biddle, Saunders, & Monteiro, 2001), and items assessing injuries sustained while deployed. Results Structural equation modeling analyses showed that increases in PTSD symptoms were associated with poorer couple adjustment and greater perceived parenting challenges at Time 2 (both at p<.001). Furthermore, PTSD symptoms predicted parenting challenges independent of their impact on couple adjustment. Conclusions Findings highlight the importance of investigating and intervening to support parenting and couple adjustment among combat-affected National Guard families. PMID:20873896

  9. Laboratory Study on the Effect of Tidal Stream Turbines on Hydrodynamics and Sediment Dynamics

    NASA Astrophysics Data System (ADS)

    Amoudry, L.; Ramirez-Mendoza, R.; Peter, T.; McLelland, S.; Simmons, S.; Parsons, D. R.; Vybulkova, L.

    2016-02-01

    Tidal stream turbines (TST) are one potential technology for harnessing tidal energy, and the measurement and characterisation of their wakes is important both for environmental and development reasons. Indeed, wake recovery length is an important parameter for appropriate design of arrays, and wakes may result in altered dynamics both in the water column and at the seabed. We will report on laboratory scale experiments over a mobile sediment bed, which aim to quantify the detailed wake structure and its impact on sediment transport dynamics. A 0.2 m diameter model turbine was installed in a large-scale flume (16 m long, 1.6 m wide, 0.6 m deep) at the University of Hull's Total Environment Simulator and a steady current was driven over an artificial sediment bed using recirculating pumps. A high-resolution pulse-coherent acoustic Doppler profiler (Nortek Aquadopp HR) was used to measure vertical profiles of the three-dimensional mean current at different locations downstream of the model turbine. A three-dimensional Acoustic Ripple Profiler was used to map the bed and its evolution during the experiments. Acoustic backscatter systems were also deployed in two-dimensional arrays both along the flume and across the flume. These measurements revealed that the presence of the model turbine resulted in an expected reduction of the mean current and in changes in the vertical shear profiles. The bed mapping highlighted a horseshoe-shaped scour near the model turbine, and sediment deposition in the far wake region. The model turbine significantly influenced the suspension patterns, and generated significant asymmetry in the process, which was also evident from the other measurements (flow and sediment bed). These results highlight the effects induced by TSTs on near-bed hydrodynamics, suspension dynamics, and geomorphology, which may all have to be considered prior to large-scale deployments of arrays of TSTs in shelf seas.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.

    The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.

  11. Research in Observations of Oceanic Air/Sea Interaction

    NASA Technical Reports Server (NTRS)

    Long, David G.; Arnold, David V.

    1995-01-01

    The primary purpose of this research has been: (1) to develop an innovative research radar scatterometer system capable of directly measuring both the radar backscatter and the small-scale and large-scale ocean wave field simultaneously and (2) deploy this instrument to collect data to support studies of air/sea interaction. The instrument has been successfully completed and deployed. The system deployment lasted for six months during 1995. Results to date suggest that the data is remarkably useful in air/sea interaction studies. While the data analysis is continuing, two journal and fifteen conference papers have been published. Six papers are currently in review with two additional journal papers scheduled for publication. Three Master's theses on this research have been completed. A Ph.D. student is currently finalizing his dissertation which should be completed by the end of the calendar year. We have received additional 'mainstream' funding from the NASA oceans branch to continue data analysis and instrument operations. We are actively pursuing results from the data expect additional publications to follow. This final report briefly describes the instrument system we developed and results to-date from the deployment. Additional detail is contained in the attached papers selected from the bibliography.

  12. Challenges in Obtaining Estimates of the Risk of Tuberculosis Infection During Overseas Deployment.

    PubMed

    Mancuso, James D; Geurts, Mia

    2015-12-01

    Estimates of the risk of tuberculosis (TB) infection resulting from overseas deployment among U.S. military service members have varied widely, and have been plagued by methodological problems. The purpose of this study was to estimate the incidence of TB infection in the U.S. military resulting from deployment. Three populations were examined: 1) a unit of 2,228 soldiers redeploying from Iraq in 2008, 2) a cohort of 1,978 soldiers followed up over 5 years after basic training at Fort Jackson in 2009, and 3) 6,062 participants in the 2011-2012 National Health and Nutrition Examination Survey (NHANES). The risk of TB infection in the deployed population was low-0.6% (95% confidence interval [CI]: 0.1-2.3%)-and was similar to the non-deployed population. The prevalence of latent TB infection (LTBI) in the U.S. population was not significantly different among deployed and non-deployed veterans and those with no military service. The limitations of these retrospective studies highlight the challenge in obtaining valid estimates of risk using retrospective data and the need for a more definitive study. Similar to civilian long-term travelers, risks for TB infection during deployment are focal in nature, and testing should be targeted to only those at increased risk. © The American Society of Tropical Medicine and Hygiene.

  13. Rapidly deployable emergency communication system

    DOEpatents

    Gladden, Charles A.; Parelman, Martin H.

    1979-01-01

    A highly versatile, highly portable emergency communication system which permits deployment in a very short time to cover both wide areas and distant isolated areas depending upon mission requirements. The system employs a plurality of lightweight, fully self-contained repeaters which are deployed within the mission area to provide communication between field teams, and between each field team and a mobile communication control center. Each repeater contains a microcomputer controller, the program for which may be changed from the control center by the transmission of digital data within the audible range (300-3,000 Hz). Repeaters are accessed by portable/mobile transceivers, other repeaters, and the control center through the transmission and recognition of digital data code words in the subaudible range.

  14. Environmental Co-Benefit Opportunities of Solar Energy

    NASA Astrophysics Data System (ADS)

    Hernandez, R. R.; Armstrong, A.; Burney, J. A.; Easter, S. B.; Hoffacker, M. K.; Moore, K. A.

    2015-12-01

    Solar energy reduces greenhouse gas emissions by an order of magnitude when substituted for fossil fuels. Nonetheless, the strategic deployment of solar energy—from single, rooftop modules to utility-scale solar energy power plants—can confer additional environmental co-benefits beyond its immediate use as a low carbon energy source. In this study, we identify a diverse portfolio of environmental co-benefit opportunities of solar energy technologies resulting from synergistic innovations in land, food, energy, and water systems. For each opportunity, we provide a demonstrative, quantitative framework for environmental co-benefit valuation—including, equations, models, or case studies for estimating carbon dioxide equivalent (CO2-eq) and cost savings ($US) averted by environmental co-benefit opportunities of solar energy—and imminent research questions to improve certainty of valuations. As land-energy-food-water nexus issues are increasingly exigent in 21st century, we show that environmental co-benefit opportunities of solar energy are feasible in numerous environments and at a wide range of spatial scales thereby able to contribute to local and regional environmental goals and for the mitigation of climate change.

  15. A Survey on Virtualization of Wireless Sensor Networks

    PubMed Central

    Islam, Md. Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam

    2012-01-01

    Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization. PMID:22438759

  16. A survey on virtualization of Wireless Sensor Networks.

    PubMed

    Islam, Md Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam

    2012-01-01

    Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization.

  17. Observations of thunderstorm-related 630 nm airglow depletions

    NASA Astrophysics Data System (ADS)

    Kendall, E. A.; Bhatt, A.

    2015-12-01

    The Midlatitude All-sky imaging Network for Geophysical Observations (MANGO) is an NSF-funded network of 630 nm all-sky imagers in the continental United States. MANGO will be used to observe the generation, propagation, and dissipation of medium and large-scale wave activity in the subauroral, mid and low-latitude thermosphere. This network is actively being deployed and will ultimately consist of nine all-sky imagers. These imagers form a network providing continuous coverage over the western United States, including California, Oregon, Washington, Utah, Arizona and Texas extending south into Mexico. This network sees high levels of both medium and large scale wave activity. Apart from the widely reported northeast to southwest propagating wave fronts resulting from the so called Perkins mechanism, this network observes wave fronts propagating to the west, north and northeast. At least three of these anomalous events have been associated with thunderstorm activity. Imager data has been correlated with both GPS data and data from the AIRS (Atmospheric Infrared Sounder) instrument on board NASA's Earth Observing System Aqua satellite. We will present a comprehensive analysis of these events and discuss the potential thunderstorm source mechanism.

  18. Integrated genome browser: visual analytics platform for genomics.

    PubMed

    Freese, Nowlan H; Norris, David C; Loraine, Ann E

    2016-07-15

    Genome browsers that support fast navigation through vast datasets and provide interactive visual analytics functions can help scientists achieve deeper insight into biological systems. Toward this end, we developed Integrated Genome Browser (IGB), a highly configurable, interactive and fast open source desktop genome browser. Here we describe multiple updates to IGB, including all-new capabilities to display and interact with data from high-throughput sequencing experiments. To demonstrate, we describe example visualizations and analyses of datasets from RNA-Seq, ChIP-Seq and bisulfite sequencing experiments. Understanding results from genome-scale experiments requires viewing the data in the context of reference genome annotations and other related datasets. To facilitate this, we enhanced IGB's ability to consume data from diverse sources, including Galaxy, Distributed Annotation and IGB-specific Quickload servers. To support future visualization needs as new genome-scale assays enter wide use, we transformed the IGB codebase into a modular, extensible platform for developers to create and deploy all-new visualizations of genomic data. IGB is open source and is freely available from http://bioviz.org/igb aloraine@uncc.edu. © The Author 2016. Published by Oxford University Press.

  19. Application-level regression testing framework using Jenkins

    DOE PAGES

    Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen

    2017-09-26

    Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less

  20. Application-level regression testing framework using Jenkins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budiardja, Reuben; Bouvet, Timothy; Arnold, Galen

    Monitoring and testing for regression of large-scale systems such as the NCSA's Blue Waters supercomputer are challenging tasks. In this paper, we describe the solution we came up with to perform those tasks. The goal was to find an automated solution for running user-level regression tests to evaluate system usability and performance. Jenkins, an automation server software, was chosen for its versatility, large user base, and multitude of plugins including collecting data and plotting test results over time. We also describe our Jenkins deployment to launch and monitor jobs on remote HPC system, perform authentication with one-time password, and integratemore » with our LDAP server for its authorization. We show some use cases and describe our best practices for successfully using Jenkins as a user-level system-wide regression testing and monitoring framework for large supercomputer systems.« less

  1. Biology-Inspired Distributed Consensus in Massively-Deployed Sensor Networks

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng

    2005-01-01

    Promises of ubiquitous control of the physical environment by large-scale wireless sensor networks open avenues for new applications that are expected to redefine the way we live and work. Most of recent research has concentrated on developing techniques for performing relatively simple tasks in small-scale sensor networks assuming some form of centralized control. The main contribution of this work is to propose a new way of looking at large-scale sensor networks, motivated by lessons learned from the way biological ecosystems are organized. Indeed, we believe that techniques used in small-scale sensor networks are not likely to scale to large networks; that such large-scale networks must be viewed as an ecosystem in which the sensors/effectors are organisms whose autonomous actions, based on local information, combine in a communal way to produce global results. As an example of a useful function, we demonstrate that fully distributed consensus can be attained in a scalable fashion in massively deployed sensor networks where individual motes operate based on local information, making local decisions that are aggregated across the network to achieve globally-meaningful effects.

  2. Large deployable antenna program. Phase 1: Technology assessment and mission architecture

    NASA Technical Reports Server (NTRS)

    Rogers, Craig A.; Stutzman, Warren L.

    1991-01-01

    The program was initiated to investigate the availability of critical large deployable antenna technologies which would enable microwave remote sensing missions from geostationary orbits as required for Mission to Planet Earth. Program goals for the large antenna were: 40-meter diameter, offset-fed paraboloid, and surface precision of 0.1 mm rms. Phase 1 goals were: to review the state-of-the-art for large, precise, wide-scanning radiometers up to 60 GHz; to assess critical technologies necessary for selected concepts; to develop mission architecture for these concepts; and to evaluate generic technologies to support the large deployable reflectors necessary for these missions. Selected results of the study show that deployable reflectors using furlable segments are limited by surface precision goals to 12 meters in diameter, current launch vehicles can place in geostationary only a 20-meter class antenna, and conceptual designs using stiff reflectors are possible with areal densities of 2.4 deg/sq m.

  3. Testing the Deployment Repeatability of a Precision Deployable Boom Prototype for the Proposed SWOT Karin Instrument

    NASA Technical Reports Server (NTRS)

    Agnes, Gregory S.; Waldman, Jeff; Hughes, Richard; Peterson, Lee D.

    2015-01-01

    NASA's proposed Surface Water Ocean Topography (SWOT) mission, scheduled to launch in 2020, would provide critical information about Earth's oceans, ocean circulation, fresh water storage, and river discharge. The mission concept calls for a dual-antenna Ka-band radar interferometer instrument, known as KaRIn, that would map the height of water globally along two 50 km wide swaths. The KaRIn antennas, which would be separated by 10 meters on either side of the spacecraft, would need to be precisely deployable in order to meet demanding pointing requirements. Consequently, an effort was undertaken to design build and prototype a precision deployable Mast for the KaRIn instrument. Each mast was 4.5-m long with a required dilitation stability of 2.5 microns over 3 minutes. It required a minimum first mode of 7 Hz. Deployment repeatability was less than +/- 7 arcsec in all three rotation directions. Overall mass could not exceed 41.5 Kg including any actuators and thermal blanketing. This set of requirements meant the boom had to be three times lighter and two orders of magnitude more precise than the existing state of the art for deployable booms.

  4. Field intercomparison of ammonia passive samplers: results and lessons learned.

    NASA Astrophysics Data System (ADS)

    Stephens, Amy; Leeson, Sarah; Jones, Matthew; van Dijk, Netty; Kentisbeer, John; Twigg, Marsailidh; Simmons, Ivan; Braban, Christine; Martin, Nick; Poskitt, Janet; Ferm, Martin; Seitler, Eva; Sacco, Paolo; Gates, Linda; Stolk, Ariën; Stoll, Jean-Marc; Tang, Sim

    2017-04-01

    Ammonia pollution contributes significantly to eutrophication and acidification of ecosystems with resultant losses of biodiversity and ecosystem changes. Monitoring of ambient ammonia over a wide spatial and long temporal scales is primarily done with low-cost diffusive samplers. Less frequently, surface flux measurements of ammonia can be made using passive samplers at plot scale. This paper will present a field intercomparison conducted within the MetNH3 project to assess the performance of passive samplers for ambient measurements of ammonia. Eight different designs of commercial passive samplers housed in shelters provided by the manufacturer/laboratory were exposed over an 8-week period at the Whim experimental field site in Scotland between August and October 2016. Whim Bog has a facility in place for controlled releases of ammonia (http://www.whimbog.ceh.ac.uk/). Automated conditional release from the line source occurs when the wind direction in the preceding minute is from the northeast (wind sector 180-215°) and wind speed is > 5 m s-1. The passive samplers were exposed at different distances from the release source (16, 32 and 60 m) and also at a background location. Most were exposed for 2 x 4-week long periods and some for 4 x 2-week long periods. At the 32 m position, an active denuder method, the CEH DELTA sampler and a continuous high temporal resolution wet chemistry ammonia instrument (AiRRmonia, Mechatronics, NL.) were also deployed alongside the passive samplers to provide reference measurements of ammonia. Results are presented within the context of the MetNH3 CATFAC controlled laboratory exposure assessments. The results are discussed in terms of typical deployments of passive samplers and quality control. Measurement for policy evidence for both local and regional studies using passive samplers are discussed.

  5. The Role of Environmental Forcing in Controlling Water Retention Gyres in Subsystems of Narragansett Bay

    NASA Astrophysics Data System (ADS)

    Balt, C.; Kincaid, C. R.; Ullman, D. S.

    2010-12-01

    Greenwich Bay and the Providence River represent two subsystems of the Narragansett Bay (RI) estuary with chronic water quality problems. Both underway and moored Acoustic Doppler Current Profiler (ADCP) observations have shown the presence of large-scale, subtidal gyres within these subsystems. Prior numerical models of Narragansett Bay, developed using the Regional Ocean Modeling System (ROMS), indicate that prevailing summer sea breeze conditions are favorable to the evolution of stable circulation gyres, which increase retention times within each subsystem. Fluid dynamics laboratory models of the Providence River, conducted in the Geophysical Fluid Dynamics Laboratory of the Research School of Earth Sciences (Australian National University), reproduce gyres that match first order features of the ADCP data. These laboratory models also reveal details of small-scale eddies along the edges of the retention gyre. We report results from spatially and temporally detailed current meter deployments (using SeaHorse Tilt Current Meters) in both subsystems, which reveal details on the growth and decay of gyres under various spring-summer forcing conditions. In particular, current meters were deployed during the severe flooding events in the Narragansett Bay watershed during March, 2010. A combination of current meter data and high-resolution ROMS modeling is used to show how gyres effectively limit subtidal exchange from the Providence River and Greenwich Bay and to understand the forcing conditions that favor efficient flushing. The residence times of stable gyres within these regions can be an order of magnitude larger than values predicted by fraction of water methods. ROMS modeling is employed to characterize gyre energy, stability, and flushing rates for a wide range of seasonal, wind and runoff scenarios.

  6. A new type of tri-axial accelerometers with high dynamic range MEMS for earthquake early warning

    NASA Astrophysics Data System (ADS)

    Peng, Chaoyong; Chen, Yang; Chen, Quansheng; Yang, Jiansi; Wang, Hongti; Zhu, Xiaoyi; Xu, Zhiqiang; Zheng, Yu

    2017-03-01

    Earthquake Early Warning System (EEWS) has shown its efficiency for earthquake damage mitigation. As the progress of low-cost Micro Electro Mechanical System (MEMS), many types of MEMS-based accelerometers have been developed and widely used in deploying large-scale, dense seismic networks for EEWS. However, the noise performance of these commercially available MEMS is still insufficient for weak seismic signals, leading to the large scatter of early-warning parameters estimation. In this study, we developed a new type of tri-axial accelerometer based on high dynamic range MEMS with low noise level using for EEWS. It is a MEMS-integrated data logger with built-in seismological processing. The device is built on a custom-tailored Linux 2.6.27 operating system and the method for automatic detecting seismic events is STA/LTA algorithms. When a seismic event is detected, peak ground parameters of all data components will be calculated at an interval of 1 s, and τc-Pd values will be evaluated using the initial 3 s of P wave. These values will then be organized as a trigger packet actively sent to the processing center for event combining detection. The output data of all three components are calibrated to sensitivity 500 counts/cm/s2. Several tests and a real field test deployment were performed to obtain the performances of this device. The results show that the dynamic range can reach 98 dB for the vertical component and 99 dB for the horizontal components, and majority of bias temperature coefficients are lower than 200 μg/°C. In addition, the results of event detection and real field deployment have shown its capabilities for EEWS and rapid intensity reporting.

  7. Can the deployment of community health workers for the delivery of HIV services represent an effective and sustainable response to health workforce shortages? Results of a multicountry study.

    PubMed

    Celletti, Francesca; Wright, Anna; Palen, John; Frehywot, Seble; Markus, Anne; Greenberg, Alan; de Aguiar, Rafael Augusto Teixeira; Campos, Francisco; Buch, Eric; Samb, Badara

    2010-01-01

    In countries severely affected by HIV/AIDS, shortages of health workers present a major obstacle to scaling up HIV services. Adopting a task shifting approach for the deployment of community health workers (CHWs) represents one strategy for rapid expansion of the health workforce. This study aimed to evaluate the contribution of CHWs with a focus on identifying the critical elements of an enabling environment that can ensure they provide quality services in a manner that is sustainable. The method of work included a collection of primary data in five countries: Brazil, Ethiopia, Malawi, Namibia, and Uganda. The findings show that delegation of specific tasks to cadres of CHWs with limited training can increase access to HIV services, particularly in rural areas and among underserved communities, and can improve the quality of care for HIV. There is also evidence that CHWs can make a significant contribution to the delivery of a wide range of other health services. The findings also show that certain conditions must be observed if CHWs are to contribute to well-functioning and sustainable service delivery. These conditions involve adequate systems integration with significant attention to: political will and commitment; collaborative planning; definition of scope of practice; selection and educational requirements; registration, licensure and certification; recruitment and deployment; adequate and sustainable remuneration; mentoring and supervision including referral system; career path and continuous education; performance evaluation; supply of equipment and commodities. The study concludes that, where there is the necessary support, the potential contribution of CHWs can be optimized and represents a valuable addition to the urgent expansion of human resources for health, and to universal coverage of HIV services.

  8. Public key infrastructure for DOE security research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, R.; Foster, I.; Johnston, W.E.

    This document summarizes the Department of Energy`s Second Joint Energy Research/Defence Programs Security Research Workshop. The workshop, built on the results of the first Joint Workshop which reviewed security requirements represented in a range of mission-critical ER and DP applications, discussed commonalties and differences in ER/DP requirements and approaches, and identified an integrated common set of security research priorities. One significant conclusion of the first workshop was that progress in a broad spectrum of DOE-relevant security problems and applications could best be addressed through public-key cryptography based systems, and therefore depended upon the existence of a robust, broadly deployed public-keymore » infrastructure. Hence, public-key infrastructure ({open_quotes}PKI{close_quotes}) was adopted as a primary focus for the second workshop. The Second Joint Workshop covered a range of DOE security research and deployment efforts, as well as summaries of the state of the art in various areas relating to public-key technologies. Key findings were that a broad range of DOE applications can benefit from security architectures and technologies built on a robust, flexible, widely deployed public-key infrastructure; that there exists a collection of specific requirements for missing or undeveloped PKI functionality, together with a preliminary assessment of how these requirements can be met; that, while commercial developments can be expected to provide many relevant security technologies, there are important capabilities that commercial developments will not address, due to the unique scale, performance, diversity, distributed nature, and sensitivity of DOE applications; that DOE should encourage and support research activities intended to increase understanding of security technology requirements, and to develop critical components not forthcoming from other sources in a timely manner.« less

  9. Occupational differences in US Army suicide rates.

    PubMed

    Kessler, R C; Stein, M B; Bliese, P D; Bromet, E J; Chiu, W T; Cox, K L; Colpe, L J; Fullerton, C S; Gilman, S E; Gruber, M J; Heeringa, S G; Lewandowski-Romps, L; Millikan-Bell, A; Naifeh, J A; Nock, M K; Petukhova, M V; Rosellini, A J; Sampson, N A; Schoenbaum, M; Zaslavsky, A M; Ursano, R J

    2015-11-01

    Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate. The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009. There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2-39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2-22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1-4.1], less so when previously deployed (OR 1.6, 95% CI 1.1-2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8-1.8). Adjustment for a differential 'healthy warrior effect' cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status. Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.

  10. Occupational differences in US Army suicide rates

    PubMed Central

    Kessler, R. C.; Stein, M. B.; Bliese, P. D.; Bromet, E. J.; Chiu, W. T.; Cox, K. L.; Colpe, L. J.; Fullerton, C. S.; Gilman, S. E.; Gruber, M. J.; Heeringa, S. G.; Lewandowski-Romps, L.; Millikan-Bell, A.; Naifeh, J. A.; Nock, M. K.; Petukhova, M. V.; Rosellini, A. J.; Sampson, N. A.; Schoenbaum, M.; Zaslavsky, A. M.; Ursano, R. J.

    2016-01-01

    Background Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate. Method The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009. Results There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status. Conclusions Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk. PMID:26190760

  11. Industrial Wireless Sensors: A User's Perspective on the Impact of Standards on Wide-spread Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taft, Cyrus W.; Manges, Wayne W; Sorge, John N

    2012-01-01

    The role of wireless sensing technologies in industrial instrumentation will undoubtedly become more important in the years ahead. . Deployment of such instrumentation in an industrial setting with its heightened security and robustness criteria hinges on user acceptance of verified performance as well as meeting cost requirements. Today, industrial users face many choices when specifying a wireless sensor network, including radio performance, battery life, interoperability, security, and standards compliance. The potential market for industrial wireless sensors is literally millions of wireless instruments and it is imperative that accurate information for applying the technology to real-world applications be available to themore » end-user so that they can make informed deployment decisions. The majority of industrial wireless automation designs now being deployed or being considered for deployment are based on three different standards . The HART Communications Foundation s WirelessHART (IEC 62591), the International Society of Automation s ISA100.11a, and the offering from the Industrial Wireless Alliance of China known as WIA-PA (IEC 62601). Aside from these industrial automation standards, users must also be cognizant of the underlying wireless network standards IEEE 802.11, IEEE 802.15.4, and IEEE 802.15.3a and their interactions with the three principal industrial automation protocols mentioned previously. The crucial questions being asked by end users revolve around sensor network performance, interoperability, reliability, and security. This paper will discuss potential wireless sensor applications in power plants, barriers to the acceptance of wireless technology, concerns related to standards, and provide an end user prospective on the issues affecting wide-spread deployment of wireless sensors. Finally, the authors conclude with a discussion of a recommended path forward including how standards organizations can better facilitate end user decision making and how end users can locate and use objective information for decision making.« less

  12. New seismic instrumentation packaged for all terrestrial environments (including the quietest observatories!).

    NASA Astrophysics Data System (ADS)

    Parker, Tim; Devanney, Peter; Bainbridge, Geoff; Townsend, Bruce

    2017-04-01

    The march to make every type of seismometer, weak to strong motion, reliable and economically deployable in any terrestrial environment continues with the availability of three new sensors and seismic systems including ones with over 200dB of dynamic range. Until recently there were probably 100 pier type broadband sensors for every observatory type pier, not the types of deployments geoscientists are needing to advance science and monitoring capability. Deeper boreholes are now the recognized quieter environments for best observatory class instruments and these same instruments can now be deployed in direct burial environments which is unprecedented. The experiences of facilities in large deployments of broadband seismometers in continental scale rolling arrays proves the utility of packaging new sensors in corrosion resistant casings and designing in the robustness needed to work reliably in temporary deployments. Integrating digitizers and other sensors decreases deployment complexity, decreases acquisition and deployment costs, increases reliability and utility. We'll discuss the informed evolution of broadband pier instruments into the modern integrated field tools that enable economic densification of monitoring arrays along with supporting new ways to approach geoscience research in a field environment.

  13. Lunar surface structural concepts and construction studies

    NASA Technical Reports Server (NTRS)

    Mikulas, Martin

    1991-01-01

    The topics are presented in viewgraph form and include the following: lunar surface structures construction research areas; lunar crane related disciplines; shortcomings of typical mobile crane in lunar base applications; candidate crane cable suspension systems; NIST six-cable suspension crane; numerical example of natural frequency; the incorporation of two new features for improved performance of the counter-balanced actively-controlled lunar crane; lunar crane pendulum mechanics; simulation results; 1/6 scale lunar crane testbed using GE robot for global manipulation; basic deployable truss approaches; bi-pantograph elevator platform; comparison of elevator platforms; perspective of bi-pantograph beam; bi-pantograph synchronously deployable tower/beam; lunar module off-loading concept; module off-loader concept packaged; starburst deployable precision reflector; 3-ring reflector deployment scheme; cross-section of packaged starburst reflector; and focal point and thickness packaging considerations.

  14. Node Deployment with k-Connectivity in Sensor Networks for Crop Information Full Coverage Monitoring

    PubMed Central

    Liu, Naisen; Cao, Weixing; Zhu, Yan; Zhang, Jingchao; Pang, Fangrong; Ni, Jun

    2016-01-01

    Wireless sensor networks (WSNs) are suitable for the continuous monitoring of crop information in large-scale farmland. The information obtained is great for regulation of crop growth and achieving high yields in precision agriculture (PA). In order to realize full coverage and k-connectivity WSN deployment for monitoring crop growth information of farmland on a large scale and to ensure the accuracy of the monitored data, a new WSN deployment method using a genetic algorithm (GA) is here proposed. The fitness function of GA was constructed based on the following WSN deployment criteria: (1) nodes must be located in the corresponding plots; (2) WSN must have k-connectivity; (3) WSN must have no communication silos; (4) the minimum distance between node and plot boundary must be greater than a specific value to prevent each node from being affected by the farmland edge effect. The deployment experiments were performed on natural farmland and on irregular farmland divided based on spatial differences of soil nutrients. Results showed that both WSNs gave full coverage, there were no communication silos, and the minimum connectivity of nodes was equal to k. The deployment was tested for different values of k and transmission distance (d) to the node. The results showed that, when d was set to 200 m, as k increased from 2 to 4 the minimum connectivity of nodes increases and is equal to k. When k was set to 2, the average connectivity of all nodes increased in a linear manner with the increase of d from 140 m to 250 m, and the minimum connectivity does not change. PMID:27941704

  15. Constraints on biomass energy deployment in mitigation pathways: the case of water scarcity

    NASA Astrophysics Data System (ADS)

    Séférian, Roland; Rocher, Matthias; Guivarch, Céline; Colin, Jeanne

    2018-05-01

    To limit global warming to well below 2 ° most of the IPCC-WGIII future stringent mitigation pathways feature a massive global-scale deployment of negative emissions technologies (NETs) before the end of the century. The global-scale deployment of NETs like Biomass Energy with Carbon Capture and Storage (BECCS) can be hampered by climate constraints that are not taken into account by Integrated assessment models (IAMs) used to produce those pathways. Among the various climate constraints, water scarcity appears as a potential bottleneck for future land-based mitigation strategies and remains largely unexplored. Here, we assess climate constraints relative to water scarcity in response to the global deployment of BECCS. To this end, we confront results from an Earth system model (ESM) and an IAM under an array of 25 stringent mitigation pathways. These pathways are compatible with the Paris Agreement long-term temperature goal and with cumulative carbon emissions ranging from 230 Pg C and 300 Pg C from January 1st onwards. We show that all stylized mitigation pathways studied in this work limit warming below 2 °C or even 1.5 °C by 2100 but all exhibit a temperature overshoot exceeding 2 °C after 2050. According to the IAM, a subset of 17 emission pathways are feasible when evaluated in terms of socio-economic and technological constraints. The ESM however shows that water scarcity would limit the deployment of BECCS in all the mitigation pathways assessed in this work. Our findings suggest that the evolution of the water resources under climate change can exert a significant constraint on BECCS deployment before 2050. In 2100, the BECCS water needs could represent more than 30% of the total precipitation in several regions like Europe or Asia.

  16. Data-Driven Simulation-Enhanced Optimization of People-Based Print Production Service

    NASA Astrophysics Data System (ADS)

    Rai, Sudhendu

    This paper describes a systematic six-step data-driven simulation-based methodology for optimizing people-based service systems on a large distributed scale that exhibit high variety and variability. The methodology is exemplified through its application within the printing services industry where it has been successfully deployed by Xerox Corporation across small, mid-sized and large print shops generating over 250 million in profits across the customer value chain. Each step of the methodology consisting of innovative concepts co-development and testing in partnership with customers, development of software and hardware tools to implement the innovative concepts, establishment of work-process and practices for customer-engagement and service implementation, creation of training and infrastructure for large scale deployment, integration of the innovative offering within the framework of existing corporate offerings and lastly the monitoring and deployment of the financial and operational metrics for estimating the return-on-investment and the continual renewal of the offering are described in detail.

  17. Innovative Escapement-Based Mechanism for Micro-Antenna Boom Deployment

    NASA Technical Reports Server (NTRS)

    Tokarz, Marta; Grygorczuk, Jerzy; Jarzynka, Stanislaw; Gut, Henryk

    2014-01-01

    This paper presents the prototype of a tubular boom antenna developed for the Polish BRITE-PL satellite by the Space Research Center of the Polish Academy of Sciences (CBK PAN). What is unique about our work is that we developed an original type of the tubular boom antenna deployment mechanism that can be used widely as a basic solution for compact electrical antennas, booms deploying sensitive instruments, ultra-light planetary manipulators etc. The invented electromagnetic driving unit provides a dual complementary action - it adds extra energy to the driving spring, making the system more reliable, and at the same time it moderates the deployment speed acting as a kind of damper. That distinguishing feature predetermines the mechanism to be applied wherever the dynamic nature of a spring drive introducing dangerous vibrations and inducing severe local stress in the structure needs to be mitigated. Moreover, the paper reveals a product unique in Europe - a miniature beryllium bronze tubular boom free of geometry and strain defects, which is essential for stiffness and fatigue resistance. Both the deployment mechanism and the technology of tubular boom manufacturing are protected by patent rights.

  18. Economies of Scale and Scope in E-Learning

    ERIC Educational Resources Information Center

    Morris, David

    2008-01-01

    Economies of scale are often cited in the higher education literature as being one of the drivers for the deployment of e-learning. They are variously used to support the notions that higher education is becoming more global, that national policy towards e-learning should promote scale efficiencies, that larger institutions will be better able to…

  19. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  20. 915-MHz Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, M.; Bartholomew, M. J.; Giangrande, S.

    When considering the amount of shortwave radiation incident on a photovoltaic solar array and, therefore, the amount and stability of the energy output from the system, clouds represent the greatest source of short-term (i.e., scale of minutes to hours) variability through scattering and reflection of incoming solar radiation. Providing estimates of this short-term variability is important for determining and regulating the output from large solar arrays as they connect with the larger power infrastructure. In support of the installation of a 37-MW solar array on the grounds of Brookhaven National Laboratory (BNL), a study of the impacts of clouds onmore » the output of the solar array has been undertaken. The study emphasis is on predicting the change in surface solar radiation resulting from the observed/forecast cloud field on a 5-minute time scale. At these time scales, advection of cloud elements over the solar array is of particular importance. As part of the BNL Aerosol Life Cycle Intensive Operational Period (IOP), a 915-MHz Radar Wind Profiler (RWP) was deployed to determine the profile of low-level horizontal winds and the depth of the planetary boundary layer. The initial deployment mission of the 915-MHz RWP for cloud forecasting has been expanded the deployment to provide horizontal wind measurements for estimating and constraining cloud advection speeds. A secondary focus is on the observation of dynamics and microphysics of precipitation during cold season/winter storms on Long Island. In total, the profiler was deployed at BNL for 1 year from May 2011 through May 2012.« less

  1. 915-Mhz Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, M.; Bartholomew, M. J.; Giangrande, S.

    When considering the amount of shortwave radiation incident on a photovoltaic solar array and, therefore, the amount and stability of the energy output from the system, clouds represent the greatest source of short-term (i.e., scale of minutes to hours) variability through scattering and reflection of incoming solar radiation. Providing estimates of this short-term variability is important for determining and regulating the output from large solar arrays as they connect with the larger power infrastructure. In support of the installation of a 37-MW solar array on the grounds of Brookhaven National Laboratory (BNL), a study of the impacts of clouds onmore » the output of the solar array has been undertaken. The study emphasis is on predicting the change in surface solar radiation resulting from the observed/forecast cloud field on a 5-minute time scale. At these time scales, advection of cloud elements over the solar array is of particular importance. As part of the BNL Aerosol Life Cycle Intensive Operational Period (IOP), a 915-MHz Radar Wind Profiler (RWP) was deployed to determine the profile of low-level horizontal winds and the depth of the planetary boundary layer. The initial deployment mission of the 915-MHz RWP for cloud forecasting has been expanded the deployment to provide horizontal wind measurements for estimating and constraining cloud advection speeds. A secondary focus is on the observation of dynamics and microphysics of precipitation during cold season/winter storms on Long Island. In total, the profiler was deployed at BNL for 1 year from May 2011 through May 2012.« less

  2. Assessing PTSD in the military: Validation of a scale distributed to Danish soldiers after deployment since 1998.

    PubMed

    Karstoft, Karen-Inge; Andersen, Søren B; Nielsen, Anni B S

    2017-06-01

    Since 1998, soldiers deployed to war zones with the Danish Defense (≈31,000) have been invited to fill out a questionnaire on post-mission reactions. This provides a unique data source for studying the psychological toll of war. Here, we validate a measure of PTSD-symptoms from the questionnaire. Soldiers from two cohorts deployed to Afghanistan with the International Security Assistance Force (ISAF) in 2009 (ISAF7, N = 334) and 2013 (ISAF15, N = 278) filled out a standard questionnaire (Psychological Reactions following International Missions, PRIM) concerning a range of post-deployment reactions including symptoms of PTSD (PRIM-PTSD). They also filled out a validated measure of PTSD-symptoms in DSM-IV, the PTSD-checklist (PCL). We tested reliability of PRIM-PTSD by estimating Cronbach's alpha, and tested validity by correlating items, clusters, and overall scale with corresponding items in the PCL. Furthermore, we conducted two confirmatory factor analytic models to test the factor structure of PRIM-PTSD, and tested measurement invariance of the selected model. Finally, we established a screening and a clinical cutoff score by application of ROC analysis. We found high internal consistency of the PRIM-PTSD (Cronbach's alpha = 0.88; both cohorts), strong item-item (0.48-0.83), item-cluster (0.43-0.72), cluster-cluster (0.71-0.82) and full-scale (0.86-0.88) correlations between PRIM-PTSD and PCL. The factor analyses showed adequate fit of a one-factor model, which was also found to display strong measurement invariance across cohorts. ROC curve analysis established cutoff scores for screening (sensitivity = 1, specificity = 0.93) and clinical use (sensitivity = 0.71, specificity = 0.98). In conclusion, we find that PRIM-PTSD is a valid measure for assessing PTSD-symptoms in Danish soldiers following deployment. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  3. Forecasting Demand for KC-135 Sorties: Deploy to Dwell Impacts

    DTIC Science & Technology

    2013-06-01

    fighter movements from individual units are rampant (6 OSS/ OSOS , 2013). However, TACC directed missions in this category are scarce, if not non...existent (6 OSS/ OSOS , 2013). Recent TACC tasked missions that appear to support CONUS fighter movements were training related: pre-deployment preparation...and large scale exercises directed by the Joint Staff (6 OSS/ OSOS , 2013). Anecdotal evidence that AMC supports CONUS fighter movements was flawed

  4. Small-scale nuclear reactors for remote military operations: opportunities and challenges

    DTIC Science & Technology

    2015-08-25

    study – Report was published in March 2011  CNA study identified challenges to deploy small modular reactors (SMRs) at a base – Identified First-of...forward operating bases. The availability of deployable, cost-effective, regulated, and secure small modular reactors with a modest output electrical...defense committees on the challenges, operational requirements, constraints, cost, and life cycle analysis for a small modular reactor of less than 10

  5. A study of the variable impedance surface concept as a means for reducing noise from jet interaction with deployed lift-augmenting flaps

    NASA Technical Reports Server (NTRS)

    Hayden, R. E.; Kadman, Y.; Chanaud, R. C.

    1972-01-01

    The feasibility of quieting the externally-blown-flap (EBF) noise sources which are due to interaction of jet exhaust flow with deployed flaps was demonstrated on a 1/15-scale 3-flap EBF model. Sound field characteristics were measured and noise reduction fundamentals were reviewed in terms of source models. Test of the 1/15-scale model showed broadband noise reductions of up to 20 dB resulting from combination of variable impedance flap treatment and mesh grids placed in the jet flow upstream of the flaps. Steady-state lift, drag, and pitching moment were measured with and without noise reduction treatment.

  6. Resolving uncertainties in the urban air quality, climate, and vegetation nexus through citizen science, satellite imagery, and atmospheric modeling

    NASA Astrophysics Data System (ADS)

    Jenerette, D.; Wang, J.; Chandler, M.; Ripplinger, J.; Koutzoukis, S.; Ge, C.; Castro Garcia, L.; Kucera, D.; Liu, X.

    2017-12-01

    Large uncertainties remain in identifying the distribution of urban air quality and temperature risks across neighborhood to regional scales. Nevertheless, many cities are actively expanding vegetation with an expectation to moderate both climate and air quality risks. We address these uncertainties through an integrated analysis of satellite data, atmospheric modeling, and in-situ environmental sensor networks maintained by citizen scientists. During the summer of 2017 we deployed neighborhood-scale networks of air temperature and ozone sensors through three campaigns across urbanized southern California. During each five-week campaign we deployed six sensor nodes that included an EPA federal equivalent method ozone sensor and a suite of meteorological sensors. Each node was further embedded in a network of 100 air temperature sensors that combined a randomized design developed by the research team and a design co-created by citizen scientists. Between 20 and 60 citizen scientists were recruited for each campaign, with local partners supporting outreach and training to ensure consistent deployment and data gathering. We observed substantial variation in both temperature and ozone concentrations at scales less than 4km, whole city, and the broader southern California region. At the whole city scale the average spatial variation with our ozone sensor network just for city of Long Beach was 26% of the mean, while corresponding variation in air temperature was only 7% of the mean. These findings contrast with atmospheric model estimates of variation at the regional scale of 11% and 1%. Our results show the magnitude of fine-scale variation underestimated by current models and may also suggest scaling functions that can connect neighborhood and regional variation in both ozone and temperature risks in southern California. By engaging citizen science with high quality sensors, satellite data, and real-time forecasting, our results help identify magnitudes of climate and air quality risk variation across scales and can guide individual decisions and urban policies surrounding vegetation to moderate these risks.

  7. Emerging Technologies and Techniques for Wide Area Radiological Survey and Remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutton, M.; Zhao, P.

    2016-03-24

    Technologies to survey and decontaminate wide-area contamination and process the subsequent radioactive waste have been developed and implemented following the Chernobyl nuclear power plant release and the breach of a radiological source resulting in contamination in Goiania, Brazil. These civilian examples of radioactive material releases provided some of the first examples of urban radiological remediation. Many emerging technologies have recently been developed and demonstrated in Japan following the release of radioactive cesium isotopes (Cs-134 and Cs-137) from the Fukushima Dai-ichi nuclear power plant in 2011. Information on technologies reported by several Japanese government agencies, such as the Japan Atomic Energymore » Agency (JAEA), the Ministry of the Environment (MOE) and the National Institute for Environmental Science (NIES), together with academic institutions and industry are summarized and compared to recently developed, deployed and available technologies in the United States. The technologies and techniques presented in this report may be deployed in response to a wide area contamination event in the United States. In some cases, additional research and testing is needed to adequately validate the technology effectiveness over wide areas. Survey techniques can be deployed on the ground or from the air, allowing a range of coverage rates and sensitivities. Survey technologies also include those useful in measuring decontamination progress and mapping contamination. Decontamination technologies and techniques range from non-destructive (e.g., high pressure washing) and minimally destructive (plowing), to fully destructive (surface removal or demolition). Waste minimization techniques can greatly impact the long-term environmental consequences and cost following remediation efforts. Recommendations on technical improvements to address technology gaps are presented together with observations on remediation in Japan.« less

  8. Alternative Path Communication in Wide-Scale Cluster-Tree Wireless Sensor Networks Using Inactive Periods

    PubMed Central

    Leão, Erico; Montez, Carlos; Moraes, Ricardo; Portugal, Paulo; Vasques, Francisco

    2017-01-01

    The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable technology to deploy wide-scale Wireless Sensor Networks (WSNs). These networks are usually designed to support convergecast traffic, where all communication paths go through the PAN (Personal Area Network) coordinator. Nevertheless, peer-to-peer communication relationships may be also required for different types of WSN applications. That is the typical case of sensor and actuator networks, where local control loops must be closed using a reduced number of communication hops. The use of communication schemes optimised just for the support of convergecast traffic may result in higher network congestion and in a potentially higher number of communication hops. Within this context, this paper proposes an Alternative-Route Definition (ARounD) communication scheme for WSNs. The underlying idea of ARounD is to setup alternative communication paths between specific source and destination nodes, avoiding congested cluster-tree paths. These alternative paths consider shorter inter-cluster paths, using a set of intermediate nodes to relay messages during their inactive periods in the cluster-tree network. Simulation results show that the ARounD communication scheme can significantly decrease the end-to-end communication delay, when compared to the use of standard cluster-tree communication schemes. Moreover, the ARounD communication scheme is able to reduce the network congestion around the PAN coordinator, enabling the reduction of the number of message drops due to queue overflows in the cluster-tree network. PMID:28481245

  9. Alternative Path Communication in Wide-Scale Cluster-Tree Wireless Sensor Networks Using Inactive Periods.

    PubMed

    Leão, Erico; Montez, Carlos; Moraes, Ricardo; Portugal, Paulo; Vasques, Francisco

    2017-05-06

    The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable technology to deploy wide-scale Wireless Sensor Networks (WSNs). These networks are usually designed to support convergecast traffic, where all communication paths go through the PAN (Personal Area Network) coordinator. Nevertheless, peer-to-peer communication relationships may be also required for different types of WSN applications. That is the typical case of sensor and actuator networks, where local control loops must be closed using a reduced number of communication hops. The use of communication schemes optimised just for the support of convergecast traffic may result in higher network congestion and in a potentially higher number of communication hops. Within this context, this paper proposes an Alternative-Route Definition (ARounD) communication scheme for WSNs. The underlying idea of ARounD is to setup alternative communication paths between specific source and destination nodes, avoiding congested cluster-tree paths. These alternative paths consider shorter inter-cluster paths, using a set of intermediate nodes to relay messages during their inactive periods in the cluster-tree network. Simulation results show that the ARounD communication scheme can significantly decrease the end-to-end communication delay, when compared to the use of standard cluster-tree communication schemes. Moreover, the ARounD communication scheme is able to reduce the network congestion around the PAN coordinator, enabling the reduction of the number of message drops due to queue overflows in the cluster-tree network.

  10. Phenotypes, genome wide markers and structured genetic populations; a means to understand economically important traits in beta vulgaris and to inform the process of germplasm enhancement

    USDA-ARS?s Scientific Manuscript database

    Although hybrid seed systems in beet have been widely adopted due to profitability and productivity, the population remains the operational unit of beet improvement and thus characterizing populations in terms of markers and phenotypes is critical for novel trait discovery and eventual deployment of...

  11. Photovoltaic Manufacturing Consortium (PVMC) – Enabling America’s Solar Revolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metacarpa, David

    The U.S. Photovoltaic Manufacturing Consortium (US-PVMC) is an industry-led consortium which was created with the mission to accelerate the research, development, manufacturing, field testing, commercialization, and deployment of next-generation solar photovoltaic technologies. Formed as part of the U.S. Department of Energy's (DOE) SunShot initiative, and headquartered in New York State, PVMC is managed by the State University of New York Polytechnic Institute (SUNY Poly) at the Colleges of Nanoscale Science and Engineering. PVMC is a hybrid of industry-led consortium and manufacturing development facility, with capabilities for collaborative and proprietary industry engagement. Through its technology development programs, advanced manufacturing development facilities,more » system demonstrations, and reliability and testing capabilities, PVMC has demonstrated itself to be a recognized proving ground for innovative solar technologies and system designs. PVMC comprises multiple locations, with the core manufacturing and deployment support activities conducted at the Solar Energy Development Center (SEDC), and the core Si wafering and metrology technologies being headed out of the University of Central Florida. The SEDC provides a pilot line for proof-of-concept prototyping, offering critical opportunities to demonstrate emerging concepts in PV manufacturing, such as evaluations of innovative materials, system components, and PV system designs. The facility, located in Halfmoon NY, encompasses 40,000 square feet of dedicated PV development space. The infrastructure and capabilities housed at PVMC includes PV system level testing at the Prototype Demonstration Facility (PDF), manufacturing scale cell & module fabrication at the Manufacturing Development Facility (MDF), cell and module testing, reliability equipment on its PV pilot line, all integrated with a PV performance database and analytical characterizations for PVMC and its partners test and commercial arrays. Additional development and deployment support are also housed at the SEDC, such as cost modeling and cost model based development activities for PV and thin film modules, components, and system level designs for reduced LCOE through lower installation hardware costs, labor reductions, soft costs and reduced operations and maintenance costs. The progression of the consortium activities started with infrastructure and capabilities build out focused on CIGS thin film photovoltaics, with a particular focus on flexible cell and module production. As marketplace changes and partners objectives shifted, the consortium shifted heavily towards deployment and market pull activities including Balance of System, cost modeling, and installation cost reduction efforts along with impacts to performance and DER operational costs. The consortium consisted of a wide array of PV supply chain companies from equipment and component suppliers through national developers and installers with a particular focus on commercial scale deployments (typically 25 to 2MW installations). With DOE funding ending after the fifth budget period, the advantages and disadvantages of such a consortium is detailed along with potential avenues for self-sustainability is reviewed.« less

  12. System-Integrated Finite Element Analysis of a Full-Scale Helicopter Crash Test with Deployable Energy Absorbers

    NASA Technical Reports Server (NTRS)

    Annett, Martin S.; Polanco, Michael A.

    2010-01-01

    A full-scale crash test of an MD-500 helicopter was conducted in December 2009 at NASA Langley's Landing and Impact Research facility (LandIR). The MD-500 helicopter was fitted with a composite honeycomb Deployable Energy Absorber (DEA) and tested under vertical and horizontal impact velocities of 26-ft/sec and 40-ft/sec, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of a system integrated finite element model. In preparation for the full-scale crash test, a series of sub-scale and MD-500 mass simulator tests was conducted to evaluate the impact performances of various components, including a new crush tube and the DEA blocks. Parameters defined within the system integrated finite element model were determined from these tests. The objective of this paper is to summarize the finite element models developed and analyses performed, beginning with pre-test predictions and continuing through post-test validation.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, M.; Lowder, T.; Canavan, B.

    Over the last several years, solar energy technologies have been, or are in the process of being, deployed at unprecedented levels. A critical recent development, resulting from the massive scale of projects in progress or recently completed, is having the power sold directly to electric utilities. Such 'utility-scale' systems offer the opportunity to deploy solar technologies far faster than the traditional 'behind-the-meter' projects designed to offset retail load. Moreover, these systems have employed significant economies of scale during construction and operation, attracting financial capital, which in turn can reduce the delivered cost of power. This report is a summary ofmore » the current U.S. utility-scale solar state-of-the-market and development pipeline. Utility-scale solar energy systems are generally categorized as one of two basic designs: concentrating solar power (CSP) and photovoltaic (PV). CSP systems can be further delineated into four commercially available technologies: parabolic trough, central receiver (CR), parabolic dish, and linear Fresnel reflector. CSP systems can also be categorized as hybrid, which combine a solar-based system (generally parabolic trough, CR, or linear Fresnel) and a fossil fuel energy system to produce electric power or steam.« less

  14. Family Functioning Differences Across the Deployment Cycle in British Army Families: The Perceptions of Wives and Children.

    PubMed

    Pye, Rachel E; Simpson, Leanne K

    2017-09-01

    Military deployment can have an adverse effect on a soldier's family, though little research has looked at these effects in a British sample. We investigated wives' of U.K.-serving soldiers perceptions of marital and family functioning, across three stages of the deployment cycle: currently deployed, postdeployment and predeployed, plus a nonmilitary comparison group. Uniquely, young (aged 3.5-11 years) children's perceptions of their family were also investigated, using the parent-child alliance (PCA) coding scheme of drawings of the family. Two hundred and twenty British military families of regular service personnel from the British Army's Royal Armoured Corps, were sent survey packs distributed with a monthly welfare office newsletter. Wives were asked to complete a series of self-report items, and the youngest child in the family between the ages of 3.5 and 11 years was asked to draw a picture of their family. Complete data were available for 78 military families, and an additional 34 nonmilitary families were recruited via opportunity sampling. Results indicated wives of currently deployed and recently returned personnel were less satisfied with their family and its communication, and children's pictures indicated higher levels of dysfunctional parent-child alliance, whereas predeployed families responded similarly to nonmilitary families. Marital satisfaction was similar across all groups except predeployed families who were significantly more satisfied. Nonmilitary and predeployed families showed balanced family functioning, and currently and recently deployed families demonstrated poor family functioning. In comparison to nonmilitary families, predeployed families showed a large "spike" in the rigidity subscale of the Family Adaptability and Cohesion Evaluation Scale IV. Wives' perceptions of family functioning, but not marital satisfaction, differed between the deployment groups. The results from the coded children's drawings correlated with the self-report measures from the wife/mother, indicating that children's drawings could be a useful approach when working with younger children in this area. It is tentatively suggested that the differences across deployment stage on family functioning could be mediated not only by communication difficulties between deployed personnel and their families, but also by its effect on the children in the family. Larger-scale longitudinal research is needed to investigate this further. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.

  15. Strategies Which Foster Broad Use and Deployment of Earth and Space Science Informal and Formal Education Resources

    NASA Technical Reports Server (NTRS)

    Meeson, Blanche W.; Gabrys, Robert; Ireton, M. Frank; Einaudi, Franco (Technical Monitor)

    2001-01-01

    Education projects supported by federal agencies and carried out by a wide range of organizations foster learning about Earth and Space systems science in a wide array of venues. Across these agencies a range of strategies are employed to ensure that effective materials are created for these diverse venues. And that these materials are deployed broadly so that a large spectrum of the American Public, both adults and children alike, can learn and become excited by the Earth and space system science. This session will highlight some of those strategies and will cover representative examples to illustrate the effectiveness of the strategies. Invited speakers from selected formal and informal educational efforts will anchor this session. Speakers with representative examples are encouraged to submit abstracts for the session to showcase the strategies which they use.

  16. Dual-Polarization Observations of Precipitation: State of the Art in Operational and Research Applications

    NASA Astrophysics Data System (ADS)

    Chandra, C. V.; Moisseev, D. N.; Baldini, L.; Bechini, R.; Cremonini, R.; Wolff, D. B.; Petersen, W. A.; Junyent, F.; Chen, H.; Beauchamp, R.

    2016-12-01

    Dual-polarization weather radars have been widely used for rainfall measurement applications and studies of the microphysical characteristics of precipitation. Ground-based, dual-polarization radar systems form the cornerstones of national severe weather warning and forecasting infrastructure in many developed countries. As a result of the improved performance of dual-polarization radars for these applications, large scale dual-polarization upgrades are being planned for India and China. In addition to national forecast and warning operations, dual-polarization radars have also been used for satellite ground validation activities. The operational Dual-Polarization radars in the US are mostly S band systems whereas in Europe are mostly C band systems. In addition a third class of systems is emerging in urban regions where networks of X band systems are being deployed operationally. There are successful networks planned or already deployed in big cities such as Dallas Fort Worth, Tokyo or Beijing. These X band networks are developing their own operational domain. In summary a large infrastructure in terms of user specified products and dual use of operational research applications are also emerging in these systems. This paper will discuss some of the innovative uses of the operational dual-polarization radar networks for research purposes, with references to calibration, hydrometeor classification and quantitative precipitation estimation. Additional application to the study of precipitation processes will also be discussed.

  17. Development of the geoCamera, a System for Mapping Ice from a Ship

    NASA Astrophysics Data System (ADS)

    Arsenault, R.; Clemente-Colon, P.

    2012-12-01

    The geoCamera produces maps of the ice surrounding an ice-capable ship by combining images from one or more digital cameras with the ship's position and attitude data. Maps are produced along the ship's path with the achievable width and resolution depending on camera mounting height as well as camera resolution and lens parameters. Our system has produced maps up to 2000m wide at 1m resolution. Once installed and calibrated, the system is designed to operate automatically producing maps in near real-time and making them available to on-board users via existing information systems. The resulting small-scale maps complement existing satellite based products as well as on-board observations. Development versions have temporarily been deployed in Antarctica on the RV Nathaniel B. Palmer in 2010 and in the Arctic on the USCGC Healy in 2011. A permanent system has been deployed during the summer of 2012 on the USCGC Healy. To make the system attractive to other ships of opportunity, design goals include using existing ship systems when practical, using low costs commercial-off-the-shelf components if additional hardware is necessary, automating the process to virtually eliminate adding to the workload of ships technicians and making the software components modular and flexible enough to allow more seamless integration with a ships particular IT system.

  18. Optical techniques for the determination of nitrate in environmental waters: Guidelines for instrument selection, operation, deployment, maintenance, quality assurance, and data reporting

    USGS Publications Warehouse

    Pellerin, Brian A.; Bergamaschi, Brian A.; Downing, Bryan D.; Saraceno, John Franco; Garrett, Jessica D.; Olsen, Lisa D.

    2013-01-01

    The recent commercial availability of in situ optical sensors, together with new techniques for data collection and analysis, provides the opportunity to monitor a wide range of water-quality constituents on time scales in which environmental conditions actually change. Of particular interest is the application of ultraviolet (UV) photometers for in situ determination of nitrate concentrations in rivers and streams. The variety of UV nitrate sensors currently available differ in several important ways related to instrument design that affect the accuracy of their nitrate concentration measurements in different types of natural waters. This report provides information about selection and use of UV nitrate sensors by the U.S. Geological Survey to facilitate the collection of high-quality data across studies, sites, and instrument types. For those in need of technical background and information about sensor selection, this report addresses the operating principles, key features and sensor design, sensor characterization techniques and typical interferences, and approaches for sensor deployment. For those needing information about maintaining sensor performance in the field, key sections in this report address maintenance and calibration protocols, quality-assurance techniques, and data formats and reporting. Although the focus of this report is UV nitrate sensors, many of the principles can be applied to other in situ optical sensors for water-quality studies.

  19. Implementation of a wireless sensor network for heart rate monitoring in a senior center.

    PubMed

    Huang, Jyh-How; Su, Tzu-Yao; Raknim, Paweeya; Lan, Kun-Chan

    2015-06-01

    Wearable sensor systems are widely used to monitor vital sign in hospitals and in recent years have also been used at home. In this article we present a system that includes a ring probe, sensor, radio, and receiver, designed for use as a long-term heart rate monitoring system in a senior center. The primary contribution of this article is successfully implementing a cheap, large-scale wireless heart rate monitoring system that is stable and comfortable to use 24 h a day. We developed new finger ring sensors for comfortable continuous wearing experience and used dynamic power adjustment on the ring so the sensor can detect pulses at different strength levels. Our system has been deployed in a senior center since May 2012, and 63 seniors have used this system in this period. During the 54-h system observation period, 10 alarms were set off. Eight of them were due to abnormal heart rate, and two of them were due to loose probes. The monitoring system runs stably with the senior center's existing WiFi network, and achieves 99.48% system availability. The managers and caregivers use our system as a reliable warning system for clinical deterioration. The results of the year-long deployment show that the wireless group heart rate monitoring system developed in this work is viable for use within a designated area.

  20. Compact Optical Atomic Clock Based on a Two-Photon Transition in Rubidium

    NASA Astrophysics Data System (ADS)

    Martin, Kyle W.; Phelps, Gretchen; Lemke, Nathan D.; Bigelow, Matthew S.; Stuhl, Benjamin; Wojcik, Michael; Holt, Michael; Coddington, Ian; Bishop, Michael W.; Burke, John H.

    2018-01-01

    Extralaboratory atomic clocks are necessary for a wide array of applications (e.g., satellite-based navigation and communication). Building upon existing vapor-cell and laser technologies, we describe an optical atomic clock, designed around a simple and manufacturable architecture, that utilizes the 778-nm two-photon transition in rubidium and yields fractional-frequency instabilities of 4 ×10-13/√{τ (s ) } for τ from 1 to 10 000 s. We present a complete stability budget for this system and explore the required conditions under which a fractional-frequency instability of 1 ×10-15 can be maintained on long time scales. We provide a precise characterization of the leading sensitivities to external processes, including magnetic fields and fluctuations of the vapor-cell temperature and 778-nm laser power. The system is constructed primarily from commercially available components, an attractive feature from the standpoint of the commercialization and deployment of optical frequency standards.

  1. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  2. Non-aqueous homogenous biocatalytic conversion of polysaccharides in ionic liquids using chemically modified glucosidase.

    PubMed

    Brogan, Alex P S; Bui-Le, Liem; Hallett, Jason P

    2018-06-25

    The increasing requirement to produce platform chemicals and fuels from renewable sources means advances in biocatalysis are rapidly becoming a necessity. Biomass is widely used in nature as a source of energy and as chemical building blocks. However, recalcitrance towards traditional chemical processes and solvents provides a significant barrier to widespread utility. Here, by optimizing enzyme solubility in ionic liquids, we have discovered solvent-induced substrate promiscuity of glucosidase, demonstrating an unprecedented example of homogeneous enzyme bioprocessing of cellulose. Specifically, chemical modification of glucosidase for solubilization in ionic liquids can increase thermal stability to up to 137 °C, allowing for enzymatic activity 30 times greater than is possible in aqueous media. These results establish that through a synergistic combination of chemical biology (enzyme modification) and reaction engineering (solvent choice), the biocatalytic capability of enzymes can be intensified: a key step towards the full-scale deployment of industrial biocatalysis.

  3. Gimbals Drive and Control Electronics Design, Development and Testing of the LRO High Gain Antenna and Solar Array Systems

    NASA Technical Reports Server (NTRS)

    Chernyakov, Boris; Thakore, Kamal

    2010-01-01

    Launched June 18, 2009 on an Atlas V rocket, NASA's Lunar Reconnaissance Orbiter (LRO) is the first step in NASA's Vision for Space Exploration program and for a human return to the Moon. The spacecraft (SC) carries a wide variety of scientific instruments and provides an extraordinary opportunity to study the lunar landscape at resolutions and over time scales never achieved before. The spacecraft systems are designed to enable achievement of LRO's mission requirements. To that end, LRO's mechanical system employed two two-axis gimbal assemblies used to drive the deployment and articulation of the Solar Array System (SAS) and the High Gain Antenna System (HGAS). This paper describes the design, development, integration, and testing of Gimbal Control Electronics (GCE) and Actuators for both the HGAS and SAS systems, as well as flight testing during the on-orbit commissioning phase and lessons learned.

  4. Computational homogenisation for thermoviscoplasticity: application to thermally sprayed coatings

    NASA Astrophysics Data System (ADS)

    Berthelsen, Rolf; Denzer, Ralf; Oppermann, Philip; Menzel, Andreas

    2017-11-01

    Metal forming processes require wear-resistant tool surfaces in order to ensure a long life cycle of the expensive tools together with a constant high quality of the produced components. Thermal spraying is a relatively widely applied coating technique for the deposit of wear protection coatings. During these coating processes, heterogeneous coatings are deployed at high temperatures followed by quenching where residual stresses occur which strongly influence the performance of the coated tools. The objective of this article is to discuss and apply a thermo-mechanically coupled simulation framework which captures the heterogeneity of the deposited coating material. Therefore, a two-scale finite element framework for the solution of nonlinear thermo-mechanically coupled problems is elaborated and applied to the simulation of thermoviscoplastic material behaviour including nonlinear thermal softening in a geometrically linearised setting. The finite element framework and material model is demonstrated by means of numerical examples.

  5. Online Visualization and Analysis of Global Half-Hourly Infrared Satellite Data

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    nfrared (IR) images (approximately 11-micron channel) recorded by satellite sensors have been widely used in weather forecasting, research, and classroom education since the Nimbus program. Unlike visible images, IR imagery can reveal cloud features without sunlight illumination; therefore, they can be used to monitor weather phenomena day and night. With geostationary satellites deployed around the globe, it is possible to monitor weather events 24/7 at a temporal resolution that polar-orbiting satellites cannot achieve at the present time. When IR data from multiple geostationary satellites are merged to form a single product--also known as a merged product--it allows for observing weather on a global scale. Its high temporal resolution (e.g., every half hour) also makes it an ideal ancillary dataset for supporting other satellite missions, such as the Tropical Rainfall Measuring Mission (TRMM), etc., by providing additional background information about weather system evolution.

  6. The IQ-wall and IQ-station -- harnessing our collective intelligence to realize the potential of ultra-resolution and immersive visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Chris Eller

    2012-03-01

    We present a pair of open-recipe, affordably-priced, easy-to-integrate, and easy-to-use visualization systems. The IQ-wall is an ultra-resolution tiled display wall that scales up to 24 screens with a single PC. The IQ-station is a semi-immersive display system that utilizes commodity stereoscopic displays, lower cost tracking systems, and touch overlays. These systems have been designed to support a wide range of research, education, creative activities, and information presentations. They were designed to work equally well as stand-alone installations or as part of a larger distributed visualization ecosystem. We detail the hardware and software components of these systems, describe our deployments andmore » experiences in a variety of research lab and university environments, and share our insights for effective support and community development.« less

  7. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE PAGES

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.; ...

    2017-11-26

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  8. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  9. Tracking state deployments of commercial vehicle information systems and networks : 1998 Florida state report

    DOT National Transportation Integrated Search

    2000-06-01

    Micro-scale design (MSD) is a term that has been coined recently by transportation and land use planners to describe the human-scale features of the built environment. This concept focuses on accessibility to desired activities rather than on mobilit...

  10. Development of a remote sensing network for time-sensitive detection of fine scale damage to transportation infrastructure : [final report].

    DOT National Transportation Integrated Search

    2015-09-23

    This research project aimed to develop a remote sensing system capable of rapidly identifying fine-scale damage to critical transportation infrastructure following hazard events. Such a system must be pre-planned for rapid deployment, automate proces...

  11. Charting the Emergence of Corporate Procurement of Utility-Scale PV |

    Science.gov Websites

    Jeffrey J. Cook Though most large-scale solar photovoltaic (PV) deployment has been driven by utility corporate interest in renewables as more companies are recognizing that solar PV can provide clean United States highlighting states with utility-scale solar PV purchasing options Figure 2. States with

  12. Potential climatic impacts and reliability of large-scale offshore wind farms

    NASA Astrophysics Data System (ADS)

    Wang, Chien; Prinn, Ronald G.

    2011-04-01

    The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land-based installations. However, the intermittency caused by the significant seasonal wind variations over several major offshore sites is substantial, and demands further options to ensure the reliability of large-scale offshore wind power. The method that we used to simulate the offshore wind turbine effect on the lower atmosphere involved simply increasing the ocean surface drag coefficient. While this method is consistent with several detailed fine-scale simulations of wind turbines, it still needs further study to ensure its validity. New field observations of actual wind turbine arrays are definitely required to provide ultimate validation of the model predictions presented here.

  13. Fuel savings and emissions reductions from light duty fuel cell vehicles

    NASA Astrophysics Data System (ADS)

    Mark, J.; Ohi, J. M.; Hudson, D. V., Jr.

    1994-04-01

    Fuel cell vehicles (FCV's) operate efficiently, emit few pollutants, and run on nonpetroleum fuels. Because of these characteristics, the large-scale deployment of FCV's has the potential to lessen U.S. dependence on foreign oil and improve air quality. This study characterizes the benefits of large-scale FCV deployment in the light duty vehicle market. Specifically, the study assesses the potential fuel savings and emissions reductions resulting from large-scale use of these FCV's and identifies the key parameters that affect the scope of the benefits from FCV use. The analysis scenario assumes that FCV's will compete with gasoline-powered light trucks and cars in the new vehicle market for replacement of retired vehicles and will compete for growth in the total market. Analysts concluded that the potential benefits from FCV's, measured in terms of consumer outlays for motor fuel and the value of reduced air emissions, are substantial.

  14. Negative emissions: Part 1—research landscape and synthesis

    NASA Astrophysics Data System (ADS)

    Minx, Jan C.; Lamb, William F.; Callaghan, Max W.; Fuss, Sabine; Hilaire, Jérôme; Creutzig, Felix; Amann, Thorben; Beringer, Tim; de Oliveira Garcia, Wagner; Hartmann, Jens; Khanna, Tarun; Lenzi, Dominic; Luderer, Gunnar; Nemet, Gregory F.; Rogelj, Joeri; Smith, Pete; Vicente, Jose Luis Vicente; Wilcox, Jennifer; del Mar Zamora Dominguez, Maria

    2018-06-01

    With the Paris Agreement’s ambition of limiting climate change to well below 2 °C, negative emission technologies (NETs) have moved into the limelight of discussions in climate science and policy. Despite several assessments, the current knowledge on NETs is still diffuse and incomplete, but also growing fast. Here, we synthesize a comprehensive body of NETs literature, using scientometric tools and performing an in-depth assessment of the quantitative and qualitative evidence therein. We clarify the role of NETs in climate change mitigation scenarios, their ethical implications, as well as the challenges involved in bringing the various NETs to the market and scaling them up in time. There are six major findings arising from our assessment: first, keeping warming below 1.5 °C requires the large-scale deployment of NETs, but this dependency can still be kept to a minimum for the 2 °C warming limit. Second, accounting for economic and biophysical limits, we identify relevant potentials for all NETs except ocean fertilization. Third, any single NET is unlikely to sustainably achieve the large NETs deployment observed in many 1.5 °C and 2 °C mitigation scenarios. Yet, portfolios of multiple NETs, each deployed at modest scales, could be invaluable for reaching the climate goals. Fourth, a substantial gap exists between the upscaling and rapid diffusion of NETs implied in scenarios and progress in actual innovation and deployment. If NETs are required at the scales currently discussed, the resulting urgency of implementation is currently neither reflected in science nor policy. Fifth, NETs face severe barriers to implementation and are only weakly incentivized so far. Finally, we identify distinct ethical discourses relevant for NETs, but highlight the need to root them firmly in the available evidence in order to render such discussions relevant in practice.

  15. On Efficient Deployment of Wireless Sensors for Coverage and Connectivity in Constrained 3D Space.

    PubMed

    Wu, Chase Q; Wang, Li

    2017-10-10

    Sensor networks have been used in a rapidly increasing number of applications in many fields. This work generalizes a sensor deployment problem to place a minimum set of wireless sensors at candidate locations in constrained 3D space to k -cover a given set of target objects. By exhausting the combinations of discreteness/continuousness constraints on either sensor locations or target objects, we formulate four classes of sensor deployment problems in 3D space: deploy sensors at Discrete/Continuous Locations (D/CL) to cover Discrete/Continuous Targets (D/CT). We begin with the design of an approximate algorithm for DLDT and then reduce DLCT, CLDT, and CLCT to DLDT by discretizing continuous sensor locations or target objects into a set of divisions without sacrificing sensing precision. Furthermore, we consider a connected version of each problem where the deployed sensors must form a connected network, and design an approximation algorithm to minimize the number of deployed sensors with connectivity guarantee. For performance comparison, we design and implement an optimal solution and a genetic algorithm (GA)-based approach. Extensive simulation results show that the proposed deployment algorithms consistently outperform the GA-based heuristic and achieve a close-to-optimal performance in small-scale problem instances and a significantly superior overall performance than the theoretical upper bound.

  16. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    NASA Technical Reports Server (NTRS)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  17. GIS-based suitability modeling and multi-criteria decision analysis for utility scale solar plants in four states in the Southeast U.S

    NASA Astrophysics Data System (ADS)

    Tisza, Kata

    Photovoltaic (PV) development shows significantly smaller growth in the Southeast U.S., than in the Southwest; which is mainly due to the low cost of fossil-fuel based energy production in the region and the lack of solar incentives. However, the Southeast has appropriate insolation conditions (4.0-6.0 KWh/m2/day) for photovoltaic deployment and in the past decade the region has experienced the highest population growth for the entire country. These factors, combined with new renewable energy portfolio policies, could create an opportunity for PV to provide some of the energy that will be required to sustain this growth. The goal of the study was to investigate the potential for PV generation in the Southeast region by identifying suitable areas for a utility-scale solar power plant deployment. Four states with currently low solar penetration were studied: Georgia, North Carolina, South Carolina and Tennessee. Feasible areas were assessed with Geographic Information Systems (GIS) software using solar, land use and population growth criteria combined with proximity to transmission lines and roads. After the GIS-based assessment of the areas, technological potential was calculated for each state. Multi-decision analysis model (MCDA) was used to simulate the decision making method for a strategic PV installation. The model accounted for all criteria necessary to consider in case of a PV development and also included economic and policy criteria, which is thought to be a strong influence on the PV market. Three different scenarios were established, representing decision makers' theoretical preferences. Map layers created in the first part were used as basis for the MCDA and additional technical, economic and political/market criteria were added. A sensitivity analysis was conducted to test the model's robustness. Finally, weighted criteria were assigned to the GIS map layers, so that the different preference systems could be visualized. As a result, lands suitable for a potential industrial-scale PV deployment were assessed. Moreover, a precise calculation for technical potential was conducted, with a capacity factor determined by the actual insolation of the sum of each specific feasible area. The results of the study showed that, for a utility-scale PV utility deployment, significant amount of feasible areas are available, with good electricity generation potential Moreover, a stable MCDA model was established for supporting strategic decision making in a PV deployment. Also, changes of suitable lands for utility-scale PV installations were visualized in GIS for the state of Tennessee.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement ofmore » all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.« less

  19. The Effects of Prior Combat Experience on the Expression of Somatic and Affective Symptoms in Deploying Soldiers

    DTIC Science & Technology

    2006-01-01

    Journal of Psychosomatic ResThe effects of prior combat experience on the expression of somatic and affective symptoms in deploying soldiers William...rates of somatic complaints compared with combat-naive soldiers. Methods: Self-reports of posttraumatic stress disorder (PTSD) and affective and somatic ...identical for the experienced and inexperienced groups, scores on the Affective and Somatic scales differed as a function of prior combat history. Previous

  20. Findings from the Supersonic Qualification Program of the Mars Science Laboratory Parachute System

    NASA Technical Reports Server (NTRS)

    Sengupta, Anita; Steltzner, Adam; Witkowski, Allen; Candler, Graham; Pantano, Carlos

    2009-01-01

    In 2012, the Mars Science Laboratory Mission (MSL) will deploy NASA's largest extra-terrestrial parachute, a technology integral to the safe landing of its advanced robotic explorer on the surface. The supersonic parachute system is a mortar deployed 21.5 m disk-gap-band (DGB) parachute, identical in geometric scaling to the Viking era DGB parachutes of the 1970's. The MSL parachute deployment conditions are Mach 2.3 at a dynamic pressure of 750 Pa. The Viking Balloon Launched Decelerator Test (BLDT) successfully demonstrated a maximum of 700 Pa at Mach 2.2 for a 16.1 m DGB parachute in its AV4 flight. All previous Mars deployments have derived their supersonic qualification from the Viking BLDT test series, preventing the need for full scale high altitude supersonic testing. The qualification programs for Mars Pathfinder, Mars Exploration Rover, and Phoenix Scout Missions were all limited to subsonic structural qualification, with supersonic performance and survivability bounded by the BLDT qualification. The MSL parachute, at the edge of the supersonic heritage deployment space and 33% larger than the Viking parachute, accepts a certain degree of risk without addressing the supersonic environment in which it will deploy. In addition, MSL will spend up to 10 seconds above Mach 1.5, an aerodynamic regime that is associated with a known parachute instability characterized by significant canopy projected area fluctuation and dynamic drag variation. This aerodynamic instability, referred to as "area oscillations" by the parachute community has drag performance, inflation stability, and structural implications, introducing risk to mission success if not quantified for the MSL parachute system. To minimize this risk and as an alternative to a prohibitively expensive high altitude test program, a multi-phase qualification program using computation simulation validated by subscale test was developed and implemented for MSL. The first phase consisted of 2% of fullscale supersonic wind tunnel testing of a rigid DGB parachute with entry-vehicle to validate two high fidelity computational fluid dynamics (CFD) tools. The computer codes utilized Large Eddy Simulation and Detached Eddy Simulation numerical approaches to accurately capture the turbulent wake of the entry vehicle and its coupling to the parachute bow-shock. The second phase was the development of fluid structure interaction (FSI) computational tools to predict parachute response to the supersonic flow field. The FSI development included the integration of the CFD from the first phase with a finite element structural model of the parachute membrane and cable elements. In this phase, a 4% of full-scale supersonic flexible parachute test program was conducted to provide validation data to the FSI code and an empirical dataset of the MSL parachute in a flight-like environment. The final phase is FSI simulations of the full-scale MSL parachute in a Mars type deployment. Findings from this program will be presented in terms of code development and validation, empirical findings from the supersonic testing, and drag performance during supersonic operation.

  1. Post-deployment usability evaluation of a radiology workstation.

    PubMed

    Jorritsma, Wiard; Cnossen, Fokie; Dierckx, Rudi A; Oudkerk, Matthijs; Van Ooijen, Peter M A

    2016-01-01

    To determine the number, nature and severity of usability issues radiologists encounter while using a commercially available radiology workstation in clinical practice, and to assess how well the results of a pre-deployment usability evaluation of this workstation generalize to clinical practice. The usability evaluation consisted of semi-structured interviews and observations of twelve users using the workstation during their daily work. Usability issues and positive usability findings were documented. Each issue was given a severity rating and its root cause was determined. Results were compared to the results of a pre-deployment usability evaluation of the same workstation. Ninety-two usability issues were identified, ranging from issues that cause minor frustration or delay, to issues that cause significant delays, prevent users from completing tasks, or even pose a potential threat to patient safety. The results of the pre-deployment usability evaluation had limited generalizability to clinical practice. This study showed that radiologists encountered a large number and a wide variety of usability issues when using a commercially available radiology workstation in clinical practice. This underlines the need for effective usability engineering in radiology. Given the limitations of pre-deployment usability evaluation in radiology, which were confirmed by our finding that the results of a pre-deployment usability evaluation of this workstation had limited generalizability to clinical practice, it is vital that radiology workstation vendors devote significant resources to usability engineering efforts before deployment of their workstation, and to continue these efforts after the workstation is deployed in a hospital. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. A genome-wide association study of resistance to stripe rust (Puccinia striiformis f. sp. tritici) in a worldwide collection of hexaploid spring wheat (Triticum aestivum L.)

    USDA-ARS?s Scientific Manuscript database

    New races of Puccinia striiformis f. sp. tritici (Pst), the causal pathogen of wheat stripe rust, show high virulence to previously deployed resistance genes and are causing large yield losses worldwide. To identify new sources of resistance we performed a genome-wide association study (GWAS) using...

  3. Experience with ActiveX control for simple channel access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timossi, C.; Nishimura, H.; McDonald, J.

    2003-05-15

    Accelerator control system applications at Berkeley Lab's Advanced Light Source (ALS) are typically deployed on operator consoles running Microsoft Windows 2000 and utilize EPICS[2]channel access for data access. In an effort to accommodate the wide variety of Windows based development tools and developers with little experience in network programming, ActiveX controls have been deployed on the operator stations. Use of ActiveX controls for use in the accelerator control environment has been presented previously[1]. Here we report on some of our experiences with the use and development of these controls.

  4. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Bartoldus, R.

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farmmore » of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.« less

  5. Negative emissions—Part 3: Innovation and upscaling

    NASA Astrophysics Data System (ADS)

    Nemet, Gregory F.; Callaghan, Max W.; Creutzig, Felix; Fuss, Sabine; Hartmann, Jens; Hilaire, Jérôme; Lamb, William F.; Minx, Jan C.; Rogers, Sophia; Smith, Pete

    2018-06-01

    We assess the literature on innovation and upscaling for negative emissions technologies (NETs) using a systematic and reproducible literature coding procedure. To structure our review, we employ the framework of sequential stages in the innovation process, with which we code each NETs article in innovation space. We find that while there is a growing body of innovation literature on NETs, 59% of the articles are focused on the earliest stages of the innovation process, ‘research and development’ (R&D). The subsequent stages of innovation are also represented in the literature, but at much lower levels of activity than R&D. Distinguishing between innovation stages that are related to the supply of the technology (R&D, demonstrations, scale up) and demand for the technology (demand pull, niche markets, public acceptance), we find an overwhelming emphasis (83%) on the supply side. BECCS articles have an above average share of demand-side articles while direct air carbon capture and storage has a very low share. Innovation in NETs has much to learn from successfully diffused technologies; appealing to heterogeneous users, managing policy risk, as well as understanding and addressing public concerns are all crucial yet not well represented in the extant literature. Results from integrated assessment models show that while NETs play a key role in the second half of the 21st century for 1.5 °C and 2 °C scenarios, the major period of new NETs deployment is between 2030 and 2050. Given that the broader innovation literature consistently finds long time periods involved in scaling up and deploying novel technologies, there is an urgency to developing NETs that is largely unappreciated. This challenge is exacerbated by the thousands to millions of actors that potentially need to adopt these technologies for them to achieve planetary scale. This urgency is reflected neither in the Paris Agreement nor in most of the literature we review here. If NETs are to be deployed at the levels required to meet 1.5 °C and 2 °C targets, then important post-R&D issues will need to be addressed in the literature, including incentives for early deployment, niche markets, scale-up, demand, and—particularly if deployment is to be hastened—public acceptance.

  6. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  7. Unit cost analysis of training and deploying paid community health workers in three rural districts of Tanzania.

    PubMed

    Tani, Kassimu; Exavery, Amon; Baynes, Colin D; Pemba, Senga; Hingora, Ahmed; Manzi, Fatuma; Phillips, James F; Kanté, Almamy Malick

    2016-07-08

    Tanzania, like other African countries, faces significant health workforce shortages. With advisory and partnership from Columbia University, the Ifakara Health Institute and the Tanzanian Training Centre for International Health (TTCIH) developed and implemented the Connect Project as a randomized cluster experimental trial of the childhood survival impact of recruiting, training, and deploying of a new cadre of paid community health workers (CHW), named "Wawazesha wa afya ya Jamii" (WAJA). This paper presents an estimation of the cost of training and deploying WAJA in three rural districts of Tanzania. Costing data were collected by tracking project activity expenditure records and conducting in-depth interviews of TTCIH staff who have led the training and deployment of WAJA, as well as their counterparts at Public Clinical Training Centres who have responsibility for scaling up the WAJA training program. The trial is registered with the International Standard Randomized Controlled Trial Register number ( ISRCTN96819844 ). The Connect training cost was US$ 2,489.3 per WAJA, of which 40.1 % was for meals, 20.2 % for accommodation 10.2 % for tuition fees and the remaining 29.5 % for other costs including instruction and training facilities and field allowance. A comparable training program estimated unit cost for scaling-up this training via regional/district clinical training centres would be US$ 833.5 per WAJA. Of this unit cost, 50.3 % would involve the cost of meals, 27.4 % training fees, 13.7 % for field allowances, 9 % for accommodation and medical insurance. The annual running cost of WAJA in a village will cost US$ 1.16 per capita. Costs estimated by this study are likely to be sustainable on a large scale, particularly if existing regional/district institutions are utilized for this program.

  8. Effects of Deployment Investment on the Growth of the Biofuels Industry. 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J.; Warner, Ethan S.; Stright, Dana

    This report updates the 2013 report of the same title. Some text originally published in that report is retained and indicated in gray. In support of the national goals for biofuel use in the United States, numerous technologies have been developed that convert biomass to biofuels. Some of these biomass to biofuel conversion technology pathways are operating at commercial scales, while others are in earlier stages of development. The advancement of a new pathway toward commercialization involves various types of progress, including yield improvements, process engineering, and financial performance. Actions of private investors and public programs can accelerate the demonstrationmore » and deployment of new conversion technology pathways. These investors (both private and public) will pursue a range of pilot, demonstration, and pioneer scale biorefinery investments; the most cost-effective set of investments for advancing the maturity of any given biomass to biofuel conversion technology pathway is unknown. In some cases, whether or not the pathway itself will ultimately be technically and financially successful is also unknown. This report presents results from the Biomass Scenario Model--a system dynamics model of the biomass to biofuels system--that estimate effects of investments in biorefineries at different maturity levels and operational scales. The report discusses challenges in estimating effects of such investments and explores the interaction between this deployment investment and a volumetric production incentive. Model results show that investments in demonstration and deployment have a substantial growth impact on the development of the biofuels industry. Results also show that other conditions, such as accompanying incentives, have major impacts on the effectiveness of such investments. Results from the 2013 report are compared to new results. This report does not advocate for or against investments, incentives, or policies, but analyzes simulations of their hypothetical effects.« less

  9. A stowing and deployment strategy for large membrane space systems on the example of Gossamer-1

    NASA Astrophysics Data System (ADS)

    Seefeldt, Patric

    2017-09-01

    Deployment systems for innovative space applications such as solar sails require a technique for a controlled and autonomous deployment in space. The deployment process has a strong impact on the mechanism and structural design and sizing. On the example of the design implemented in the Gossamer-1 project of the German Aerospace Center (DLR), such a stowing and deployment process is analyzed. It is based on a combination of zig-zag folding and coiling of triangular sail segments spanned between crossed booms. The deployment geometry and forces introduced by the mechanism considered are explored in order to reveal how the loads are transferred through the membranes to structural components such as the booms. The folding geometry and force progressions are described by function compositions of an inverse trigonometric function with the considered trigonometric function itself. If these functions are evaluated over several periods of the trigonometric function, a non-smooth oscillating curve occurs. Depending on the trigonometric function, these are often vividly described as zig-zag or sawtooth functions. The developed functions are applied to the Gossamer-1 design. The deployment geometry reveals a tendency that the loads are transferred along the catheti of the sail segments and therefore mainly along the boom axes. The load introduced by the spool deployment mechanism is described. By combining the deployment geometry with that load, a prediction of the deployment load progression is achieved. The mathematical description of the stowing and deployment geometry, as well as the forces inflicted by the mechanism provides an understanding of how exactly the membrane deploys and through which edges the deployment forces are transferred. The mathematical analysis also gives an impression of sensitive parameters that could be influenced by manufacturing tolerances or unsymmetrical deployment of the sail segments. While the mathematical model was applied on the design of the Gossamer-1 hardware, it allows an analysis of other geometries. This is of particular interest as Gossamer-1 investigated deployment technology on a relatively small scale of 5m × 5m , while the currently considered solar sail missions require sails that are about one order of magnitude bigger.

  10. Economically Sustainable Scaling of Photovoltaics to Meet Climate Targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Needleman, David Berney; Poindexter, Jeremy R.; Kurchin, Rachel C.

    To meet climate goals, photovoltaics (PV) deployment will have to grow rapidly over the next fifteen years. We identify two barriers to this growth: scale-up of manufacturing capacity and the cost of PV module production. We explore several technoeconomic approaches to overcoming these barriers and identify deep reductions in the capital intensity (capex) of PV module manufacturing and large increases in module efficiency as the most promising routes to rapid deployment. Given the lag inherent in rolling out new technology, we explore an approach where growth is fueled by debt or subsidies in the short-term and technological advances in themore » medium term. Finally, we analyze the current capex structure of crystalline silicon PV module manufacturing to identify potential savings.« less

  11. Resource allocation for epidemic control in metapopulations.

    PubMed

    Ndeffo Mbah, Martial L; Gilligan, Christopher A

    2011-01-01

    Deployment of limited resources is an issue of major importance for decision-making in crisis events. This is especially true for large-scale outbreaks of infectious diseases. Little is known when it comes to identifying the most efficient way of deploying scarce resources for control when disease outbreaks occur in different but interconnected regions. The policy maker is frequently faced with the challenge of optimizing efficiency (e.g. minimizing the burden of infection) while accounting for social equity (e.g. equal opportunity for infected individuals to access treatment). For a large range of diseases described by a simple SIRS model, we consider strategies that should be used to minimize the discounted number of infected individuals during the course of an epidemic. We show that when faced with the dilemma of choosing between socially equitable and purely efficient strategies, the choice of the control strategy should be informed by key measurable epidemiological factors such as the basic reproductive number and the efficiency of the treatment measure. Our model provides new insights for policy makers in the optimal deployment of limited resources for control in the event of epidemic outbreaks at the landscape scale.

  12. Ephemerality of discrete methane vents in lake sediments

    USGS Publications Warehouse

    Scandella, Benjamin P.; Pillsbury, Liam; Weber, Thomas; Ruppel, Carolyn D.; Hemond, Harold F.; Juanes, Ruben

    2016-01-01

    Methane is a potent greenhouse gas whose emission from sediments in inland waters and shallow oceans may both contribute to global warming and be exacerbated by it. The fraction of methane emitted by sediments that bypasses dissolution in the water column and reaches the atmosphere as bubbles depends on the mode and spatiotemporal characteristics of venting from the sediments. Earlier studies have concluded that hot spots—persistent, high-flux vents—dominate the regional ebullitive flux from submerged sediments. Here the spatial structure, persistence, and variability in the intensity of methane venting are analyzed using a high-resolution multibeam sonar record acquired at the bottom of a lake during multiple deployments over a 9 month period. We confirm that ebullition is strongly episodic, with distinct regimes of high flux and low flux largely controlled by changes in hydrostatic pressure. Our analysis shows that the spatial pattern of ebullition becomes homogeneous at the sonar's resolution over time scales of hours (for high-flux periods) or days (for low-flux periods), demonstrating that vents are ephemeral rather than persistent, and suggesting that long-term, lake-wide ebullition dynamics may be modeled without resolving the fine-scale spatial structure of venting.

  13. Energy efficiency evaluation of tree-topology 10 gigabit ethernet passive optical network and ring-topology time- and wavelength-division-multiplexed passive optical network

    NASA Astrophysics Data System (ADS)

    Song, Jingjing; Yang, Chuanchuan; Zhang, Qingxiang; Ma, Zhuang; Huang, Xingang; Geng, Dan; Wang, Ziyu

    2015-09-01

    Higher capacity and larger scales have always been the top targets for the evolution of optical access networks, driven by the ever-increasing demand from the end users. One thing that started to attract wide attention not long ago, but with at least equal importance as capacity and scale, is energy efficiency, a metric essential nowadays as human beings are confronted with severe environmental issues like global warming, air pollution, and so on. Here, different from the conventional energy consumption analysis of tree-topology networks, we propose an effective energy consumption calculation method to compare the energy efficiency of the tree-topology 10 gigabit ethernet passive optical network (10G-EPON) and ring-topology time- and wavelength-division-multiplexed passive optical network (TWDM-PON), two experimental networks deployed in China. Numerical results show that the ring-topology TWDM-PON networks with 2, 4, 8, and 16 wavelengths are more energy efficient than the tree-topology 10G-EPON, although 10G-EPON consumes less energy. Also, TWDM-PON with four wavelengths is the most energy-efficient network candidate and saves 58.7% more energy than 10G-EPON when fully loaded.

  14. Unraveling metamaterial properties in zigzag-base folded sheets.

    PubMed

    Eidini, Maryam; Paulino, Glaucio H

    2015-09-01

    Creating complex spatial objects from a flat sheet of material using origami folding techniques has attracted attention in science and engineering. In the present work, we use the geometric properties of partially folded zigzag strips to better describe the kinematics of known zigzag/herringbone-base folded sheet metamaterials such as Miura-ori. Inspired by the kinematics of a one-degree of freedom zigzag strip, we introduce a class of cellular folded mechanical metamaterials comprising different scales of zigzag strips. This class of patterns combines origami folding techniques with kirigami. Using analytical and numerical models, we study the key mechanical properties of the folded materials. We show that our class of patterns, by expanding on the design space of Miura-ori, is appropriate for a wide range of applications from mechanical metamaterials to deployable structures at small and large scales. We further show that, depending on the geometry, these materials exhibit either negative or positive in-plane Poisson's ratios. By introducing a class of zigzag-base materials in the current study, we unify the concept of in-plane Poisson's ratio for similar materials in the literature and extend it to the class of zigzag-base folded sheet materials.

  15. A Tale of Two Cities - HSI-DOAS Measurements of Air Quality

    NASA Astrophysics Data System (ADS)

    Graves, Rosemarie; Leigh, Roland; Anand, Jasdeep; McNally, Michael; Lawrence, James; Monks, Paul

    2013-04-01

    Differential Optical Absorption Spectroscopy is now commonly used as an air quality measuring system; primarily through the measurements of nitrogen dioxide (NO2) both as a ground-based and satellite technique. CityScan is a Hemispherical Scanning Imaging Differential Optical Absorption Spectrometer (HSI-DOAS) which has been optimised to measure concentrations of nitrogen dioxide. CityScan has a 95˚ field of view (FOV) between the zenith and 5˚ below the horizon. Across this FOV there are 128 resolved elements which are measured concurrently, the spectrometer is rotated azimuthally 1˚ per second providing full hemispherical coverage every 6 minutes. CityScan measures concentrations of nitrogen dioxide over specific lines of sight and due to the extensive field of view of the instrument this produces measurements which are representative over city-wide scales. Nitrogen dioxide is an important air pollutant which is produced in all combustion processes and can reduce lung function; especially in sensitised individuals. These instruments aim to bridge the gap in spatial scales between point source measurements of air quality and satellite measurements of air quality offering additional information on emissions, transport and the chemistry of nitrogen dioxide. More information regarding the CityScan technique can be found at http://www.leos.le.ac.uk/aq/index.html. CityScan has been deployed in both London and Bologna, Italy during 2012. The London deployment took place as part of the large NERC funded ClearfLo project in January and July/August. CityScan was deployed in Bologna in June as part of the large EU project PEGASOS. Analysis of both of these campaigns of data will be used to give unprecedented levels of spatial information to air quality measurements whilst also showing the difference in air quality between a relatively isolated mega city and a smaller city situated in a very polluted region; in this case the Po Valley. Results from multiple CityScan instruments will be used in conjunction with data from ground based in-situ monitor networks to evaluate the ability of in-situ monitors to effectively assess the air quality in an urban environment. Trend analysis will also be shown to demonstrate any changes in the air quality in London during the time of the Olympic Games in comparison with a normal summer.

  16. Genetics of Resistant Hypertension: the Missing Heritability and Opportunities.

    PubMed

    Teixeira, Samantha K; Pereira, Alexandre C; Krieger, Jose E

    2018-05-19

    Blood pressure regulation in humans has long been known to be a genetically determined trait. The identification of causal genetic modulators for this trait has been unfulfilling at the least. Despite the recent advances of genome-wide genetic studies, loci associated with hypertension or blood pressure still explain a very low percentage of the overall variation of blood pressure in the general population. This has precluded the translation of discoveries in the genetics of human hypertension to clinical use. Here, we propose the combined use of resistant hypertension as a trait for mapping genetic determinants in humans and the integration of new large-scale technologies to approach in model systems the multidimensional nature of the problem. New large-scale efforts in the genetic and genomic arenas are paving the way for an increased and granular understanding of genetic determinants of hypertension. New technologies for whole genome sequence and large-scale forward genetic screens can help prioritize gene and gene-pathways for downstream characterization and large-scale population studies, and guided pharmacological design can be used to drive discoveries to the translational application through better risk stratification and new therapeutic approaches. Although significant challenges remain in the mapping and identification of genetic determinants of hypertension, new large-scale technological approaches have been proposed to surpass some of the shortcomings that have limited progress in the area for the last three decades. The incorporation of these technologies to hypertension research may significantly help in the understanding of inter-individual blood pressure variation and the deployment of new phenotyping and treatment approaches for the condition.

  17. An Eulerian-Lagrangian description for fluvial coarse sediment transport: theory and verification with low-cost inertial sensors.

    NASA Astrophysics Data System (ADS)

    Maniatis, Georgios

    2017-04-01

    Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. One reference frame descriptions (Eulerian or Lagrangian) yield useful results but restrict the theoretical understanding of the process as differences between the two phases (liquid and solid) are not explicitly accounted. Recently, affordable Inertial Measurement Units (IMUs) that can be embedded in coarse (100 mm diameter scale) natural or artificial particles became available. These sensors are subjected to technical limitations when deployed for natural sediment transport. However, they give us the ability to measure for the first time the inertial dynamics (acceleration and angular velocity) of moving sediment grains under fluvial transport. Theoretically, the assumption of an ideal (IMU), rigidly attached at the centre of the mass of a sediment particle can simplify greatly the derivation of a general Eulerian-Lagrangian (E-L) model. This approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. Simplified versions of the E-L model have been evaluated in laboratory experiments using real-IMUs [Maniatis et. al 2015]. Here, experimental results are presented relevant to the evaluation of the complete E-L model. Artificial particles were deployed in a series of laboratory and field experiments. The particles are equipped with an IMU capable of recording acceleration at ± 400 g and angular velocities at ± 1200 rads/sec ranges. The sampling frequency ranges from 50 to 200 Hz for the total IMU measurement. Two sets of laboratory experiments were conducted in a 0.9m wide laboratory flume. The first is a set of entrainment threshold experiments using two artificial particles: a spherical of D=90mm (A) and an ellipsoid with axes of 100, 70 and 30 mm (B). For the second set of experiments, a spherical artificial enclosure of D=75 mm (C) was released to roll freely in a (> threshold for entrainment) flow and over surfaces of different roughness. Finally, the coarser spherical and elliptical sensor- assemblies (A and B) were deployed in a steep mountain stream during active sediment transport flow conditions. The results include the calculation of the inertial acceleration, the instantaneous particle velocity and the total kinetic energy of the mobile particle (including the rotational component using gyroscope measurements). The comparison of the field deployments with the laboratory experiments suggests that E-L model can be generalised from laboratory to natural conditions. Overall, the inertia of individual coarse particles is a statistically significant effect for all the modes of sediment transport (entrainment, translation, deposition) in both natural and laboratory regimes. Maniatis et. al 2015: "Calculating the Explicit Probability of Entrainment Based on Inertial Acceleration Measurements", J. Hydraulic Engineering, 04016097

  18. Distributed Disdrometer and Rain Gauge Measurement Infrastructure Developed for GPM Ground Validation

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Bringi, V. N.; Gatlin, Patrick; Phillips, Dustin; Schwaller, Mathew; Tokay, Ali; Wingo, Mathew; Wolff, David

    2010-01-01

    Global Precipitation Mission (GPM)retrieval algorithm validation requires datasets characterizing the 4-D structure, variability, and correlation properties of hydrometeor particle size distributions (PSD) and accumulations over satellite fields of view (FOV;<10 km). Collection of this data provides a means to assess retrieval errors related to beam filling and algorithm PSD assumptions. Hence, GPM Ground Validation is developing a deployable network of precipitation gauges and disdrometers to provide fine-scale measurements of PSD and precipitation accumulation variability. These observations will be combined with dual-frequency, polarimetric, and profiling radar data in a bootstrapping fashion to extend validated PSD measurements to a large coverage domain. Accordingly, a total of 24 Parsivel disdrometers(PD), 5 3rd-generation 2D Video Disdrometers (2DVD), 70 tipping bucket rain gauges (TBRG),9 weighing gauges, 7 Hot-Plate precipitation sensors (HP), and 3 Micro Rain Radars (MRR) have been procured. In liquid precipitation the suite of TBRG, PD and 2DVD instruments will quantify a broad spectrum of rain rate and PSD variability at sub-kilometer scales. In the envisioned network configuration 5 2DVDs will act as reference points for 16 collocated PD and TBRG measurements. We find that PD measurements provide similar measures of the rain PSD as observed with collocated 2DVDs (e.g., D0, Nw) for rain rates less than 15 mm/hr. For heavier rain rates we will rely on 2DVDs for PSD information. For snowfall we will combine point-redundant observations of SWER distributed over three or more locations within a FOV. Each location will contain at least one fenced weighing gauge, one HP, two PDs, and a 2DVD. MRRs will also be located at each site to extend the measurement to the column. By collecting SWER measurements using different instrument types that employ different measurement techniques our objective is to separate measurement uncertainty from natural variability in SWER and PSD. As demonstrated using C3VP polarimetric radar, gauge, and 2DVD/PD datasets these measurements can be combined to bootstrap an area wide SWER estimate via constrained modification of density-diameter and radar reflectivity-snowfall relationships. These data will be combined with snowpack, airborne microphysics, radar, radiometer, and tropospheric sounding data to refine GPM snowfall retrievals. The gauge and disdrometer instruments are being developed to operate autonomously when necessary using solar power and wireless communications. These systems will be deployed in numerous field campaigns through 2016. Planned deployment of these systems include field campaigns in Finland (2010), Oklahoma (2011), Canada (2012) and North Carolina (2013). GPM will also deploy 20 pairs of TBRGs within a 25 km2 region along the Virginia coast under NASA NPOL radar coverage in order to quantify errors in point-area rainfall measurements.

  19. CloudDOE: a user-friendly tool for deploying Hadoop clouds and analyzing high-throughput sequencing data with MapReduce.

    PubMed

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.

  20. CloudDOE: A User-Friendly Tool for Deploying Hadoop Clouds and Analyzing High-Throughput Sequencing Data with MapReduce

    PubMed Central

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343

  1. Towards a comprehensive climate impacts assessment of solar geoengineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Irvine, Peter J.; Kravitz, Ben; Lawrence, Mark G.

    Despite a growing literature on the climate response to solar geoengineering—proposals to cool the planet by increasing the planetary albedo—there has been little published on the impacts of solar geoengineering on natural and human systems such as agriculture, health, water resources, and ecosystems. An understanding of the impacts of different scenarios of solar geoengineering deployment will be crucial for informing decisions on whether and how to deploy it. Here we review the current state of knowledge about impacts of a solar-geoengineered climate and identify the major research gaps. We suggest that a thorough assessment of the climate impacts of amore » range of scenarios of solar geoengineering deployment is needed and can be built upon existing frameworks. However, solar geoengineering poses a novel challenge for climate impacts research as the manner of deployment could be tailored to pursue different objectives making possible a wide range of climate outcomes. We present a number of ideas for approaches to extend the survey of climate impacts beyond standard scenarios of solar geoengineering deployment to address this challenge. Reducing the impacts of climate change is the fundamental motivator for emissions reductions and for considering whether and how to deploy solar geoengineering. This means that the active engagement of the climate impacts research community will be important for improving the overall understanding of the opportunities, challenges, and risks presented by solar geoengineering.« less

  2. Towards a comprehensive climate impacts assessment of solar geoengineering

    DOE PAGES

    Irvine, Peter J.; Kravitz, Ben; Lawrence, Mark G.; ...

    2016-11-23

    Despite a growing literature on the climate response to solar geoengineering—proposals to cool the planet by increasing the planetary albedo—there has been little published on the impacts of solar geoengineering on natural and human systems such as agriculture, health, water resources, and ecosystems. An understanding of the impacts of different scenarios of solar geoengineering deployment will be crucial for informing decisions on whether and how to deploy it. Here we review the current state of knowledge about impacts of a solar-geoengineered climate and identify the major research gaps. We suggest that a thorough assessment of the climate impacts of amore » range of scenarios of solar geoengineering deployment is needed and can be built upon existing frameworks. However, solar geoengineering poses a novel challenge for climate impacts research as the manner of deployment could be tailored to pursue different objectives making possible a wide range of climate outcomes. We present a number of ideas for approaches to extend the survey of climate impacts beyond standard scenarios of solar geoengineering deployment to address this challenge. Reducing the impacts of climate change is the fundamental motivator for emissions reductions and for considering whether and how to deploy solar geoengineering. This means that the active engagement of the climate impacts research community will be important for improving the overall understanding of the opportunities, challenges, and risks presented by solar geoengineering.« less

  3. Towards a comprehensive climate impacts assessment of solar geoengineering

    NASA Astrophysics Data System (ADS)

    Irvine, Peter J.; Kravitz, Ben; Lawrence, Mark G.; Gerten, Dieter; Caminade, Cyril; Gosling, Simon N.; Hendy, Erica J.; Kassie, Belay T.; Kissling, W. Daniel; Muri, Helene; Oschlies, Andreas; Smith, Steven J.

    2017-01-01

    Despite a growing literature on the climate response to solar geoengineering—proposals to cool the planet by increasing the planetary albedo—there has been little published on the impacts of solar geoengineering on natural and human systems such as agriculture, health, water resources, and ecosystems. An understanding of the impacts of different scenarios of solar geoengineering deployment will be crucial for informing decisions on whether and how to deploy it. Here we review the current state of knowledge about impacts of a solar-geoengineered climate and identify the major research gaps. We suggest that a thorough assessment of the climate impacts of a range of scenarios of solar geoengineering deployment is needed and can be built upon existing frameworks. However, solar geoengineering poses a novel challenge for climate impacts research as the manner of deployment could be tailored to pursue different objectives making possible a wide range of climate outcomes. We present a number of ideas for approaches to extend the survey of climate impacts beyond standard scenarios of solar geoengineering deployment to address this challenge. Reducing the impacts of climate change is the fundamental motivator for emissions reductions and for considering whether and how to deploy solar geoengineering. This means that the active engagement of the climate impacts research community will be important for improving the overall understanding of the opportunities, challenges, and risks presented by solar geoengineering.

  4. Lessons Learned in the Flight Qualification of the S-NPP and NOAA-20 Solar Array Mechanisms

    NASA Technical Reports Server (NTRS)

    Helfrich, Daniel; Sexton, Adam

    2018-01-01

    Deployable solar arrays are the energy source used on almost all Earth orbiting spacecraft and their release and deployment are mission-critical; fully testing them on the ground is a challenging endeavor. The 8 meter long deployable arrays flown on two sequential NASA weather satellites were each comprised of three rigid panels almost 2 meters wide. These large panels were deployed by hinges comprised of stacked constant force springs, eddy current dampers, and were restrained through launch by a set of four releasable hold-downs using shape memory alloy release devices. The ground qualification testing of such unwieldy deployable solar arrays, whose design was optimized for orbital operations, proved to be quite challenging and provides numerous lessons learned. A paperwork review and follow-up inspection after hardware storage determined that there were negative torque margins and missing lubricant, this paper will explain how these unexpected issues were overcome. The paper will also provide details on how the hinge subassemblies, the fully-assembled array, and mechanical ground support equipment were subsequently improved and qualified for a follow-on flight with considerably less difficulty. The solar arrays built by Ball Aerospace Corp. for the Suomi National Polar Partnership (S-NPP) satellite and the Joint Polar Satellite System (JPSS-1) satellite (now NOAA-20) were both successfully deployed on-obit and are performing well.

  5. Lessons Learned in the Flight Qualification of the S-NPP and NOAA-20 Solar Array Mechanisms

    NASA Technical Reports Server (NTRS)

    Sexton, Adam; Helfrich, Dan

    2018-01-01

    Deployable solar arrays are the energy source used on almost all Earth orbiting spacecraft and their release and deployment are mission-critical; fully testing them on the ground is a challenging endeavor. The 8 meter long deployable arrays flown on two sequential NASA weather satellites were each comprised of three rigid panels almost 2 meters wide. These large panels were deployed by hinges comprised of stacked constant force springs, eddy current dampers, and were restrained through launch by a set of four releasable hold-downs using shape memory alloy release devices. The ground qualification testing of such unwieldy deployable solar arrays, whose design was optimized for orbital operations, proved to be quite challenging and provides numerous lessons learned. A paperwork review and follow-up inspection after hardware storage determined that there were negative torque margins and missing lubricant, this paper will explain how these unexpected issues were overcome. The paper will also provide details on how the hinge subassemblies, the fully-assembled array, and mechanical ground support equipment were subsequently improved and qualified for a follow-on flight with considerably less difficulty. The solar arrays built by Ball Aerospace Corp. for the Suomi National Polar Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS-1) satellite (now NOAA-20) were both successfully deployed on-obit and are performing well.

  6. Community-aware charging station network design for electrified vehicles in urban areas : reducing congestion, emissions, improving accessibility, and promoting walking, bicycling, and use of public transportation.

    DOT National Transportation Integrated Search

    2016-08-31

    A major challenge for achieving large-scale adoption of EVs is an accessible infrastructure for the communities. The societal benefits of large-scale adoption of EVs cannot be realized without adequate deployment of publicly accessible charging stati...

  7. WETS - Azura Half Scale Testing MOIS Documentation

    DOE Data Explorer

    Nelson, Eric

    2015-05-30

    This submission includes documentation on the Modular Ocean Instrumentation System (MOIS) installation on the Azura 1/2 scale wave energy converter at the Marine Station Kaneohe Bay (MCBH). Data from the deployment will be uploaded over the course of the test. The instrumentation and data come from the NREL team participating in this testing.

  8. The Development and Validation of Scores on the Mathematics Information Processing Scale (MIPS).

    ERIC Educational Resources Information Center

    Bessant, Kenneth C.

    1997-01-01

    This study reports on the development and psychometric properties of a new 87-item Mathematics Information Processing Scale that explores learning strategies, metacognitive problem-solving skills, and attentional deployment. Results with 340 college students support the use of the instrument, for which factor analysis identified five theoretically…

  9. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    NASA Astrophysics Data System (ADS)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  10. Simulating the Response of a Composite Honeycomb Energy Absorber. Part 2; Full-Scale Impact Testing

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Annett, Martin S.; Jackson, Karen E.; Polanco, Michael A.

    2012-01-01

    NASA has sponsored research to evaluate an externally deployable composite honeycomb designed to attenuate loads in the event of a helicopter crash. The concept, designated the Deployable Energy Absorber (DEA), is an expandable Kevlar(Registered TradeMark) honeycomb. The DEA has a flexible hinge that allows the honeycomb to be stowed collapsed until needed during an emergency. Evaluation of the DEA began with material characterization of the Kevlar(Registered TradeMark)-129 fabric/epoxy, and ended with a full-scale crash test of a retrofitted MD-500 helicopter. During each evaluation phase, finite element models of the test articles were developed and simulations were performed using the dynamic finite element code, LS-DYNA(Registered TradeMark). The paper will focus on simulations of two full-scale impact tests involving the DEA, a mass-simulator and a full-scale crash of an instrumented MD-500 helicopter. Isotropic (MAT24) and composite (MAT58) material models, which were assigned to DEA shell elements, were compared. Based on simulations results, the MAT58 model showed better agreement with test.

  11. Ground Deployment Demonstration and Material Testing for Solar Sail

    NASA Astrophysics Data System (ADS)

    Huang, Xiaoqi; Cheng, Zhengai; Liu, Yufei; Wang, Li

    2016-07-01

    Solar Sail is a kind of spacecraft that can achieve extremely high velocity by light pressure instead of chemical fuel. The great accelerate rely on its high area-to-mass ratio. So solar sail is always designed in huge size and it use ultra thin and light weight materials. For 100-meter class solar sail, two key points must be considered in the design process. They are fold-deployment method, and material property change in space environment. To test and verify the fold-deployment technology, a 8*8m principle prototype was developed. Sail membrane folding in method of IKAROS, Nanosail-D , and new proposed L-shape folding pattern were tested on this prototype. Their deployment properties were investigated in detail, and comparisons were made between them. Also, the space environment suitability of ultra thin polyimide films as candidate solar sail material was analyzed. The preliminary test results showed that membrane by all the folding method could deploy well. Moreover, sail membrane folding by L-shape pattern deployed more rapidly and more organized among the three folding pattern tested. The mechanical properties of the polyimide had no significant change after electron irradiation. As the preliminary research on the key technology of solar sail spacecraft, in this paper, the results of the study would provide important basis on large-scale solar sail membrane select and fold-deploying method design.

  12. Evaluation of a Technology-Based Adaptive Learning and Prevention Program for Stress Response-A Randomized Controlled Trial.

    PubMed

    Wesemann, Ulrich; Kowalski, Jens T; Jacobsen, Thomas; Beudt, Susan; Jacobs, Herbert; Fehr, Julia; Büchler, Jana; Zimmermann, Peter L

    2016-08-01

    To prevent deployment-related disorders, Chaos Driven Situations Management Retrieval System (CHARLY), a computer-aided training platform with a biofeedback interface has been developed. It simulates critical situations photorealistic for certain target and occupational groups. CHARLY was evaluated as a 1.5 days predeployment training method comparing it with the routine training. The evaluation was carried out for a matched random sample of N = 67 soldiers deployed in Afghanistan (International Security Assistance Force). Data collection took place before and after the prevention program and 4 to 6 weeks after deployment, which included mental state, post-traumatic stress disorder (PTSD) symptoms, knowledge of and attitude toward PTSD, and deployment-specific stressors. CHARLY has been significantly superior to the control group in terms of psychoeducation and attitude change. As to the mental state, both groups showed a significant increase in stress after deployment with significant lower increase in CHARLY. For PTSD-specific symptoms, CHARLY achieved a significant superiority. The fact that PTSD-specific scales showed significant differences at the end of deployment substantiates the validity of a specifically preventive effect of CHARLY. The study results tentatively indicate that highly standardized, computer-based primary prevention of mental disorders in soldiers on deployment might be superior to other more personal and less standardized forms of prevention. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  13. Imaging Exoplanets with the Exo-S Starshade Mission: Key Enabling Technologies

    NASA Astrophysics Data System (ADS)

    Kasdin, N. Jeremy; Lisman, Doug; Shaklan, Stuart; Thomson, Mark; Webb, David; Cady, Eric; Exo-S Science; Technology Definition Team, Exoplanet Program Probe Study Design Team

    2015-01-01

    There is increasing interest in the use of a starshade, a spacecraft employing a large screen flying in formation with a space telescope, for providing the starlight suppression needed to detect and characterize exoplanets. In particular, Exo-S is a NASA study directed at designing a probe-scale exoplanet mission employing a starshade. In this poster we present the enabling technologies needed to make a starshade mission a reality: flight-like petals, a deployable truss to support the petals, optical edges, optical diffraction studies, and formation sensing and control. We show the status of each technology gap and summarize our progress over the past 5 years with plans for the next 3 years in demonstrating feasibility in all these areas. In particular, since no optical end-to-end test is possible, it is necessary to both show that a starshade can be built and deployed to the required accuracy and, via laboratory experiments at smaller scale, that the optical modeling upon which the accuracy requirements are based is validated. We show our progress verifying key enabling technologies, including demonstrating that a starshade petal made from flight-like materials can be manufactured to the needed accuracy and that a central truss with attached petals can be deployed with the needed precision. We also summarize our sub-scale lab experiments that demonstrate we can achieve the contrast predicted by our optical models.

  14. Integrated deployment architecture for predictive real-time traffic routing incorporating human factors considerations.

    DOT National Transportation Integrated Search

    2014-05-01

    As Advanced Traveler Information Systems (ATIS) are being more widely accessed by drivers, understanding drivers behavioral responses to real-time travel information through ATIS and its consequential benefits are important to the widespread deplo...

  15. Developing an area-wide system for coordinated ramp meter control.

    DOT National Transportation Integrated Search

    2008-12-01

    Ramp metering has been broadly accepted and deployed as an effective countermeasure : against both recurrent and non-recurrent congestion on freeways. However, many current ramp : metering algorithms tend to improve only freeway travels using local d...

  16. Localization Framework for Real-Time UAV Autonomous Landing: An On-Ground Deployed Visual Approach

    PubMed Central

    Kong, Weiwei; Hu, Tianjiang; Zhang, Daibing; Shen, Lincheng; Zhang, Jianwei

    2017-01-01

    One of the greatest challenges for fixed-wing unmanned aircraft vehicles (UAVs) is safe landing. Hereafter, an on-ground deployed visual approach is developed in this paper. This approach is definitely suitable for landing within the global navigation satellite system (GNSS)-denied environments. As for applications, the deployed guidance system makes full use of the ground computing resource and feedbacks the aircraft’s real-time localization to its on-board autopilot. Under such circumstances, a separate long baseline stereo architecture is proposed to possess an extendable baseline and wide-angle field of view (FOV) against the traditional fixed baseline schemes. Furthermore, accuracy evaluation of the new type of architecture is conducted by theoretical modeling and computational analysis. Dataset-driven experimental results demonstrate the feasibility and effectiveness of the developed approach. PMID:28629189

  17. Localization Framework for Real-Time UAV Autonomous Landing: An On-Ground Deployed Visual Approach.

    PubMed

    Kong, Weiwei; Hu, Tianjiang; Zhang, Daibing; Shen, Lincheng; Zhang, Jianwei

    2017-06-19

    [-5]One of the greatest challenges for fixed-wing unmanned aircraft vehicles (UAVs) is safe landing. Hereafter, an on-ground deployed visual approach is developed in this paper. This approach is definitely suitable for landing within the global navigation satellite system (GNSS)-denied environments. As for applications, the deployed guidance system makes full use of the ground computing resource and feedbacks the aircraft's real-time localization to its on-board autopilot. Under such circumstances, a separate long baseline stereo architecture is proposed to possess an extendable baseline and wide-angle field of view (FOV) against the traditional fixed baseline schemes. Furthermore, accuracy evaluation of the new type of architecture is conducted by theoretical modeling and computational analysis. Dataset-driven experimental results demonstrate the feasibility and effectiveness of the developed approach.

  18. Framework for Deploying a Virtualized Computing Environment for Collaborative and Secure Data Analytics

    PubMed Central

    Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie

    2016-01-01

    Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665

  19. Securing the Global Airspace System Via Identity-Based Security

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2015-01-01

    Current telecommunications systems have very good security architectures that include authentication and authorization as well as accounting. These three features enable an edge system to obtain access into a radio communication network, request specific Quality-of-Service (QoS) requirements and ensure proper billing for service. Furthermore, the links are secure. Widely used telecommunication technologies are Long Term Evolution (LTE) and Worldwide Interoperability for Microwave Access (WiMAX) This paper provides a system-level view of network-centric operations for the global airspace system and the problems and issues with deploying new technologies into the system. The paper then focuses on applying the basic security architectures of commercial telecommunication systems and deployment of federated Authentication, Authorization and Accounting systems to provide a scalable, evolvable reliable and maintainable solution to enable a globally deployable identity-based secure airspace system.

  20. Supporting Collaborative Model and Data Service Development and Deployment with DevOps

    NASA Astrophysics Data System (ADS)

    David, O.

    2016-12-01

    Adopting DevOps practices for model service development and deployment enables a community to engage in service-oriented modeling and data management. The Cloud Services Integration Platform (CSIP) developed the last 5 years at Colorado State University provides for collaborative integration of environmental models into scalable model and data services as a micro-services platform with API and deployment infrastructure. Originally developed to support USDA natural resource applications, it proved suitable for a wider range of applications in the environmental modeling domain. While extending its scope and visibility it became apparent community integration and adequate work flow support through the full model development and application cycle drove successful outcomes.DevOps provide best practices, tools, and organizational structures to optimize the transition from model service development to deployment by minimizing the (i) operational burden and (ii) turnaround time for modelers. We have developed and implemented a methodology to fully automate a suite of applications for application lifecycle management, version control, continuous integration, container management, and container scaling to enable model and data service developers in various institutions to collaboratively build, run, deploy, test, and scale services within minutes.To date more than 160 model and data services are available for applications in hydrology (PRMS, Hydrotools, CFA, ESP), water and wind erosion prediction (WEPP, WEPS, RUSLE2), soil quality trends (SCI, STIR), water quality analysis (SWAT-CP, WQM, CFA, AgES-W), stream degradation assessment (SWAT-DEG), hydraulics (cross-section), and grazing management (GRAS). In addition, supporting data services include soil (SSURGO), ecological site (ESIS), climate (CLIGEN, WINDGEN), land management and crop rotations (LMOD), and pesticides (WQM), developed using this workflow automation and decentralized governance.

  1. Scales of variability of bio-optical properties as observed from near-surface drifters

    NASA Technical Reports Server (NTRS)

    Abbott, Mark R.; Brink, Kenneth H.; Booth, C. R.; Blasco, Dolors; Swenson, Mark S.; Davis, Curtiss O.; Codispoti, L. A.

    1995-01-01

    A drifter equipped with bio-optical sensors and an automated water sampler was deployed in the California Current as part of the coastal transition zone program to study the biological, chemical, and physical dynamics of the meandering filaments. During deployments in 1987 and 1988, measurements were made of fluorescence, downwelling irradiance, upwelling radiance, and beam attenuation using several bio-optical sensors. Samples were collected by an automated sampler for later analysis of nutrients and phytoplankton species compositon. Large-scale spatial and temporal changes in the bio-optical and biological properties of the region were driven by changes in phytoplankton species composition which, in turn, were associated with the meandering circulation. Variance spectra of the bio-optical paramenters revealed fluctuations on both diel and semidiurnal scales, perhaps associated with solar variations and internal tides, respectively. Offshore, inertial-scale fluctuations were apparent in the variance spectra of temperature, fluorescence, and beam attenuation. Although calibration samples can help remove some of these variations, these results suggest that the use of bio-optical data from unattended platforms such as moorings and drifters must be analyzed carefully. Characterization of the scaled of phytoplankton variability must account for the scales of variability in the algorithms used to convert bio-optical measurments into biological quantities.

  2. Scale Matters: An Action Plan for Realizing Sector-Wide"Zero-Energy" Performance Goals in Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selkowitz, Stephen; Selkowitz, Stephen; Granderson, Jessica

    2008-06-16

    It is widely accepted that if the United States is to reduce greenhouse gas emissions it must aggressively address energy end use in the building sector. While there have been some notable but modest successes with mandatory and voluntary programs, there have also been puzzling failures to achieve expected savings. Collectively, these programs have not yet reached the majority of the building stock, nor have they yet routinely produced very large savings in individual buildings. Several trends that have the potential to change this are noteworthy: (1) the growing market interest in 'green buildings' and 'sustainable design', (2) the majormore » professional societies (e.g. AIA, ASHRAE) have more aggressively adopted significant improvements in energy efficiency as strategic goals, e.g. targeting 'zero energy', carbon-neutral buildings by 2030. While this vision is widely accepted as desirable, unless there are significant changes to the way buildings are routinely designed, delivered and operated, zero energy buildings will remain a niche phenomenon rather than a sector-wide reality. Toward that end, a public/private coalition including the Alliance to Save Energy, LBNL, AIA, ASHRAE, USGBC and the World Business Council for Sustainable Development (WBCSD) are developing an 'action plan' for moving the U.S. commercial building sector towards zero energy performance. It addresses regional action in a national framework; integrated deployment, demonstration and R&D threads; and would focus on measurable, visible performance indicators. This paper outlines this action plan, focusing on the challenge, the key themes, and the strategies and actions leading to substantial reductions in GHG emissions by 2030.« less

  3. Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.

    PubMed

    Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.

  4. Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics

    PubMed Central

    Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285

  5. Challenges for achieving safe and effective radical cure of Plasmodium vivax: a round table discussion of the APMEN Vivax Working Group.

    PubMed

    Thriemer, Kamala; Ley, Benedikt; Bobogare, Albino; Dysoley, Lek; Alam, Mohammad Shafiul; Pasaribu, Ayodhia P; Sattabongkot, Jetsumon; Jambert, Elodie; Domingo, Gonzalo J; Commons, Robert; Auburn, Sarah; Marfurt, Jutta; Devine, Angela; Aktaruzzaman, Mohammad M; Sohel, Nayeem; Namgay, Rinzin; Drukpa, Tobgyel; Sharma, Surender Nath; Sarawati, Elvieda; Samad, Iriani; Theodora, Minerva; Nambanya, Simone; Ounekham, Sonesay; Mudin, Rose Nanti Binti; Da Thakur, Garib; Makita, Leo Sora; Deray, Raffy; Lee, Sang-Eun; Boaz, Leonard; Danansuriya, Manjula N; Mudiyanselage, Santha D; Chinanonwait, Nipon; Kitchakarn, Suravadee; Nausien, Johnny; Naket, Esau; Duc, Thang Ngo; Do Manh, Ha; Hong, Young S; Cheng, Qin; Richards, Jack S; Kusriastuti, Rita; Satyagraha, Ari; Noviyanti, Rintis; Ding, Xavier C; Khan, Wasif Ali; Swe Phru, Ching; Guoding, Zhu; Qi, Gao; Kaneko, Akira; Miotto, Olivo; Nguitragool, Wang; Roobsoong, Wanlapa; Battle, Katherine; Howes, Rosalind E; Roca-Feltrer, Arantxa; Duparc, Stephan; Bhowmick, Ipsita Pal; Kenangalem, Enny; Bibit, Jo-Anne; Barry, Alyssa; Sintasath, David; Abeyasinghe, Rabindra; Sibley, Carol H; McCarthy, James; von Seidlein, Lorenz; Baird, J Kevin; Price, Ric N

    2017-04-05

    The delivery of safe and effective radical cure for Plasmodium vivax is one of the greatest challenges for achieving malaria elimination from the Asia-Pacific by 2030. During the annual meeting of the Asia Pacific Malaria Elimination Network Vivax Working Group in October 2016, a round table discussion was held to discuss the programmatic issues hindering the widespread use of primaquine (PQ) radical cure. Participants included 73 representatives from 16 partner countries and 33 institutional partners and other research institutes. In this meeting report, the key discussion points are presented and grouped into five themes: (i) current barriers for glucose-6-phosphate deficiency (G6PD) testing prior to PQ radical cure, (ii) necessary properties of G6PD tests for wide scale deployment, (iii) the promotion of G6PD testing, (iv) improving adherence to PQ regimens and (v) the challenges for future tafenoquine (TQ) roll out. Robust point of care (PoC) G6PD tests are needed, which are suitable and cost-effective for clinical settings with limited infrastructure. An affordable and competitive test price is needed, accompanied by sustainable funding for the product with appropriate training of healthcare staff, and robust quality control and assurance processes. In the absence of quantitative PoC G6PD tests, G6PD status can be gauged with qualitative diagnostics, however none of the available tests is currently sensitive enough to guide TQ treatment. TQ introduction will require overcoming additional challenges including the management of severely and intermediately G6PD deficient individuals. Robust strategies are needed to ensure that effective treatment practices can be deployed widely, and these should ensure that the caveats are outweighed by  the benefits of radical cure for both the patients and the community. Widespread access to quality controlled G6PD testing will be critical.

  6. The impact of retail electricity tariff evolution on solar photovoltaic deployment

    DOE PAGES

    Gagnon, Pieter; Cole, Wesley J.; Frew, Bethany; ...

    2017-11-10

    Here, this analysis explores the impact that the evolution of retail electricity tariffs can have on the deployment of solar photovoltaics. It suggests that ignoring the evolution of tariffs resulted in up to a 36% higher prediction of the capacity of distributed PV in 2050, compared to scenarios that represented tariff evolution. Critically, the evolution of tariffs had a negligible impact on the total generation from PV $-$ both utility-scale and distributed $-$ in the scenarios that were examined.

  7. Heritage Adoption Lessons Learned: Cover Deployment and Latch Mechanism

    NASA Technical Reports Server (NTRS)

    Wincentsen, James

    2006-01-01

    Within JPL, there is a technology thrust need to develop a larger Cover Deployment and Latch Mechanism (CDLM) for future missions. The approach taken was to adopt and scale the CDLM design as used on the Galaxy Evolution Explorer (GALEX) project. The three separate mechanisms that comprise the CDLM will be discussed in this paper in addition to a focus on heritage adoption lessons learned and specific examples. These lessons learned will be valuable to any project considering the use of heritage designs.

  8. The impact of retail electricity tariff evolution on solar photovoltaic deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gagnon, Pieter; Cole, Wesley J.; Frew, Bethany

    Here, this analysis explores the impact that the evolution of retail electricity tariffs can have on the deployment of solar photovoltaics. It suggests that ignoring the evolution of tariffs resulted in up to a 36% higher prediction of the capacity of distributed PV in 2050, compared to scenarios that represented tariff evolution. Critically, the evolution of tariffs had a negligible impact on the total generation from PV $-$ both utility-scale and distributed $-$ in the scenarios that were examined.

  9. Oak Ridge Leadership Computing Facility Position Paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oral, H Sarp; Hill, Jason J; Thach, Kevin G

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  10. ATLAS I/O performance optimization in as-deployed environments

    NASA Astrophysics Data System (ADS)

    Maier, T.; Benjamin, D.; Bhimji, W.; Elmsheuser, J.; van Gemmeren, P.; Malon, D.; Krumnack, N.

    2015-12-01

    This paper provides an overview of an integrated program of work underway within the ATLAS experiment to optimise I/O performance for large-scale physics data analysis in a range of deployment environments. It proceeds to examine in greater detail one component of that work, the tuning of job-level I/O parameters in response to changes to the ATLAS event data model, and considers the implications of such tuning for a number of measures of I/O performance.

  11. Quality Function Deployment for Large Systems

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  12. Fiber-Optic Sensing System: Overview, Development and Deployment in Flight at NASA

    NASA Technical Reports Server (NTRS)

    Chan, Hon Man; Parker, Allen R.; Piazza, Anthony; Richards, W. Lance

    2015-01-01

    An overview of the research and technological development of the fiber-optic sensing system (FOSS) at the National Aeronautics and Space Administration Armstrong Flight Research Center (NASA AFRC) is presented. Theory behind fiber Bragg grating (FBG) sensors, as well as interrogation technique based on optical frequency domain reflectometry (OFDR) is discussed. Assessment and validation of FOSS as an accurate measurement tool for structural health monitoring is realized in the laboratory environment as well as large-scale flight deployment.

  13. Chemical Warfare and Medical Response During World War I

    PubMed Central

    Fitzgerald, Gerard J.

    2008-01-01

    The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914–1918). Historians now refer to the Great War as the chemist’s war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations. PMID:18356568

  14. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  15. Development of the Military Women's Attitudes Toward Menstrual Suppression Scale: from construct definition to pilot testing.

    PubMed

    Trego, Lori L

    2009-01-01

    The Military Women's Attitudes Toward Menstrual Suppression scale (MWATMS) was created to measure attitudes toward menstrual suppression during deployment. The human health and social ecology theories were integrated to conceptualize an instrument that accounts for military-unique aspects of the environment on attitudes toward suppression. A three-step instrument development process was followed to develop the MWATMS. The instrument was pilot tested on a convenience sample of 206 military women with deployment experience. Reliability was tested with measures of internal consistency (alpha = .97); validity was tested with principal components analysis with varimax rotation. Four components accounted for 65% of variance: Benefits/Interest, Hygiene, Convenience, and Soldier/Stress. The pilot test of the MWATMS supported its reliability and validity. Further testing is warranted for validation of this instrument.

  16. Chemical warfare and medical response during World War I.

    PubMed

    Fitzgerald, Gerard J

    2008-04-01

    The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914-1918). Historians now refer to the Great War as the chemist's war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations.

  17. Qualification flight tests of the Viking decelerator system.

    NASA Technical Reports Server (NTRS)

    Moog, R. D.; Bendura, R. J.; Timmons, J. D.; Lau, R. A.

    1973-01-01

    The Balloon Launched Decelerator Test (BLDT) series conducted at White Sands Missile Range (WSMR) during July and August of 1972 flight qualified the NASA Viking '75 decelerator system at conditions bracketing those expected for Mars. This paper discusses the decelerator system design requiremnts, compares the test results with prior work, and discusses significant considerations leading to successful qualification in earth's atmosphere. The Viking decelerator system consists of a single-stage mortar-deployed 53-foot nominal diameter disk-gap-band parachute. Full-scale parachutes were deployed behind a full-scale simulated Viking vehicle at Mach numbers from 0.47 to 2.18 and dynamic pressures from 6.9 to 14.6 psf. Analyses show that the system is qualified with sufficient margin to perform successfully for the Viking mission.

  18. Extendable retractable telescopic mast for deployable structures

    NASA Technical Reports Server (NTRS)

    Schmid, M.; Aguirre, M.

    1986-01-01

    The Extendable and Retractable Mast (ERM) which is presently developed by Dornier in the frame of an ESA-contract, will be used to deploy and retract large foldable structures. The design is based on a telescopic carbon-fiber structure with high stiffness, strength and pointing accuracy. To verify the chosen design, a breadboard model of an ERM was built and tested under thermal vacuum (TV)-conditions. It is planned as a follow-on development to manufacture and test an Engineering Model Mast. The Engineering Model will be used to establish the basis for an ERM-family covering a wide range of requirements.

  19. Parentage Analysis of Freedom Rootstock

    USDA-ARS?s Scientific Manuscript database

    The rootstock Freedom has been widely deployed for nematode resistance in California vineyards. As the offspring of two open-pollinated parents, its risk of V. vinifera-derived phylloxera susceptibility is unknown. In order to determine the progenitors of Freedom, genetic profiles of candidate paren...

  20. Sorption of Radionuclides to Building Materials and its ...

    EPA Pesticide Factsheets

    Journal article Urban contamination via a number of radiological release scenarios may require simple decontamination methods that can be deployed for wide-area decontamination. This paper investigates a number of factors of importance for developing such decontamination methods, focusing on cesium.

  1. A Lightweight, Precision-Deployable, Optical Bench for High Energy Astrophysics Missions

    NASA Astrophysics Data System (ADS)

    Danner, Rolf; Dailey, D.; Lillie, C.

    2011-09-01

    The small angle of total reflection for X-rays, forcing grazing incidence optics with large collecting areas to long focal lengths, has been a fundamental barrier to the advancement of high-energy astrophysics. Design teams around the world have long recognized that a significant increase in effective area beyond Chandra and XMM-Newton requires either a deployable optical bench or separate X-ray optics and instrument module on formation flying spacecraft. Here, we show that we have in hand the components for a lightweight, precision-deployable optical bench that, through its inherent design features, is the affordable path to the next generation of imaging high-energy astrophysics missions. We present our plans for a full-scale engineering model of a deployable optical bench for Explorer-class missions. We intend to use this test article to raise the technology readiness level (TRL) of the tensegrity truss for a lightweight, precision-deployable optical bench for high-energy astrophysics missions from TRL 3 to TRL 5 through a set of four well-defined technology milestones. The milestones cover the architecture's ability to deploy and control the focal point, characterize the deployed dynamics, determine long-term stability, and verify the stowed load capability. Our plan is based on detailed design and analysis work and the construction of a first prototype by our team. Building on our prior analysis and the high TRL of the architecture components we are ready to move on to the next step. The key elements to do this affordably are two existing, fully characterized, flight-quality, deployable booms. After integrating them into the test article, we will demonstrate that our architecture meets the deployment accuracy, adjustability, and stability requirements. The same test article can be used to further raise the TRL in the future.

  2. A prototype experiment for cooperative monitoring of nuclear reactors with cubic meter scale antineutrino detectors

    NASA Astrophysics Data System (ADS)

    Bernstein, A.; Allen, M.; Bowden, N.; Brennan, J.; Carr, D. J.; Estrada, J.; Hagmann, C.; Lund, J. C.; Madden, N. W.; Winant, C. D.

    2005-09-01

    Our Lawrence Livermore National Laboratory/Sandia National Laboratories collaboration has deployed a cubic-meter-scale antineutrino detector to demonstrate non-intrusive and automatic monitoring of the power levels and plutonium content of a nuclear reactor. Reactor monitoring of this kind is required for all non-nuclear weapons states under the Nuclear Nonproliferation Treaty (NPT), and is implemented by the International Atomic Energy Agency (IAEA). Since the antineutrino count rate and energy spectrum depend on the relative yields of fissioning isotopes in the reactor core, changes in isotopic composition can be observed without ever directly accessing the core. Data from a cubic meter scale antineutrino detector, coupled with the well-understood principles that govern the core's evolution in time, can be used to determine whether the reactor is being operated in an illegitimate way. Our group has deployed a detector at the San Onofre reactor site in California to demonstrate this concept. This paper describes the concept and shows preliminary results from 8 months of operation.

  3. Maintaining a Distributed File System by Collection and Analysis of Metrics

    NASA Technical Reports Server (NTRS)

    Bromberg, Daniel

    1997-01-01

    AFS(originally, Andrew File System) is a widely-deployed distributed file system product used by companies, universities, and laboratories world-wide. However, it is not trivial to operate: runing an AFS cell is a formidable task. It requires a team of dedicated and experienced system administratores who must manage a user base numbring in the thousands, rather than the smaller range of 10 to 500 faced by the typical system administrator.

  4. Deploy Nalu/Kokkos algorithmic infrastructure with performance benchmarking.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Domino, Stefan P.; Ananthan, Shreyas; Knaus, Robert C.

    The former Nalu interior heterogeneous algorithm design, which was originally designed to manage matrix assembly operations over all elemental topology types, has been modified to operate over homogeneous collections of mesh entities. This newly templated kernel design allows for removal of workset variable resize operations that were formerly required at each loop over a Sierra ToolKit (STK) bucket (nominally, 512 entities in size). Extensive usage of the Standard Template Library (STL) std::vector has been removed in favor of intrinsic Kokkos memory views. In this milestone effort, the transition to Kokkos as the underlying infrastructure to support performance and portability onmore » many-core architectures has been deployed for key matrix algorithmic kernels. A unit-test driven design effort has developed a homogeneous entity algorithm that employs a team-based thread parallelism construct. The STK Single Instruction Multiple Data (SIMD) infrastructure is used to interleave data for improved vectorization. The collective algorithm design, which allows for concurrent threading and SIMD management, has been deployed for the core low-Mach element- based algorithm. Several tests to ascertain SIMD performance on Intel KNL and Haswell architectures have been carried out. The performance test matrix includes evaluation of both low- and higher-order methods. The higher-order low-Mach methodology builds on polynomial promotion of the core low-order control volume nite element method (CVFEM). Performance testing of the Kokkos-view/SIMD design indicates low-order matrix assembly kernel speed-up ranging between two and four times depending on mesh loading and node count. Better speedups are observed for higher-order meshes (currently only P=2 has been tested) especially on KNL. The increased workload per element on higher-order meshes bene ts from the wide SIMD width on KNL machines. Combining multiple threads with SIMD on KNL achieves a 4.6x speedup over the baseline, with assembly timings faster than that observed on Haswell architecture. The computational workload of higher-order meshes, therefore, seems ideally suited for the many-core architecture and justi es further exploration of higher-order on NGP platforms. A Trilinos/Tpetra-based multi-threaded GMRES preconditioned by symmetric Gauss Seidel (SGS) represents the core solver infrastructure for the low-Mach advection/diffusion implicit solves. The threaded solver stack has been tested on small problems on NREL's Peregrine system using the newly developed and deployed Kokkos-view/SIMD kernels. fforts are underway to deploy the Tpetra-based solver stack on NERSC Cori system to benchmark its performance at scale on KNL machines.« less

  5. Time Series Analysis for Spatial Node Selection in Environment Monitoring Sensor Networks

    PubMed Central

    Bhandari, Siddhartha; Jurdak, Raja; Kusy, Branislav

    2017-01-01

    Wireless sensor networks are widely used in environmental monitoring. The number of sensor nodes to be deployed will vary depending on the desired spatio-temporal resolution. Selecting an optimal number, position and sampling rate for an array of sensor nodes in environmental monitoring is a challenging question. Most of the current solutions are either theoretical or simulation-based where the problems are tackled using random field theory, computational geometry or computer simulations, limiting their specificity to a given sensor deployment. Using an empirical dataset from a mine rehabilitation monitoring sensor network, this work proposes a data-driven approach where co-integrated time series analysis is used to select the number of sensors from a short-term deployment of a larger set of potential node positions. Analyses conducted on temperature time series show 75% of sensors are co-integrated. Using only 25% of the original nodes can generate a complete dataset within a 0.5 °C average error bound. Our data-driven approach to sensor position selection is applicable for spatiotemporal monitoring of spatially correlated environmental parameters to minimize deployment cost without compromising data resolution. PMID:29271880

  6. The Value of Wind Technology Innovation: Implications for the U.S. Power System, Wind Industry, Electricity Consumers, and Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mai, Trieu T; Lantz, Eric J; Mowers, Matthew

    Improvements to wind technologies have, in part, led to substantial deployment of U.S. wind power in recent years. The degree to which technology innovation will continue is highly uncertain adding to uncertainties in future wind deployment. We apply electric sector modeling to estimate the potential wind deployment opportunities across a range of technology advancement projections. The suite of projections considered span a wide range of possible cost and technology innovation trajectories, including those from a recent expert elicitation of wind energy experts, a projection based on the broader literature, and one reflecting estimates based on a U.S. DOE research initiative.more » In addition, we explore how these deployment pathways may impact the electricity system, electricity consumers, the environment, and the wind-related workforce. Overall, our analysis finds that wind technology innovation can have consequential implications for future wind power development throughout the United States, impact the broader electricity system, lower electric system and consumer costs, provide potential environmental benefits, and grow the U.S. wind workforce.« less

  7. Machine Learning for Flood Prediction in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Kuhn, C.; Tellman, B.; Max, S. A.; Schwarz, B.

    2015-12-01

    With the increasing availability of high-resolution satellite imagery, dynamic flood mapping in near real time is becoming a reachable goal for decision-makers. This talk describes a newly developed framework for predicting biophysical flood vulnerability using public data, cloud computing and machine learning. Our objective is to define an approach to flood inundation modeling using statistical learning methods deployed in a cloud-based computing platform. Traditionally, static flood extent maps grounded in physically based hydrologic models can require hours of human expertise to construct at significant financial cost. In addition, desktop modeling software and limited local server storage can impose restraints on the size and resolution of input datasets. Data-driven, cloud-based processing holds promise for predictive watershed modeling at a wide range of spatio-temporal scales. However, these benefits come with constraints. In particular, parallel computing limits a modeler's ability to simulate the flow of water across a landscape, rendering traditional routing algorithms unusable in this platform. Our project pushes these limits by testing the performance of two machine learning algorithms, Support Vector Machine (SVM) and Random Forests, at predicting flood extent. Constructed in Google Earth Engine, the model mines a suite of publicly available satellite imagery layers to use as algorithm inputs. Results are cross-validated using MODIS-based flood maps created using the Dartmouth Flood Observatory detection algorithm. Model uncertainty highlights the difficulty of deploying unbalanced training data sets based on rare extreme events.

  8. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research

    PubMed Central

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400

  9. Effects of sand fences on coastal dune vegetation distribution

    NASA Astrophysics Data System (ADS)

    Grafals-Soto, Rosana

    2012-04-01

    Sand fences are important human adjustments modifying the morphology of developed shores. The effects of sand fences on sediment transport and deposition in their initial stages have been well studied, but little is known about the effect of deteriorated sand fences that have become partially buried low scale barriers within the dune, potentially benefiting vegetation growth by protecting it from onshore stress. Data on vegetation, topography and fence characteristics were gathered at three dune sites in Ocean City, New Jersey on September 2007 and March 2008 to evaluate the effect of fences within the dune on vegetation distribution. Variables include: distance landward of dune toe, degree of sheltering from onshore stressors, net change in surface elevation (deposition or erosion), vegetation diversity and density, presence of remnant fence, and distance landward of fence. Results for the studied environment reveal that 1) vegetation diversity or density does not increase near remnant fences because most remnants are lower than average vegetation height and can not provide shelter; but 2) vegetation distribution is related to topographic variables, such as degree of sheltering, that are most likely the result of sand accretion caused by fence deployment. Fence deployment that prioritizes the creation of topographically diverse dunes within a restricted space may increase the diversity and density of the vegetation, and the resilience and value of developed dunes. Managers should consider the benefits of using sand fences on appropriately wide beaches to create a protective dune that is also diverse, functional and better able to adapt to change.

  10. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    PubMed

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  11. EOS developments

    NASA Astrophysics Data System (ADS)

    Sindrilaru, Elvin A.; Peters, Andreas J.; Adde, Geoffray M.; Duellmann, Dirk

    2017-10-01

    CERN has been developing and operating EOS as a disk storage solution successfully for over 6 years. The CERN deployment provides 135 PB and stores 1.2 billion replicas distributed over two computer centres. Deployment includes four LHC instances, a shared instance for smaller experiments and since last year an instance for individual user data as well. The user instance represents the backbone of the CERNBOX service for file sharing. New use cases like synchronisation and sharing, the planned migration to reduce AFS usage at CERN and the continuous growth has brought EOS to new challenges. Recent developments include the integration and evaluation of various technologies to do the transition from a single active in-memory namespace to a scale-out implementation distributed over many meta-data servers. The new architecture aims to separate the data from the application logic and user interface code, thus providing flexibility and scalability to the namespace component. Another important goal is to provide EOS as a CERN-wide mounted filesystem with strong authentication making it a single storage repository accessible via various services and front- ends (/eos initiative). This required new developments in the security infrastructure of the EOS FUSE implementation. Furthermore, there were a series of improvements targeting the end-user experience like tighter consistency and latency optimisations. In collaboration with Seagate as Openlab partner, EOS has a complete integration of OpenKinetic object drive cluster as a high-throughput, high-availability, low-cost storage solution. This contribution will discuss these three main development projects and present new performance metrics.

  12. Open-Source Python Tools for Deploying Interactive GIS Dashboards for a Billion Datapoints on a Laptop

    NASA Astrophysics Data System (ADS)

    Steinberg, P. D.; Bednar, J. A.; Rudiger, P.; Stevens, J. L. R.; Ball, C. E.; Christensen, S. D.; Pothina, D.

    2017-12-01

    The rich variety of software libraries available in the Python scientific ecosystem provides a flexible and powerful alternative to traditional integrated GIS (geographic information system) programs. Each such library focuses on doing a certain set of general-purpose tasks well, and Python makes it relatively simple to glue the libraries together to solve a wide range of complex, open-ended problems in Earth science. However, choosing an appropriate set of libraries can be challenging, and it is difficult to predict how much "glue code" will be needed for any particular combination of libraries and tasks. Here we present a set of libraries that have been designed to work well together to build interactive analyses and visualizations of large geographic datasets, in standard web browsers. The resulting workflows run on ordinary laptops even for billions of data points, and easily scale up to larger compute clusters when available. The declarative top-level interface used in these libraries means that even complex, fully interactive applications can be built and deployed as web services using only a few dozen lines of code, making it simple to create and share custom interactive applications even for datasets too large for most traditional GIS systems. The libraries we will cover include GeoViews (HoloViews extended for geographic applications) for declaring visualizable/plottable objects, Bokeh for building visual web applications from GeoViews objects, Datashader for rendering arbitrarily large datasets faithfully as fixed-size images, Param for specifying user-modifiable parameters that model your domain, Xarray for computing with n-dimensional array data, Dask for flexibly dispatching computational tasks across processors, and Numba for compiling array-based Python code down to fast machine code. We will show how to use the resulting workflow with static datasets and with simulators such as GSSHA or AdH, allowing you to deploy flexible, high-performance web-based dashboards for your GIS data or simulations without needing major investments in code development or maintenance.

  13. Estimating trans-seasonal variability in water column biomass for a highly migratory, deep diving predator.

    PubMed

    O'Toole, Malcolm D; Lea, Mary-Anne; Guinet, Christophe; Hindell, Mark A

    2014-01-01

    The deployment of animal-borne electronic tags is revolutionizing our understanding of how pelagic species respond to their environment by providing in situ oceanographic information such as temperature, salinity, and light measurements. These tags, deployed on pelagic animals, provide data that can be used to study the ecological context of their foraging behaviour and surrounding environment. Satellite-derived measures of ocean colour reveal temporal and spatial variability of surface chlorophyll-a (a useful proxy for phytoplankton distribution). However, this information can be patchy in space and time resulting in poor correspondence with marine animal behaviour. Alternatively, light data collected by animal-borne tag sensors can be used to estimate chlorophyll-a distribution. Here, we use light level and depth data to generate a phytoplankton index that matches daily seal movements. Time-depth-light recorders (TDLRs) were deployed on 89 southern elephant seals (Mirounga leonina) over a period of 6 years (1999-2005). TDLR data were used to calculate integrated light attenuation of the top 250 m of the water column (LA(250)), which provided an index of phytoplankton density at the daily scale that was concurrent with the movement and behaviour of seals throughout their entire foraging trip. These index values were consistent with typical seasonal chl-a patterns as measured from 8-daySea-viewing Wide Field-of-view Sensor (SeaWiFs) images. The availability of data recorded by the TDLRs was far greater than concurrent remotely sensed chl-a at higher latitudes and during winter months. Improving the spatial and temporal availability of phytoplankton information concurrent with animal behaviour has ecological implications for understanding the movement of deep diving predators in relation to lower trophic levels in the Southern Ocean. Light attenuation profiles recorded by animal-borne electronic tags can be used more broadly and routinely to estimate lower trophic distribution at sea in relation to deep diving predator foraging behaviour.

  14. Using a Network Model to Assess Risk of Forest Pest Spread via Recreational Travel

    PubMed Central

    Koch, Frank H.; Yemshanov, Denys; Haack, Robert A.; Magarey, Roger D.

    2014-01-01

    Long-distance dispersal pathways, which frequently relate to human activities, facilitate the spread of alien species. One pathway of concern in North America is the possible spread of forest pests in firewood carried by visitors to campgrounds or recreational facilities. We present a network model depicting the movement of campers and, by extension, potentially infested firewood. We constructed the model from US National Recreation Reservation Service data documenting more than seven million visitor reservations (including visitors from Canada) at campgrounds nationwide. This bi-directional model can be used to identify likely origin and destination locations for a camper-transported pest. To support broad-scale decision making, we used the model to generate summary maps for 48 US states and seven Canadian provinces that depict the most likely origins of campers traveling from outside the target state or province. The maps generally showed one of two basic spatial patterns of out-of-state (or out-of-province) origin risk. In the eastern United States, the riskiest out-of-state origin locations were usually found in a localized region restricted to portions of adjacent states. In the western United States, the riskiest out-of-state origin locations were typically associated with major urban areas located far from the state of interest. A few states and the Canadian provinces showed characteristics of both patterns. These model outputs can guide deployment of resources for surveillance, firewood inspections, or other activities. Significantly, the contrasting map patterns indicate that no single response strategy is appropriate for all states and provinces. If most out-of-state campers are traveling from distant areas, it may be effective to deploy resources at key points along major roads (e.g., interstate highways), since these locations could effectively represent bottlenecks of camper movement. If most campers are from nearby areas, they may have many feasible travel routes, so a more widely distributed deployment may be necessary. PMID:25007186

  15. Using a network model to assess risk of forest pest spread via recreational travel.

    PubMed

    Koch, Frank H; Yemshanov, Denys; Haack, Robert A; Magarey, Roger D

    2014-01-01

    Long-distance dispersal pathways, which frequently relate to human activities, facilitate the spread of alien species. One pathway of concern in North America is the possible spread of forest pests in firewood carried by visitors to campgrounds or recreational facilities. We present a network model depicting the movement of campers and, by extension, potentially infested firewood. We constructed the model from US National Recreation Reservation Service data documenting more than seven million visitor reservations (including visitors from Canada) at campgrounds nationwide. This bi-directional model can be used to identify likely origin and destination locations for a camper-transported pest. To support broad-scale decision making, we used the model to generate summary maps for 48 US states and seven Canadian provinces that depict the most likely origins of campers traveling from outside the target state or province. The maps generally showed one of two basic spatial patterns of out-of-state (or out-of-province) origin risk. In the eastern United States, the riskiest out-of-state origin locations were usually found in a localized region restricted to portions of adjacent states. In the western United States, the riskiest out-of-state origin locations were typically associated with major urban areas located far from the state of interest. A few states and the Canadian provinces showed characteristics of both patterns. These model outputs can guide deployment of resources for surveillance, firewood inspections, or other activities. Significantly, the contrasting map patterns indicate that no single response strategy is appropriate for all states and provinces. If most out-of-state campers are traveling from distant areas, it may be effective to deploy resources at key points along major roads (e.g., interstate highways), since these locations could effectively represent bottlenecks of camper movement. If most campers are from nearby areas, they may have many feasible travel routes, so a more widely distributed deployment may be necessary.

  16. Geochemical particle fluxes in the Southern Indian Ocean seasonal ice zone: Prydz Bay region, East Antarctica

    NASA Astrophysics Data System (ADS)

    Pilskaln, C. H.; Manganini, S. J.; Trull, T. W.; Armand, L.; Howard, W.; Asper, V. L.; Massom, R.

    2004-02-01

    Time-series sediment traps were deployed between December 1998 and January 2000 and from March 2000 to February 2001 at two offshore Prydz Bay sites within the seasonal ice zone (SIZ) of the Southern Indian Ocean located between 62-63°S and 73-76°E to quantify seasonal biogeochemical particle fluxes. Samples were obtained from traps placed at 1400, 2400, and 3400 m during the first deployment year (PZB-1) and from 3300 m in the second deployment year (PZB-2). All geochemical export fluxes were highly seasonal with primary peaks occurring during the austral summer and relatively low fluxes prevailing through the winter months. Secondary flux peaks in mid-winter and in early spring were suggestive of small-scale, sea-ice break-up events and the spring retreat of seasonal ice, respectively. Biogenic silica represented over 70% (by weight) of the collected trap material and provided an annual opal export of 18 g m -2 to 1 km and 3-10 g m -2 to 3 km. POC fluxes supplied an annual export of approximately 1 g m -2, equal to the estimated ocean-wide average. Elevated particulate C org/C inorg and Si bio/C inorg molar ratios indicate a productive, diatom-dominated system, although consistently small fluxes of planktonic foraminifera and pteropod shells document a heterotrophic source of carbonate to deeper waters in the SIZ. The observation of high Si bio/C org ratios and the δ15N time-series data suggest enhanced rates of diatom-POC remineralization in the upper 1000 m relative to bioSiO 2. The occurrence in this region of a pronounced temperature minimum, associated with a strong pycnocline and subsurface particle maximum at 50-100 m, may represent a zone where sinking, diatom-rich particulates temporarily accumulate and POC is remineralized.

  17. SoilSCAPE in-Situ Observations of Soil Moisture for SMAP Validation: Pushing the Envelopes of Spatial Coverage and Energy Efficiency in Sparse Wireless Sensor Networks (Invited)

    NASA Astrophysics Data System (ADS)

    Moghaddam, M.; Silva, A.; Clewley, D.; Akbar, R.; Entekhabi, D.

    2013-12-01

    Soil Moisture Sensing Controller and oPtimal Estimator (SoilSCAPE) is a wireless in-situ sensor network technology, developed under the support of NASA ESTO/AIST program, for multi-scale validation of soil moisture retrievals from the Soil Moisture Active and Passive (SMAP) mission. The SMAP sensor suite is expected to produce soil moisture retrievals at 3 km scale from the radar instrument, at 36 km from the radiometer, and at 10 km from the combination of the two sensors. To validate the retrieved soil moisture maps at any of these scales, it is necessary to perform in-situ observations at multiple scales (ten, hundreds, and thousands of meters), representative of the true spatial variability of soil moisture fields. The most recent SoilSCAPE network, deployed in the California central valley, has been designed, built, and deployed to accomplish this goal, and is expected to become a core validation site for SMAP. The network consists of up to 150 sensor nodes, each comprised of 3-4 soil moisture sensors at various depths, deployed over a spatial extent of 36 km by 36 km. The network contains multiple sub-networks, each having up to 30 nodes, whose location is selected in part based on maximizing the land cover diversity within the 36 km cell. The network has achieved unprecedented energy efficiency, longevity, and spatial coverage using custom-designed hardware and software protocols. The network architecture utilizes a nested strategy, where a number of end devices (EDs) communicate to a local coordinator (LC) using our recently developed hardware with ultra-efficient circuitry and best-effort-timeslot allocation communication protocol. The LCs in turn communicates with the base station (BS) via text messages and a new compression scheme. The hardware and software technologies required to implement this latest deployment of the SoilSCAPE network will be presented in this paper, and several data sets resulting from the measurements will be shown. The data are available publicly in near-real-time from the project web site, and are also available and searchable via an extensive set of metadata fields through the ORNL-DAAC.

  18. Relevant climate response tests for stratospheric aerosol injection: A combined ethical and scientific analysis

    NASA Astrophysics Data System (ADS)

    Lenferna, Georges Alexandre; Russotto, Rick D.; Tan, Amanda; Gardiner, Stephen M.; Ackerman, Thomas P.

    2017-06-01

    In this paper, we focus on stratospheric sulfate injection as a geoengineering scheme, and provide a combined scientific and ethical analysis of climate response tests, which are a subset of outdoor tests that would seek to impose detectable and attributable changes to climate variables on global or regional scales. We assess the current state of scientific understanding on the plausibility and scalability of climate response tests. Then, we delineate a minimal baseline against which to consider whether certain climate response tests would be relevant for a deployment scenario. Our analysis shows that some climate response tests, such as those attempting to detect changes in regional climate impacts, may not be deployable in time periods relevant to realistic geoengineering scenarios. This might pose significant challenges for justifying stratospheric sulfate aerosol injection deployment overall. We then survey some of the major ethical challenges that proposed climate response tests face. We consider what levels of confidence would be required to ethically justify approving a proposed test; whether the consequences of tests are subject to similar questions of justice, compensation, and informed consent as full-scale deployment; and whether questions of intent and hubris are morally relevant for climate response tests. We suggest further research into laboratory-based work and modeling may help to narrow the scientific uncertainties related to climate response tests, and help inform future ethical debate. However, even if such work is pursued, the ethical issues raised by proposed climate response tests are significant and manifold.

  19. Effects of Deployment Investment on the Growth of the Biofuels Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura J.; Bush, Brian W.

    2013-12-01

    In support of the national goals for biofuel use in the United States, numerous technologies have been developed that convert biomass to biofuels. Some of these biomass to biofuel conversion technology pathways are operating at commercial scales, while others are in earlier stages of development. The advancement of a new pathway toward commercialization involves various types of progress, including yield improvements, process engineering, and financial performance. Actions of private investors and public programs can accelerate the demonstration and deployment of new conversion technology pathways. These investors (both private and public) will pursue a range of pilot, demonstration, and pioneer scalemore » biorefinery investments; the most cost-effective set of investments for advancing the maturity of any given biomass to biofuel conversion technology pathway is unknown. In some cases, whether or not the pathway itself will ultimately be technically and financially successful is also unknown. This report presents results from the Biomass Scenario Model -- a system dynamics model of the biomass to biofuels system -- that estimate effects of investments in biorefineries at different maturity levels and operational scales. The report discusses challenges in estimating effects of such investments and explores the interaction between this deployment investment and a volumetric production incentive. Model results show that investments in demonstration and deployment have a substantial positive effect on the development of the biofuels industry. Results also show that other conditions, such as supportive policies, have major impacts on the effectiveness of such investments.« less

  20. Temporal and spatial variability of aeolian sand transport: Implications for field measurements

    NASA Astrophysics Data System (ADS)

    Ellis, Jean T.; Sherman, Douglas J.; Farrell, Eugene J.; Li, Bailiang

    2012-01-01

    Horizontal variability is often cited as one source of disparity between observed and predicted rates of aeolian mass flux, but few studies have quantified the magnitude of this variability. Two field projects were conducted to evaluate meter-scale spatial and temporal in the saltation field. In Shoalhaven Heads, NSW, Australia a horizontal array of passive-style sand traps were deployed on a beach for 600 or 1200 s across a horizontal span of 0.80 m. In Jericoacoara, Brazil, traps spanning 4 m were deployed for 180 and 240 s. Five saltation sensors (miniphones) spaced 1 m apart were also deployed at Jericoacoara. Spatial variation in aeolian transport rates over small spatial and short temporal scales was substantial. The measured transport rates ( Q) obtained from the passive traps ranged from 0.70 to 32.63 g/m/s. When considering all traps, the coefficient of variation ( CoV) values ranged from 16.6% to 67.8%, and minimum and maximum range of variation coefficient ( RVC) values were 106.1% to 152.5% and 75.1% to 90.8%, respectively. The miniphone Q and CoV averaged 47.1% and 4.1% for the 1260 s data series, which was subsequently sub-sampled at 60-630 s intervals to simulate shorter deployment times. A statistically significant ( p < 0.002), inverselinear relationship was found between sample duration and CoV and between Q and CoV, the latter relationship also considering data from previous studies.

  1. Improving liquid bait programs for Argentine ant control: bait station density.

    PubMed

    Nelson, Erik H; Daane, Kent M

    2007-12-01

    Argentine ants, Linepithema humile (Mayr), have a positive effect on populations of mealybugs (Pseudococcus spp.) in California vineyards. Previous studies have shown reductions in both ant activity and mealybug numbers after liquid ant baits were deployed in vineyards at densities of 85-620 bait stations/ha. However, bait station densities may need to be <85 bait stations/ha before bait-based strategies for ant control are economically comparable to spray-based insecticide treatments-a condition that, if met, will encourage the commercial adoption of liquid baits for ant control. This research assessed the effectiveness of baits deployed at lower densities. Two field experiments were conducted in commercial vineyards. In experiment 1, baits were deployed at 54-225 bait stations/ha in 2005 and 2006. In experiment 2, baits were deployed at 34-205 bait stations/ha in 2006 only. In both experiments, ant activity and the density of mealybugs in grape fruit clusters at harvest time declined with increasing bait station density. In 2005 only, European fruit lecanium scale [Parthenolecanium corni (Bouché)] were also present in fruit clusters, and scale densities were negatively related to bait station density. The results indicate that the amount of ant and mealybug control achieved by an incremental increase in the number of bait stations per hectare is constant across a broad range of bait station densities. The results are discussed in the context of commercializing liquid ant baits to provide a more sustainable Argentine ant control strategy.

  2. An Overview of Communications Technology and Development Efforts for 2015 SBIR Phase I

    NASA Technical Reports Server (NTRS)

    Nguyen, Hung D.; Steele, Gynelle C.

    2017-01-01

    This report highlights innovative SBIR 2015 Phase I projects specifically addressing areas in Communications Technology and Development which is one of six core competencies at NASA Glenn Research Center. There are fifteen technologies featured with emphasis on a wide spectrum of applications such as novel solid state lasers for space-based water vapor dial; wide temperature, high voltage and energy density capacitors for aerospace exploration; instrument for airborne measurement of carbonyl sulfide; high-power tunable seed laser for methane Lidar transmitter; ROC-rib deployable ka-band antenna for nanosatellites; a SIC-based microcontroller for high-temperature in-situ instruments and systems; improved yield, performance and reliability of high-actuator-count deformable mirrors; embedded multifunctional optical sensor system; switching electronics for space-based telescopes with advanced AO systems; integrated miniature DBR laser module for Lidar instruments; and much more. Each article in this booklet describes an innovation, technical objective, and highlights NASA commercial and industrial applications. space-based water vapor dial; wide temperature, high voltage and energy density capacitors foraerospace exploration; instrument for airborne measurement of carbonyl sulfide; high-power tunable seed laser formethane Lidar transmitter; ROC-rib deployable ka-band antenna for nanosatellites.

  3. GENERAL EARTHQUAKE-OBSERVATION SYSTEM (GEOS).

    USGS Publications Warehouse

    Borcherdt, R.D.; Fletcher, Joe B.; Jensen, E.G.; Maxwell, G.L.; VanSchaack, J.R.; Warrick, R.E.; Cranswick, E.; Johnston, M.J.S.; McClearn, R.

    1985-01-01

    Microprocessor technology has permitted the development of a General Earthquake-Observation System (GEOS) useful for most seismic applications. Central-processing-unit control via robust software of system functions that are isolated on hardware modules permits field adaptability of the system to a wide variety of active and passive seismic experiments and straightforward modification for incorporation of improvements in technology. Various laboratory tests and numerous deployments of a set of the systems in the field have confirmed design goals, including: wide linear dynamic range (16 bit/96 dB); broad bandwidth (36 hr to 600 Hz; greater than 36 hr available); selectable sensor-type (accelerometer, seismometer, dilatometer); selectable channels (1 to 6); selectable record mode (continuous, preset, trigger); large data capacity (1. 4 to 60 Mbytes); selectable time standard (WWVB, master, manual); automatic self-calibration; simple field operation; full capability to adapt system in the field to a wide variety of experiments; low power; portability; and modest costs. System design goals for a microcomputer-controlled system with modular software and hardware components as implemented on the GEOS are presented. The systems have been deployed for 15 experiments, including: studies of near-source strong motion; high-frequency microearthquakes; crustal structure; down-hole wave propagation; teleseismicity; and earth-tidal strains.

  4. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany A; Cole, Wesley J; Sun, Yinong

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve demand over the evolution of many years or decades. Various CEM formulations are used to evaluate systems ranging in scale from states or utility service territories to national or multi-national systems. CEMs can be computationally complex, and to achieve acceptable solve times, key parameters are often estimated using simplified methods. In this paper, we focus on two of these key parameters associated with the integration of variable generation (VG) resources: capacity value and curtailment. We first discuss commonmore » modeling simplifications used in CEMs to estimate capacity value and curtailment, many of which are based on a representative subset of hours that can miss important tail events or which require assumptions about the load and resource distributions that may not match actual distributions. We then present an alternate approach that captures key elements of chronological operation over all hours of the year without the computationally intensive economic dispatch optimization typically employed within more detailed operational models. The updated methodology characterizes the (1) contribution of VG to system capacity during high load and net load hours, (2) the curtailment level of VG, and (3) the potential reductions in curtailments enabled through deployment of storage and more flexible operation of select thermal generators. We apply this alternate methodology to an existing CEM, the Regional Energy Deployment System (ReEDS). Results demonstrate that this alternate approach provides more accurate estimates of capacity value and curtailments by explicitly capturing system interactions across all hours of the year. This approach could be applied more broadly to CEMs at many different scales where hourly resource and load data is available, greatly improving the representation of challenges associate with integration of variable generation resources.« less

  5. Analytical model for local scour prediction around hydrokinetic turbine foundations

    NASA Astrophysics Data System (ADS)

    Musa, M.; Heisel, M.; Hill, C.; Guala, M.

    2017-12-01

    Marine and Hydrokinetic renewable energy is an emerging sustainable and secure technology which produces clean energy harnessing water currents from mostly tidal and fluvial waterways. Hydrokinetic turbines are typically anchored at the bottom of the channel, which can be erodible or non-erodible. Recent experiments demonstrated the interactions between operating turbines and an erodible surface with sediment transport, resulting in a remarkable localized erosion-deposition pattern significantly larger than those observed by static in-river construction such as bridge piers, etc. Predicting local scour geometry at the base of hydrokinetic devices is extremely important during foundation design, installation, operation, and maintenance (IO&M), and long-term structural integrity. An analytical modeling framework is proposed applying the phenomenological theory of turbulence to the flow structures that promote the scouring process at the base of a turbine. The evolution of scour is directly linked to device operating conditions through the turbine drag force, which is inferred to locally dictate the energy dissipation rate in the scour region. The predictive model is validated using experimental data obtained at the University of Minnesota's St. Anthony Falls Laboratory (SAFL), covering two sediment mobility regimes (clear water and live bed), different turbine designs, hydraulic parameters, grain size distribution and bedform types. The model is applied to a potential prototype scale deployment in the lower Mississippi River, demonstrating its practical relevance and endorsing the feasibility of hydrokinetic energy power plants in large sandy rivers. Multi-turbine deployments are further studied experimentally by monitoring both local and non-local geomorphic effects introduced by a twelve turbine staggered array model installed in a wide channel at SAFL. Local scour behind each turbine is well captured by the theoretical predictive model. However, multi-turbine configurations introduce subtle large-scale effects that deepen local scour within the first two rows of the array and develop spatially as a two-dimensional oscillation of the mean bed downstream of the entire array.

  6. A return on investment study of the Hampton Roads Safety Service Patrol program.

    DOT National Transportation Integrated Search

    2007-01-01

    Safety Service Patrol (SSP) programs are widely used to help mitigate the effects of nonrecurring congestion on our nation's highways and have become an increasingly vital element of incident management programs. SSPs are typically deployed in areas ...

  7. Managing Investment in Teaching and Learning Technologies

    ERIC Educational Resources Information Center

    Coen, Michael; Nicol, David

    2007-01-01

    Information and communications technologies are radically changing the way that teaching and learning activities are organised and delivered within higher education (HE) institutions. A wide range of technologies is being deployed in quite complex and interactive ways, including virtual learning environments (VLEs), mobile communication…

  8. Curve identification for high friction surface treatment (HFST) installation recommendation : final report.

    DOT National Transportation Integrated Search

    2016-09-01

    The objectives of this study are to develop and deploy a means for cost-effectively extracting curve information using the widely available GPS and GIS data to support high friction surface treatment (HFST) installation recommendations (i.e., start a...

  9. Assessing Resource Assessment for MRE (Invited)

    NASA Astrophysics Data System (ADS)

    Hanson, H. P.; Bozec, A.; Duerr, A. S.; Rauchenstein, L. T.

    2010-12-01

    The Southeast National Marine Renewable Energy Center at Florida Atlantic University is concerned with marine renewable energy (MRE) recovery from the Florida Current using marine hydrokinetic technology and, in the future, from the thermocline in the Florida Straits via ocean thermal energy conversion. Although neither concept is new, technology improvements and the evolution of policy now warrant optimism for the future of these potentially rich resources. In moving toward commercial-scale deployments of energy-generating systems, an important first step is accurate and unembellished assessment of the resource itself. In short, we must ask: how much energy might be available? The answer to this deceptively simple question depends, of course, on the technology itself - system efficiencies, for example - but it also depends on a variety of other limiting factors such as deployment strategies, environmental considerations, and the overall economics of MRE in the context of competing energy resources. While it is universally agreed that MRE development must occur within a framework of environmental stewardship, it is nonetheless inevitable that there will be trade-offs between absolute environmental protection and realizing the benefits of MRE implementation. As with solar-energy and wind-power technologies, MRE technologies interact with the environment in which they are deployed. Ecological, societal, and even physical resource concerns all require investigation and, in some cases, mitigation. Moreover, the converse - how will the environment affect the equipment? - presents technical challenges that have confounded the seagoing community forever. Biofouling, for example, will affect system efficiency and create significant maintenance and operations issues. Because this will also affect the economics of MRE, nonlinear interactions among the limiting factors complicate the overall issue of resource assessment significantly. While MRE technology development is largely an engineering task, resource assessment falls more to the oceanography community. Current and temperature structure measurements, for example, are critical for these efforts. Once again, however, the picture is complicated by the nature of the endeavor: deploying complex equipment of scales of tens of meters into a medium that is traditionally measured on scales of tens of kilometers implies a scale mismatch that must be overcome. The challenge, then, is to develop assessments of the resource on larger scales - so that the potential of the resource may be understood - while characterizing it on very small scales to be able to understand how equipment will be affected. Meeting this challenge will require both funding and time, but it will also result in new oceanographic insight and understanding.

  10. LSST (Hoop/Column) Maypole Antenna Development Program, phase 1, part 1

    NASA Technical Reports Server (NTRS)

    Sullivan, M. R.

    1982-01-01

    The first of a two-phase program was performed to develop the technology necessary to evaluate, design, manufacture, package, transport and deploy the hoop/column deployable antenna reflector by means of a ground based program. The hoop/column concept consists of a cable stiffened large diameter hoop and central column structure that supports and contours a radio frequency reflective mesh surface. Mission scenarios for communications, radiometer and radio astronomy, were studied. The data to establish technology drivers that resulted in a specification of a point design was provided. The point design is a multiple beam quadaperture offset antenna system wich provides four separate offset areas of illumination on a 100 meter diameter symmetrical parent reflector. The periphery of the reflector is a hoop having 48 segments that articulate into a small stowed volume around a center extendable column. The hoop and column are structurally connected by graphite and quartz cables. The prominence of cables in the design resulted in the development of advanced cable technology. Design verification models were built of the hoop, column, and surface stowage subassemblies. Model designs were generated for a half scale sector of the surface and a 1/6 scale of the complete deployable reflector.

  11. Floating Offshore Wind in California: Gross Potential for Jobs and Economic Impacts from Two Future Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speer, Bethany; Keyser, David; Tegen, Suzanne

    Construction of the first offshore wind farm in the United States began in 2015, using fixed platform structures that are appropriate for shallow seafloors, like those located off of the East Coast and mid-Atlantic. However, floating platforms, which have yet to be deployed commercially, will likely need to anchor to the deeper seafloor if deployed off of the West Coast. To analyze the employment and economic potential for floating offshore wind along the West Coast, the Bureau of Ocean Energy Management (BOEM) has commissioned the National Renewable Energy Laboratory (NREL) to analyze two hypothetical, large-scale deployment scenarios for California: 16more » GW of offshore wind by 2050 (Scenario A) and 10 GW of offshore wind by 2050 (Scenario B). The results of this analysis can be used to better understand the general scales of economic opportunities that could result from offshore wind development. Results show total state gross domestic product (GDP) impacts of $16.2 billion in Scenario B or $39.7 billion in Scenario A for construction; and $3.5 billion in Scenario B or $7.9 billion in Scenario A for the operations phases.« less

  12. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  13. Deployment of ERP Systems at Automotive Industries, Security Inspection (Case Study: IRAN KHODRO Automotive Company)

    NASA Astrophysics Data System (ADS)

    Ali, Hatamirad; Hasan, Mehrjerdi

    Automotive industry and car production process is one of the most complex and large-scale production processes. Today, information technology (IT) and ERP systems incorporates a large portion of production processes. Without any integrated systems such as ERP, the production and supply chain processes will be tangled. The ERP systems, that are last generation of MRP systems, make produce and sale processes of these industries easier and this is the major factor of development of these industries anyhow. Today many of large-scale companies are developing and deploying the ERP systems. The ERP systems facilitate many of organization processes and make organization to increase efficiency. The security is a very important part of the ERP strategy at the organization, Security at the ERP systems, because of integrity and extensive, is more important of local and legacy systems. Disregarding of this point can play a giant role at success or failure of this kind of systems. The IRANKHODRO is the biggest automotive factory in the Middle East with an annual production over 600.000 cars. This paper presents ERP security deployment experience at the "IRANKHODRO Company". Recently, by launching ERP systems, it moved a big step toward more developments.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobler, Jeremy; Zaccheo, T. Scott; Blume, Nathan

    This report describes the development and testing of a novel system, the Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE), for Monitoring, Reporting and Verification (MRV) of CO 2 at Geological Carbon Storage (GCS) sites. The system consists of a pair of laser based transceivers, a number of retroreflectors, and a set of cloud based data processing, storage and dissemination tools, which enable 2-D mapping of the CO 2 in near real time. A system was built, tested locally in New Haven, Indiana, and then deployed to the Zero Emissions Research and Technology (ZERT) facility in Bozeman, MT. Testing at ZERTmore » demonstrated the ability of the GreenLITE system to identify and map small underground leaks, in the presence of other biological sources and with widely varying background concentrations. The system was then ruggedized and tested at the Harris test site in New Haven, IN, during winter time while exposed to temperatures as low as -15 °CºC. Additional testing was conducted using simulated concentration enhancements to validate the 2-D retrieval accuracy. This test resulted in a high confidence in the reconstruction ability to identify sources to tens of meters resolution in this configuration. Finally, the system was deployed for a period of approximately 6 months to an active industrial site, Illinois Basin – Decatur Project (IBDP), where >1M metric tons of CO 2 had been injected into an underground sandstone basin. The main objective of this final deployment was to demonstrate autonomous operation over a wide range of environmental conditions with very little human interaction, and to demonstrate the feasibility of the system for long term deployment in a GCS environment.« less

  15. Wheeling and Banking Strategies for Optimal Renewable Energy Deployment. International Experiences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heeter, Jenny; Vora, Ravi; Mathur, Shivani

    This paper defines the principles of wheeling (i.e., transmission) tariffs and renewable energy (RE) banking provisions and their role in RE deployment in countries with plans for large-scale RE. It reviews experiences to date in the United States, Mexico, and India and discusses key policy and regulatory considerations for devising more effective wheeling and/or banking provisions for countries with ambitious RE deployment targets. The paper addresses the challenges of competing needs of stakeholders, especially those of RE generators, distribution utilities, and transmission network owners and operators. The importance of wheeling and banking and their effectiveness for financial viability of REmore » deployment is also explored. This paper aims to benefit policymakers and regulators as well as key renewable energy stakeholders. Key lessons for regulators include: creating long-term wheeling and banking policy certainty, considering incentivizing RE through discounted transmission access, and assessing the cost implications of such discounts, as well as expanding access to renewable energy customers.« less

  16. Deployment and Evaluation of the Helicopter In-Flight Tracking System (HITS)

    NASA Technical Reports Server (NTRS)

    Daskalakis, Anastasios; Martone, Patrick

    2004-01-01

    The Gulf of Mexico airspace has two major operating regions: low altitude offshore (below 7,000 ft) and high altitude oceanic (above 18,000 ft). Both regions suffer significant inefficiencies due to the lack of continuous surveillance during Instrument Flight Rules operations. Provision of surveillance in the offshore region is hindered by its low-altitude nature, which makes coverage by conventional radars economically infeasible. Significant portions of the oceanic sectors are inaccessible to shore-based sensors, as they are beyond line-of-sight. Two emerging surveillance technologies were assessed that are relatively low cost and can be deployed on offshore platforms Wide Area Multilateration and Automatic Dependent Surveillance Broadcast. Performance criteria were formulated using existing FAA specifications. Three configurations were developed and deployed representative of systems serving full-size and reduced-sized domestic terminal areas and an en-route/oceanic region. These configurations were evaluated during nine flight test periods using fixed- and rotary-wing aircraft.

  17. Advanced Opto-Electronics (LIDAR and Microsensor Development)

    NASA Technical Reports Server (NTRS)

    Vanderbilt, Vern C. (Technical Monitor); Spangler, Lee H.

    2005-01-01

    Our overall intent in this aspect of the project were to establish a collaborative effort between several departments at Montana State University for developing advanced optoelectronic technology for advancing the state-of-the-art in optical remote sensing of the environment. Our particular focus was on development of small systems that can eventually be used in a wide variety of applications that might include ground-, air-, and space deployments, possibly in sensor networks. Specific objectives were to: 1) Build a field-deployable direct-detection lidar system for use in measurements of clouds, aerosols, fish, and vegetation; 2) Develop a breadboard prototype water vapor differential absorption lidar (DIAL) system based on highly stable, tunable diode laser technology developed previously at MSU. We accomplished both primary objectives of this project, in developing a field-deployable direct-detection lidar and a breadboard prototype of a water vapor DIAL system. Paper summarizes each of these accomplishments.

  18. Enterprise Deployment Through PulseRider To Treat Anterior Communicating Artery Aneurysm Recurrence.

    PubMed

    Valente, Iacopo; Limbucci, Nicola; Nappini, Sergio; Rosi, Andrea; Laiso, Antonio; Mangiafico, Salvatore

    2018-02-01

    PulseRider (Pulsar Vascular, Los Gatos, California, USA) is a new endovascular device designed to treat wide-neck bifurcation intracranial aneurysms. Deployment of a stent through a PulseRider to treat an aneurysm's recurrence has never been described before. We report the case of a 55-year-old man who underwent coiling of an 8-mm anterior communicating artery aneurysm with assistance of a PulseRider neck reconstruction device. The 6-month digital subtraction angiography control showed aneurysm recurrence, so we deployed an Enterprise 2 closed-cell stent (Codman, Miami Lakes, Florida, USA) in the A1-A2 segment passing across the previously implanted PulseRider. Enterprise correctly expanded and allowed for adequate coiling of the aneurysm. An Enterprise stent can be safely opened through a PulseRider in order to treat aneurysm recurrence. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Injury risk to restrained children exposed to deployed first- and second-generation air bags in frontal crashes.

    PubMed

    Arbogast, Kristy B; Durbin, Dennis R; Kallan, Michael J; Elliott, Michael R; Winston, Flaura K

    2005-04-01

    To estimate the risk of serious nonfatal injuries in frontal crashes among belted children seated in the right front seat of vehicles in which second-generation passenger air bags deployed compared with that of belted children seated in the right front seat of vehicles in which first-generation passenger air bags deployed. We enrolled a probability sample of 1781 seat belt-restrained occupants aged 3 through 15 years seated in the right front seat, exposed to deployed passenger air bags in frontal crashes involving insured vehicles in 3 large US regions, between December 1, 1998, and November 30, 2002. A telephone interview was conducted with the driver of the vehicle using a previously validated instrument. The study sample was weighted according to each subject's probability of selection, with analyses conducted on the weighted sample. Main Outcome Measure Risk of serious injury (Abbreviated Injury Scale score of > or =2 injuries and facial lacerations). The risk of serious injury for restrained children in the right front seat exposed to deployed second-generation passenger air bags was 9.9%, compared with 14.9% for similar children exposed to deployed first-generation passenger air bags (adjusted odds ratio, 0.59; 95% confidence interval, 0.36-0.97). This study provides evidence based on field data that the risk of injury to children exposed to deploying second-generation passenger air bags is reduced compared with earlier designs.

  20. System Level Aerothermal Testing for the Adaptive Deployable Entry and Placement Technology (ADEPT)

    NASA Technical Reports Server (NTRS)

    Cassell, Alan; Gorbunov, Sergey; Yount, Bryan; Prabhu, Dinesh; de Jong, Maxim; Boghozian, Tane; Hui, Frank; Chen, Y.-K.; Kruger, Carl; Poteet, Carl; hide

    2016-01-01

    The Adaptive Deployable Entry and Placement Technology (ADEPT), a mechanically deployable entry vehicle technology, has been under development at NASA since 2011. As part of the technical maturation of ADEPT, designs capable of delivering small payloads (10 kg) are being considered to rapidly mature sub 1 m deployed diameter designs. The unique capability of ADEPT for small payloads comes from its ability to stow within a slender volume and deploy to achieve a mass efficient drag surface with a high heat rate capability. The low ballistic coefficient results in entry heating and mechanical loads that can be met by a revolutionary three-dimensionally woven carbon fabric supported by a deployable skeleton structure. This carbon fabric has test proven capability as both primary structure and payload thermal protection system. In order to rapidly advance ADEPTs technical maturation, the project is developing test methods that enable thermostructural design requirement verification of ADEPT designs at the system level using ground test facilities. Results from these tests are also relevant to larger class missions and help us define areas of focused component level testing in order to mature material and thermal response design codes. The ability to ground test sub 1 m diameter ADEPT configurations at or near full-scale provides significant value to the rapid maturation of this class of deployable entry vehicles. This paper will summarize arc jet test results, highlight design challenges, provide a summary of lessons learned and discuss future test approaches based upon this methodology.

  1. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  2. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  3. Flexible session management in a distributed environment

    NASA Astrophysics Data System (ADS)

    Miller, Zach; Bradley, Dan; Tannenbaum, Todd; Sfiligoi, Igor

    2010-04-01

    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.

  4. Detectors for Tomorrow's Instruments

    NASA Technical Reports Server (NTRS)

    Moseley, Harvey

    2009-01-01

    Cryogenically cooled superconducting detectors have become essential tools for a wide range of measurement applications, ranging from quantum limited heterodyne detection in the millimeter range to direct searches for dark matter with superconducting phonon detectors operating at 20 mK. Superconducting detectors have several fundamental and practical advantages which have resulted in their rapid adoption by experimenters. Their excellent performance arises in part from reductions in noise resulting from their low operating temperatures, but unique superconducting properties provide a wide range of mechanisms for detection. For example, the steep dependence of resistance with temperature on the superconductor/normal transition provides a sensitive thermometer for calorimetric and bolometric applications. Parametric changes in the properties of superconducting resonators provides a mechanism for high sensitivity detection of submillimeter photons. From a practical point of view, the use of superconducting detectors has grown rapidly because many of these devices couple well to SQUID amplifiers, which are easily integrated with the detectors. These SQUID-based amplifiers and multiplexers have matured with the detectors; they are convenient to use, and have excellent noise performance. The first generation of fully integrated large scale superconducting detection systems are now being deployed. I will discuss the prospects for a new generation of instruments designed to take full advantage of the revolution in detector technology.

  5. A Review of the Environmental Impacts for Marine and Hydrokinetic Projects to Inform Regulatory Permitting: Summary Findings from the 2015 Workshop on Marine and Hydrokinetic Technologies, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baring-Gould, E. Ian; Christol, Corrie; LiVecchi, Al

    In 2014 and 2015, the U.S. Department of Energy initiated efforts to develop and implement technology- and application-focused marine and hydrokinetic (MHK) workshops to share the global experience and knowledge base on evolving MHK technologies, observed and not-observed impacts, monitoring and measurement methods, and regulatory needs. The resulting MHK Regulator Workshops engaged resource managers and other decision makers at key regulatory organizations, scientists, researchers, facilitators, and technical experts and provided an opportunity to examine the risks of single-device and small-scale deployments, explore what can be learned and observed from single devices and small-scale arrays, and consider requirements for projects atmore » varying scales of deployment. Experts and stakeholders identified key remaining information gaps. Initial discussions focused on differentiating between monitoring required for single or small-scale deployments and MHK impact research that, although important, goes beyond what is feasible or should be needed to meet specific project regulatory requirements but is appropriate for broader research and development. Four areas of identified potential environmental impacts provided the focus for the workshop: acoustic output impacts, electromagnetic field (EMF) emissions, physical interactions, and environmental effects of MHK energy development on the physical environment. Discussions also focused on the regulatory process and experience, adaptive management, industry drivers, and lessons that can be learned from the wind energy industry. The discussion was set in the context of the types of MHK technologies that are currently proposed or planned in the United States. All presentations and the following discussions are summarized in this document.« less

  6. Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.

    PubMed

    Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk

    2015-01-01

    Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system throughput performance.

  7. Biomimetic Models for An Ecological Approach to Massively-Deployed Sensor Networks

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng

    2005-01-01

    Promises of ubiquitous control of the physical environment by massively-deployed wireless sensor networks open avenues for new applications that will redefine the way we live and work. Due to small size and low cost of sensor devices, visionaries promise systems enabled by deployment of massive numbers of sensors ubiquitous throughout our environment working in concert. Recent research has concentrated on developing techniques for performing relatively simple tasks with minimal energy expense, assuming some form of centralized control. Unfortunately, centralized control is not conducive to parallel activities and does not scale to massive size networks. Execution of simple tasks in sparse networks will not lead to the sophisticated applications predicted. We propose a new way of looking at massively-deployed sensor networks, motivated by lessons learned from the way biological ecosystems are organized. We demonstrate that in such a model, fully distributed data aggregation can be performed in a scalable fashion in massively deployed sensor networks, where motes operate on local information, making local decisions that are aggregated across the network to achieve globally-meaningful effects. We show that such architectures may be used to facilitate communication and synchronization in a fault-tolerant manner, while balancing workload and required energy expenditure throughout the network.

  8. Preliminary results from DIMES: Dispersion in the ACC

    NASA Astrophysics Data System (ADS)

    Balwada, D.; Speer, K.; LaCasce, J. H.; Owens, B.

    2012-04-01

    The Diapycnal and Isopynal Mixing Experiment in the Southern Ocean (DIMES) is a CLIVAR process study designed to study mixing in the Antarctic Circumpolar Current. The experiment includes tracer release, float, and small-scale turbulence components. This presentation will report on some results of the float component, from floats deployed across the ACC in the Southeast Pacific Ocean. These are the first subsurface Lagrangian trajectories from the ACC. Floats were deployed to follow approximately a constant density surface for a period of 1-3 years. To help aid the experimental results virtual floats were advected using AVISO data and basic statistics were derived from both deployed and virtual float trajectories. Experimental design, initial results, comparison to virtual floats and single particle and relative dispersion calculations will be presented.

  9. FLUIDIC: Metal Air Recharged

    ScienceCinema

    Friesen, Cody

    2018-02-14

    Fluidic, with the help of ARPA-E funding, has developed and deployed the world's first proven high cycle life metal air battery. Metal air technology, often used in smaller scale devices like hearing aids, has the lowest cost per electron of any rechargeable battery storage in existence. Deploying these batteries for grid reliability is competitive with pumped hydro installations while having the advantages of a small footprint. Fluidic's battery technology allows utilities and other end users to store intermittent energy generated from solar and wind, as well as maintain reliable electrical delivery during power outages. The batteries are manufactured in the US and currently deployed to customers in emerging markets for cell tower reliability. As they continue to add customers, they've gained experience and real world data that will soon be leveraged for US grid reliability.

  10. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  11. Personality Assessment Inventory profiles of deployed combat troops: an empirical investigation of normative performance.

    PubMed

    Morey, Leslie C; Lowmaster, Sara E; Coldren, Rodney L; Kelly, Mark P; Parish, Robert V; Russell, Michael L

    2011-06-01

    The present study examined the normative scores and psychometric properties of the Personality Assessment Inventory (PAI; Morey, 1991) within a non-treatment-seeking sample of soldiers deployed to combat zones in Iraq, compared with a sample of community adults matched with respect to age and gender. Results indicate the scores and properties of the PAI scales were generally quite similar in the Iraq and community samples, with modest differences emerging on only 3 subscales addressing antisocial behavior, issues with close relationships, and interpersonal vigilance. These results suggest that standard normative interpretation of PAI scales is appropriate even when the instrument is administered in a combat zone. In comparison with prior research, the results may suggest that documented mental health issues among combat veterans, when present, may be particularly likely to emerge postdeployment. 2011 APA, all rights reserved

  12. Concern over radiation exposure and psychological distress among rescue workers following the Great East Japan Earthquake.

    PubMed

    Matsuoka, Yutaka; Nishi, Daisuke; Nakaya, Naoki; Sone, Toshimasa; Noguchi, Hiroko; Hamazaki, Kei; Hamazaki, Tomohito; Koido, Yuichi

    2012-05-15

    On March 11, 2011, the Great East Japan Earthquake and tsunami that followed caused severe damage along Japans northeastern coastline and to the Fukushima Daiichi nuclear power plant. To date, there are few reports specifically examining psychological distress in rescue workers in Japan. Moreover, it is unclear to what extent concern over radiation exposure has caused psychological distress to such workers deployed in the disaster area. One month after the disaster, 424 of 1816 (24%) disaster medical assistance team workers deployed to the disaster area were assessed. Concern over radiation exposure was evaluated by a single self-reported question. General psychological distress was assessed with the Kessler 6 scale (K6), depressive symptoms with the Center for Epidemiologic Studies Depression Scale (CES-D), fear and sense of helplessness with the Peritraumatic Distress Inventory (PDI), and posttraumatic stress symptoms with the Impact of Event Scale-Revised (IES-R). Radiation exposure was a concern for 39 (9.2%) respondents. Concern over radiation exposure was significantly associated with higher scores on the K6, CES-D, PDI, and IES-R. After controlling for age, occupation, disaster operation experience, duration of time spent watching earthquake news, and past history of psychiatric illness, these associations remained significant in men, but did not remain significant in women for the CES-D and PDI scores. The findings suggest that concern over radiation exposure was strongly associated with psychological distress. Reliable, accurate information on radiation exposure might reduce deployment-related distress in disaster rescue workers.

  13. Intranets: Just Another Bandwagon?

    ERIC Educational Resources Information Center

    Lynch, Gary

    1997-01-01

    Discusses intranets--the deployment and use of Internet technologies such as the World Wide Web, electronic mail, and Transmission Control Protocol/Internet Protocol (TCP/IP) on a closed network. Considers the "hype," benefits, standards, implementation, and problems of intranets, and concludes that while intranets can be beneficial,…

  14. War Gaming Peace Operations

    ERIC Educational Resources Information Center

    Mason, Roger; Patterson, Eric

    2013-01-01

    Today's military personnel fight against and work with a diverse variety of nonstate actors, from al-Qaeda terrorists to major nongovernmental organizations who provide vital humanitarian assistance. Furthermore, the nontraditional battle spaces where America and its allies have recently deployed (Kosovo, Afghanistan, Iraq) include a wide range of…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsmith, John E. M.; Brennan, James S.; Brubaker, Erik

    A wide range of NSC (Neutron Scatter Camera) activities were conducted under this lifecycle plan. This document outlines the highlights of those activities, broadly characterized as system improvements, laboratory measurements, and deployments, and presents sample results in these areas. Additional information can be found in the documents that reside in WebPMIS.

  16. Evolutionary Design of an X-Band Antenna for NASA's Space Technology 5 Mission

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Hornby, Gregory S.; Rodriguez-Arroyo, Adan; Linden, Derek S.; Kraus, William F.; Seufert, Stephen E.

    2003-01-01

    We present an evolved X-band antenna design and flight prototype currently on schedule to be deployed on NASA s Space Technology 5 spacecraft in 2004. The mission consists of three small satellites that wall take science measurements in Earth s magnetosphere. The antenna was evolved to meet a challenging set of mission requirements, most notably the combination of wide beamwidth for a circularly-polarized wave and wide bandwidth. Two genetic algorithms were used: one allowed branching an the antenna arms and the other did not. The highest performance antennas from both algorithms were fabricated and tested. A handdesigned antenna was produced by the contractor responsible for the design and build of the mission antennas. The hand-designed antenna is a quadrifilar helix, and we present performance data for comparison to the evolved antennas. As of this writing, one of our evolved antenna prototypes is undergoing flight qualification testing. If successful, the resulting antenna would represent the first evolved hardware in space, and the first deployed evolved antenna.

  17. Technical Requirements For Reactors To Be Deployed Internationally For the Global Nuclear Energy Partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingersoll, Daniel T

    2007-01-01

    Technical Requirements For Reactors To Be Deployed Internationally For the Global Nuclear Energy Partnership Robert Price U.S. Department of Energy, 1000 Independence Ave, SW, Washington, DC 20585, Daniel T. Ingersoll Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6162, INTRODUCTION The Global Nuclear Energy Partnership (GNEP) seeks to create an international regime to support large-scale growth in the worldwide use of nuclear energy. Fully meeting the GNEP vision may require the deployment of thousands of reactors in scores of countries, many of which do not use nuclear energy currently. Some of these needs will be met by large-scalemore » Generation III and III+ reactors (>1000 MWe) and Generation IV reactors when they are available. However, because many developing countries have small and immature electricity grids, the currently available Generation III(+) reactors may be unsuitable since they are too large, too expensive, and too complex. Therefore, GNEP envisions new types of reactors that must be developed for international deployment that are "right sized" for the developing countries and that are based on technologies, designs, and policies focused on reducing proliferation risk. The first step in developing such systems is the generation of technical requirements that will ensure that the systems meet both the GNEP policy goals and the power needs of the recipient countries. REQUIREMENTS Reactor systems deployed internationally within the GNEP context must meet a number of requirements similar to the safety, reliability, economics, and proliferation goals established for the DOE Generation IV program. Because of the emphasis on deployment to nonnuclear developing countries, the requirements will be weighted differently than with Generation IV, especially regarding safety and non-proliferation goals. Also, the reactors should be sized for market conditions in developing countries where energy demand per capita, institutional maturity and industrial infrastructure vary considerably, and must utilize fuel that is compatible with the fuel recycle technologies being developed by GNEP. Arrangements are already underway to establish Working Groups jointly with Japan and Russia to develop requirements for reactor systems. Additional bilateral and multilateral arrangements are expected as GNEP progresses. These Working Groups will be instrumental in establishing an international consensus on reactor system requirements. GNEP CERTIFICATION After establishing an accepted set of requirements for new reactors that are deployed internationally, a mechanism is needed that allows capable countries to continue to market their reactor technologies and services while assuring that they are compatible with GNEP goals and technologies. This will help to preserve the current system of open, commercial competition while steering the international community to meet common policy goals. The proposed vehicle to achieve this is the concept of GNEP Certification. Using objective criteria derived from the technical requirements in several key areas such as safety, security, non-proliferation, and safeguards, reactor designs could be evaluated and then certified if they meet the criteria. This certification would ensure that reactor designs meet internationally approved standards and that the designs are compatible with GNEP assured fuel services. SUMMARY New "right sized" power reactor systems will need to be developed and deployed internationally to fully achieve the GNEP vision of an expanded use of nuclear energy world-wide. The technical requirements for these systems are being developed through national and international Working Groups. The process is expected to culminate in a new GNEP Certification process that enables commercial competition while ensuring that the policy goals of GNEP are adequately met.« less

  18. Understanding Emerging Impacts and Requirements Related to Utility-Scale Solar Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartmann, Heidi M.; Grippo, Mark A.; Heath, Garvin A.

    2016-09-01

    Utility-scale solar energy plays an important role in the nation’s strategy to address climate change threats through increased deployment of renewable energy technologies, and both the federal government and individual states have established specific goals for increased solar energy development. In order to achieve these goals, much attention is paid to making utility-scale solar energy cost-competitive with other conventional energy sources, while concurrently conducting solar development in an environmentally sound manner.

  19. Is Intelligent Speed Adaptation ready for deployment?

    PubMed

    Carsten, Oliver

    2012-09-01

    There have been 30 years of research on Intelligent Speed Adaptation (ISA), the in-vehicle system that is designed to promote compliance with speed limits. Extensive trials of ISA in real-world driving have shown that ISA can significantly reduce speeding, users have been found to have generally positive attitudes and at least some sections of the public have been shown to be willing to purchase ISA systems. Yet large-scale deployment of a system that could deliver huge accident reductions is still by no means guaranteed. Copyright © 2012. Published by Elsevier Ltd.

  20. A test-bed modeling study for wave resource assessment

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  1. High Fidelity Computational Analysis of CO2 Trapping at Pore Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Vinod

    2013-07-13

    With an alarming rise in carbon dioxide (CO2) emission from anthropogenic sources, CO2 sequestration has become an attractive choice to mitigate the emission. Some popular storage media for CO{sub 2} are oil reservoirs, deep coal-bed, and deep oceanic-beds. These have been used for the long term CO{sub 2} storage. Due to special lowering viscosity and surface tension property of CO{sub 2}, it has been widely used for enhanced oil recovery. The sites for CO{sub 2} sequestration or enhanced oil recovery mostly consist of porous rocks. Lack of knowledge of molecular mobility under confinement and molecule-surface interactions between CO2 and naturalmore » porous media results in generally governed by unpredictable absorption kinetics and total absorption capacity for injected fluids, and therefore, constitutes barriers to the deployment of this technology. Therefore, it is important to understand the flow dynamics of CO{sub 2} through the porous microstructures at the finest scale (pore-scale) to accurately predict the storage potential and long-term dynamics of the sequestered CO{sub 2}. This report discusses about pore-network flow modeling approach using variational method and analyzes simulated results this method simulations at pore-scales for idealized network and using Berea Sandstone CT scanned images. Variational method provides a promising way to study the kinetic behavior and storage potential at the pore scale in the presence of other phases. The current study validates variational solutions for single and two-phase Newtonian and single phase non-Newtonian flow through angular pores for special geometries whose analytical and/or empirical solutions are known. The hydraulic conductance for single phase flow through a triangular duct was also validated against empirical results derived from lubricant theory.« less

  2. Characterization of Two Ton NaI Scintillator

    NASA Astrophysics Data System (ADS)

    Maier, Alleta; Coherent Collaboration

    2017-09-01

    The COHERENT collaboration is dedicated to measuring Coherent Elastic Neutrino-Nucleus Scattering (CE νNS), an interaction predicted by the standard model that ultimately serves as a background floor for dark matter detection. In the pursuit of observing the N2 scaling predicted, COHERENT is deploying two tons of NaI[Tl] detector to observe CE νNS recoils of sodium nuclei. Before the two tons of this NaI[Tl] scintillator are deployed, however, all crystals and PMTs must be characterized to understand the individual properties vital to precision in the measurement of CE νNS. This detector is also expected to allow COHERENT to observe charged current and CE νNS interactions with 127I. A standard operating procedure is developed to characterize each detector based on seven properties relevant to precision in the measurement of CE νNS: energy scale, energy resolution, low-energy light yield non-linearity, decay time energy dependence, position variance, time variance, and background levels. Crystals will be tested and characterized for these properties in the context of a ton-scale NaI[Tl] detector. Preliminary development of the SOP has allowed for greater understanding of optimization methods needed for characterization for the ton scale detector. TUNL, NSF, Duke University.

  3. New Markets for Solar Photovoltaic Power Systems

    NASA Astrophysics Data System (ADS)

    Thomas, Chacko; Jennings, Philip; Singh, Dilawar

    2007-10-01

    Over the past five years solar photovoltaic (PV) power supply systems have matured and are now being deployed on a much larger scale. The traditional small-scale remote area power supply systems are still important and village electrification is also a large and growing market but large scale, grid-connected systems and building integrated systems are now being deployed in many countries. This growth has been aided by imaginative government policies in several countries and the overall result is a growth rate of over 40% per annum in the sales of PV systems. Optimistic forecasts are being made about the future of PV power as a major source of sustainable energy. Plans are now being formulated by the IEA for very large-scale PV installations of more than 100 MW peak output. The Australian Government has announced a subsidy for a large solar photovoltaic power station of 154 MW in Victoria, based on the concentrator technology developed in Australia. In Western Australia a proposal has been submitted to the State Government for a 2 MW photovoltaic power system to provide fringe of grid support at Perenjori. This paper outlines the technologies, designs, management and policies that underpin these exciting developments in solar PV power.

  4. Standing wave tube electro active polymer wave energy converter

    NASA Astrophysics Data System (ADS)

    Jean, Philippe; Wattez, Ambroise; Ardoise, Guillaume; Melis, C.; Van Kessel, R.; Fourmon, A.; Barrabino, E.; Heemskerk, J.; Queau, J. P.

    2012-04-01

    Over the past 4 years SBM has developed a revolutionary Wave Energy Converter (WEC): the S3. Floating under the ocean surface, the S3 amplifies pressure waves similarly to a Ruben's tube. Only made of elastomers, the system is entirely flexible, environmentally friendly and silent. Thanks to a multimodal resonant behavior, the S3 is capable of efficiently harvesting wave energy from a wide range of wave periods, naturally smoothing the irregularities of ocean wave amplitudes and periods. In the S3 system, Electro Active Polymer (EAP) generators are distributed along an elastomeric tube over several wave lengths, they convert wave induced deformations directly into electricity. The output is high voltage multiphase Direct Current with low ripple. Unlike other conventional WECs, the S3 requires no maintenance of moving parts. The conception and operating principle will eventually lead to a reduction of both CAPEX and OPEX. By integrating EAP generators into a small scale S3, SBM achieved a world first: direct conversion of wave energy in electricity with a moored flexible submerged EAP WEC in a wave tank test. Through an extensive testing program on large scale EAP generators, SBM identified challenges in scaling up to a utility grid device. French Government supports the consortium consisting of SBM, IFREMER and ECN in their efforts to deploy a full scale prototype at the SEMREV test center in France at the horizon 2014-2015. SBM will be seeking strategic as well as financial partners to unleash the true potentials of the S3 Standing Wave Tube Electro Active Polymer WEC.

  5. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments.

    PubMed

    Gopalakrishnan, V; Subramanian, V; Baskaran, R; Venkatraman, B

    2015-07-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  6. Small unmanned aircraft systems for remote sensing and Earth science research

    NASA Astrophysics Data System (ADS)

    Hugenholtz, Chris H.; Moorman, Brian J.; Riddell, Kevin; Whitehead, Ken

    2012-06-01

    To understand and predict Earth-surface dynamics, scientists often rely on access to the latest remote sensing data. Over the past several decades, considerable progress has been made in the development of specialized Earth observation sensors for measuring a wide range of processes and features. Comparatively little progress has been made, however, in the development of new platforms upon which these sensors can be deployed. Conventional platforms are still almost exclusively restricted to piloted aircraft and satellites. For many Earth science research questions and applications these platforms do not yet have the resolution or operational flexibility to provide answers affordably. The most effective remote sensing data match the spatiotemporal scale of the process or feature of interest. An emerging technology comprising unmanned aircraft systems (UAS), also known as unmanned aerial vehicles (UAV), is poised to offer a viable alternative to conventional platforms for acquiring high-resolution remote sensing data with increased operational flexibility, lower cost, and greater versatility (Figure 1).

  7. Characterizing User Groups in Online Social Networks

    NASA Astrophysics Data System (ADS)

    Gyarmati, László; Trinh, Tuan Anh

    The users’ role is crucial in the development, deployment and the success of online social networks (OSNs). Despite this fact, little is known and even less has been published about user activities in the operating OSNs. In this paper, we present a large scale measurement analysis of user behaviour, in terms of time spent online, in some popular OSNs, namely Bebo, Flixster, MySpace, and Skyrock, and characterise user groups in OSNs. We used more than 200 PlanetLab [1] nodes for our measurement, monitored more than 3000 users for three weeks by downloading repeatedly their profile pages; more than 100 million pages were processed in total. The main findings of the paper are the following. Firstly, we create a measurement framework in order to observe user activity. Secondly, we present cumulative usage statistics of the different OSNs. Thirdly, we classify the monitored users into different groups and characterise the common properties of the members. Finally, we illustrate the wide applicability of our datasets by predicting the sign out method of the OSN users.

  8. The newest findings on Red Lake (Dinaric karst of Croatia)

    NASA Astrophysics Data System (ADS)

    Andrić, Ivo; Jukić, Branimir

    2014-05-01

    Red Lake in the Dinaric karst (Croatia) is of the deepest karst lakes in the world. Even so, through the history of Red Lake's research there were many controversies in the conclusions and the theories concerning its genesis, geomorphology and hydrology. This work has for a goal to present the newest research results won with the help of emerging technologies based on LiDAR and SoNAR methods. The measurements took place during September 2013. New generation of equipment developed to advance the geoscientific research has been deployed during the field work and the gathered data enabled the analysis which led to a new understanding of the lake's morphology. Some of the results confirmed already known and well documented features of Red Lake whereas others disputed widely accepted assumptions in the scientific community and general public. The objective of this paper is also groundwork for further research in the field of karst hydrology and a new insight on local and regional scale.

  9. Field measurement of penetrator seismic coupling in sediments and volcanic rocks

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Latham, G. V.; Frohlich, C.

    1979-01-01

    Field experiments were conducted to determine experimentally how well a seismometer installed using a penetrator would be coupled to the ground. A dry lake bed and a lava bed were chosen as test sites to represent geological environments of two widely different material properties. At each site, two half-scale penetrators were fired into the ground, a three-component geophone assembly was mounted to the aft end of each penetrator, and dummy penetrators were fired at various distances to generate seismic signals. The recorded signals were digitized, and cross-spectral analyses were performed to compare the observed signals in terms of power spectral density ratio, coherence and phase difference. The analyses indicate that seismometers deployed by penetrators will be as well coupled to the ground as are seismometers installed by conventional methods for the frequency range of interest in earthquake seismology, although some minor differences were observed at frequencies near the upper limit of the frequency band.

  10. The Soil Moisture Active and Passive Mission (SMAP): Science and Applications

    NASA Technical Reports Server (NTRS)

    Entekhabi, Dara; O'Neill, Peggy; Njoku, Eni

    2009-01-01

    The Soil Moisture Active and Passive mission (SMAP) will provide global maps of soil moisture content and surface freeze/thaw state. Global measurements of these variables are critical for terrestrial water and carbon cycle applications. The SMAP observatory consists of two multipolarization L-band sensors, a radar and radiometer, that share a deployable-mesh reflector antenna. The combined observations from the two sensors will allow accurate estimation of soil moisture at hydrometeorological (10 km) and hydroclimatological (40 km) spatial scales. The rotating antenna configuration provides conical scans of the Earth surface at a constant look angle. The wide-swath (1000 km) measurements will allow global mapping of soil moisture and its freeze/thaw state with 2-3 days revisit. Freeze/thaw in boreal latitudes will be mapped using the radar at 3 km resolution with 1-2 days revisit. The synergy of active and passive observations enables measurements of soil moisture and freeze/thaw state with unprecedented resolution, sensitivity, area coverage and revisit.

  11. Virtual Network Configuration Management System for Data Center Operations and Management

    NASA Astrophysics Data System (ADS)

    Okita, Hideki; Yoshizawa, Masahiro; Uehara, Keitaro; Mizuno, Kazuhiko; Tarui, Toshiaki; Naono, Ken

    Virtualization technologies are widely deployed in data centers to improve system utilization. However, they increase the workload for operators, who have to manage the structure of virtual networks in data centers. A virtual-network management system which automates the integration of the configurations of the virtual networks is provided. The proposed system collects the configurations from server virtualization platforms and VLAN-supported switches, and integrates these configurations according to a newly developed XML-based management information model for virtual-network configurations. Preliminary evaluations show that the proposed system helps operators by reducing the time to acquire the configurations from devices and correct the inconsistency of operators' configuration management database by about 40 percent. Further, they also show that the proposed system has excellent scalability; the system takes less than 20 minutes to acquire the virtual-network configurations from a large scale network that includes 300 virtual machines. These results imply that the proposed system is effective for improving the configuration management process for virtual networks in data centers.

  12. High-performance flat-panel solar thermoelectric generators with high thermal concentration

    NASA Astrophysics Data System (ADS)

    Kraemer, Daniel; Poudel, Bed; Feng, Hsien-Ping; Caylor, J. Christopher; Yu, Bo; Yan, Xiao; Ma, Yi; Wang, Xiaowei; Wang, Dezhi; Muto, Andrew; McEnaney, Kenneth; Chiesa, Matteo; Ren, Zhifeng; Chen, Gang

    2011-07-01

    The conversion of sunlight into electricity has been dominated by photovoltaic and solar thermal power generation. Photovoltaic cells are deployed widely, mostly as flat panels, whereas solar thermal electricity generation relying on optical concentrators and mechanical heat engines is only seen in large-scale power plants. Here we demonstrate a promising flat-panel solar thermal to electric power conversion technology based on the Seebeck effect and high thermal concentration, thus enabling wider applications. The developed solar thermoelectric generators (STEGs) achieved a peak efficiency of 4.6% under AM1.5G (1 kW m-2) conditions. The efficiency is 7-8 times higher than the previously reported best value for a flat-panel STEG, and is enabled by the use of high-performance nanostructured thermoelectric materials and spectrally-selective solar absorbers in an innovative design that exploits high thermal concentration in an evacuated environment. Our work opens up a promising new approach which has the potential to achieve cost-effective conversion of solar energy into electricity.

  13. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography.Our derivation, which is based on the rate-summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills mature pine trees.more » This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  14. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in amore » preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, G.N.; Ride, S.K.; Townsend, J.S.

    It is widely believed that an arms control limit on nuclear-armed sea-launched cruise missiles would be nearly impossible to verify. Among the reasons usually given are: these weapons are small, built in nondistinctive industrial facilities, deployed on a variety of ships and submarines, and difficult to distinguish from their conventionally armed counterparts. In this article, it is argued that the covert production and deployment of nuclear-armed sea-launched cruise missiles would not be so straightforward. A specific arms control proposed is described, namely a total ban on nuclear-armed sea-launched cruise missiles. This proposal is used to illustrate how an effective verificationmore » scheme might be constructed. 9 refs., 6 figs.« less

  16. Dispelling myths about verification of sea-launched cruise missiles.

    PubMed

    Lewis, G N; Ride, S K; Townsend, J S

    1989-11-10

    It is widely believed that an arms control limit on nuclear-armed sea-launched cruise missiles would be nearly impossible to verify. Among the reasons usually given are: these weapons are small, built in nondistinctive industrial facilities, deployed on a variety of ships and submarines, and difficult to distinguish from their conventionally armed counterparts. In this article, it is argued that the covert production and deployment of nuclear-armed sealaunched cruise missiles would not be so straightforward. A specific arms control proposal is described, namely a total ban on nuclear-armed sea-launched cruise missiles. This proposal is used to illustrate how an effective verification scheme might be constructed.

  17. Preparedness Evaluation of French Military Orthopedic Surgeons Before Deployment.

    PubMed

    Choufani, Camille; Barbier, Olivier; Mayet, Aurélie; Rigal, Sylvain; Mathieu, Laurent

    2018-06-13

    A deployed military orthopedic surgeon is a trauma surgeon working in austere conditions. The first aim of this study was to analyze the current activity of French military orthopedic surgeons in the field and to identify the differences of the combat zone with their daily practice. The second aim was to assess the adequacy of the preparedness they received before their deployment and to identify additional needs that could be addressed in future training. An evaluation survey was sent to all French military orthopedic surgeons deployed in theaters of operations between 2004 and 2014. An analogic visual scale of 10 was used to evaluate their surgical activity abroad and prior training. A total of 55 surgeons, with a median deployment number of 7, were included in this study after they answered the survey. Debridement and external fixation were the most common orthopedic procedures. The practice of general surgery was mostly concerned with vascular and abdominal injuries as part of damage control procedures. Median scores were ranked at seven for surgical preparedness, five for physical readiness, and three for mental preparedness. There was a significant inverse relationship between the number of missions performed and the evaluation of surgical preparedness. The higher they perceived their mental preparedness, the better they estimated their surgical preparedness. In the French Army, deployed orthopedic surgeons perform general surgical activity. Their initial training must be adapted to this constraint and enhanced by continuing medical education.

  18. First Results from UAS Deployed Ocean Sensor Systems during the 2013 MIZOPEX Campaign

    NASA Astrophysics Data System (ADS)

    Palo, S. E.; Weibel, D.; Lawrence, D.; LoDolce, G.; Bradley, A. C.; Adler, J.; Maslanik, J. A.; Walker, G.

    2013-12-01

    The Marginal Ice Zone Observations and Processes Experiment (MIZOPEX), is an Arctic field campaign which occurred during summer 2013. The goals of the project are to understand how warming of the marginal ice zone affects sea ice melt and if this warming has been over or underestimated by satellite measurements. To achieve these goals calibrated physical measurements, both remote and in-situ, of the marginal ice zone over scales of square kilometers with a resolution of square meters is required. This will be accomplished with a suite of unmanned aerial vehicles (UAVs) equipped with both remote sensing and in-situ instruments, air deployed microbuoys, and ship deployed buoys. In this talk we will present details about the air deployed microbouys (ADMB) and self-deployed surface sondes (SDSS) developed at the University of Colorado. Both the ADMB and SDSS share a common measurement suite with the capability to measure water temperature at three distinct depths and provide position information via GPS. The ADMB is 90 grams, 1.3 inches in diameter, 4.25 inches long and is designed for deployment from the InSitu ScanEagle platform. The designed and experimentally verified operational lifetime is 10 days, however this can be extended with additional batteries.. While the ADMB are deployed from the ScanEagle, the SDSS are vectorable and can be remotely and precisely positioned. Lab performance results, calibration results and initial results from the ADMB and SDSS that were deployed during the MIZOPEX mission will be presented. These results include day-in-the-life tests, antenna pattern analysis, range tests, temperature measurement accuracy and initial scientific results from the campaign.

  19. Advantages and limitations of remotely operated sea floor drill rigs

    NASA Astrophysics Data System (ADS)

    Freudenthal, T.; Smith, D. J.; Wefer, G.

    2009-04-01

    A variety of research targets in marine sciences including the investigation of gas hydrates, slope stability, alteration of oceanic crust, ore formation and palaeoclimate can be addressed by shallow drilling. However, drill ships are mostly used for deep drillings, both because the effort of building up a drill string from a drill ship to the deep sea floor is tremendous and control on drill bit pressure from a movable platform and a vibrating drill string is poor especially in the upper hundred meters. During the last decade a variety of remotely operated drill rigs have been developed, that are deployed on the sea bed and operated from standard research vessels. These developments include the BMS (Bentic Multicoring System, developed by Williamson and Associates, operated by the Japanese Mining Agency), the PROD (Portable Remotely Operated Drill, developed and operated by Benthic Geotech), the Rockdrill 2 (developed and operated by the British geological Survey) and the MeBo (German abbreviation for sea floor drill rig, developed and operated by Marum, University of Bremen). These drill rigs reach drilling depths between 15 and 100 m. For shallow drillings remotely operated drill rigs are a cost effective alternative to the services of drill ships and have the major advantage that the drilling operations are performed from a stable platform independent of any ship movements due to waves, wind or currents. Sea floor drill rigs can be deployed both in shallow waters and the deep sea. A careful site survey is required before deploying the sea floor drill rig. Slope gradient, small scale topography and soil strength are important factors when planning the deployment. The choice of drill bits and core catcher depend on the expected geology. The required drill tools are stored on one or two magazines on the drill rig. The MeBo is the only remotely operated drill rig world wide that can use wire line coring technique. This method is much faster than conventional drilling. It has the advantage that the drill string stays in the drilled hole during the entire drilling process and prevents the drilled hole from collapsing while the inner core barrels comprising the drilled core sections are hooked up inside the drill string using a wire.

  20. Multiyear ice transport and small scale sea ice deformation near the Alaska coast measured by air-deployable Ice Trackers

    NASA Astrophysics Data System (ADS)

    Mahoney, A. R.; Kasper, J.; Winsor, P.

    2015-12-01

    Highly complex patterns of ice motion and deformation were captured by fifteen satellite-telemetered GPS buoys (known as Ice Trackers) deployed near Barrow, Alaska, in spring 2015. Two pentagonal clusters of buoys were deployed on pack ice by helicopter in the Beaufort Sea between 20 and 80 km offshore. During deployment, ice motion in the study region was effectively zero, but two days later the buoys captured a rapid transport event in which multiyear ice from the Beaufort Sea was flushed into the Chukchi Sea. During this event, westward ice motion began in the Chukchi Sea and propagated eastward. This created new openings in the ice and led to rapid elongation of the clusters as the westernmost buoys accelerated away from their neighbors to the east. The buoys tracked ice velocities of over 1.5 ms-1, with fastest motion occurring closest to the coast indicating strong current shear. Three days later, ice motion reversed and the two clusters became intermingled, rendering divergence calculations based on the area enclosed by clusters invalid. The data show no detectable difference in velocity between first year and multiyear ice floes, but Lagrangian timeseries of SAR imagery centered on each buoy show that first year ice underwent significant small-scale deformation during the event. The five remaining buoys were deployed by local residents on prominent ridges embedded in the landfast ice within 16 km of Barrow in order to track the fate of such features after they detached from the coast. Break-up of the landfast ice took place over a period of several days and, although the buoys each initially followed a similar eastward trajectory around Point Barrow into the Beaufort Sea, they rapidly dispersed over an area more than 50 km across. With rapid environmental and socio-economic change in the Arctic, understanding the complexity of nearshore ice motion is increasingly important for predict future changes in the ice and the tracking ice-related hazards contaminants entrained in the ice. This work demonstrates the ability of low-cost easily-deployable Ice Trackers to generate to generate data of both scientific and operational value.

  1. The Impact of CCS Readiness on the Evolution of China's Electric Power Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahowski, Robert T.; Davidson, Casie L.; Yu, Sha

    In this study, GCAM-China is exercised to examine the impact of CCS availability on the projected evolution of China’s electric power sector under the Paris Increased Ambition policy scenario developed by Fawcett et al. based on the Intended Nationally Determined Contributions (INDCs) submitted under the COP-21 Paris Agreement. This policy scenario provides a backdrop for understanding China’s electric generation mix over the coming century under several CCS availability scenarios: CCS is fully available for commercial-scale deployment by 2025; by 2050; by 2075; and CCS is unavailable for use in meeting the modelled mitigation targets through 2100. Without having CCS available,more » the Chinese electric power sector turns to significant use of nuclear, wind, and solar to meet growing demands and emissions targets, at a cost. Should large-scale CCS deployment be delayed in China by 25 years, the modeled per-ton cost of climate change mitigation is projected to be roughly $420/tC (2010 US dollars) by 2050, relative to $360/tC in the case in which CCS is available to deploy by 2025, a 16% increase. Once CCS is available for commercial use, mitigation costs for the two cases converge, equilibrating by 2085. However, should CCS be entirely unavailable to deploy in China, the mitigation cost spread, compared to the 2025 case, doubles by 2075 ($580/tC and $1130/tC respectively), and triples by 2100 ($1050/tC vs. $3200/tC). However, while delays in CCS availability may have short-term impacts on China’s overall per-ton cost of meeting the emissions reduction target evaluated here, as well as total mitigation costs, the carbon price is likely to approach the price path associated with the full CCS availability case within a decade of CCS deployment. Having CCS available before the end of the century, even under the delays examined here, could reduce the total amount of nuclear and renewable energy that must deploy, reducing the overall cost of meeting the emissions mitigation targets.« less

  2. Desalination

    EPA Science Inventory

    To cope with the rising demand for fresh water, desalination of brackish groundwater and seawater is increasingly being viewed as a pragmatic option for augmenting fresh water supplies. The large scale deployment of desalination is likely to demonstrably increase electricity use,...

  3. Distributed Adaptive Binary Quantization for Fast Nearest Neighbor Search.

    PubMed

    Xianglong Liu; Zhujin Li; Cheng Deng; Dacheng Tao

    2017-11-01

    Hashing has been proved an attractive technique for fast nearest neighbor search over big data. Compared with the projection based hashing methods, prototype-based ones own stronger power to generate discriminative binary codes for the data with complex intrinsic structure. However, existing prototype-based methods, such as spherical hashing and K-means hashing, still suffer from the ineffective coding that utilizes the complete binary codes in a hypercube. To address this problem, we propose an adaptive binary quantization (ABQ) method that learns a discriminative hash function with prototypes associated with small unique binary codes. Our alternating optimization adaptively discovers the prototype set and the code set of a varying size in an efficient way, which together robustly approximate the data relations. Our method can be naturally generalized to the product space for long hash codes, and enjoys the fast training linear to the number of the training data. We further devise a distributed framework for the large-scale learning, which can significantly speed up the training of ABQ in the distributed environment that has been widely deployed in many areas nowadays. The extensive experiments on four large-scale (up to 80 million) data sets demonstrate that our method significantly outperforms state-of-the-art hashing methods, with up to 58.84% performance gains relatively.

  4. Large-scale deployment of seed treatments has driven rapid increase in use of neonicotinoid insecticides and preemptive pest management in US field crops.

    PubMed

    Douglas, Margaret R; Tooker, John F

    2015-04-21

    Neonicotinoids are the most widely used class of insecticides worldwide, but patterns of their use in the U.S. are poorly documented, constraining attempts to understand their role in pest management and potential nontarget effects. We synthesized publicly available data to estimate and interpret trends in neonicotinoid use since their introduction in 1994, with a special focus on seed treatments, a major use not captured by the national pesticide-use survey. Neonicotinoid use increased rapidly between 2003 and 2011, as seed-applied products were introduced in field crops, marking an unprecedented shift toward large-scale, preemptive insecticide use: 34-44% of soybeans and 79-100% of maize hectares were treated in 2011. This finding contradicts recent analyses, which concluded that insecticides are used today on fewer maize hectares than a decade or two ago. If current trends continue, neonicotinoid use will increase further through application to more hectares of soybean and other crop species and escalation of per-seed rates. Alternatively, our results, and other recent analyses, suggest that carefully targeted efforts could considerably reduce neonicotinoid use in field crops without yield declines or economic harm to farmers, reducing the potential for pest resistance, nontarget pest outbreaks, environmental contamination, and harm to wildlife, including pollinator species.

  5. Scaling up discovery of hidden diversity in fungi: impacts of barcoding approaches.

    PubMed

    Yahr, Rebecca; Schoch, Conrad L; Dentinger, Bryn T M

    2016-09-05

    The fungal kingdom is a hyperdiverse group of multicellular eukaryotes with profound impacts on human society and ecosystem function. The challenge of documenting and describing fungal diversity is exacerbated by their typically cryptic nature, their ability to produce seemingly unrelated morphologies from a single individual and their similarity in appearance to distantly related taxa. This multiplicity of hurdles resulted in the early adoption of DNA-based comparisons to study fungal diversity, including linking curated DNA sequence data to expertly identified voucher specimens. DNA-barcoding approaches in fungi were first applied in specimen-based studies for identification and discovery of taxonomic diversity, but are now widely deployed for community characterization based on sequencing of environmental samples. Collectively, fungal barcoding approaches have yielded important advances across biological scales and research applications, from taxonomic, ecological, industrial and health perspectives. A major outstanding issue is the growing problem of 'sequences without names' that are somewhat uncoupled from the traditional framework of fungal classification based on morphology and preserved specimens. This review summarizes some of the most significant impacts of fungal barcoding, its limitations, and progress towards the challenge of effective utilization of the exponentially growing volume of data gathered from high-throughput sequencing technologies.This article is part of the themed issue 'From DNA barcodes to biomes'. © 2016 The Authors.

  6. BRepertoire: a user-friendly web server for analysing antibody repertoire data.

    PubMed

    Margreitter, Christian; Lu, Hui-Chun; Townsend, Catherine; Stewart, Alexander; Dunn-Walters, Deborah K; Fraternali, Franca

    2018-04-14

    Antibody repertoire analysis by high throughput sequencing is now widely used, but a persisting challenge is enabling immunologists to explore their data to discover discriminating repertoire features for their own particular investigations. Computational methods are necessary for large-scale evaluation of antibody properties. We have developed BRepertoire, a suite of user-friendly web-based software tools for large-scale statistical analyses of repertoire data. The software is able to use data preprocessed by IMGT, and performs statistical and comparative analyses with versatile plotting options. BRepertoire has been designed to operate in various modes, for example analysing sequence-specific V(D)J gene usage, discerning physico-chemical properties of the CDR regions and clustering of clonotypes. Those analyses are performed on the fly by a number of R packages and are deployed by a shiny web platform. The user can download the analysed data in different table formats and save the generated plots as image files ready for publication. We believe BRepertoire to be a versatile analytical tool that complements experimental studies of immune repertoires. To illustrate the server's functionality, we show use cases including differential gene usage in a vaccination dataset and analysis of CDR3H properties in old and young individuals. The server is accessible under http://mabra.biomed.kcl.ac.uk/BRepertoire.

  7. Small Cages with Insect Couples Provide a Simple Method for a Preliminary Assessment of Mating Disruption

    PubMed Central

    Briand, Françoise; Guerin, Patrick M.; Charmillot, Pierre-Joseph; Kehrli, Patrik

    2012-01-01

    Mating disruption by sex pheromones is a sustainable, effective and widely used pest management scheme. A drawback of this technique is its challenging assessment of effectiveness in the field (e.g., spatial scale, pest density). The aim of this work was to facilitate the evaluation of field-deployed pheromone dispensers. We tested the suitability of small insect field cages for a pre-evaluation of the impact of sex pheromones on mating using the grape moths Eupoecilia ambiguella and Lobesia botrana, two major pests in vineyards. Cages consisted of a cubic metal frame of 35 cm sides, which was covered with a mosquito net of 1500 μm mesh size. Cages were installed in the centre of pheromone-treated and untreated vineyards. In several trials, 1 to 20 couples of grape moths per cage were released for one to three nights. The proportion of mated females was between 15 to 70% lower in pheromone-treated compared to untreated vineyards. Overall, the exposure of eight couples for one night was adequate for comparing different control schemes. Small cages may therefore provide a fast and cheap method to compare the effectiveness of pheromone dispensers under standardised semi-field conditions and may help predict the value of setting-up large-scale field trials. PMID:22645483

  8. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  9. Wavelength-Selective Photovoltaics for Power-generating Greenhouses

    NASA Astrophysics Data System (ADS)

    Carter, Sue; Loik, Michael; Shugar, David; Corrado, Carley; Wade, Catherine; Alers, Glenn

    2014-03-01

    While photovoltaic (PV) technologies are being developed that have the potential for meeting the cost target of 0.50/W per module, the cost of installation combined with the competition over land resources could curtail the wide scale deployment needed to generate the Terrawatts per year required to meet the world's electricity demands. To be cost effective, such large scale power generation will almost certainly require PV solar farms to be installed in agricultural and desert areas, thereby competing with food production, crops for biofuels, or the biodiversity of desert ecosystems. This requirement has put the PV community at odds with both the environmental and agricultural groups they would hope to support through the reduction of greenhouse gas emissions. A possible solution to this challenge is the use of wavelength-selective solar collectors, based on luminescent solar concentrators, that transmit wavelengths needed for plant growth while absorbing the remaining portions of the solar spectrum and converting it to power. Costs are reduced through simultaneous use of land for both food and power production, by replacing the PV cells by inexpensive long-lived luminescent materials as the solar absorber, and by integrating the panels directly into existing greenhouse or cold frames. Results on power generation and crop yields for year-long trials done at academic and commercial greenhouse growers in California will be presented.

  10. Persistent Identifiers for Field Deployments: A Missing Link in the Provenance Chain

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Ji, P.; Fils, D.; Shepherd, A.; Chandler, C. L.; Lehnert, K.

    2016-12-01

    Research in the geosciences is characterized by a wide range of complex and costly field deployments including oceanographic cruises, submersible dives, drilling expeditions, seismic networks, geodetic campaigns, moored arrays, aircraft flights, and satellite missions. Each deployment typically produces a mix of sensor and sample data, spanning a period from hours to decades, that ultimately yields a long tail of post-field products and publications. Publishing persistent, citable identifiers for field deployments will facilitate 1) preservation and reuse of the original field data, 2) reproducibility of the resulting publications, and 3) recognition for both the facilities that operate the platforms and the investigators who secure funding for the experiments. In the ocean domain, sharing unique identifiers for field deployments is a familiar practice. For example, the Biological and Chemical Oceanography Data Management Office (BCO-DMO) routinely links datasets to cruise identifiers published by the Rolling Deck to Repository (R2R) program. In recent years, facilities have started to publish formal/persistent identifiers, typically Digital Object Identifiers (DOIs), for field deployments including seismic networks, oceanographic cruises, and moored arrays. For example, the EarthChem Library (ECL) publishes a DOI for each dataset which, if it derived from an oceanographic research cruise on a US vessel, is linked to a DOI for the cruise published by R2R. Work is underway to create similar links for the IODP JOIDES Resolution Science Operator (JRSO) and the Continental Scientific Drilling Coordination Office (CSDCO). We present results and lessons learned including a draft schema for publishing field deployments as DataCite DOI records; current practice for linking these DOIs with related identifiers such as Open Researcher and Contributor IDs (ORCIDs), Open Funder Registry (OFR) codes, and International Geo Sample Numbers (IGSNs); and consideration of other identifier types for field deployments such as UUIDs and Handles.

  11. Scientific Cluster Deployment and Recovery - Using puppet to simplify cluster management

    NASA Astrophysics Data System (ADS)

    Hendrix, Val; Benjamin, Doug; Yao, Yushu

    2012-12-01

    Deployment, maintenance and recovery of a scientific cluster, which has complex, specialized services, can be a time consuming task requiring the assistance of Linux system administrators, network engineers as well as domain experts. Universities and small institutions that have a part-time FTE with limited time for and knowledge of the administration of such clusters can be strained by such maintenance tasks. This current work is the result of an effort to maintain a data analysis cluster (DAC) with minimal effort by a local system administrator. The realized benefit is the scientist, who is the local system administrator, is able to focus on the data analysis instead of the intricacies of managing a cluster. Our work provides a cluster deployment and recovery process (CDRP) based on the puppet configuration engine allowing a part-time FTE to easily deploy and recover entire clusters with minimal effort. Puppet is a configuration management system (CMS) used widely in computing centers for the automatic management of resources. Domain experts use Puppet's declarative language to define reusable modules for service configuration and deployment. Our CDRP has three actors: domain experts, a cluster designer and a cluster manager. The domain experts first write the puppet modules for the cluster services. A cluster designer would then define a cluster. This includes the creation of cluster roles, mapping the services to those roles and determining the relationships between the services. Finally, a cluster manager would acquire the resources (machines, networking), enter the cluster input parameters (hostnames, IP addresses) and automatically generate deployment scripts used by puppet to configure it to act as a designated role. In the event of a machine failure, the originally generated deployment scripts along with puppet can be used to easily reconfigure a new machine. The cluster definition produced in our CDRP is an integral part of automating cluster deployment in a cloud environment. Our future cloud efforts will further build on this work.

  12. U.S. Army Operation Enduring Freedom Deployment Injury Surveillance Summary 1 January-31 December 2013

    DTIC Science & Technology

    2016-10-01

    American football (10%). [Note: These data are not shown in the figure.] Figure 6. Distribution of Leading Causes1 of Air-Evacuated Non-Battle...sion/Su- perficial Crush Burns Nerves Unspeci- fied System- wide & late effects Post- Concussive Total Percent Percent by Body Region Type 1 TBI 0 9...Burns Nerves Unspeci- fied System- wide & late effects Post- Concussive Total Percent Percent by Body Region Type 1 TBI 1 0 0 1 2.2 Type 2 TBI 0 1 1

  13. A revised approach to the ULDB design

    NASA Astrophysics Data System (ADS)

    Smith, M.; Cathey, H.

    The National Aeronautics and Space Administration Balloon Program has experienced problems in the scaling up of the proposed Ultra Long Duration Balloon. Full deployment of the balloon envelope has been the issue for the larger balloons. There are a number of factors that contribute to this phenomenon. Analytical treatments of the deployment issue are currently underway. It has also been acknowledged that the current fabrication approach using foreshortening is costly, labor intensive, and requires significant handling during production thereby increasing the chances of inducing damage to the envelope. Raven Industries has proposed a new design and fabrication approach that should increase the probability of balloon deployment, does not require foreshortening, will reduce the handling, production labor, and reduce the final balloon cost. This paper will present a description of the logic and approach used to develop this innovation. This development consists of a serial set of steps with decision points that build upon the results of the previous steps. The first steps include limited material development and testing. This will be followed by load testing of bi-axial reinforced cylinders to determine the effect of eliminating the foreshortening. This series of tests have the goal of measuring the strain in the material as it is bi-axially loaded in a condition that closely replicated the application in the full-scale balloon. Constant lobe radius pumpkin shaped test structures will be designed and analyzed. This matrix of model tests, in conjunction with the deployment analyses, will help develop a curve that should clearly present the deployment relationship for this kind of design. This will allow the ``design space'' for this type of balloon to be initially determined. The materials used, analyses, and ground testing results of both cylinders and small pumpkin structures will be presented. Following ground testing, a series of test flights, staged in increments of increasing suspended load and balloon volume, will be conducted. The first small scale test flight has been proposed for early Spring 2004. Results of this test flight of this new design and approach will presented. Two additional domestic test flights from Ft. Sumner, New Mexico, and Palestine, Texas, and one circumglobal test flight from Australia are planned as part of this development. Future plans for both ground testing and test flights will also be presented.

  14. A Framework for Optimizing the Placement of Tidal Turbines

    NASA Astrophysics Data System (ADS)

    Nelson, K. S.; Roberts, J.; Jones, C.; James, S. C.

    2013-12-01

    Power generation with marine hydrokinetic (MHK) current energy converters (CECs), often in the form of underwater turbines, is receiving growing global interest. Because of reasonable investment, maintenance, reliability, and environmental friendliness, this technology can contribute to national (and global) energy markets and is worthy of research investment. Furthermore, in remote areas, small-scale MHK energy from river, tidal, or ocean currents can provide a local power supply. However, little is known about the potential environmental effects of CEC operation in coastal embayments, estuaries, or rivers, or of the cumulative impacts of these devices on aquatic ecosystems over years or decades of operation. There is an urgent need for practical, accessible tools and peer-reviewed publications to help industry and regulators evaluate environmental impacts and mitigation measures, while establishing best sitting and design practices. Sandia National Laboratories (SNL) and Sea Engineering, Inc. (SEI) have investigated the potential environmental impacts and performance of individual tidal energy converters (TECs) in Cobscook Bay, ME; TECs are a subset of CECs that are specifically deployed in tidal channels. Cobscook Bay is the first deployment location of Ocean Renewable Power Company's (ORPC) TidGenTM unit. One unit is currently in place with four more to follow. Together, SNL and SEI built a coarse-grid, regional-scale model that included Cobscook Bay and all other landward embayments using the modeling platform SNL-EFDC. Within SNL-EFDC tidal turbines are represented using a unique set of momentum extraction, turbulence generation, and turbulence dissipation equations at TEC locations. The global model was then coupled to a local-scale model that was centered on the proposed TEC deployment locations. An optimization frame work was developed that used the refined model to determine optimal device placement locations that maximized array performance. Within the framework, environmental effects are considered to minimize the possibility of altering flows to an extent that would affect fish-swimming behavior and sediment-transport trends. Simulation results were compared between model runs with the optimized array configuration, and the originally purposed deployment locations; the optimized array showed a 17% increase in power generation. The developed framework can provide regulators and developers with a tool for assessing environmental impacts and device-performance parameters for the deployment of MHK devices. The more thoroughly understood this promising technology, the more likely it will become a viable source of alternative energy.

  15. A Revised Approach to the ULDB Design

    NASA Technical Reports Server (NTRS)

    Smith, Michael; Cathey, H. M., Jr.

    2004-01-01

    The National Aeronautics and Space Administration Balloon Program has experienced problems in the scaling up of the proposed Ultra Long Duration Balloon. Full deployment of the balloon envelope has been the issue for the larger balloons. There are a number of factors that contribute to this phenomenon. Analytical treatments of the deployment issue are currently underway. It has also been acknowledged that the current fabrication approach using foreshortening is costly, labor intensive, and requires significant handling during production thereby increasing the chances of inducing damage to the envelope. Raven Industries has proposed a new design and fabrication approach that should increase the probability of balloon deployment, does not require foreshortening, will reduce the handling, production labor, and reduce the final balloon cost. This paper will present a description of the logic and approach used to develop this innovation. This development consists of a serial set of steps with decision points that build upon the results of the previous steps. The first steps include limited material development and testing. This will be followed by load testing of bi-axial reinforced cylinders to determine the effect of eliminating the foreshortening. This series of tests have the goal of measuring the strain in the material as it is bi-axially loaded in a condition that closely replicated the application in the full-scale balloon. Constant lobe radius pumpkin shaped test structures will be designed and analyzed. This matrix of model tests, in conjunction with the deployment analyses, will help develop a curve that should clearly present the deployment relationship for this kind of design. This will allow the "design space" for this type of balloon to be initially determined. The materials used, analyses, and ground testing results of both cylinders and small pumpkin structures will be presented. Following ground testing, a series of test flights, staged in increments of increasing suspended load and balloon volume, will be conducted. The first small scale test flight has been proposed for early Spring 2004. Results of this test flight of this new design and approach will presented. Two additional domestic test flights from Ft. Sumner, New Mexico, and Palestine, Texas, and one circumglobal test flight from Australia are planned as part of this development. Future plans for both ground testing and test flights will also be presented.

  16. Hybrid cloud: bridging of private and public cloud computing

    NASA Astrophysics Data System (ADS)

    Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol

    2018-05-01

    Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.

  17. Dynamics of anchor last deployment of submersible buoy system

    NASA Astrophysics Data System (ADS)

    Zheng, Zhongqiang; Xu, Jianpeng; Huang, Peng; Wang, Lei; Yang, Xiaoguang; Chang, Zongyu

    2016-02-01

    Submersible buoy systems are widely used for oceanographic research, ocean engineering and coastal defense. Severe sea environment has obvious effects on the dynamics of submersible buoy systems. Huge tension can occur and may cause the snap of cables, especially during the deployment period. This paper studies the deployment dynamics of submersible buoy systems with numerical and experimental methods. By applying the lumped mass approach, a three-dimensional multi-body model of submersible buoy system is developed considering the hydrodynamic force, tension force and impact force between components of submersible buoy system and seabed. Numerical integration method is used to solve the differential equations. The simulation output includes tension force, trajectory, profile and dropping location and impact force of submersible buoys. In addition, the deployment experiment of a simplified submersible buoy model was carried out. The profile and different nodes' velocities of the submersible buoy are obtained. By comparing the results of the two methods, it is found that the numerical model well simulates the actual process and conditions of the experiment. The simulation results agree well with the results of the experiment such as gravity anchor's location and velocities of different nodes of the submersible buoy. The study results will help to understand the conditions of submersible buoy's deployment, operation and recovery, and can be used to guide the design and optimization of the system.

  18. Acute Assessment of Traumatic Brain Injury and Post-Traumatic Stress After Exposure to a Deployment-Related Explosive Blast.

    PubMed

    Baker, Monty T; Moring, John C; Hale, Willie J; Mintz, Jim; Young-McCaughan, Stacey; Bryant, Richard A; Broshek, Donna K; Barth, Jeffrey T; Villarreal, Robert; Lancaster, Cynthia L; Malach, Steffany L; Lara-Ruiz, Jose M; Isler, William; Peterson, Alan L

    2018-05-18

    Traumatic brain injury (TBI) and post-traumatic stress disorder (PTSD) are two of the signature injuries in military service members who have been exposed to explosive blasts during deployments to Iraq and Afghanistan. Acute stress disorder (ASD), which occurs within 2-30 d after trauma exposure, is a more immediate psychological reaction predictive of the later development of PTSD. Most previous studies have evaluated service members after their return from deployment, which is often months or years after the initial blast exposure. The current study is the first large study to collect psychological and neuropsychological data from active duty service members within a few days after blast exposure. Recruitment for blast-injured TBI patients occurred at the Air Force Theater Hospital, 332nd Air Expeditionary Wing, Joint Base Balad, Iraq. Patients were referred from across the combat theater and evaluated as part of routine clinical assessment of psychiatric and neuropsychological symptoms after exposure to an explosive blast. Four measures of neuropsychological functioning were used: the Military Acute Concussion Evaluation (MACE); the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS); the Headminder Cognitive Stability Index (CSI); and the Automated Neuropsychological Assessment Metrics, Version 4.0 (ANAM4). Three measures of combat exposure and psychological functioning were used: the Combat Experiences Scale (CES); the PTSD Checklist-Military Version (PCL-M); and the Acute Stress Disorder Scale (ASDS). Assessments were completed by a deployed clinical psychologist, clinical social worker, or mental health technician. A total of 894 patients were evaluated. Data from 93 patients were removed from the data set for analysis because they experienced a head injury due to an event that was not an explosive blast (n = 84) or they were only assessed for psychiatric symptoms (n = 9). This resulted in a total of 801 blast-exposed patients for data analysis. Because data were collected in-theater for the initial purpose of clinical evaluation, sample size varied widely between measures, from 565 patients who completed the MACE to 154 who completed the CES. Bivariate correlations revealed that the majority of psychological measures were significantly correlated with each other (ps ≤ 0.01), neuropsychological measures were correlated with each other (ps ≤ 0.05), and psychological and neuropsychological measures were also correlated with each other (ps ≤ 0.05). This paper provides one of the first descriptions of psychological and neuropsychological functioning (and their inter-correlation) within days after blast exposure in a large sample of military personnel. Furthermore, this report describes the methodology used to gather data for the acute assessment of TBI, PTSD, and ASD after exposure to an explosive blast in the combat theater. Future analyses will examine the common and unique symptoms of TBI and PTSD, which will be instrumental in developing new assessment approaches and intervention strategies.

  19. Rapid, Long-term Monitoring of CO2 Concentration and δ13CO2 at CCUS Sites Allows Discrimination of Leakage Patterns from Natural Background Values

    NASA Astrophysics Data System (ADS)

    Galfond, B.; Riemer, D. D.; Swart, P. K.

    2014-12-01

    In order for Carbon Capture Utilization and Storage (CCUS) to gain wide acceptance as a method for mitigating atmospheric CO2 concentrations, schemes must be devised to ensure that potential leakage is detected. New regulations from the US Environmental Protection Agency require monitoring and accounting for Class VI injection wells, which will remain a barrier to wide scale CCUS deployment until effective and efficient monitoring techniques have been developed and proven. Monitoring near-surface CO2 at injection sites to ensure safety and operational success requires high temporal resolution CO2 concentration and carbon isotopic (δ13C) measurements. The only technologies currently capable of this rapid measurement of δ13C are optical techniques such as Cavity Ringdown Spectroscopy (CRDS). We have developed a comprehensive remote monitoring approach using CRDS and a custom manifold system to obtain accurate rapid measurements from a large sample area over an extended study period. Our modified Picarro G1101-i CRDS allows for automated rapid and continuous field measurement of δ13CO2 and concentrations of relevant gas species. At our field site, where preparations have been underway for Enhanced Oil Recovery (EOR) operations, we have been able to measure biogenic effects on a diurnal scale, as well as variation due to precipitation and seasonality. Taking these background trends into account, our statistical treatment of real data has been used to improve signal-to-noise ratios by an order of magnitude over published models. Our system has proven field readiness for the monitoring of sites with even modest CO2 fluxes.

  20. Medina® Embolization Device for the Treatment of Intracranial Aneurysms: Safety and Angiographic Effectiveness at 6 Months.

    PubMed

    Sourour, Nader-Antoine; Vande Perre, Saskia; Maria, Federico Di; Papagiannaki, Chrysanthi; Gabrieli, Joseph; Pistocchi, Silvia; Bartolini, Bruno; Degos, Vincent; Carpentier, Alexandre; Chiras, Jacques; Clarençon, Frédéric

    2018-02-01

    The Medina Embolization Device (MED) is a new concept device that combines the design of a detachable coil and the one of an intrasaccular flow disruption device. To evaluate the feasibility, safety, and 6- to 9-mo effectiveness of this new device for the treatment of intracranial wide-necked aneurysms. Twelve patients (10 females, mean age = 56 yr) with 13 wide-necked intracranial aneurysms (3 ruptured; 10 unruptured) were treated by means of the MED from January 2015 to October 2015. In 15% of the cases, MEDs were used in a standalone fashion; in 85% of the cases, additional regular coils were used. Adjunctive compliant balloon was used in 4 of 13 cases (31%). Procedure-related complications were systematically recorded; discharge and 6- to 9-mo follow-up modified Rankin Scale was assessed. Angiographic follow-up was performed with a mean delay of 5.5 ± 1.7 mo. Occlusion rate was evaluated in postprocedure and at midterm follow-up using the Roy-Raymond scale. The deployment of the MED was feasible in all cases. No perforation was recorded. One case of thromboembolic complication was observed in a ruptured anterior communicating artery aneurysm, without any clinical consequence at follow-up. Grade A occlusion rate was 61.5% in postprocedure and 83% at 6-mo follow-up. Two cases (17%) of recanalization were documented angiographically. The MED is a new generation device combining the design of a detachable coil and an intrasaccular flow disruption device. According to our early experience, this device is safe and provides a satisfactory occlusion rate at angiographic follow-up of 6 mo. Copyright © 2017 by the Congress of Neurological Surgeons

  1. EDITORIAL Wireless sensor networks: design for real-life deployment and deployment experiences Wireless sensor networks: design for real-life deployment and deployment experiences

    NASA Astrophysics Data System (ADS)

    Gaura, Elena; Roedig, Utz; Brusey, James

    2010-12-01

    Wireless sensor networks (WSNs) are among the most promising technologies of the new millennium. The opportunities afforded by being able to program networks of small, lightweight, low-power, computation- and bandwidth-limited nodes have attracted a large community of researchers and developers. However, the unique set of capabilities offered by the technology produces an exciting but complex design space, which is often difficult to negotiate in an application context. Deploying sensing physical environments produces its own set of challenges, and can push systems into failure modes, thus revealing problems that can be difficult to discover or reproduce in simulation or the laboratory. Sustained efforts in the area of wireless networked sensing over the last 15 years have resulted in a large number of theoretical developments, substantial practical achievements, and a wealth of lessons for the future. It is clear that in order to bridge the gap between (on the one hand) visions of very large scale, autonomous, randomly deployed networks and (on the other) the actual performance of fielded systems, we need to view deployment as an essential component in the process of developing sensor networks: a process that includes hardware and software solutions that serve specific applications and end-user needs. Incorporating deployment into the design process reveals a new and different set of requirements and considerations, whose solutions require innovative thinking, multidisciplinary teams and strong involvement from end-user communities. This special feature uncovers and documents some of the hurdles encountered and solutions offered by experimental scientists when deploying and evaluating wireless sensor networks in situ, in a variety of well specified application scenarios. The papers specifically address issues of generic importance for WSN system designers: (i) data quality, (ii) communications availability and quality, (iii) alternative, low-energy sensing modalities and (iv) system solutions with high end-user added value and cost benefits. The common thread is deployment and deployment evaluation. In particular, satisfaction of application requirements, involvement of the end-user in the design and deployment process, satisfactory system performance and user acceptance are concerns addressed in many of the contributions. The contributions form a valuable set, which help to identify the priorities for research in this burgeoning area: Robust, reliable and efficient data collection in embedded wireless multi-hop networks are essential elements in creating a true deploy-and-forget user experience. Maintaining full connectivity within a WSN, in a real world environment populated by other WSNs, WiFi networks or Bluetooth devices that constitute sources of interference is a key element in any application, but more so for those that are safety-critical, such as disaster response. Awareness of the effects of wireless channel, physical position and line-of-sight on received signal strength in real-world, outdoor environments will shape the design of many outdoor applications. Thus, the quantification of such effects is valuable knowledge for designers. Sensors' failure detection, scalability and commercialization are common challenges in many long-term monitoring applications; transferable solutions are evidenced here in the context of pollutant detection and water quality. Innovative, alternative thinking is often needed to achieve the desired long-lived networks when power-hungry sensors are foreseen components; in some instances, the very problems of wireless technology, such as RF irregularity, can be transformed into advantages. The importance of an iterative design and evaluation methodology—from analysis to simulation to real-life deployment—should be well understood by all WSN developers. The value of this is highlighted in the context of a challenging WPAN video-surveillance application based on a novel Nomadic Access Mechanism. Cost benefits to be drawn from devising a WSN based solution to classic application areas such as surveillance are often a prime motivator for WSN designers; an example is offered here based on the use of intelligent agents for intrusion monitoring. Last but not least, the practicality and usability of the WSN solutions found for novel applications is key to their adoption. This is particularly true when the end-users of the developed technology are medical patients. The importance of feedback, elegant hardware encapsulation and extraction of meaning from data is presented in the context of novel orthopedic rehabilitation aids. Overall, this feature offers wide coverage of most issues encountered in the process of design, implementation and evaluation of deployable WSN systems. We trust that designers and developers of WSN systems will find much work of value, ranging from lessons learned, through solutions to known hurdles, to novel developments that enhance applications. Finally, we would like to thank all authors for their valuable contributions!

  2. Deployment-based lifetime optimization for linear wireless sensor networks considering both retransmission and discrete power control.

    PubMed

    Li, Ruiying; Ma, Wenting; Huang, Ning; Kang, Rui

    2017-01-01

    A sophisticated method for node deployment can efficiently reduce the energy consumption of a Wireless Sensor Network (WSN) and prolong the corresponding network lifetime. Pioneers have proposed many node deployment based lifetime optimization methods for WSNs, however, the retransmission mechanism and the discrete power control strategy, which are widely used in practice and have large effect on the network energy consumption, are often neglected and assumed as a continuous one, respectively, in the previous studies. In this paper, both retransmission and discrete power control are considered together, and a more realistic energy-consumption-based network lifetime model for linear WSNs is provided. Using this model, we then propose a generic deployment-based optimization model that maximizes network lifetime under coverage, connectivity and transmission rate success constraints. The more accurate lifetime evaluation conduces to a longer optimal network lifetime in the realistic situation. To illustrate the effectiveness of our method, both one-tiered and two-tiered uniformly and non-uniformly distributed linear WSNs are optimized in our case studies, and the comparisons between our optimal results and those based on relatively inaccurate lifetime evaluation show the advantage of our method when investigating WSN lifetime optimization problems.

  3. Evolution of grid-wide access to database resident information in ATLAS using Frontier

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Bujor, F.; de Stefano, J.; Dewhurst, A. L.; Dykstra, D.; Front, D.; Gallas, E.; Gamboa, C. F.; Luehring, F.; Walker, R.

    2012-12-01

    The ATLAS experiment deployed Frontier technology worldwide during the initial year of LHC collision data taking to enable user analysis jobs running on the Worldwide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken, such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond user analysis and subsystem specific tasks such as calibration and alignment, extending into production processing areas, such as initial reconstruction and trigger reprocessing. With a more robust and tuned system, we are better equipped to satisfy the still growing number of diverse clients and the demands of increasingly sophisticated processing and analysis.

  4. Strategies for lidar characterization of particulates from point and area sources

    NASA Astrophysics Data System (ADS)

    Wojcik, Michael D.; Moore, Kori D.; Martin, Randal S.; Hatfield, Jerry

    2010-10-01

    Use of ground based remote sensing technologies such as scanning lidar systems (light detection and ranging) has gained traction in characterizing ambient aerosols due to some key advantages such as wide area of regard (10 km2), fast response time, high spatial resolution (<10 m) and high sensitivity. Energy Dynamics Laboratory and Utah State University, in conjunction with the USDA-ARS, has developed a three-wavelength scanning lidar system called Aglite that has been successfully deployed to characterize particle motion, concentration, and size distribution at both point and diffuse area sources in agricultural and industrial settings. A suite of massbased and size distribution point sensors are used to locally calibrate the lidar. Generating meaningful particle size distribution, mass concentration, and emission rate results based on lidar data is dependent on strategic onsite deployment of these point sensors with successful local meteorological measurements. Deployment strategies learned from field use of this entire measurement system over five years include the characterization of local meteorology and its predictability prior to deployment, the placement of point sensors to prevent contamination and overloading, the positioning of the lidar and beam plane to avoid hard target interferences, and the usefulness of photographic and written observational data.

  5. An intelligent surveillance platform for large metropolitan areas with dense sensor deployment.

    PubMed

    Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A; Smilansky, Zeev

    2013-06-07

    This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage.

  6. Steering Concept of a 2-Blade Heliogyro Solar Sail Spacecraft

    NASA Technical Reports Server (NTRS)

    Wiwattananon, Peerawan; Bryant, Robert G.

    2017-01-01

    Solar sails can be classified into two groups based on their method of stabilization: 1) truss supported, and 2) centrifugally (spin) supported. The truss configuration requires masts or booms to deploy, support, and rigidize the sails whereas the spin type uses the spacecraft’s centrifugal force to deploy and stabilize the sails. The truss-supported type sail has a scaling limitation because as the sail area gets larger, the sail is increasingly more difficult to make and stow: the masts and booms get heavier, occupying more volume, and have increased risk during deployment. This major disadvantage limits the size of the sail area. The spin type comes in two configurations: 1) spinning square/disk sail and 2) heliogyro sail. This spinning square/disk sail architecture suffers the same sail area limitation as the truss-supported sail.

  7. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit

    PubMed Central

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R.; Smith, Jeremy C.; Kasson, Peter M.; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-01-01

    Motivation: Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Results: Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. Availability: GROMACS is an open source and free software available from http://www.gromacs.org. Contact: erik.lindahl@scilifelab.se Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23407358

  8. 2D materials in electro-optic modulation: energy efficiency, electrostatics, mode overlap, material transfer and integration

    NASA Astrophysics Data System (ADS)

    Ma, Zhizhen; Hemnani, Rohit; Bartels, Ludwig; Agarwal, Ritesh; Sorger, Volker J.

    2018-02-01

    Here we discuss the physics of electro-optic modulators deploying 2D materials. We include a scaling laws analysis and show how energy-efficiency and speed change for three underlying cavity systems as a function of critical device length scaling. A key result is that the energy-per-bit of the modulator is proportional to the volume of the device, thus making the case for submicron-scale modulators possible deploying a plasmonic optical mode. We then show how Graphene's Pauli-blocking modulation mechanism is sensitive to the device operation temperature, whereby a reduction of the temperature enables a 10× reduction in modulator energy efficiency. Furthermore, we show how the high-index tunability of graphene is able to compensate for the small optical overlap factor of 2D-based material modulators, which is unlike classical silicon-based dispersion devices. Lastly, we demonstrate a novel method towards a 2D material printer suitable for cross-contamination free and on-demand printing. The latter paves the way to integrate 2D materials seamlessly into taped-out photonic chips.

  9. Energy Management and Optimization Methods for Grid Energy Storage Systems

    DOE PAGES

    Byrne, Raymond H.; Nguyen, Tu A.; Copp, David A.; ...

    2017-08-24

    Today, the stability of the electric power grid is maintained through real time balancing of generation and demand. Grid scale energy storage systems are increasingly being deployed to provide grid operators the flexibility needed to maintain this balance. Energy storage also imparts resiliency and robustness to the grid infrastructure. Over the last few years, there has been a significant increase in the deployment of large scale energy storage systems. This growth has been driven by improvements in the cost and performance of energy storage technologies and the need to accommodate distributed generation, as well as incentives and government mandates. Energymore » management systems (EMSs) and optimization methods are required to effectively and safely utilize energy storage as a flexible grid asset that can provide multiple grid services. The EMS needs to be able to accommodate a variety of use cases and regulatory environments. In this paper, we provide a brief history of grid-scale energy storage, an overview of EMS architectures, and a summary of the leading applications for storage. These serve as a foundation for a discussion of EMS optimization methods and design.« less

  10. Energy Management and Optimization Methods for Grid Energy Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrne, Raymond H.; Nguyen, Tu A.; Copp, David A.

    Today, the stability of the electric power grid is maintained through real time balancing of generation and demand. Grid scale energy storage systems are increasingly being deployed to provide grid operators the flexibility needed to maintain this balance. Energy storage also imparts resiliency and robustness to the grid infrastructure. Over the last few years, there has been a significant increase in the deployment of large scale energy storage systems. This growth has been driven by improvements in the cost and performance of energy storage technologies and the need to accommodate distributed generation, as well as incentives and government mandates. Energymore » management systems (EMSs) and optimization methods are required to effectively and safely utilize energy storage as a flexible grid asset that can provide multiple grid services. The EMS needs to be able to accommodate a variety of use cases and regulatory environments. In this paper, we provide a brief history of grid-scale energy storage, an overview of EMS architectures, and a summary of the leading applications for storage. These serve as a foundation for a discussion of EMS optimization methods and design.« less

  11. Outlooks for Wind Power in the United States: Drivers and Trends under a 2016 Policy Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mai, Trieu; Lantz, Eric; Ho, Jonathan

    Over the past decade, wind power has become one of the fastest growing electricity generation sources in the United States. Despite this growth, the U.S. wind industry continues to experience year-to-year fluctuations across the manufacturing and supply chain as a result of dynamic market conditions and changing policy landscapes. Moreover, with advancing wind technologies, ever-changing fossil fuel prices, and evolving energy policies, the long-term future for wind power is highly uncertain. In this report, we present multiple outlooks for wind power in the United States, to explore the possibilities of future wind deployment. The future wind power outlooks presented relymore » on high-resolution wind resource data and advanced electric sector modeling capabilities to evaluate an array of potential scenarios of the U.S. electricity system. Scenario analysis is used to explore drivers, trends, and implications for wind power deployment over multiple periods through 2050. Specifically, we model 16 scenarios of wind deployment in the contiguous United States. These scenarios span a wide range of wind technology costs, natural gas prices, and future transmission expansion. We identify conditions with more consistent wind deployment after the production tax credit expires as well as drivers for more robust wind growth in the long run. Conversely, we highlight challenges to future wind deployment. We find that the degree to which wind technology costs decline can play an important role in future wind deployment, electric sector CO 2 emissions, and lowering allowance prices for the Clean Power Plan.« less

  12. Out of the Shadows: The Health and Well-Being of Private Contractors Working in Conflict Environments.

    PubMed

    Dunigan, Molly; Farmer, Carrie M; Burns, Rachel M; Hawks, Alison; Setodji, Claude Messan

    2014-01-01

    Over the past decade, private contractors have been deployed extensively around the globe. In addition to supporting U.S. and allied forces in Iraq and Afghanistan, contractors have assisted foreign governments, nongovernmental organizations, and private businesses by providing a wide range of services, including base support and maintenance, logistical support, transportation, intelligence, communications, construction, and security. At the height of the conflicts in Iraq and Afghanistan, contractors outnumbered U.S. troops deployed to both theaters. Although these contractors are not supposed to engage in offensive combat, they may nonetheless be exposed to many of the stressors that are known to have physical and mental health implications for military personnel. RAND conducted an online survey of a sample of contractors who had deployed on contract to a theater of conflict at least once between early 2011 and early 2013. The survey collected demographic and employment information, along with details about respondents' deployment experience (including level of preparation for deployment, combat exposure, and living conditions), mental health (including probable posttraumatic stress disorder, depression, and alcohol misuse), physical health, and access to and use of health care. The goal was to describe the contractors' health and well-being and to explore differences across the sample by such factors as country of citizenship, job specialty, and length and frequency of contract deployment. The findings provide a foundation for future studies of contractor populations and serve to inform policy decisions affecting contractors, including efforts to reduce barriers to mental health treatment for this population.

  13. Relationship between Device Size and Body Weight in Dogs with Patent Ductus Arteriosus Undergoing Amplatz Canine Duct Occluder Deployment.

    PubMed

    Wesselowski, S; Saunders, A B; Gordon, S G

    2017-09-01

    Deployment of the Amplatz Canine Duct Occluder (ACDO) is the preferred method for minimally invasive occlusion of patent ductus arteriosus (PDA) in dogs, with appropriate device sizing crucial to successful closure. Dogs of any body weight can be affected by PDA. To describe the range of ACDO sizes deployed in dogs of various body weights for improved procedural planning and inventory selection and to investigate for correlation between minimal ductal diameter (MDD) and body weight. A total of 152 dogs undergoing ACDO deployment between 2008 and 2016. Body weight, age, breed, sex, and MDD obtained by angiography (MDD-A), MDD obtained by transesophageal echocardiography (MDD-TEE), and ACDO size deployed were retrospectively evaluated. Correlation between body weight and ACDO size, MDD-A and MDD-TEE was poor, with R-squared values of 0.4, 0.36, and 0.3, respectively. Femoral artery diameter in the smallest population of dogs placed inherent limitations on the use of larger device sizes, with no limitations on the wide range of device sizes required as patient size increased. The most commonly used ACDO devices were size 3 through 6, representing 57% of the devices deployed within the entire study population. Patent ductus arteriosus anatomy varies on an individual basis, with poor correlation between MDD and body weight. Weight-based assumptions about expected ACDO device size for a given patient are not recommended. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  14. ADMS Evaluation Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  15. A New Multibeam Sonar Technique for Evaluating Fine-Scale Fish Behavior Near Hydroelectric Dam Guidance Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Robert L.; Simmons, Mary Ann; Simmons, Carver S.

    2002-03-07

    This book chapter describes a Dual-Head Multibeam Sonar (DHMS) system developed by Battelle and deployed at two dam sites on the Snake and Columbia rivers in Washington State to evaluate the fine-scale (

  16. Pilot-Scale Demonstration of In-Situ Chemical Oxidation Involving Chlorinated Volatile Organic Compounds - Design and Deployment Guidelines (Parris Island, SC, U.S. Marine Corp Recruit Depot, Site 45 Pilot Study)

    EPA Science Inventory

    A pilot-scale in situ chemical oxidation (ISCO) demonstration, involving subsurface injections of sodium permanganate (NaMnO4), was performed at the US Marine Corp Recruit Depot (MCRD), site 45 (Parris Island (PI), SC). The ground water was originally contaminated with perchloroe...

  17. High-Density, High-Resolution, Low-Cost Air Quality Sensor Networks for Urban Air Monitoring

    NASA Astrophysics Data System (ADS)

    Mead, M. I.; Popoola, O. A.; Stewart, G.; Bright, V.; Kaye, P.; Saffell, J.

    2012-12-01

    Monitoring air quality in highly granular environments such as urban areas which are spatially heterogeneous with variable emission sources, measurements need to be made at appropriate spatial and temporal scales. Current routine air quality monitoring networks generally are either composed of sparse expensive installations (incorporating e.g. chemiluminescence instruments) or higher density low time resolution systems (e.g. NO2 diffusion tubes). Either approach may not accurately capture important effects such as pollutant "hot spots" or adequately capture spatial (or temporal) variability. As a result, analysis based on data from traditional low spatial resolution networks, such as personal exposure, may be inaccurate. In this paper we present details of a sophisticated, low-cost, multi species (gas phase, speciated PM, meteorology) air quality measurement network methodology incorporating GPS and GPRS which has been developed for high resolution air quality measurements in urban areas. Sensor networks developed in the Centre for Atmospheric Science (University of Cambridge) incorporated electrochemical gas sensors configured for use in urban air quality studies operating at parts-per-billion (ppb) levels. It has been demonstrated that these sensors can be used to measure key air quality gases such as CO, NO and NO2 at the low ppb mixing ratios present in the urban environment (estimated detection limits <4ppb for CO and NO and <1ppb for NO2. Mead et al (submitted Aug., 2012)). Based on this work, a state of the art multi species instrument package for deployment in scalable sensor networks has been developed which has general applicability. This is currently being employed as part of a major 3 year UK program at London Heathrow airport (the Sensor Networks for Air Quality (SNAQ) Heathrow project). The main project outcome is the creation of a calibrated, high spatial and temporal resolution data set for O3, NO, NO2, SO2, CO, CO2, VOCstotal, size-speciated PM, temperature, relative humidity, wind speed and direction. The network incorporates existing GPRS infrastructures for real time sending of data with low overheads in terms of cost, effort and installation. In this paper we present data from the SNAQ Heathrow project as well as previous deployments showing measurement capability at the ppb level for NO, NO2 and CO. We show that variability can be observed and measured quantitatively using these sensor networks over widely differing time scales from individual emission events, diurnal variability associated with traffic and meteorological conditions, through to longer term synoptic weather conditions and seasonal behaviour. This work demonstrates a widely applicable generic capability to urban areas, airports as well as other complex emissions environments making this sensor system methodology valuable for scientific, policy and regulatory issues. We conclude that the low-cost high-density network philosophy has the potential to provide a more complete assessment of the high-granularity air quality structure generally observed in the environment. Further, when appropriately deployed, has the potential to offer a new paradigm in air quality quantification and monitoring.

  18. Structural Analysis of NASA's ULDB using Photogrammetric Measurements

    NASA Astrophysics Data System (ADS)

    Young, Leyland; Garde, Gabriel; Cathey, Henry

    The National Aeronautics and Space Administration (NASA) Balloon Program Office (BPO) has been developing a super-pressure Ultra Long Duration Balloon (ULDB) for constant altitude and longer flight times. The development of the ULDB has progressed in many areas that are significant to NASA's desired goals. However, there has been a re-occurring anomaly of the ULDB called a cleft, which prevents the balloon from properly deploying at float altitudes. Over the years, there has been an influx of hypotheses and speculations to the cause of the cleft formation. Significant changes were made to the design paradigm of the ULDB to address the clefting issue. It was hypothesized that the design philosophy of fore-shortening the tendons relative to the polyethylene film was causing the cleft formation, thus the fore-shortened scheme was removed in the design process. The latest design concept removed the fore-shortening and produced a one to one matching of the tendons and film. Consequently, in 2006, a six million cubic foot (MCF) balloon was designed with the new concept of zero fore-shortening and clefted as it reached its float altitude. This 6 MCF cleft proved that the clefting phenomenon was not properly understood and there was more to the problem than just fore-shortening. Most analytical analyses conducted on the ULDB towards the clefting issue focused on pressure stabilities. It was shown through several finite element analyses that the new design concept produces a stable balloon when pressurized; thus, pressurized stability was believed to be a sufficient measure to indicate if a balloon would cleft or not cleft. Eventually, the 6 MCF balloon that clefted in 2006 showed that the pressurized stability analysis is subjective and is not applicable in predicting a cleft formation. Moreover, the analytical pressurized stability is conducted on a fully deployed balloon, whereas, the clefting phenomena occurs as part of the deployment process, and is clearly seen during the final deployment stages. In time, there is no doubt that an analytical tool will be available to fully analyze the ULDB for all concerns; however, at the present time, the analytical efforts are ongoing but are delayed by the complexity of modeling a balloon from un-deployed to deployed configuration. Thus, in the absence of an analytical tool, the development of the ULDB was steered towards more experimental work in understanding the clefting phenomena. This paper highlights the experimental analyses conducted on several scaled model ULDB's using photogrammetry measurements. The experimental work began with two 48-gore 4-meter diameter scaled ULDB's having the characteristics of a 180-degree bulge angle and 7.5-degree bulge angle respectively. The 180-degree balloon inflation experiments showed that similes of clefts appeared in the balloon at the onset of full deployment; whereas, these cleft-like formations were absent in the subsequent experiments with the 7.5-degree bulge angle balloon. This confirmed the thought that "excess material" designed in the gore width to create a 180-degree bulge angle is likely contributing to the clefting phenomena. Thus, the ULDB project decided to build three 200-gore 27-meter balloons: a 90-degree bulge angle, a 55- degree bulge angle, and a 1.8-degree bulge angle balloon to verify the hypothesis of excess material contribution to the clefting phenomena and to explore the limits of the deployment trade space. The experimental analysis with photogrammetry of these three 27-meter diameter balloons provided valuable data of stresses and strains and of the deployment mechanics of an ULDB that proves excess material is a contributor to the clefting phenomena. Significantly, the photogrammetry data showed that there are significant benefits for the lower value lobe angle designs; moreover, the lower value lobe angle balloon deployed better and had stresses and strains comparable to the other two designs. Another test was conducted on an 8-meter 48-gore scaled model ULDB to test the strain limits of the film. After

  19. Automatically assessing properties of dynamic cameras for camera selection and rapid deployment of video content analysis tasks in large-scale ad-hoc networks

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; van Rest, Jeroen H. C.; ten Hove, Johan-Martijn; ter Haar, Frank B.; Burghouts, Gertjan J.

    2017-10-01

    Video analytics is essential for managing large quantities of raw data that are produced by video surveillance systems (VSS) for the prevention, repression and investigation of crime and terrorism. Analytics is highly sensitive to changes in the scene, and for changes in the optical chain so a VSS with analytics needs careful configuration and prompt maintenance to avoid false alarms. However, there is a trend from static VSS consisting of fixed CCTV cameras towards more dynamic VSS deployments over public/private multi-organization networks, consisting of a wider variety of visual sensors, including pan-tilt-zoom (PTZ) cameras, body-worn cameras and cameras on moving platforms. This trend will lead to more dynamic scenes and more frequent changes in the optical chain, creating structural problems for analytics. If these problems are not adequately addressed, analytics will not be able to continue to meet end users' developing needs. In this paper, we present a three-part solution for managing the performance of complex analytics deployments. The first part is a register containing meta data describing relevant properties of the optical chain, such as intrinsic and extrinsic calibration, and parameters of the scene such as lighting conditions or measures for scene complexity (e.g. number of people). A second part frequently assesses these parameters in the deployed VSS, stores changes in the register, and signals relevant changes in the setup to the VSS administrator. A third part uses the information in the register to dynamically configure analytics tasks based on VSS operator input. In order to support the feasibility of this solution, we give an overview of related state-of-the-art technologies for autocalibration (self-calibration), scene recognition and lighting estimation in relation to person detection. The presented solution allows for rapid and robust deployment of Video Content Analysis (VCA) tasks in large scale ad-hoc networks.

  20. Next generation sensing platforms for extended deployments in large-scale, multidisciplinary, adaptive sampling and observational networks

    NASA Astrophysics Data System (ADS)

    Cross, J. N.; Meinig, C.; Mordy, C. W.; Lawrence-Slavas, N.; Cokelet, E. D.; Jenkins, R.; Tabisola, H. M.; Stabeno, P. J.

    2016-12-01

    New autonomous sensors have dramatically increased the resolution and accuracy of oceanographic data collection, enabling rapid sampling over extremely fine scales. Innovative new autonomous platofrms like floats, gliders, drones, and crawling moorings leverage the full potential of these new sensors by extending spatiotemporal reach across varied environments. During 2015 and 2016, The Innovative Technology for Arctic Exploration Program at the Pacific Marine Environmental Laboratory tested several new types of fully autonomous platforms with increased speed, durability, and power and payload capacity designed to deliver cutting-edge ecosystem assessment sensors to remote or inaccessible environments. The Expendable Ice-Tracking (EXIT) gloat developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) is moored near bottom during the ice-free season and released on an autonomous timer beneath the ice during the following winter. The float collects a rapid profile during ascent, and continues to collect critical, poorly-accessible under-ice data until melt, when data is transmitted via satellite. The autonomous Oculus sub-surface glider developed by the University of Washington and PMEL has a large power and payload capacity and an enhanced buoyancy engine. This 'coastal truck' is designed for the rapid water column ascent required by optical imaging systems. The Saildrone is a solar and wind powered ocean unmanned surface vessel (USV) developed by Saildrone, Inc. in partnership with PMEL. This large-payload (200 lbs), fast (1-7 kts), durable (46 kts winds) platform was equipped with 15 sensors designed for ecosystem assessment during 2016, including passive and active acoustic systems specially redesigned for autonomous vehicle deployments. The senors deployed on these platforms achieved rigorous accuracy and precision standards. These innovative platforms provide new sampling capabilities and cost efficiencies in high-resolution sensor deployment, including reconnaissance for annual fisheries and marine mammal surveys; better linkages between sustained observing platforms; and adaptive deployments that can easily target anomalies as they arise.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukhanin, Gennadiy; Biery, Kurt; Foulkes, Stephen

    In the NO A experiment, the Detector Controls System (DCS) provides a method for controlling and monitoring important detector hardware and environmental parameters. It is essential for operating the detector and is required to have access to roughly 370,000 independent programmable channels via more than 11,600 physical devices. In this paper, we demonstrate an application of Control System Studio (CSS), developed by Oak Ridge National Laboratory, for the NO A experiment. The application of CSS for the DCS of the NO A experiment has been divided into three phases: (1) user requirements and concept prototype on a test-stand, (2) smallmore » scale deployment at the prototype Near Detector on the Surface, and (3) a larger scale deployment at the Far Detector. We also give an outline of the CSS integration with the NO A online software and the alarm handling logic for the Front-End electronics.« less

  2. Wireless Sensor Networks for Oceanographic Monitoring: A Systematic Review

    PubMed Central

    Albaladejo, Cristina; Sánchez, Pedro; Iborra, Andrés; Soto, Fulgencio; López, Juan A.; Torres, Roque

    2010-01-01

    Monitoring of the marine environment has come to be a field of scientific interest in the last ten years. The instruments used in this work have ranged from small-scale sensor networks to complex observation systems. Among small-scale networks, Wireless Sensor Networks (WSNs) are a highly attractive solution in that they are easy to deploy, operate and dismantle and are relatively inexpensive. The aim of this paper is to identify, appraise, select and synthesize all high quality research evidence relevant to the use of WSNs in oceanographic monitoring. The literature is systematically reviewed to offer an overview of the present state of this field of study and identify the principal resources that have been used to implement networks of this kind. Finally, this article details the challenges and difficulties that have to be overcome if these networks are to be successfully deployed. PMID:22163583

  3. Calcium-based multi-element chemistry for grid-scale electrochemical energy storage

    NASA Astrophysics Data System (ADS)

    Ouchi, Takanari; Kim, Hojong; Spatocco, Brian L.; Sadoway, Donald R.

    2016-03-01

    Calcium is an attractive material for the negative electrode in a rechargeable battery due to its low electronegativity (high cell voltage), double valence, earth abundance and low cost; however, the use of calcium has historically eluded researchers due to its high melting temperature, high reactivity and unfavorably high solubility in molten salts. Here we demonstrate a long-cycle-life calcium-metal-based rechargeable battery for grid-scale energy storage. By deploying a multi-cation binary electrolyte in concert with an alloyed negative electrode, calcium solubility in the electrolyte is suppressed and operating temperature is reduced. These chemical mitigation strategies also engage another element in energy storage reactions resulting in a multi-element battery. These initial results demonstrate how the synergistic effects of deploying multiple chemical mitigation strategies coupled with the relaxation of the requirement of a single itinerant ion can unlock calcium-based chemistries and produce a battery with enhanced performance.

  4. Calcium-based multi-element chemistry for grid-scale electrochemical energy storage

    PubMed Central

    Ouchi, Takanari; Kim, Hojong; Spatocco, Brian L.; Sadoway, Donald R.

    2016-01-01

    Calcium is an attractive material for the negative electrode in a rechargeable battery due to its low electronegativity (high cell voltage), double valence, earth abundance and low cost; however, the use of calcium has historically eluded researchers due to its high melting temperature, high reactivity and unfavorably high solubility in molten salts. Here we demonstrate a long-cycle-life calcium-metal-based rechargeable battery for grid-scale energy storage. By deploying a multi-cation binary electrolyte in concert with an alloyed negative electrode, calcium solubility in the electrolyte is suppressed and operating temperature is reduced. These chemical mitigation strategies also engage another element in energy storage reactions resulting in a multi-element battery. These initial results demonstrate how the synergistic effects of deploying multiple chemical mitigation strategies coupled with the relaxation of the requirement of a single itinerant ion can unlock calcium-based chemistries and produce a battery with enhanced performance. PMID:27001915

  5. Calcium-based multi-element chemistry for grid-scale electrochemical energy storage.

    PubMed

    Ouchi, Takanari; Kim, Hojong; Spatocco, Brian L; Sadoway, Donald R

    2016-03-22

    Calcium is an attractive material for the negative electrode in a rechargeable battery due to its low electronegativity (high cell voltage), double valence, earth abundance and low cost; however, the use of calcium has historically eluded researchers due to its high melting temperature, high reactivity and unfavorably high solubility in molten salts. Here we demonstrate a long-cycle-life calcium-metal-based rechargeable battery for grid-scale energy storage. By deploying a multi-cation binary electrolyte in concert with an alloyed negative electrode, calcium solubility in the electrolyte is suppressed and operating temperature is reduced. These chemical mitigation strategies also engage another element in energy storage reactions resulting in a multi-element battery. These initial results demonstrate how the synergistic effects of deploying multiple chemical mitigation strategies coupled with the relaxation of the requirement of a single itinerant ion can unlock calcium-based chemistries and produce a battery with enhanced performance.

  6. Measuring Large-Scale Social Networks with High Resolution

    PubMed Central

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune

    2014-01-01

    This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359

  7. Integrating Puppet and Gitolite to provide a novel solution for scalable system management at the MPPMU Tier2 centre

    NASA Astrophysics Data System (ADS)

    Delle Fratte, C.; Kennedy, J. A.; Kluth, S.; Mazzaferro, L.

    2015-12-01

    In a grid computing infrastructure tasks such as continuous upgrades, services installations and software deployments are part of an admins daily work. In such an environment tools to help with the management, provisioning and monitoring of the deployed systems and services have become crucial. As experiments such as the LHC increase in scale, the computing infrastructure also becomes larger and more complex. Moreover, today's admins increasingly work within teams that share responsibilities and tasks. Such a scaled up situation requires tools that not only simplify the workload on administrators but also enable them to work seamlessly in teams. In this paper will be presented our experience from managing the Max Planck Institute Tier2 using Puppet and Gitolite in a cooperative way to help the system administrator in their daily work. In addition to describing the Puppet-Gitolite system, best practices and customizations will also be shown.

  8. A manipulator arm for zero-g simulations

    NASA Technical Reports Server (NTRS)

    Brodie, S. B.; Grant, C.; Lazar, J. J.

    1975-01-01

    A 12-ft counterbalanced Slave Manipulator Arm (SMA) was designed and fabricated to be used for resolving the questions of operational applications, capabilities, and limitations for such remote manned systems as the Payload Deployment and Retrieval Mechanism (PDRM) for the shuttle, the Free-Flying Teleoperator System, the Advanced Space Tug, and Planetary Rovers. As a developmental tool for the shuttle manipulator system (or PDRM), the SMA represents an approximate one-quarter scale working model for simulating and demonstrating payload handling, docking assistance, and satellite servicing. For the Free-Flying Teleoperator System and the Advanced Tug, the SMA provides a near full-scale developmental tool for satellite servicing, docking, and deployment/retrieval procedures, techniques, and support equipment requirements. For the Planetary Rovers, it provides an oversize developmental tool for sample handling and soil mechanics investigations. The design of the SMA was based on concepts developed for a 40-ft NASA technology arm to be used for zero-g shuttle manipulator simulations.

  9. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    PubMed

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  10. Opportunities for Fundamental University-Based Research in Energy and Resource Recovery

    NASA Astrophysics Data System (ADS)

    Zoback, M. D.; Hitzman, M.; Tester, J. W.

    2012-12-01

    In this talk we present, from a university perspective, a few examples of fundamental research needs related to improved energy and resource recovery. One example of such a research need is related to the fact that it is not widely recognized that meeting domestic and worldwide energy needs with renewables such as wind and solar will be materials intensive. If widely deployed, the elements required by renewable technologies will be needed in significant quantities and shortage of these "energy critical elements" could significantly inhibit the adoption of otherwise game changing energy technologies. It is imperative to better understand the geology, metallurgy, and mining engineering of critical mineral deposits if we are to sustainably develop these new technologies. Unfortunately, there is currently no consensus among federal and state agencies, the national and international mining industry, the public, and the U.S. academic community regarding the importance of economic geology in the context of securing sufficient energy critical elements to undertake large-scale renewable energy development. Another option for transitioning away from our current hydrocarbon-based energy system to non-carbon based sources, is geothermal energy - from both conventional hydrothermal resources and enhanced or engineered geothermal systems (EGS). Although geothermal energy is currently used for both electric and non-electric applications worldwide from conventional hydrothermal resources and in ground source heat pumps, most of the emphasis in the US has been generating electricity. To this end, there is a need for research, development and demonstration in five important areas - estimating the magnitude and distribution of recoverable geothermal resources, establishing requirements for extracting and utilizing energy from EGS reservoirs the including drilling, reservoir design and stimulation, exploring end use options for district heating, electricity generation and co-generation, evaluating environmental impacts and tradeoffs (from dealing with water and land use to seismic risk) and projecting costs for EGS supplied electricity as a function of invested funds in research and development and deployment in evolving energy markets Finally, the shale gas revolution that has been underway in North America for the past few years has been of unprecedented scale and importance. As such resources are beginning to be developed globally, there is a critical need for fundamental research on such questions as how shale properties affect the success of stimulation, the importance of seismic and aseismic deformation mechanisms during reservoir stimulation, the factors that affect ultimate recovery and the development of methodologies that minimize the environmental impact of shale gas development.

  11. E-Government: Issues and Implications for Public Libraries

    ERIC Educational Resources Information Center

    Berryman, Jennifer

    2004-01-01

    Reviews the literature of e-government deployment world-wide, focussing on two possible roles for public libraries in e-government. The first is a continuation of their traditional role of information provision and managing library transactions electronically and the second, a move to handling government business transactions as well. Identifies…

  12. Strategic Deployment of Orthographic Knowledge in Phoneme Detection

    ERIC Educational Resources Information Center

    Cutler, Anne; Treiman, Rebecca; van Ooijen, Brit

    2010-01-01

    The phoneme detection task is widely used in spoken-word recognition research. Alphabetically literate participants, however, are more used to explicit representations of letters than of phonemes. The present study explored whether phoneme detection is sensitive to how target phonemes are, or may be, orthographically realized. Listeners detected…

  13. Schooling, the School Effectiveness Movement, and Educational Reform.

    ERIC Educational Resources Information Center

    Angus, Lawrence

    The widely accepted notion that the management of resources in schools involves merely strategic decisions about the deployment of finances, staff, and materials must be contested. The school effectiveness movement ignores the social and political context of schools and, through emphasis upon superficial managerial matters, teaches pupils to…

  14. High Bandwidth Communications Links Between Heterogeneous Autonomous Vehicles Using Sensor Network Modeling and Extremum Control Approaches

    DTIC Science & Technology

    2008-12-01

    In future network-centric warfare environments, teams of autonomous vehicles will be deployed in a coorperative manner to conduct wide-area...of data back to the command station, autonomous vehicles configured with high bandwidth communication system are positioned between the command

  15. Continuing Development of California State Packet Radio Project.

    ERIC Educational Resources Information Center

    Brownrigg, Edwin

    1992-01-01

    Provides background on the California State Library Packet Radio project, which will use packet radios to deploy a wireless, high-speed, wide-area network of 600 nodes, including 100 libraries, in the San Francisco Bay Area. Project goals and objectives, plan of operation, equipment, and evaluation plans are summarized. (MES)

  16. Computing in the Clouds

    ERIC Educational Resources Information Center

    Johnson, Doug

    2010-01-01

    Web-based applications offer teachers, students, and school districts a convenient way to accomplish a wide range of tasks, from accounting to word processing, for free. Cloud computing has the potential to offer staff and students better services at a lower cost than the technology deployment models they're using now. Saving money and improving…

  17. Genome wide association studies on yield components using a lentil genetic diversity panel

    USDA-ARS?s Scientific Manuscript database

    The cool season food legume research community are now at the threshold of deploying the cutting-edge molecular genetics and genomics tools that have led to significant and rapid expansion of gene discovery, knowledge of gene function (including tolerance to biotic and abiotic stresses) and genetic ...

  18. An Interdisciplinary Field Robotics Program for Undergraduate Computer Science and Engineering Education

    ERIC Educational Resources Information Center

    Kitts, Christopher; Quinn, Neil

    2004-01-01

    Santa Clara University's Robotic Systems Laboratory conducts an aggressive robotic development and operations program in which interdisciplinary teams of undergraduate students build and deploy a wide range of robotic systems, ranging from underwater vehicles to spacecraft. These year-long projects expose students to the breadth of and…

  19. Deploying the Win TR-20 computational engine as a web service

    USDA-ARS?s Scientific Manuscript database

    Despite its simplicity and limitations, the runoff curve number method remains a widely-used hydrologic modeling tool, and its use through the USDA Natural Resources Conservation Service (NRCS) computer application WinTR-20 is expected to continue for the foreseeable future. To facilitate timely up...

  20. Novel Sensor for the In Situ Measurement of Uranium Fluxes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hatfield, Kirk

    2015-02-10

    The goal of this project was to develop a sensor that incorporates the field-tested concepts of the passive flux meter to provide direct in situ measures of flux for uranium and groundwater in porous media. Measurable contaminant fluxes [J] are essentially the product of concentration [C] and groundwater flux or specific discharge [q ]. The sensor measures [J] and [q] by changes in contaminant and tracer amounts respectively on a sorbent. By using measurement rather than inference from static parameters, the sensor can directly advance conceptual and computational models for field scale simulations. The sensor was deployed in conjunction withmore » DOE in obtaining field-scale quantification of subsurface processes affecting uranium transport (e.g., advection) and transformation (e.g., uranium attenuation) at the Rifle IFRC Site in Rifle, Colorado. Project results have expanded our current understanding of how field-scale spatial variations in fluxes of uranium, groundwater and salient electron donor/acceptors are coupled to spatial variations in measured microbial biomass/community composition, effective field-scale uranium mass balances, attenuation, and stability. The coupling between uranium, various nutrients and micro flora can be used to estimate field-scale rates of uranium attenuation and field-scale transitions in microbial communities. This research focuses on uranium (VI), but the sensor principles and design are applicable to field-scale fate and transport of other radionuclides. Laboratory studies focused on sorbent selection and calibration, along with sensor development and validation under controlled conditions. Field studies were conducted at the Rifle IFRC Site in Rifle, Colorado. These studies were closely coordinated with existing SBR (formerly ERSP) projects to complement data collection. Small field tests were conducted during the first two years that focused on evaluating field-scale deployment procedures and validating sensor performance under controlled field conditions. In the third and fourth year a suite of larger field studies were conducted. For these studies, the uranium flux sensor was used with uranium speciation measurements and molecular-biological tools to characterize microbial community and active biomass at synonymous wells distributed in a large grid. These field efforts quantified spatial changes in uranium flux and field-scale rates of uranium attenuation (ambient and stimulated), uranium stability, and quantitatively assessed how fluxes and effective reaction rates were coupled to spatial variations in microbial community and active biomass. Analyses of data from these field experiments were used to generate estimates of Monod kinetic parameters that are ‘effective’ in nature and optimal for modeling uranium fate and transport at the field-scale. This project provided the opportunity to develop the first sensor that provides direct measures of both uranium (VI) and groundwater flux. A multidisciplinary team was assembled to include two geochemists, a microbiologist, and two quantitative contaminant hydrologists. Now that the project is complete, the sensor can be deployed at DOE sites to evaluate field-scale uranium attenuation, source behavior, the efficacy of remediation, and off-site risk. Because the sensor requires no power, it can be deployed at remote sites for periods of days to months. The fundamental science derived from this project can be used to advance the development of predictive models for various transport and attenuation processes in aquifers. Proper development of these models is critical for long-term stewardship of contaminated sites in the context of predicting uranium source behavior, remediation performance, and off-site risk.« less

  1. Policies to keep and expand the option of concentrating solar power for dispatchable renewable electricity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lilliestam, Johan; Barradi, Touria; Caldes, Natalia

    Concentrating solar power (CSP) is one of the few renewable electricity technologies that can offer dispatchable electricity at large scale. Thus, it may play an important role in the future, especially to balance fluctuating sources in increasingly renewables-based power systems. Today, its costs are higher than those of PV and wind power and, as most countries do not support CSP, deployment is slow. Unless the expansion gains pace and costs decrease, the industry may stagnate or collapse, and an important technology for climate change mitigation has been lost. Keeping CSP as a maturing technology for dispatchable renewable power thus requiresmore » measures to improve its short-term economic attractiveness and to continue reducing costs in the longer term. We suggest a set of three policy instruments - feed-in tariffs or auctions reflecting the value of dispatchable CSP, and not merely its cost; risk coverage support for innovative designs; and demonstration projects - to be deployed, in regions where CSP has a potentially large role to play. This could provide the CSP industry with a balance of attractive profits and competitive pressure, the incentive to expand CSP while also reducing its costs, making it ready for broad-scale deployment when it is needed.« less

  2. Policies to keep and expand the option of concentrating solar power for dispatchable renewable electricity

    DOE PAGES

    Lilliestam, Johan; Barradi, Touria; Caldes, Natalia; ...

    2018-02-16

    Concentrating solar power (CSP) is one of the few renewable electricity technologies that can offer dispatchable electricity at large scale. Thus, it may play an important role in the future, especially to balance fluctuating sources in increasingly renewables-based power systems. Today, its costs are higher than those of PV and wind power and, as most countries do not support CSP, deployment is slow. Unless the expansion gains pace and costs decrease, the industry may stagnate or collapse, and an important technology for climate change mitigation has been lost. Keeping CSP as a maturing technology for dispatchable renewable power thus requiresmore » measures to improve its short-term economic attractiveness and to continue reducing costs in the longer term. We suggest a set of three policy instruments - feed-in tariffs or auctions reflecting the value of dispatchable CSP, and not merely its cost; risk coverage support for innovative designs; and demonstration projects - to be deployed, in regions where CSP has a potentially large role to play. This could provide the CSP industry with a balance of attractive profits and competitive pressure, the incentive to expand CSP while also reducing its costs, making it ready for broad-scale deployment when it is needed.« less

  3. Do interoperable national information systems enhance availability of data to assess the effect of scale-up of HIV services on health workforce deployment in resource-limited countries?

    PubMed

    Oluoch, Tom; Muturi, David; Kiriinya, Rose; Waruru, Anthony; Lanyo, Kevin; Nguni, Robert; Ojwang, James; Waters, Keith P; Richards, Janise

    2015-01-01

    Sub-Saharan Africa (SSA) bears the heaviest burden of the HIV epidemic. Health workers play a critical role in the scale-up of HIV programs. SSA also has the weakest information and communication technology (ICT) infrastructure globally. Implementing interoperable national health information systems (HIS) is a challenge, even in developed countries. Countries in resource-limited settings have yet to demonstrate that interoperable systems can be achieved, and can improve quality of healthcare through enhanced data availability and use in the deployment of the health workforce. We established interoperable HIS integrating a Master Facility List (MFL), District Health Information Software (DHIS2), and Human Resources Information Systems (HRIS) through application programmers interfaces (API). We abstracted data on HIV care, health workers deployment, and health facilities geo-coordinates. Over 95% of data elements were exchanged between the MFL-DHIS and HRIS-DHIS. The correlation between the number of HIV-positive clients and nurses and clinical officers in 2013 was R2=0.251 and R2=0.261 respectively. Wrong MFL codes, data type mis-match and hyphens in legacy data were key causes of data transmission errors. Lack of information exchange standards for aggregate data made programming time-consuming.

  4. Passive flux meter measurement of water and nutrient flux in saturated porous media: bench-scale laboratory tests.

    PubMed

    Cho, Jaehyun; Annable, Michael D; Jawitz, James W; Hatfield, Kirk

    2007-01-01

    The passive nutrient flux meter (PNFM) is introduced for simultaneous measurement of both water and nutrient flux through saturated porous media. The PNFM comprises a porous sorbent pre-equilibrated with a suite of alcohol tracers, which have different partitioning coefficients. Water flux was estimated based on the loss of loaded resident tracers during deployment, while nutrient flux was quantified based on the nutrient solute mass captured on the sorbent. An anionic resin, Lewatit 6328 A, was used as a permeable sorbent and phosphate (PO4(3-)) was the nutrient studied. The phosphate sorption capacity of the resin was measured in batch equilibration tests as 56 mg PO4(3-) g(-1), which was determined to be adequate capacity to retain PO4(3-) loads intercepted over typical PNFM deployment periods in most natural systems. The PNFM design was validated with bench-scale laboratory tests for a range of 9.8 to 28.3 cm d(-1) Darcy velocities and 6 to 43 h deployment durations. Nutrient and water fluxes measured by the PNFM averaged within 6 and 12% of the applied values, respectively, indicating that the PNFM shows promise as a tool for simultaneous measurement of water and nutrient fluxes.

  5. NEON's Mobile Deployment Platform: A research tool for integrating ecological processes across scales

    NASA Astrophysics Data System (ADS)

    Sanclements, M.

    2016-12-01

    Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.

  6. Deploying Solid Targets in Dense Plasma Focus Devices for Improved Neutron Yields

    NASA Astrophysics Data System (ADS)

    Podpaly, Y. A.; Chapman, S.; Povilus, A.; Falabella, S.; Link, A.; Shaw, B. H.; Cooper, C. M.; Higginson, D.; Holod, I.; Sipe, N.; Gall, B.; Schmidt, A. E.

    2017-10-01

    We report on recent progress in using solid targets in dense plasma focus (DPF) devices. DPFs have been observed to generate energetic ion beams during the pinch phase; these beams interact with the dense plasma in the pinch region as well as the background gas and are believed to be the primary neutron generation mechanism for a D2 gas fill. Targets can be placed in the beam path to enhance neutron yield and to shorten the neutron pulse if desired. In this work, we measure yields from placing titanium deuteride foils, deuterated polyethylene, and non-deuterated control targets in deuterium filled DPFs at both megajoule and kilojoule scales. Furthermore, we have deployed beryllium targets in a helium gas-filled, kilojoule scale DPF for use as a potential AmBe radiological source replacement. Neutron yield, neutron time of flight, and optical images are used to diagnose the effectiveness of target deployments relative to particle-in-cell simulation predictions. A discussion of target holder engineering for material compatibility and damage control will be shown as well. Prepared by LLNL under Contract DE-AC52-07NA27344. Supported by the Office of Defense Nuclear Nonproliferation Research and Development within U.S. DOE's National Nuclear Security Administration and the LLNL Institutional Computing Grand Challenge program.

  7. Verify Occulter Deployment Tolerances as Part of NASA's Technology Development for Exoplanet Missions

    NASA Astrophysics Data System (ADS)

    Kasdin, N. J.; Shaklan, S.; Lisman, D.; Thomson, M.; Webb, D.; Cady, E.; Marks, G. W.; Lo, A.

    2013-01-01

    In support of NASA's Exoplanet Exploration Program and the Technology Development for Exoplanet Missions (TDEM), we recently completed a 2 year study of the manufacturability and metrology of starshade petals. An external occult is a satellite employing a large screen, or starshade, that flies in formation with a spaceborne telescope to provide the starlight suppression needed for detecting and characterizing exoplanets. Among the advantages of using an occulter are the broadband allowed for characterization and the removal of light before entering the observatory, greatly relaxing the requirements on the telescope and instrument. This poster presents the results of our successful first TDEM that demonstrated an occulter petal could be built and measured to an accuracy consistent with close to 10^-10 contrast. We also present the progress in our second TDEM to demonstrate the next critical technology milestone: precision deployment of the central truss and petals to the necessary accuracy. We have completed manufacture of four sub-scale petals and a central hub to fit with an existing deployable truss. We show the plans for repeated stow and deploy tests of the assembly and the metrology to confirm that each deploy repeatably meets the absolute positioning requirements of the petals (better than 1.0 mm).

  8. Posttraumatic stress disorder and associated risk factors in Canadian peacekeeping veterans with health-related disabilities.

    PubMed

    Richardson, J Don; Naifeh, James A; Elhai, Jon D

    2007-08-01

    This study investigates posttraumatic stress disorder (PTSD) and its associated risk factors in a random, national, Canadian sample of United Nations peacekeeping veterans with service-related disabilities. Participants included 1016 male veterans (age < 65 years) who served in the Canadian Forces from 1990 to 1999 and were selected from a larger random sample of 1968 veterans who voluntarily and anonymously completed a general health survey conducted by Veterans Affairs Canada in 1999. Survey instruments included the PTSD Checklist-Military Version (PCL-M), Center for Epidemiological Studies-Depression Scale (CES-D), and questionnaires regarding life events during the past year, current stressors, sociodemographic characteristics, and military history. We found that rates of probable PTSD (PCL-M score > 50) among veterans were 10.92% for veterans deployed once and 14.84% for those deployed more than once. The rates of probable clinical depression (CES-D score > 16) were 30.35% for veterans deployed once and 32.62% for those deployed more than once. We found that, in multivariate analyses, probable PTSD rates and PTSD severity were associated with younger age, single marital status, and deployment frequency. PTSD is an important health concern in the veteran population. Understanding such risk factors as younger age and unmarried status can help predict morbidity among trauma-exposed veterans.

  9. Net radiative forcing from widespread deployment of photovoltaics.

    PubMed

    Nemet, Gregory F

    2009-03-15

    If photovoltaics (PV) are to contribute significantly to stabilizing the climate, they will need to be deployed on the scale of multiple terawatts. Installation of that much PV would cover substantial portions of the Earth's surface with dark-colored, sunlight-absorbing panels, reducing the Earth's albedo. How much radiative forcing would result from this change in land use? How does this amount compare to the radiative forcing avoided by substituting PV for fossil fuels? This analysis uses a series of simple equations to compare the two effects and finds that substitution dominates; the avoided radiative forcing due to substitution of PV for fossil fuels is approximately 30 times largerthan the forcing due to albedo modification. Sensitivity analysis, including discounting of future costs and benefits, identifies unfavorable yet plausible configurations in which the albedo effect substantially reduces the climatic benefits of PV. The value of PV as a climate mitigation option depends on how it is deployed, not just how much it is deployed--efficiency of PV systems and the carbon intensity of the substituted energy are particularly important

  10. Verifying occulter deployment tolerances as part of NASA's technology development for exoplanet missions

    NASA Astrophysics Data System (ADS)

    Kasdin, N. J.; Lisman, D.; Shaklan, S.; Thomson, M.; Webb, D.; Cady, E.; Marks, G. W.; Lo, A.

    2013-09-01

    An external occulter is a satellite employing a large screen, or starshade, that flies in formation with a spaceborne telescope to provide the starlight suppression needed for detecting and characterizing exoplanets. Among the advantages of using an occulter are the broadband allowed for characterization and the removal of light before entering the observatory, greatly relaxing the requirements on the telescope and instrument. In support of NASA's Exoplanet Exploration Program and the Technology Development for Exoplanet Missions (TDEM), we recently completed a 2 year study of the manufacturability and metrology of starshade petals. In this paper we review the results of that successful first TDEM which demonstrated an occulter petal could be built and measured to an accuracy consistent with close to 10-10 contrast. We then present the results of our second TDEM to demonstrate the next critical technology milestone: precision deployment of the central truss and petals to the necessary accuracy. We show the deployment of an existing deployable truss outfitted with four sub-scale petals and a custom designed central hub.

  11. HF-induced airglow structure as a proxy for ionospheric irregularity detection

    NASA Astrophysics Data System (ADS)

    Kendall, E. A.

    2013-12-01

    The High Frequency Active Auroral Research Program (HAARP) heating facility allows scientists to test current theories of plasma physics to gain a better understanding of the underlying mechanisms at work in the lower ionosphere. One powerful technique for diagnosing radio frequency interactions in the ionosphere is to use ground-based optical instrumentation. High-frequency (HF), heater-induced artificial airglow observations can be used to diagnose electron energies and distributions in the heated region, illuminate natural and/or artificially induced ionospheric irregularities, determine ExB plasma drifts, and measure quenching rates by neutral species. Artificial airglow is caused by HF-accelerated electrons colliding with various atmospheric constituents, which in turn emit a photon. The most common emissions are 630.0 nm O(1D), 557.7 nm O(1S), and 427.8 nm N2+(1NG). Because more photons will be emitted in regions of higher electron energization, it may be possible to use airglow imaging to map artificial field-aligned irregularities at a particular altitude range in the ionosphere. Since fairly wide field-of-view imagers are typically deployed in airglow campaigns, it is not well-known what meter-scale features exist in the artificial airglow emissions. Rocket data show that heater-induced electron density variations, or irregularities, consist of bundles of ~10-m-wide magnetic field-aligned filaments with a mean depletion depth of 6% [Kelley et al., 1995]. These bundles themselves constitute small-scale structures with widths of 1.5 to 6 km. Telescopic imaging provides high resolution spatial coverage of ionospheric irregularities and goes hand in hand with other observing techniques such as GPS scintillation, radar, and ionosonde. Since airglow observations can presumably image ionospheric irregularities (electron density variations), they can be used to determine the spatial scale variation, the fill factor, and the lifetime characteristics of irregularities. Telescopic imaging of airglow is a technique capable of simultaneously determining the properties of ionospheric irregularities at decameter resolution over a range of several kilometers. The HAARP telescopic imager consists of two cameras, a set of optics for each camera, and a robotic mount that supports and orients the system. The camera and optics systems are identical except for the camera lenses: one has a wide-angle lens (~19 degrees) and the other has a telescopic lens (~3 degrees). The telescopic imager has a resolution of ~20 m in the F layer and ~10 m in the E layer, which allows the observation of decameter- and kilometer-scale features. Analysis of telescopic data from HAARP campaigns over the last five years will be presented.

  12. Efficient and sustainable deployment of bioenergy with carbon capture and storage in mitigation pathways

    NASA Astrophysics Data System (ADS)

    Kato, E.; Moriyama, R.; Kurosawa, A.

    2016-12-01

    Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socio-economic scenarios that aim to keep mean global temperature rise well below 2°C above pre-industrial, which would require net negative carbon emissions at the end of the 21st century. Also, in the Paris agreement from COP21, it is denoted "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century" which could require large scale deployment of negative emissions technologies later in this century. Because of the additional requirement for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of large-scale BECCS. In this study, we present possible development strategies of low carbon scenarios that consider interaction of economically efficient deployment of bioenergy and/or BECCS technologies, biophysical limit of bioenergy productivity, and food production. In the evaluations, detailed bioenergy representations, including bioenergy feedstocks and conversion technologies with and without CCS, are implemented in an integrated assessment model GRAPE. Also, to overcome a general discrepancy about yield development between 'top-down' integrate assessment models and 'bottom-up' estimates, we applied yields changes of food and bioenergy crops consistent with process-based biophysical models; PRYSBI-2 (Process-Based Regional-Scale Yield Simulator with Bayesian Inference) for food crops, and SWAT (Soil and Water Assessment Tool) for bioenergy crops in changing climate conditions. Using the framework, economically viable strategy for implementing sustainable BECCS are evaluated.

  13. Concern over radiation exposure and psychological distress among rescue workers following the Great East Japan Earthquake

    PubMed Central

    2012-01-01

    Background On March 11, 2011, the Great East Japan Earthquake and tsunami that followed caused severe damage along Japans northeastern coastline and to the Fukushima Daiichi nuclear power plant. To date, there are few reports specifically examining psychological distress in rescue workers in Japan. Moreover, it is unclear to what extent concern over radiation exposure has caused psychological distress to such workers deployed in the disaster area. Methods One month after the disaster, 424 of 1816 (24%) disaster medical assistance team workers deployed to the disaster area were assessed. Concern over radiation exposure was evaluated by a single self-reported question. General psychological distress was assessed with the Kessler 6 scale (K6), depressive symptoms with the Center for Epidemiologic Studies Depression Scale (CES-D), fear and sense of helplessness with the Peritraumatic Distress Inventory (PDI), and posttraumatic stress symptoms with the Impact of Event Scale-Revised (IES-R). Results Radiation exposure was a concern for 39 (9.2%) respondents. Concern over radiation exposure was significantly associated with higher scores on the K6, CES-D, PDI, and IES-R. After controlling for age, occupation, disaster operation experience, duration of time spent watching earthquake news, and past history of psychiatric illness, these associations remained significant in men, but did not remain significant in women for the CES-D and PDI scores. Conclusion The findings suggest that concern over radiation exposure was strongly associated with psychological distress. Reliable, accurate information on radiation exposure might reduce deployment-related distress in disaster rescue workers. PMID:22455604

  14. Role of C-arm VasoCT in the use of endovascular WEB flow disruption in intracranial aneurysm treatment.

    PubMed

    Caroff, J; Mihalea, C; Neki, H; Ruijters, D; Ikka, L; Benachour, N; Moret, J; Spelle, L

    2014-07-01

    The WEB aneurysm embolization system is still under evaluation but seems to be a promising technique to treat wide-neck bifurcation aneurysms. However, this device is barely visible using conventional DSA; thus, high-resolution contrast-enhanced flat panel detector CT (VasoCT) may be useful before detachment to assess the sizing and positioning of the WEB. The purpose of this study was to evaluate the interest of VasoCT during WEB procedures. From March 2012 to July 2013, twelve patients (10 women and 2 men; age range, 44-55 years) were treated for 13 intracranial aneurysms with the WEB device. DSA and VasoCT were used and compared to depict any protrusion of the device in parent arteries before detachment. Two neuroradiologists reviewed each VasoCT scan, and the quality was graded on a subjective quality scale. The mesh of the WEB was very well-depicted in all cases, allowing a very good assessment of its deployment. Device protrusion was clearly detected with VasoCT in 5 cases, leading to WEB repositioning or size substitution. During follow-up, VasoCT also allows good assessment of eventual residual blood flow inside the aneurysm or the WEB device. Unlike DSA, VasoCT is an excellent tool to assess WEB deployment and positioning. In our experience, it allowed a precise evaluation of the WEB sizing and its relation to the parent vessel. Such information very likely enhances the ability to safely use this device, avoiding potential thromboembolic events in cases of protrusion in the parent arteries. © 2014 by American Journal of Neuroradiology.

  15. Mantle structure beneath eastern Africa: Evidence for a through going-mantle anomaly and its implications for the origin of Cenozoic tectonism in eastern Africa

    NASA Astrophysics Data System (ADS)

    Mulibo, G.; Tugume, F.; Julia, J.

    2012-12-01

    In this study, teleseismic earthquakes recorded on over 60 temporary AfricaArray seismic stations deployed in Uganda, Kenya, Tanzania and Zambia between 2007 and 2011 are used to invert P and S travel time residuals, together with travel time residuals from previous deployments, for a 3D image of mantle wave speeds and for examining relief on transition zone discontinuities using receiver function stacks. Tomographic images reveal a low wave speed anomaly (LWA) that dips to the SW beneath northern Zambia, extending to a depth of at least 900 km. The anomaly appears to be continuous across the transition zone, extending into the lower mantle. Receiver function stacks reveal an average transition zone thickness (TZT) across a wide region extending from central Zambia to the NE through Tanzania and into Kenya, which is ~30-40 km thinner than the global average. These results are not easily explained by models for the origin of the Cenozoic tectonism in eastern Africa that invoke a plume head or small scale convection either by edge flow or passive stretching of the lithosphere. However, the depth extent of the LWA coincident with a thin transition zone is consistent with a model invoking a through-going mantle anomaly beneath eastern Africa that links anomalous upper mantle to the African Superplume anomaly in the lower mantle beneath southern Africa. This finding indicates that geodynamic processes deep in the lower mantle are influencing surface dynamics across the Afro-Arabian rift system.

  16. Deep Seismic Structure of the Texas-Gulf of Mexico Passive Margin

    NASA Astrophysics Data System (ADS)

    Pulliam, J.; Gurrola, H.

    2013-12-01

    The Texas-Gulf of Mexico region has witnessed a wide range of tectonic processes, including deformation due to orogeny, continental collision and rifting. Artifacts of these processes are likely to remain at lithospheric depths beneath the region but, until recently, the tools needed to examine structures at mantle depths were not available. With the passage of the EarthScope's USArray stations and the completion of a targeted broadband deployment, new images of the region's lithosphere have emerged. These images reveal lithospheric-scale anomalies that correlate strongly with surface features, such as a large fast anomaly that corresponds to the southern extent of the Laurentia (or 'Great Plains') craton and a large slow anomaly associated with the Southern Oklahoma Aulacogen. Other features that would not have been expected based on surface tectonics include a slow layer that we interpret to be a shear zone at the base of the cratonic root and the transitional continental lithosphere, and a zone that is bounded at its top and bottom by discontinuities and high levels of seismic anisotropy. Additionally a high velocity body underlying the Gulf Coast Plains may mark delaminating lower crust. If true it provides indirect evidence that active rifting best describes the process that led to the opening of the Gulf of Mexico. These new results are based upon the analysis of 326 USArray broadband seismic stations and a 23-station broadband deployment across Texas' passive margin, from Matagorda Island, a barrier island in the Gulf of Mexico, to Johnson City, TX, on the relatively undisturbed Proterozoic crust of central Texas.

  17. Quantifying the Impact of BOReal Forest Fires on Tropospheric Oxidants Over the Atlantic Using Aircraft and Satellites (BORTAS) Experiment: Design, Execution, and Science Overview

    NASA Technical Reports Server (NTRS)

    Palmer, Paul I.; Parrington, Mark; Lee, James D.; Lewis, Alistair C.; Richard, Andrew R.; Bernath, Peter F.; Pawson, Steven; daSilva, Arlindo M.; Duck, Thomas J.; Waugh, David L.; hide

    2013-01-01

    We describe the design and execution of the BORTAS (Quantifying the impact of BOReal forest fires on Tropospheric oxidants using Aircraft and Satellites) experiment, which has the overarching objective of understanding the chemical aging of airmasses that contain the emission products from seasonal boreal wildfires and how these airmasses subsequently impact downwind atmospheric composition. The central focus of the experiment was a two-week deployment of the UK BAe-146-301 Atmospheric Research Aircraft (ARA) over eastern Canada. The planned July 2010 deployment of the ARA was postponed by 12 months because of activities related to the dispersal of material emitted by the Eyjafjallaj¨okull volcano. However, most other planned model and measurement activities, including ground-based measurements at the Dalhousie University Ground Station (DGS), enhanced ozonesonde launches, and measurements at the Pico Atmospheric Observatory in the Azores, went ahead and constituted phase A of the experiment. Phase B of BORTAS in July 2011 included the same measurements, but included the ARA, special satellite observations and a more comprehensive measurement suite at the DGS. Integrating these data helped us to describe pyrogenic plumes from wildfires on a wide spectrum of temporal and spatial scales. We interpret these data using a range of chemistry models, from a near-explicit gas-phase chemical mechanism to regional and global models of atmospheric transport and lumped chemistry. We also present an overview of some of the new science that has originated from this project.

  18. Deployment of Wind Turbines in the Built Environment: Risks, Lessons, and Recommended Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baring-Gould, Ian; Fields, Jason; Oteri, Frank

    Built-environment wind turbine (BEWT) projects are wind energy projects that are constructed on, in, or near buildings, as shown below. These projects present an opportunity for distributed, low-carbon generation combined with highly visible statements on sustainability, but the BEWT niche of the wind industry is still developing and is relatively less mature than the utility-scale wind or conventional ground-based distributed wind sectors. This poster investigates the current state of the BEWT industry by reviewing available literature on BEWT projects as well as interviewing project owners on their experiences deploying and operating the technology.

  19. Advanced Commercial Buildings Initiative Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Sydney G.

    The Southface Advanced Commercial Buildings Initiative has developed solutions to overcome market barriers to energy reductions in small commercial buildings by building on the success of four local and Southeast regional energy efficiency deployment programs. These programs address a variety of small commercial building types, efficiency levels, owners, facility manager skills and needs for financing. The deployment programs also reach critical private sector, utility, nonprofit and government submarkets, and have strong potential to be replicated at scale. During the grant period, 200 small commercial buildings participated in Southface-sponsored energy upgrade programs, saving 166,736,703 kBtu of source energy.

  20. Deploying Liquid Filaments and Suspensions with an Electrohydrodynamic Liquid Bridge

    NASA Astrophysics Data System (ADS)

    Saville, D. A.

    2005-11-01

    We show that a dynamic liquid bridge can be formed by deploying the filament issuing from a Taylor Cone onto a surface with the nozzle and surface held at different electric potentials. This configuration differs sharply form the familiar `electrospinning' configuration where the filament whips violently. Nevertheless, although the aspect ratio (length/diameter) exceeds the Plateau limit by more than two orders of magnitude the bridge is stable. Here we report on the stability characteristics and show that such a bridge can be used to `print' sub-micron scale features on a moving surface with both clear fluids and suspensions.

Top