Efficient On-Demand Operations in Large-Scale Infrastructures
ERIC Educational Resources Information Center
Ko, Steven Y.
2009-01-01
In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…
Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy
2012-11-01
Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
He, Xinhua; Hu, Wenfa
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.
He, Xinhua
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367
Demand surge following earthquakes
Olsen, Anna H.
2012-01-01
Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.
Large Scale Traffic Simulations
DOT National Transportation Integrated Search
1997-01-01
Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computation speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated "looping" between t...
NASA Astrophysics Data System (ADS)
Tyralis, Hristos; Karakatsanis, Georgios; Tzouka, Katerina; Mamassis, Nikos
2015-04-01
The Greek electricity system is examined for the period 2002-2014. The demand load data are analysed at various time scales (hourly, daily, seasonal and annual) and they are related to the mean daily temperature and the gross domestic product (GDP) of Greece for the same time period. The prediction of energy demand, a product of the Greek Independent Power Transmission Operator, is also compared with the demand load. Interesting results about the change of the electricity demand scheme after the year 2010 are derived. This change is related to the decrease of the GDP, during the period 2010-2014. The results of the analysis will be used in the development of an energy forecasting system which will be a part of a framework for optimal planning of a large-scale hybrid renewable energy system in which hydropower plays the dominant role. Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)
NASA Astrophysics Data System (ADS)
Rajagopalan, K.; Chinnayakanahalli, K. J.; Stockle, C. O.; Nelson, R. L.; Kruger, C. E.; Brady, M. P.; Malek, K.; Dinesh, S. T.; Barber, M. E.; Hamlet, A. F.; Yorgey, G. G.; Adam, J. C.
2018-03-01
Adaptation to a changing climate is critical to address future global food and water security challenges. While these challenges are global, successful adaptation strategies are often generated at regional scales; therefore, regional-scale studies are critical to inform adaptation decision making. While climate change affects both water supply and demand, water demand is relatively understudied, especially at regional scales. The goal of this work is to address this gap, and characterize the direct impacts of near-term (for the 2030s) climate change and elevated CO2 levels on regional-scale crop yields and irrigation demands for the Columbia River basin (CRB). This question is addressed through a coupled crop-hydrology model that accounts for site-specific and crop-specific characteristics that control regional-scale response to climate change. The overall near-term outlook for agricultural production in the CRB is largely positive, with yield increases for most crops and small overall increases in irrigation demand. However, there are crop-specific and location-specific negative impacts as well, and the aggregate regional response of irrigation demands to climate change is highly sensitive to the spatial crop mix. Low-value pasture/hay varieties of crops—typically not considered in climate change assessments—play a significant role in determining the regional response of irrigation demands to climate change, and thus cannot be overlooked. While, the overall near-term outlook for agriculture in the region is largely positive, there may be potential for a negative outlook further into the future, and it is important to consider this in long-term planning.
Motor Vehicle Demand Models : Assessment of the State of the Art and Directions for Future Research
DOT National Transportation Integrated Search
1981-04-01
The report provides an assessment of the current state of motor vehicle demand modeling. It includes a detailed evaluation of one leading large-scale econometric vehicle demand model, which is tested for both logical consistency and forecasting accur...
Large-scale water projects in the developing world: Revisiting the past and looking to the future
NASA Astrophysics Data System (ADS)
Sivakumar, Bellie; Chen, Ji
2014-05-01
During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").
SCALE(ing)-UP Teaching: A Case Study of Student Motivation in an Undergraduate Course
ERIC Educational Resources Information Center
Chittum, Jessica R.; McConnell, Kathryne Drezek; Sible, Jill
2017-01-01
Teaching large classes is increasingly common; thus, demand for effective large-class pedagogy is rising. One method, titled "SCALE-UP" (Student-Centered Active Learning Environment for Undergraduate Programs), is intended for large classes and involves collaborative, active learning in a technology-rich and student-centered environment.…
Assuring Quality in Large-Scale Online Course Development
ERIC Educational Resources Information Center
Parscal, Tina; Riemer, Deborah
2010-01-01
Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…
Large-Scale 3D Printing: The Way Forward
NASA Astrophysics Data System (ADS)
Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid
2018-03-01
Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.
Causal Inferences with Large Scale Assessment Data: Using a Validity Framework
ERIC Educational Resources Information Center
Rutkowski, David; Delandshere, Ginette
2016-01-01
To answer the calls for stronger evidence by the policy community, educational researchers and their associated organizations increasingly demand more studies that can yield causal inferences. International large scale assessments (ILSAs) have been targeted as a rich data sources for causal research. It is in this context that we take up a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Downing, M.; Langseth, D.; Stoffel, R.
1996-12-31
The purpose of this project was to track and monitor costs of planting, maintaining, and monitoring large scale commercial plantings of hybrid poplar in Minnesota. These costs assists potential growers and purchasers of this resource to determine the ways in which supply and demand may be secured through developing markets.
Green and blue water demand from large-scale land acquisitions in Africa
Johansson, Emma Li; Fader, Marianela; Seaquist, Jonathan W.; Nicholas, Kimberly A.
2016-01-01
In the last decade, more than 22 million ha of land have been contracted to large-scale land acquisitions in Africa, leading to increased pressures, competition, and conflicts over freshwater resources. Currently, 3% of contracted land is in production, for which we model site-specific water demands to indicate where freshwater appropriation might pose high socioenvironmental challenges. We use the dynamic global vegetation model Lund–Potsdam–Jena managed Land to simulate green (precipitation stored in soils and consumed by plants through evapotranspiration) and blue (extracted from rivers, lakes, aquifers, and dams) water demand and crop yields for seven irrigation scenarios, and compare these data with two baseline scenarios of staple crops representing previous water demand. We find that most land acquisitions are planted with crops that demand large volumes of water (>9,000 m3⋅ha−1) like sugarcane, jatropha, and eucalyptus, and that staple crops have lower water requirements (<7,000 m3⋅ha−1). Blue water demand varies with irrigation system, crop choice, and climate. Even if the most efficient irrigation systems were implemented, 18% of the land acquisitions, totaling 91,000 ha, would still require more than 50% of water from blue water sources. These hotspots indicate areas at risk for transgressing regional constraints for freshwater use as a result of overconsumption of blue water, where socioenvironmental systems might face increased conflicts and tensions over water resources. PMID:27671634
The biomechanical demands of manual scaling on the shoulders & neck of dental hygienists.
La Delfa, Nicholas J; Grondin, Diane E; Cox, Jocelyn; Potvin, Jim R; Howarth, Samuel J
2017-01-01
The purpose of this study was to evaluate the postural and muscular demands placed on the shoulders and neck of dental hygienists when performing a simulated manual scaling task. Nineteen healthy female dental hygienists performed 30-min of simulated manual scaling on a manikin head in a laboratory setting. Surface electromyography was used to monitor muscle activity from several neck and shoulder muscles, and neck and arm elevation kinematics were evaluated using motion capture. The simulated scaling task resulted in a large range of neck and arm elevation angles and excessive low-level muscular demands in the neck extensor and scapular stabilising muscles. The physical demands varied depending on the working position of the hygienists relative to the manikin head. These findings are valuable in guiding future ergonomics interventions aimed at reducing the physical exposures of dental hygiene work. Practitioner Summary: Given that this study evaluates the physical demands of manual scaling, a procedure that is fundamental to dental hygiene work, the findings are valuable to identify ergonomics interventions to reduce the prevalence of work-related injuries, disability and the potential for early retirement among this occupational group.
Hu, Michael Z.; Zhu, Ting
2015-12-04
This study reviews the experimental synthesis and engineering developments that focused on various green approaches and large-scale process production routes for quantum dots. Fundamental process engineering principles were illustrated. In relation to the small-scale hot injection method, our discussions focus on the non-injection route that could be scaled up with engineering stir-tank reactors. In addition, applications that demand to utilize quantum dots as "commodity" chemicals are discussed, including solar cells and solid-state lightings.
Energy demand analytics using coupled technological and economic models
Impacts of a range of policy scenarios on end-use energy demand are examined using a coupling of MARKAL, an energy system model with extensive supply and end-use technological detail, with Inforum LIFT, a large-scale model of the us. economy with inter-industry, government, and c...
DOT National Transportation Integrated Search
1999-12-01
This paper analyzes the freight demand characteristics that drive modal choice by means of a large scale, national, disaggregate revealed preference database for shippers in France in 1988, using a nested logit. Particular attention is given to priva...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramamurthy, Byravamurthy
2014-05-05
In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less
NASA Astrophysics Data System (ADS)
Voisin, N.; Kintner-Meyer, M.; Skaggs, R.; Xie, Y.; Wu, D.; Nguyen, T. B.; Fu, T.; Zhou, T.
2016-12-01
Heat waves and droughts are projected to be more frequent and intense. We have seen in the past the effects of each of those extreme climate events on electricity demand and constrained electricity generation, challenging power system operations. Our aim here is to understand the compounding effects under historical conditions. We present a benchmark of Western US grid performance under 55 years of historical climate, and including droughts, using 2010-level of water demand and water management infrastructure, and 2010-level of electricity grid infrastructure and operations. We leverage CMIP5 historical hydrology simulations and force a large scale river routing- reservoir model with 2010-level sectoral water demands. The regulated flow at each water-dependent generating plants is processed to adjust water-dependent electricity generation parameterization in a production cost model, that represents 2010-level power system operations with hourly energy demand of 2010. The resulting benchmark includes a risk distribution of several grid performance metrics (unserved energy, production cost, carbon emission) as a function of inter-annual variability in regional water availability and predictability using large scale climate oscillations. In the second part of the presentation, we describe an approach to map historical heat waves onto this benchmark grid performance using a building energy demand model. The impact of the heat waves, combined with the impact of droughts, is explored at multiple scales to understand the compounding effects. Vulnerabilities of the power generation and transmission systems are highlighted to guide future adaptation.
National Testing: Gains or Strains? School Leaders' Responses to Policy Demands
ERIC Educational Resources Information Center
Gunnulfsen, Ann Elisabeth; Møller, Jorunn
2017-01-01
Studies have shown that principals are essential in successfully implementing large-scale policy reforms in schools. However, the issue of how school leaders interpret and transform reforms is understudied. This article explores how twelve Norwegian school leaders respond to external demands in a new policy context emphasizing national test…
Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos
2017-02-01
Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some normalised small-scale homes experiencing isolation and too high job demands in their work roles. This systematic review suggests that normalised small-scale homes are a good starting point for creating a higher quality of working life in the nursing home sector. Higher job control enables care workers to manage higher job demands in normalised small-scale homes. However, some jobs would benefit from interventions to address care workers' perceptions of too low social support and of too high job demands. More research is needed to examine strategies to enhance these working life issues in normalised small-scale settings. Copyright © 2016 Elsevier Ltd. All rights reserved.
Visualization, documentation, analysis, and communication of large scale gene regulatory networks
Longabaugh, William J.R.; Davidson, Eric H.; Bolouri, Hamid
2009-01-01
Summary Genetic regulatory networks (GRNs) are complex, large-scale, and spatially and temporally distributed. These characteristics impose challenging demands on computational GRN modeling tools, and there is a need for custom modeling tools. In this paper, we report on our ongoing development of BioTapestry, an open source, freely available computational tool designed specifically for GRN modeling. We also outline our future development plans, and give some examples of current applications of BioTapestry. PMID:18757046
Lakshmikanthan, P; Sivakumar Babu, G L
2017-03-01
The potential of bioreactor landfills to treat mechanically biologically treated municipal solid waste is analysed in this study. Developing countries like India and China have begun to investigate bioreactor landfills for municipal solid waste management. This article describes the impacts of leachate recirculation on waste stabilisation, landfill gas generation, leachate characteristics and long-term waste settlement. A small-scale and large-scale anaerobic cell were filled with mechanically biologically treated municipal solid waste collected from a landfill site at the outskirts of Bangalore, India. Leachate collected from the same landfill site was recirculated at the rate of 2-5 times a month on a regular basis for 370 days. The total quantity of gas generated was around 416 L in the large-scale reactor and 21 L in the small-scale reactor, respectively. Differential settlements ranging from 20%-26% were observed at two different locations in the large reactor, whereas 30% of settlement was observed in the small reactor. The biological oxygen demand/chemical oxygen demand (COD) ratio indicated that the waste in the large reactor was stabilised at the end of 1 year. The performance of the bioreactor with respect to the reactor size, temperature, landfill gas and leachate quality was analysed and it was found that the bioreactor landfill is efficient in the treatment and stabilising of mechanically biologically treated municipal solid waste.
Map Scale, Proportion, and Google[TM] Earth
ERIC Educational Resources Information Center
Roberge, Martin C.; Cooper, Linda L.
2010-01-01
Aerial imagery has a great capacity to engage and maintain student interest while providing a contextual setting to strengthen their ability to reason proportionally. Free, on-demand, high-resolution, large-scale aerial photography provides both a bird's eye view of the world and a new perspective on one's own community. This article presents an…
Romano Foti; Jorge A. Ramirez; Thomas C. Brown
2012-01-01
Comparison of projected future water demand and supply across the conterminous United States indicates that, due to improving efficiency in water use, expected increases in population and economic activity do not by themselves pose a serious threat of large-scale water shortages. However, climate change can increase water demand and decrease water supply to the extent...
GATECloud.net: a platform for large-scale, open-source text processing on the cloud.
Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina
2013-01-28
Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.
Integrating market processes into utility resource planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, E.P.
1992-11-01
Integrated resource planning has resulted in an abundance of alternatives for meeting existing and new demand for electricity services: (1) utility demand-side management (DSM) programs, (2) DSM bidding, (3) competitive bidding for private power supplies, (4) utility re-powering, and (5) new utility construction. Each alternative relies on a different degree of planning for implementation and, therefore, each alternative relies on markets to a greater or lesser degree. This paper shows how the interaction of planning processes and market forces results in resource allocations among the alternatives. The discussion focuses on three phenomena that are driving forces behind the unanticipated consequences'more » of contemporary integrated resource planning efforts. These forces are: (1) large-scale DSM efforts, (2) customer bypass, and (3) large-scale independent power projects. 22 refs., 3 figs., 2 tabs.« less
NASA Astrophysics Data System (ADS)
Semple, Lucas M.; Carriveau, Rupp; Ting, David S.-K.
2018-04-01
In the Ontario greenhouse sector the misalignment of available solar radiation during the summer months and large heating demand during the winter months makes solar thermal collector systems an unviable option without some form of seasonal energy storage. Information obtained from Ontario greenhouse operators has shown that over 20% of annual natural gas usage occurs during the summer months for greenhouse pre-heating prior to sunrise. A transient model of the greenhouse microclimate and indoor conditioning systems is carried out using TRNSYS software and validated with actual natural gas usage data. A large-scale solar thermal collector system is then incorporated and found to reduce the annual heating energy demand by approximately 35%. The inclusion of the collector system correlates to a reduction of about 120 tonnes of CO2 equivalent emissions per acre of greenhouse per year. System payback period is discussed considering the benefits of a future Ontario carbon tax.
A generic hydroeconomic model to assess future water scarcity
NASA Astrophysics Data System (ADS)
Neverre, Noémie; Dumas, Patrice
2015-04-01
We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on the maximization of water benefits, over time and space. A parameterisation-simulation-optimisation approach is used. This gives a projection of future water scarcity in the different locations and an estimation of the associated direct economic losses from unsatisfied demands. This generic hydroeconomic model can be easily applied to large-scale regions, in particular developing regions where little reliable data is available. We will present an application to Algeria, up to the 2050 horizon.
Simulating potential water grabbing from large-scale land acquisitions in Africa}
NASA Astrophysics Data System (ADS)
Li Johansson, Emma; Fader, Marianela; Seaquist, Jonathan W.; Nicholas, Kimberly A.
2017-04-01
The potential high level of water appropriation in Africa by foreign companies might pose high socioenvironmental challenges, including overconsumption of water and conflicts and tensions over water resources allocation. We will present a study published recently in the Proceedings of the National Academy of Sciences11 of the USA, where we simulated green and blue water demand and crop yields of large-scale land acquisitions in several African countries. Green water refers to precipitation stored in soils and consumed by plants through evapotranspiration, while blue water is extracted from rivers, lakes, aquifers, and dams. We simulated seven irrigation scenarios, and compared these data with two baseline scenarios of staple crops representing previous water demand. The results indicate that the green and blue water use is 39% and 76-86% greater, respectively, for crops grown on acquired land compared with the baseline of common staple crops, showing that land acquisitions substantially increase water demands. We also found that most land acquisitions are planted with crops such as sugarcane, jatropha, and eucalyptus, that demand volumes of water >9,000 m3ṡha-1. And even if the most efficient irrigation systems were implemented, 18% of the land acquisitions, totaling 91,000 ha, would still require more than 50% of water from blue water sources.
NASA Astrophysics Data System (ADS)
Nazemi, A.; Wheater, H. S.
2015-01-01
Human activities have caused various changes to the Earth system, and hence the interconnections between human activities and the Earth system should be recognized and reflected in models that simulate Earth system processes. One key anthropogenic activity is water resource management, which determines the dynamics of human-water interactions in time and space and controls human livelihoods and economy, including energy and food production. There are immediate needs to include water resource management in Earth system models. First, the extent of human water requirements is increasing rapidly at the global scale and it is crucial to analyze the possible imbalance between water demands and supply under various scenarios of climate change and across various temporal and spatial scales. Second, recent observations show that human-water interactions, manifested through water resource management, can substantially alter the terrestrial water cycle, affect land-atmospheric feedbacks and may further interact with climate and contribute to sea-level change. Due to the importance of water resource management in determining the future of the global water and climate cycles, the World Climate Research Program's Global Energy and Water Exchanges project (WRCP-GEWEX) has recently identified gaps in describing human-water interactions as one of the grand challenges in Earth system modeling (GEWEX, 2012). Here, we divide water resource management into two interdependent elements, related firstly to water demand and secondly to water supply and allocation. In this paper, we survey the current literature on how various components of water demand have been included in large-scale models, in particular land surface and global hydrological models. Issues of water supply and allocation are addressed in a companion paper. The available algorithms to represent the dominant demands are classified based on the demand type, mode of simulation and underlying modeling assumptions. We discuss the pros and cons of available algorithms, address various sources of uncertainty and highlight limitations in current applications. We conclude that current capability of large-scale models to represent human water demands is rather limited, particularly with respect to future projections and coupled land-atmospheric simulations. To fill these gaps, the available models, algorithms and data for representing various water demands should be systematically tested, intercompared and improved. In particular, human water demands should be considered in conjunction with water supply and allocation, particularly in the face of water scarcity and unknown future climate.
Large-scale alcohol production from corn, grain sorghum, and crop residues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turhollow, A.F. Jr.
1982-01-01
The potential impacts that large-scale alcohol production from corn, grain sorghum, and crop residues may have on US agriculture in the year 2000 are investigated. A one-land-group interregional linear-programming model is used. The objective function is to minimize the cost of production in the agricultural sector, given specified crop demands and constrained resources. The impacts that levels of alcohol production, ranging from zero to 12 billion gallons, have at two projected levels of crop demands, two grain-to-alcohol conversion and two milling methods, wet and dry, rates are considered. The impacts that large-scale fuel alcohol production has on US agriculture aremore » small. The major impacts that occur are the substitution of milling by-products, DDG, gluten feed, and gluten meal, for soybean meal in livestock feed rations. Production of 12 billion gallons of alcohol is estimated to be equivalent to an 18 percent increase in crop exports. Improving the grain-to-alcohol conversion rate from 2.6 to 3.0 gallons per bushels reduces the overall cost of agricultural production by $989 billion when 12 billion gallons of alcohol are produced.« less
Sensitivity analysis of key components in large-scale hydroeconomic models
NASA Astrophysics Data System (ADS)
Medellin-Azuara, J.; Connell, C. R.; Lund, J. R.; Howitt, R. E.
2008-12-01
This paper explores the likely impact of different estimation methods in key components of hydro-economic models such as hydrology and economic costs or benefits, using the CALVIN hydro-economic optimization for water supply in California. In perform our analysis using two climate scenarios: historical and warm-dry. The components compared were perturbed hydrology using six versus eighteen basins, highly-elastic urban water demands, and different valuation of agricultural water scarcity. Results indicate that large scale hydroeconomic hydro-economic models are often rather robust to a variety of estimation methods of ancillary models and components. Increasing the level of detail in the hydrologic representation of this system might not greatly affect overall estimates of climate and its effects and adaptations for California's water supply. More price responsive urban water demands will have a limited role in allocating water optimally among competing uses. Different estimation methods for the economic value of water and scarcity in agriculture may influence economically optimal water allocation; however land conversion patterns may have a stronger influence in this allocation. Overall optimization results of large-scale hydro-economic models remain useful for a wide range of assumptions in eliciting promising water management alternatives.
To cope with the rising demand for fresh water, desalination of brackish groundwater and seawater is increasingly being viewed as a pragmatic option for augmenting fresh water supplies. The large scale deployment of desalination is likely to demonstrably increase electricity use,...
Job stress and burnout among urban and rural hospital physicians in Japan.
Saijo, Yasuaki; Chiba, Shigeru; Yoshioka, Eiji; Kawanishi, Yasuyuki; Nakagi, Yoshihiko; Ito, Toshihiro; Sugioka, Yoshihiko; Kitaoka-Higashiguchi, Kazuyo; Yoshida, Takahiko
2013-08-01
To elucidate the differences in job stress and burnout status of Japanese hospital physicians between large cities, small cities, and towns and villages. Cross-sectional study. Postal self-administered questionnaires were distributed to 2937 alumni of Asahikawa Medical University. Four hundred and twenty-two hospital physicians. The Brief Job Stress Questionnaire was used to evaluate job demand, job control and social support. The Japanese version of the Maslach Burnout Inventory-General Survey (MBI-GS) was used to evaluate burnout. An analysis of covariance was conducted on the mean scores on the Brief Job Stress Questionnaire and the MBI-GS scales after adjusting for sex, age and specialties. In adjusted analyses, the job demand score was significantly different among physicians in the three areas. In Bonferroni post-hoc tests, scores in large cities was significantly higher than those in small cities and towns and villages. The job control score showed a significant difference and a marginally significant trend, with large cities associated with lower job control. There were significant differences in support from supervisors and that from family/friends, and scores in large cities was significantly higher than those in small cities in the post-hoc test. There was a significant effect on the exhaustion scale of the MBI-GS, with large cities associated with higher exhaustion, and scores in large cities was significantly higher than those in small cities. Urban hospital physicians had more job demand, less job control and exhaustion caused by burnout, and rural hospital physicians had less social support. © 2013 The Authors. Australian Journal of Rural Health © National Rural Health Alliance Inc.
Water limited agriculture in Africa: Climate change sensitivity of large scale land investments
NASA Astrophysics Data System (ADS)
Rulli, M. C.; D'Odorico, P.; Chiarelli, D. D.; Davis, K. F.
2015-12-01
The past few decades have seen unprecedented changes in the global agricultural system with a dramatic increase in the rates of food production fueled by an escalating demand for food calories, as a result of demographic growth, dietary changes, and - more recently - new bioenergy policies. Food prices have become consistently higher and increasingly volatile with dramatic spikes in 2007-08 and 2010-11. The confluence of these factors has heightened demand for land and brought a wave of land investment to the developing world: some of the more affluent countries are trying to secure land rights in areas suitable for agriculture. According to some estimates, to date, roughly 38 million hectares have been acquired worldwide by large scale investors, 16 million of which in Africa. More than 85% of large scale land acquisitions in Africa are by foreign investors. Many land deals are motivated not only by the need for fertile land but for the water resources required for crop production. Despite some recent assessments of the water appropriation associated with large scale land investments, their impact on the water resources of the target countries under present conditions and climate change scenarios remains poorly understood. Here we investigate irrigation water requirements by various crops planted in the acquired land as an indicator of the pressure likely placed by land investors on ("blue") water resources of target regions in Africa and evaluate the sensitivity to climate changes scenarios.
Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul
2008-01-01
With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.
Progressing Deployment of Solar Photovoltaic Installations in the United States
NASA Astrophysics Data System (ADS)
Kwan, Calvin Lee
2011-07-01
This dissertation evaluates the likelihood of solar PV playing a larger role in national and state level renewable energy portfolios. I examine the feasibility of large-scale solar PV arrays on college campuses, the financials associated with large-scale solar PV arrays and finally, the influence of environmental, economic, social and political variables on the distribution of residential solar PV arrays in the United States. Chapter two investigates the challenges and feasibility of college campuses adopting a net-zero energy policy. Using energy consumption data, local solar insolation data and projected campus growth, I present a method to identify the minimum sized solar PV array that is required for the City College campus of the Los Angeles Community College District to achieve net-zero energy status. I document how current energy demand can be reduced using strategic demand side management, with remaining energy demand being met using a solar PV array. Chapter three focuses on the financial feasibility of large-scale solar PV arrays, using the proposed City College campus array as an example. I document that even after demand side energy management initiatives and financial incentives, large-scale solar PV arrays continue to have ROIs greater than 25 years. I find that traditional financial evaluation methods are not suitable for environmental projects such as solar PV installations as externalities are not taken into account and therefore calls for development of alternative financial valuation methods. Chapter four investigates the influence of environmental, social, economic and political variables on the distribution of residential solar PV arrays across the United States using ZIP code level data from the 2000 US Census. Using data from the National Renewable Energy Laboratory's Open PV project, I document where residential solar PVs are currently located. A zero-inflated negative binomial model was run to evaluate the influence of selected variables. Using the same model, predicted residential solar PV shares were generated and illustrated using GIS software. The results of this model indicate that solar insolation, state energy deregulation and cost of electricity are statistically significant factors positively correlated with the adoption of residential solar PV arrays. With this information, policymakers at the towns and cities level can establish effective solar PV promoting policies and regulations for their respective locations.
The latest developments and outlook for hydrogen liquefaction technology
NASA Astrophysics Data System (ADS)
Ohlig, K.; Decker, L.
2014-01-01
Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence higher operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, E.; Burton, E.; Duran, A.
Understanding the real-world power demand of modern automobiles is of critical importance to engineers using modeling and simulation to inform the intelligent design of increasingly efficient powertrains. Increased use of global positioning system (GPS) devices has made large scale data collection of vehicle speed (and associated power demand) a reality. While the availability of real-world GPS data has improved the industry's understanding of in-use vehicle power demand, relatively little attention has been paid to the incremental power requirements imposed by road grade. This analysis quantifies the incremental efficiency impacts of real-world road grade by appending high fidelity elevation profiles tomore » GPS speed traces and performing a large simulation study. Employing a large real-world dataset from the National Renewable Energy Laboratory's Transportation Secure Data Center, vehicle powertrain simulations are performed with and without road grade under five vehicle models. Aggregate results of this study suggest that road grade could be responsible for 1% to 3% of fuel use in light-duty automobiles.« less
Trends in laser micromachining
NASA Astrophysics Data System (ADS)
Gaebler, Frank; van Nunen, Joris; Held, Andrew
2016-03-01
Laser Micromachining is well established in industry. Depending on the application lasers with pulse length from μseconds to femtoseconds and wavelengths from 1064nm and its harmonics up to 5μm or 10.6μm are used. Ultrafast laser machining using pulses with pico or femtosecond duration pulses is gaining traction, as it offers very precise processing of materials with low thermal impact. Large-scale industrial ultrafast laser applications show that the market can be divided into various sub segments. One set of applications demand low power around 10W, compact footprint and are extremely sensitive to the laser price whilst still demanding 10ps or shorter laser pulses. A second set of applications are very power hungry and only become economically feasible for large scale deployments at power levels in the 100+W class. There is also a growing demand for applications requiring fs-laser pulses. In our presentation we would like to describe these sub segments by using selected applications from the automotive and electronics industry e.g. drilling of gas/diesel injection nozzles, dicing of LED substrates. We close the presentation with an outlook to micromachining applications e.g. glass cutting and foil processing with unique new CO lasers emitting 5μm laser wavelength.
Globus | Informatics Technology for Cancer Research (ITCR)
Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.
Incorporating human-water dynamics in a hyper-resolution land surface model
NASA Astrophysics Data System (ADS)
Vergopolan, N.; Chaney, N.; Wanders, N.; Sheffield, J.; Wood, E. F.
2017-12-01
The increasing demand for water, energy, and food is leading to unsustainable groundwater and surface water exploitation. As a result, the human interactions with the environment, through alteration of land and water resources dynamics, need to be reflected in hydrologic and land surface models (LSMs). Advancements in representing human-water dynamics still leave challenges related to the lack of water use data, water allocation algorithms, and modeling scales. This leads to an over-simplistic representation of human water use in large-scale models; this is in turn leads to an inability to capture extreme events signatures and to provide reliable information at stakeholder-level spatial scales. The emergence of hyper-resolution models allows one to address these challenges by simulating the hydrological processes and interactions with the human impacts at field scales. We integrated human-water dynamics into HydroBlocks - a hyper-resolution, field-scale resolving LSM. HydroBlocks explicitly solves the field-scale spatial heterogeneity of land surface processes through interacting hydrologic response units (HRUs); and its HRU-based model parallelization allows computationally efficient long-term simulations as well as ensemble predictions. The implemented human-water dynamics include groundwater and surface water abstraction to meet agricultural, domestic and industrial water demands. Furthermore, a supply-demand water allocation scheme based on relative costs helps to determine sectoral water use requirements and tradeoffs. A set of HydroBlocks simulations over the Midwest United States (daily, at 30-m spatial resolution for 30 years) are used to quantify the irrigation impacts on water availability. The model captures large reductions in total soil moisture and water table levels, as well as spatiotemporal changes in evapotranspiration and runoff peaks, with their intensity related to the adopted water management strategy. By incorporating human-water dynamics in a hyper-resolution LSM this work allows for progress on hydrological monitoring and predictions, as well as drought preparedness and water impact assessments at relevant decision-making scales.
Sub-seasonal predictability of water scarcity at global and local scale
NASA Astrophysics Data System (ADS)
Wanders, N.; Wada, Y.; Wood, E. F.
2016-12-01
Forecasting the water demand and availability for agriculture and energy production has been neglected in previous research, partly due to the fact that most large-scale hydrological models lack the skill to forecast human water demands at sub-seasonal time scale. We study the potential of a sub-seasonal water scarcity forecasting system for improved water management decision making and improved estimates of water demand and availability. We have generated 32 years of global sub-seasonal multi-model water availability, demand and scarcity forecasts. The quality of the forecasts is compared to a reference forecast derived from resampling historic weather observations. The newly developed system has been evaluated for both the global scale and in a real-time local application in the Sacramento valley for the Trinity, Shasta and Oroville reservoirs, where the water demand for agriculture and hydropower is high. On the global scale we find that the reference forecast shows high initial forecast skill (up to 8 months) for water scarcity in the eastern US, Central Asia and Sub-Saharan Africa. Adding dynamical sub-seasonal forecasts results in a clear improvement for most regions in the world, increasing the forecasts' lead time by 2 or more months on average. The strongest improvements are found in the US, Brazil, Central Asia and Australia. For the Sacramento valley we can accurately predict anomalies in the reservoir inflow, hydropower potential and the downstream irrigation water demand 6 months in advance. This allow us to forecast potential water scarcity in the Sacramento valley and adjust the reservoir management to prevent deficits in energy or irrigation water availability. The newly developed forecast system shows that it is possible to reduce the vulnerability to upcoming water scarcity events and allows optimization of the distribution of the available water between the agricultural and energy sector half a year in advance.
A holistic approach for large-scale derived flood frequency analysis
NASA Astrophysics Data System (ADS)
Dung Nguyen, Viet; Apel, Heiko; Hundecha, Yeshewatesfa; Guse, Björn; Sergiy, Vorogushyn; Merz, Bruno
2017-04-01
Spatial consistency, which has been usually disregarded because of the reported methodological difficulties, is increasingly demanded in regional flood hazard (and risk) assessments. This study aims at developing a holistic approach for deriving flood frequency at large scale consistently. A large scale two-component model has been established for simulating very long-term multisite synthetic meteorological fields and flood flow at many gauged and ungauged locations hence reflecting the spatially inherent heterogeneity. The model has been applied for the region of nearly a half million km2 including Germany and parts of nearby countries. The model performance has been multi-objectively examined with a focus on extreme. By this continuous simulation approach, flood quantiles for the studied region have been derived successfully and provide useful input for a comprehensive flood risk study.
Differentiating unipolar and bipolar depression by alterations in large-scale brain networks.
Goya-Maldonado, Roberto; Brodmann, Katja; Keil, Maria; Trost, Sarah; Dechent, Peter; Gruber, Oliver
2016-02-01
Misdiagnosing bipolar depression can lead to very deleterious consequences of mistreatment. Although depressive symptoms may be similarly expressed in unipolar and bipolar disorder, changes in specific brain networks could be very distinct, being therefore informative markers for the differential diagnosis. We aimed to characterize specific alterations in candidate large-scale networks (frontoparietal, cingulo-opercular, and default mode) in symptomatic unipolar and bipolar patients using resting state fMRI, a cognitively low demanding paradigm ideal to investigate patients. Networks were selected after independent component analysis, compared across 40 patients acutely depressed (20 unipolar, 20 bipolar), and 20 controls well-matched for age, gender, and education levels, and alterations were correlated to clinical parameters. Despite comparable symptoms, patient groups were robustly differentiated by large-scale network alterations. Differences were driven in bipolar patients by increased functional connectivity in the frontoparietal network, a central executive and externally-oriented network. Conversely, unipolar patients presented increased functional connectivity in the default mode network, an introspective and self-referential network, as much as reduced connectivity of the cingulo-opercular network to default mode regions, a network involved in detecting the need to switch between internally and externally oriented demands. These findings were mostly unaffected by current medication, comorbidity, and structural changes. Moreover, network alterations in unipolar patients were significantly correlated to the number of depressive episodes. Unipolar and bipolar groups displaying similar symptomatology could be clearly distinguished by characteristic changes in large-scale networks, encouraging further investigation of network fingerprints for clinical use. Hum Brain Mapp 37:808-818, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Economically viable large-scale hydrogen liquefaction
NASA Astrophysics Data System (ADS)
Cardella, U.; Decker, L.; Klein, H.
2017-02-01
The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.
NASA Astrophysics Data System (ADS)
Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.
2017-01-01
As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application and the performance that was achieved.
Combined heat and power supply using Carnot engines
NASA Astrophysics Data System (ADS)
Horlock, J. H.
The Marshall Report on the thermodynamic and economic feasibility of introducing large scale combined heat and electrical power generation (CHP) into the United Kingdom is summarized. Combinations of reversible power plant (Carnot engines) to meet a given demand of power and heat production are analyzed. The Marshall Report states that fairly large scale CHP plants are an attractive energy saving option for areas of high heat load densities. Analysis shows that for given requirements, the total heat supply and utilization factor are functions of heat output, reservoir supply temperature, temperature of heat rejected to the reservoir, and an intermediate temperature for district heating.
Expansion of Human Induced Pluripotent Stem Cells in Stirred Suspension Bioreactors.
Almutawaa, Walaa; Rohani, Leili; Rancourt, Derrick E
2016-01-01
Human induced pluripotent stem cells (hiPSCs) hold great promise as a cell source for therapeutic applications and regenerative medicine. Traditionally, hiPSCs are expanded in two-dimensional static culture as colonies in the presence or absence of feeder cells. However, this expansion procedure is associated with lack of reproducibility and low cell yields. To fulfill the large cell number demand for clinical use, robust large-scale production of these cells under defined conditions is needed. Herein, we describe a scalable, low-cost protocol for expanding hiPSCs as aggregates in a lab-scale bioreactor.
Aerodynamic flow deflector to increase large scale wind turbine power generation by 10%.
DOT National Transportation Integrated Search
2015-11-01
The innovation proposed in this paper has the potential to address both the efficiency demands of wind farm owners as well as to provide a disruptive design innovation to turbine manufacturers. The aerodynamic deflector technology was created to impr...
Violante, Ines R; Li, Lucia M; Carmichael, David W; Lorenz, Romy; Leech, Robert; Hampshire, Adam; Rothwell, John C; Sharp, David J
2017-03-14
Cognitive functions such as working memory (WM) are emergent properties of large-scale network interactions. Synchronisation of oscillatory activity might contribute to WM by enabling the coordination of long-range processes. However, causal evidence for the way oscillatory activity shapes network dynamics and behavior in humans is limited. Here we applied transcranial alternating current stimulation (tACS) to exogenously modulate oscillatory activity in a right frontoparietal network that supports WM. Externally induced synchronization improved performance when cognitive demands were high. Simultaneously collected fMRI data reveals tACS effects dependent on the relative phase of the stimulation and the internal cognitive processing state. Specifically, synchronous tACS during the verbal WM task increased parietal activity, which correlated with behavioral performance. Furthermore, functional connectivity results indicate that the relative phase of frontoparietal stimulation influences information flow within the WM network. Overall, our findings demonstrate a link between behavioral performance in a demanding WM task and large-scale brain synchronization.
Locating inefficient links in a large-scale transportation network
NASA Astrophysics Data System (ADS)
Sun, Li; Liu, Like; Xu, Zhongzhi; Jie, Yang; Wei, Dong; Wang, Pu
2015-02-01
Based on data from geographical information system (GIS) and daily commuting origin destination (OD) matrices, we estimated the distribution of traffic flow in the San Francisco road network and studied Braess's paradox in a large-scale transportation network with realistic travel demand. We measured the variation of total travel time Δ T when a road segment is closed, and found that | Δ T | follows a power-law distribution if Δ T < 0 or Δ T > 0. This implies that most roads have a negligible effect on the efficiency of the road network, while the failure of a few crucial links would result in severe travel delays, and closure of a few inefficient links would counter-intuitively reduce travel costs considerably. Generating three theoretical networks, we discovered that the heterogeneously distributed travel demand may be the origin of the observed power-law distributions of | Δ T | . Finally, a genetic algorithm was used to pinpoint inefficient link clusters in the road network. We found that closing specific road clusters would further improve the transportation efficiency.
Violante, Ines R; Li, Lucia M; Carmichael, David W; Lorenz, Romy; Leech, Robert; Hampshire, Adam; Rothwell, John C; Sharp, David J
2017-01-01
Cognitive functions such as working memory (WM) are emergent properties of large-scale network interactions. Synchronisation of oscillatory activity might contribute to WM by enabling the coordination of long-range processes. However, causal evidence for the way oscillatory activity shapes network dynamics and behavior in humans is limited. Here we applied transcranial alternating current stimulation (tACS) to exogenously modulate oscillatory activity in a right frontoparietal network that supports WM. Externally induced synchronization improved performance when cognitive demands were high. Simultaneously collected fMRI data reveals tACS effects dependent on the relative phase of the stimulation and the internal cognitive processing state. Specifically, synchronous tACS during the verbal WM task increased parietal activity, which correlated with behavioral performance. Furthermore, functional connectivity results indicate that the relative phase of frontoparietal stimulation influences information flow within the WM network. Overall, our findings demonstrate a link between behavioral performance in a demanding WM task and large-scale brain synchronization. DOI: http://dx.doi.org/10.7554/eLife.22001.001 PMID:28288700
NASA Astrophysics Data System (ADS)
Kennedy, Scott Warren
A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable contribution by synthesizing information from research in power market economics, power system reliability, and environmental impact assessment, to develop a comprehensive methodology for analyzing wind power in the context of long-term energy planning.
The latest developments and outlook for hydrogen liquefaction technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohlig, K.; Decker, L.
2014-01-29
Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence highermore » operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.« less
Gray, Nicola; Lewis, Matthew R; Plumb, Robert S; Wilson, Ian D; Nicholson, Jeremy K
2015-06-05
A new generation of metabolic phenotyping centers are being created to meet the increasing demands of personalized healthcare, and this has resulted in a major requirement for economical, high-throughput metabonomic analysis by liquid chromatography-mass spectrometry (LC-MS). Meeting these new demands represents an emerging bioanalytical problem that must be solved if metabolic phenotyping is to be successfully applied to large clinical and epidemiological sample sets. Ultraperformance (UP)LC-MS-based metabolic phenotyping, based on 2.1 mm i.d. LC columns, enables comprehensive metabolic phenotyping but, when employed for the analysis of thousands of samples, results in high solvent usage. The use of UPLC-MS employing 1 mm i.d. columns for metabolic phenotyping rather than the conventional 2.1 mm i.d. methodology shows that the resulting optimized microbore method provided equivalent or superior performance in terms of peak capacity, sensitivity, and robustness. On average, we also observed, when using the microbore scale separation, an increase in response of 2-3 fold over that obtained with the standard 2.1 mm scale method. When applied to the analysis of human urine, the 1 mm scale method showed no decline in performance over the course of 1000 analyses, illustrating that microbore UPLC-MS represents a viable alternative to conventional 2.1 mm i.d. formats for routine large-scale metabolic profiling studies while also resulting in a 75% reduction in solvent usage. The modest increase in sensitivity provided by this methodology also offers the potential to either reduce sample consumption or increase the number of metabolite features detected with confidence due to the increased signal-to-noise ratios obtained. Implementation of this miniaturized UPLC-MS method of metabolic phenotyping results in clear analytical, economic, and environmental benefits for large-scale metabolic profiling studies with similar or improved analytical performance compared to conventional UPLC-MS.
Land Grabbing and the Commodification of Agricultural Land in Africa
NASA Astrophysics Data System (ADS)
D'Odorico, P.; Rulli, M. C.
2014-12-01
The increasing global demand for farmland products is placing unprecedented pressure on the global agricultural system. The increasing demand can be met through either the intensification or the expansion of agricultural production at the expenses of other ecosystems. The ongoing escalation of large scale land acquisitions in the developing world may contribute to both of these two processes. Investments in agriculture have become a priority for a number of governments and corporations that are trying to expand their agricultural production while securing good profits. It is unclear however to what extent these investments are driving the intensification or the expansion of agriculture. In the last decade large scale land acquisitions by external investors have increased at unprecedented rates. This global land rush was likely enhanced by recent food crises, when prices skyrocketed in response to crop failure, new bioenergy policies, and the increasing demand for agricultural products by a growing and increasingly affluent human population. Corporations recognized the potential for high return investments in agricultural land, while governments started to enhance their food security by purchasing large tracts of land in foreign countries. It has been estimated that, to date, about 35.6 million ha of cropland - more than twice the agricultural land of Germany - have been acquired by foreign investors worldwide. As an effect of these land deals the local communities lose legal access to the land and its products. Here we investigate the effect of large scale land acquisition on agricultural intensification or expansion in African countries. We discuss the extent to which these investments in agriculture may increase crop production and stress how this phenomenon can greatly affect the local communities, their food security, economic stability and the long term resilience of their livelihoods, regardless of whether the transfer of property rights is the result of an informed decision and the land was paid at market value.
Aesthetic demand of French seniors: a large-scale study.
Wulfman, Claudine; Tezenas du Montcel, Sophie; Jonas, Pierre; Fattouh, Jalal; Rignon-Bret, Christophe
2010-12-01
The needs of seniors for oral health and aesthetics are growing, as are their demands for aesthetics. This large-scale study aims to identify the demand for aesthetics for a population aged over 55, and the influence of age and gender. A 15-item questionnaire was placed on the web in partnership with a major magazine dedicated to seniors. It reflected practitioners' questions with regard to senior patient expectations: aesthetic demand assessment, most commonly expressed complaints, the importance given to tooth colour, knowledge of available therapeutic treatments and motivation levels for treatment. The survey generated 3868 responses, 61% from women; 77% of respondents declared being satisfied to very satisfied with their smile. Their highest priority to improve their smile was tooth alignment, followed by their shape, length and shade. Although 60% of respondents were satisfied with their current shading, 53% would prefer to have them whitened. Aesthetic treatments were well-known to seniors. Over four-fifths of them had heard of dental implants and ceramic crowns. Two-thirds of those who wished to improve their smile were considering dental treatment. The high number of collected questionnaires confirms the strong interest shown by seniors for dental aesthetics, particularly from women. Baby-boomers seem more attentive to the appearance of their smile than their elders. However, the importance of appearance decreases with age, as it becomes less of a priority, with attention more focused on general health. © 2009 The Gerodontology Society and John Wiley & Sons A/S.
Modeling the Economic Feasibility of Large-Scale Net-Zero Water Management: A Case Study.
Guo, Tianjiao; Englehardt, James D; Fallon, Howard J
While municipal direct potable water reuse (DPR) has been recommended for consideration by the U.S. National Research Council, it is unclear how to size new closed-loop DPR plants, termed "net-zero water (NZW) plants", to minimize cost and energy demand assuming upgradient water distribution. Based on a recent model optimizing the economics of plant scale for generalized conditions, the authors evaluated the feasibility and optimal scale of NZW plants for treatment capacity expansion in Miami-Dade County, Florida. Local data on population distribution and topography were input to compare projected costs for NZW vs the current plan. Total cost was minimized at a scale of 49 NZW plants for the service population of 671,823. Total unit cost for NZW systems, which mineralize chemical oxygen demand to below normal detection limits, is projected at ~$10.83 / 1000 gal, approximately 13% above the current plan and less than rates reported for several significant U.S. cities.
Energy storage inherent in large tidal turbine farms
Vennell, Ross; Adcock, Thomas A. A.
2014-01-01
While wind farms have no inherent storage to supply power in calm conditions, this paper demonstrates that large tidal turbine farms in channels have short-term energy storage. This storage lies in the inertia of the oscillating flow and can be used to exceed the previously published upper limit for power production by currents in a tidal channel, while simultaneously maintaining stronger currents. Inertial storage exploits the ability of large farms to manipulate the phase of the oscillating currents by varying the farm's drag coefficient. This work shows that by optimizing how a large farm's drag coefficient varies during the tidal cycle it is possible to have some flexibility about when power is produced. This flexibility can be used in many ways, e.g. producing more power, or to better meet short predictable peaks in demand. This flexibility also allows trading total power production off against meeting peak demand, or mitigating the flow speed reduction owing to power extraction. The effectiveness of inertial storage is governed by the frictional time scale relative to either the duration of a half tidal cycle or the duration of a peak in power demand, thus has greater benefits in larger channels. PMID:24910516
Energy storage inherent in large tidal turbine farms.
Vennell, Ross; Adcock, Thomas A A
2014-06-08
While wind farms have no inherent storage to supply power in calm conditions, this paper demonstrates that large tidal turbine farms in channels have short-term energy storage. This storage lies in the inertia of the oscillating flow and can be used to exceed the previously published upper limit for power production by currents in a tidal channel, while simultaneously maintaining stronger currents. Inertial storage exploits the ability of large farms to manipulate the phase of the oscillating currents by varying the farm's drag coefficient. This work shows that by optimizing how a large farm's drag coefficient varies during the tidal cycle it is possible to have some flexibility about when power is produced. This flexibility can be used in many ways, e.g. producing more power, or to better meet short predictable peaks in demand. This flexibility also allows trading total power production off against meeting peak demand, or mitigating the flow speed reduction owing to power extraction. The effectiveness of inertial storage is governed by the frictional time scale relative to either the duration of a half tidal cycle or the duration of a peak in power demand, thus has greater benefits in larger channels.
Practices and Strategies of Distributed Knowledge Collaboration
ERIC Educational Resources Information Center
Kudaravalli, Srinivas
2010-01-01
Information Technology is enabling large-scale, distributed collaboration across many different kinds of boundaries. Researchers have used the label new organizational forms to describe such collaborations and suggested that they are better able to meet the demands of flexibility, speed and adaptability that characterize the knowledge economy.…
Measurement-Driven Characterization of the Mobile Environment
ERIC Educational Resources Information Center
Soroush, Hamed
2013-01-01
The concurrent deployment of high-quality wireless networks and large-scale cloud services offers the promise of secure ubiquitous access to seemingly limitless amount of content. However, as users' expectations have grown more demanding, the performance and connectivity failures endemic to the existing networking infrastructure have become more…
Residential solar-heating system
NASA Technical Reports Server (NTRS)
1978-01-01
Complete residential solar-heating and hot-water system, when installed in highly-insulated energy-saver home, can supply large percentage of total energy demand for space heating and domestic hot water. System which uses water-heating energy storage can be scaled to meet requirements of building in which it is installed.
Impact of Addressing Accountability Demands in the United States
ERIC Educational Resources Information Center
Banta, Trudy W.
2010-01-01
Since 1970, quality assurance, or outcomes assessment, has provided guidance for improving pedagogy, curricula and student support programmes in the US. But evidence that student learning has improved remains elusive. Large-scale long-term studies are needed to demonstrate the effects of outcomes assessment on learning.
Electricity by intermittent sources: An analysis based on the German situation 2012
NASA Astrophysics Data System (ADS)
Wagner, Friedrich
2014-02-01
The 2012 data of the German load, the on- and offshore and the photo-voltaic energy production are used and scaled to the limit of supplying the annual demand (100% case). The reference mix of the renewable energy (RE) forms is selected such that the remaining back-up energy is minimised. For the 100% case, the RE power installation has to be about 3 times the present peak load. The back-up system can be reduced by 12% in this case. The surplus energy corresponds to 26% of the demand. The back-up system and more so the grid must be able to cope with large power excursions. All components of the electricity supply system operate at low capacity factors. Large-scale storage can hardly be motivated by the effort to further reduce CO2 emission. Demand-side management will intensify the present periods of high economic activities. Its rigorous implementation will expand the economic activities into the weekends. On the basis of a simple criterion, the increase of periods with negative electricity prices in Germany is assessed. It will be difficult with RE to meet the low CO2 emission factors which characterise those European Countries which produce electricity mostly by nuclear and hydro power.
SPIKY: a graphical user interface for monitoring spike train synchrony
Mulansky, Mario; Bozanic, Nebojsa
2015-01-01
Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. PMID:25744888
SPIKY: a graphical user interface for monitoring spike train synchrony.
Kreuz, Thomas; Mulansky, Mario; Bozanic, Nebojsa
2015-05-01
Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. Copyright © 2015 the American Physiological Society.
Choi, BongKyoo; Kawakami, Norito; Chang, SeiJin; Koh, SangBaek; Bjorner, Jakob; Punnett, Laura; Karasek, Robert
2008-01-01
The five-item psychological demands scale of the Job Content Questionnaire (JCQ) has been assumed to be one-dimensional in practice. To examine whether the scale has sufficient internal consistency and external validity to be treated as a single scale, using the cross-national JCQ datasets from the United States, Korea, and Japan. Exploratory factor analyses with 22 JCQ items, confirmatory factor analyses with the five psychological demands items, and correlations analyses with mental health indexes. Generally, exploratory factor analyses displayed the predicted demand/control/support structure with three and four factors extracted. However, at more detailed levels of exploratory and confirmatory factor analyses, the demands scale showed clear evidence of multi-factor structure. The correlations of items and subscales of the demands scale with mental health indexes were similar to those of the full scale in the Korean and Japanese datasets, but not in the U.S. data. In 4 out of 16 sub-samples of the U.S. data, several significant correlations of the components of the demands scale with job dissatisfaction and life dissatisfaction were obscured by the full scale. The multidimensionality of the psychological demands scale should be considered in psychometric analysis and interpretation, occupational epidemiologic studies, and future scale extension.
Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.
Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk
2015-01-01
Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system throughput performance.
1986-08-27
notions of large, medium-sized and small business and of the need to achieve a more just distribution of productive forces. This policy is necessary...element that could complement and raise the effectiveness of the large-scale production that is characteristic of the present phase of development of... productive forces in socialist society. To understand fully the significance of this party demand, we would have to recall that, in his speech at the
The Chandra Deep Wide-Field Survey: Completing the new generation of Chandra extragalactic surveys
NASA Astrophysics Data System (ADS)
Hickox, Ryan
2016-09-01
Chandra X-ray surveys have revolutionized our view of the growth of black holes across cosmic time. Recently, fundamental questions have emerged about the connection of AGN to their host large scale structures that clearly demand a wide, deep survey over a large area, comparable to the recent extensive Chandra surveys in smaller fields. We propose the Chandra Deep Wide-Field Survey (CDWFS) covering the central 6 sq. deg in the Bootes field, totaling 1.025 Ms (building on 550 ks from the HRC GTO program). CDWFS will efficiently probe a large cosmic volume, allowing us to carry out accurate new investigations of the connections between black holes and their large-scale structures, and will complete the next generation surveys that comprise a key part of Chandra's legacy.
Choi, Bongkyoo; Kurowski, Alicia; Bond, Meg; Baker, Dean; Clays, Els; De Bacquer, Dirk; Punnett, Laura
2012-01-01
The construct validity of the Job Content Questionnaire (JCQ) psychological demands scale in relationship to physical demands has been inconsistent. This study aims to test quantitatively and qualitatively whether the scale validity differs by occupation. Hierarchical clustering analyses of 10 JCQ psychological and physical demands items were conducted in 61 occupations from two datasets: one of non-faculty workers at a university in the United States (6 occupations with 208 total workers) and the other of a Belgian working population (55 occupations with 13,039 total workers). The psychological and physical demands items overlapped in 13 of 61 occupation-stratified clustering analyses. Most of the overlaps occurred in physically-demanding occupations and involved the two psychological demands items, 'work fast' and 'work hard'. Generally, the scale reliability was low in such occupations. Additionally, interviews with eight university workers revealed that workers interpreted the two psychological demands items differently by the nature of their tasks. The scale validity was occupation-differential. The JCQ psychological job demands scale as a job demand measure has been used worldwide in many studies. This study indicates that the wordings of the 'work fast' and 'work hard' items of the scale need to be reworded enough to differentiate mental and physical job demands as intended, 'psychological.'
Gleadall, Andrew; Pan, Jingzhe; Ding, Lifeng; Kruft, Marc-Anton; Curcó, David
2015-11-01
Molecular dynamics (MD) simulations are widely used to analyse materials at the atomic scale. However, MD has high computational demands, which may inhibit its use for simulations of structures involving large numbers of atoms such as amorphous polymer structures. An atomic-scale finite element method (AFEM) is presented in this study with significantly lower computational demands than MD. Due to the reduced computational demands, AFEM is suitable for the analysis of Young's modulus of amorphous polymer structures. This is of particular interest when studying the degradation of bioresorbable polymers, which is the topic of an accompanying paper. AFEM is derived from the inter-atomic potential energy functions of an MD force field. The nonlinear MD functions were adapted to enable static linear analysis. Finite element formulations were derived to represent interatomic potential energy functions between two, three and four atoms. Validation of the AFEM was conducted through its application to atomic structures for crystalline and amorphous poly(lactide). Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childers, J. T.; Uram, T. D.; LeCompte, T. J.
As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the World- wide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application andmore » the performance that was achieved.« less
Childers, J. T.; Uram, T. D.; LeCompte, T. J.; ...
2016-09-29
As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. Finally, this paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application andmore » the performance that was achieved.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childers, J. T.; Uram, T. D.; LeCompte, T. J.
As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. Finally, this paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application andmore » the performance that was achieved.« less
Quadratic integrand double-hybrid made spin-component-scaled
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brémond, Éric, E-mail: eric.bremond@iit.it; Savarese, Marika; Sancho-García, Juan C.
2016-03-28
We propose two analytical expressions aiming to rationalize the spin-component-scaled (SCS) and spin-opposite-scaled (SOS) schemes for double-hybrid exchange-correlation density-functionals. Their performances are extensively tested within the framework of the nonempirical quadratic integrand double-hybrid (QIDH) model on energetic properties included into the very large GMTKN30 benchmark database, and on structural properties of semirigid medium-sized organic compounds. The SOS variant is revealed as a less computationally demanding alternative to reach the accuracy of the original QIDH model without losing any theoretical background.
Utilizing Online Training for Child Sexual Abuse Prevention: Benefits and Limitations
ERIC Educational Resources Information Center
Paranal, Rechelle; Thomas, Kiona Washington; Derrick, Christina
2012-01-01
The prevalence of child sexual abuse demands innovative approaches to prevent further victimization. The online environment provides new opportunities to expand existing child sexual abuse prevention trainings that target adult gatekeepers and allow for large scale interventions that are fiscally viable. This article discusses the benefits and…
Language and Literacy Shifts in Refugee Populations.
ERIC Educational Resources Information Center
Long, Lynellyn D.
The large scale movements of refugees in many areas of the world are having dramatic impacts on indigenous cultures, languages, and literacies. Both anecdotal evidence and research suggest that the experience of uprooting and displacement creates an increased demand for literacy, new forms of literate expression, and more multilingual…
Stemming the Tide: Retaining and Supporting Science Teachers
ERIC Educational Resources Information Center
Pirkle, Sheila F.
2011-01-01
Chronically high rates of new and experienced science teacher attrition and the findings of new large-scale mentoring programs indicate that administrators should adopt new approaches. A science teacher's role encompasses demanding responsibilities, such as observing laboratory safety and OSHA mandates, as well as management of a business-like,…
Trade-offs between agricultural production and biodiversity for biofuel production
USDA-ARS?s Scientific Manuscript database
Growing energy demands and concerns for climate change have pushed forward the time line for biofuel production. However, the effect of large-scale biofuel production in the U.S. on the agricultural industry, primarily responsible for food production and livestock feed, and biodiversity levels of ma...
Detecting Item Drift in Large-Scale Testing
ERIC Educational Resources Information Center
Guo, Hongwen; Robin, Frederic; Dorans, Neil
2017-01-01
The early detection of item drift is an important issue for frequently administered testing programs because items are reused over time. Unfortunately, operational data tend to be very sparse and do not lend themselves to frequent monitoring analyses, particularly for on-demand testing. Building on existing residual analyses, the authors propose…
How will the changing industrial forest landscape affect forest sustainability?
Eric J. Gustafson; Craig Loehle
2008-01-01
Large-scale divestiture of commercial forestlands is occurring in the United States. Furthermore, increasing demand for cellulose for bioenergy may modify forest management practices widely enough to impact the spatial characteristics of forested landscapes. We used the HARVEST timber harvest simulator to investigate the potential consequences of divestiture and...
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
NASA Astrophysics Data System (ADS)
Jenkins, David R.; Basden, Alastair; Myers, Richard M.
2018-05-01
We propose a solution to the increased computational demands of Extremely Large Telescope (ELT) scale adaptive optics (AO) real-time control with the Intel Xeon Phi Knights Landing (KNL) Many Integrated Core (MIC) Architecture. The computational demands of an AO real-time controller (RTC) scale with the fourth power of telescope diameter and so the next generation ELTs require orders of magnitude more processing power for the RTC pipeline than existing systems. The Xeon Phi contains a large number (≥64) of low power x86 CPU cores and high bandwidth memory integrated into a single socketed server CPU package. The increased parallelism and memory bandwidth are crucial to providing the performance for reconstructing wavefronts with the required precision for ELT scale AO. Here, we demonstrate that the Xeon Phi KNL is capable of performing ELT scale single conjugate AO real-time control computation at over 1.0kHz with less than 20μs RMS jitter. We have also shown that with a wavefront sensor camera attached the KNL can process the real-time control loop at up to 966Hz, the maximum frame-rate of the camera, with jitter remaining below 20μs RMS. Future studies will involve exploring the use of a cluster of Xeon Phis for the real-time control of the MCAO and MOAO regimes of AO. We find that the Xeon Phi is highly suitable for ELT AO real time control.
Fransson, Eleonor I; Nyberg, Solja T; Heikkilä, Katriina; Alfredsson, Lars; Bacquer, De Dirk; Batty, G David; Bonenfant, Sébastien; Casini, Annalisa; Clays, Els; Goldberg, Marcel; Kittel, France; Koskenvuo, Markku; Knutsson, Anders; Leineweber, Constanze; Magnusson Hanson, Linda L; Nordin, Maria; Singh-Manoux, Archana; Suominen, Sakari; Vahtera, Jussi; Westerholm, Peter; Westerlund, Hugo; Zins, Marie; Theorell, Töres; Kivimäki, Mika
2012-01-20
Job strain (i.e., high job demands combined with low job control) is a frequently used indicator of harmful work stress, but studies have often used partial versions of the complete multi-item job demands and control scales. Understanding whether the different instruments assess the same underlying concepts has crucial implications for the interpretation of findings across studies, harmonisation of multi-cohort data for pooled analyses, and design of future studies. As part of the 'IPD-Work' (Individual-participant-data meta-analysis in working populations) consortium, we compared different versions of the demands and control scales available in 17 European cohort studies. Six of the 17 studies had information on the complete scales and 11 on partial scales. Here, we analyse individual level data from 70 751 participants of the studies which had complete scales (5 demand items, 6 job control items). We found high Pearson correlation coefficients between complete scales of job demands and control relative to scales with at least three items (r > 0.90) and for partial scales with two items only (r = 0.76-0.88). In comparison with scores from the complete scales, the agreement between job strain definitions was very good when only one item was missing in either the demands or the control scale (kappa > 0.80); good for job strain assessed with three demand items and all six control items (kappa > 0.68) and moderate to good when items were missing from both scales (kappa = 0.54-0.76). The sensitivity was > 0.80 when only one item was missing from either scale, decreasing when several items were missing in one or both job strain subscales. Partial job demand and job control scales with at least half of the items of the complete scales, and job strain indices based on one complete and one partial scale, seemed to assess the same underlying concepts as the complete survey instruments.
2012-01-01
Background Job strain (i.e., high job demands combined with low job control) is a frequently used indicator of harmful work stress, but studies have often used partial versions of the complete multi-item job demands and control scales. Understanding whether the different instruments assess the same underlying concepts has crucial implications for the interpretation of findings across studies, harmonisation of multi-cohort data for pooled analyses, and design of future studies. As part of the 'IPD-Work' (Individual-participant-data meta-analysis in working populations) consortium, we compared different versions of the demands and control scales available in 17 European cohort studies. Methods Six of the 17 studies had information on the complete scales and 11 on partial scales. Here, we analyse individual level data from 70 751 participants of the studies which had complete scales (5 demand items, 6 job control items). Results We found high Pearson correlation coefficients between complete scales of job demands and control relative to scales with at least three items (r > 0.90) and for partial scales with two items only (r = 0.76-0.88). In comparison with scores from the complete scales, the agreement between job strain definitions was very good when only one item was missing in either the demands or the control scale (kappa > 0.80); good for job strain assessed with three demand items and all six control items (kappa > 0.68) and moderate to good when items were missing from both scales (kappa = 0.54-0.76). The sensitivity was > 0.80 when only one item was missing from either scale, decreasing when several items were missing in one or both job strain subscales. Conclusions Partial job demand and job control scales with at least half of the items of the complete scales, and job strain indices based on one complete and one partial scale, seemed to assess the same underlying concepts as the complete survey instruments. PMID:22264402
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.
Integrated water and renewable energy management: the Acheloos-Peneios region case study
NASA Astrophysics Data System (ADS)
Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris
2015-04-01
Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.
Role of optometry school in single day large scale school vision testing
Anuradha, N; Ramani, Krishnakumar
2015-01-01
Background: School vision testing aims at identification and management of refractive errors. Large-scale school vision testing using conventional methods is time-consuming and demands a lot of chair time from the eye care professionals. A new strategy involving a school of optometry in single day large scale school vision testing is discussed. Aim: The aim was to describe a new approach of performing vision testing of school children on a large scale in a single day. Materials and Methods: A single day vision testing strategy was implemented wherein 123 members (20 teams comprising optometry students and headed by optometrists) conducted vision testing for children in 51 schools. School vision testing included basic vision screening, refraction, frame measurements, frame choice and referrals for other ocular problems. Results: A total of 12448 children were screened, among whom 420 (3.37%) were identified to have refractive errors. 28 (1.26%) children belonged to the primary, 163 to middle (9.80%), 129 (4.67%) to secondary and 100 (1.73%) to the higher secondary levels of education respectively. 265 (2.12%) children were referred for further evaluation. Conclusion: Single day large scale school vision testing can be adopted by schools of optometry to reach a higher number of children within a short span. PMID:25709271
Cognitive task demands, self-control demands and the mental well-being of office workers.
Bridger, Robert S; Brasher, Kate
2011-09-01
The cognitive task demands of office workers and the self-control demands of their work roles were measured in a sample of 196 employees in two different office layouts using a self-report questionnaire, which was circulated electronically. Multiple linear regression analysis revealed that both factors were associated with mental well-being, but not with physical well-being, while controlling for exposure to psychosocial stressors. The interaction between cognitive task demands and self-control demands had the strongest association with mental well-being, suggesting that the deleterious effect of one was greater when the other was present. An exploratory analysis revealed that the association was stronger for employees working in a large open-plan office than for those working in smaller offices with more privacy. Frustration of work goals was the cognitive task demand having the strongest negative impact on mental well-being. Methodological limitations and scale psychometrics (particularly the use of the NASA Task Load Index) are discussed. STATEMENT OF RELEVANCE: Modern office work has high mental demands and low physical demands and there is a need to design offices to prevent adverse psychological reactions. It is shown that cognitive task demands interact with self-control demands to degrade mental well-being. The association was stronger in an open-plan office.
Stephanie J. Wessell-Kelly; Deanna H. Olson
2013-01-01
Increasing global demands on forest resources are driving large-scale shifts toward plantation forestry. Simultaneously balancing resource extraction and ecological sustainability objectives in plantation forests requires the incorporation of innovative silvicultural strategies such as leave islands (green-tree retention clusters). Our primary research goal was to...
ERIC Educational Resources Information Center
Achtenhagen, Frank; Winther, Esther
2014-01-01
As a consequence of the large-scale assessment studies (TIMMS; PISA) in compulsory schooling, attention is now being given to the modelling and measurement of competencies in initial vocational education and training. This new output-led perspective of teaching/training and learning/working processes demands new approaches to research. Using the…
Rehabilitation of coastal wetland forests degraded through their conversion to shrimp farms
Peter R. Burbridge; Daniel C. Hellin
2000-01-01
International demand for shrimp has stimulated large-scale conversion of mangrove and other coastal wetlands into brackish water aquaculture ponds. Poor site selection, coupled with poor management and over-intensive development of individual sites, has led to nonsustainable production and often, wholesale abandonment of ponds. This has been followed by further...
USDA-ARS?s Scientific Manuscript database
With the increasing demand for alternative energy sources, perennial grasses are being evaluated for biomass production on large scales. Yet there is concern that some candidate species have the potential to escape cultivation and invade natural areas. Therefore, it is important that components of...
Linguistic Simplification of Mathematics Items: Effects for Language Minority Students in Germany
ERIC Educational Resources Information Center
Haag, Nicole; Heppt, Birgit; Roppelt, Alexander; Stanat, Petra
2015-01-01
In large-scale assessment studies, language minority students typically obtain lower test scores in mathematics than native speakers. Although this performance difference was related to the linguistic complexity of test items in some studies, other studies did not find linguistically demanding math items to be disproportionally more difficult for…
International Students' and Employers' Use of Rankings: A Cross-National Analysis
ERIC Educational Resources Information Center
Souto-Otero, Manuel; Enders, Jürgen
2017-01-01
The article examines, primarily based on large-scale survey data, the functionalist proposition that HE customers, students and employers, demand rankings to be able to adopt informed decisions on where to study and who to recruit respectively. This is contrasted to a Weberian "conflict" perspective on rankings in which positional…
Energy requirement for the production of silicon solar arrays
NASA Technical Reports Server (NTRS)
Lindmayer, J.; Wihl, M.; Scheinine, A.; Morrison, A.
1977-01-01
An assessment of potential changes and alternative technologies which could impact the photovoltaic manufacturing process is presented. Topics discussed include: a multiple wire saw, ribbon growth techniques, silicon casting, and a computer model for a large-scale solar power plant. Emphasis is placed on reducing the energy demands of the manufacturing process.
Multiscale socioeconomic assessment across large ecosystems: lessons from practice
Rebecca J. McLain; Ellen M. Donoghue; Jonathan Kusel; Lita Buttolph; Susan Charnley
2008-01-01
Implementation of ecosystem management projects has created a demand for socioeconomic assessments to predict or evaluate the impacts of ecosystem policies. Social scientists for these assessments face challenges that, although not unique to such projects, are more likely to arise than in smaller scale ones. This article summarizes lessons from our experiences with...
The Infrastructure of Accountability: Data Use and the Transformation of American Education
ERIC Educational Resources Information Center
Anagnostopoulos, Dorothea, Ed.; Rutledge, Stacey A., Ed.; Jacobsen, Rebecca, Ed.
2013-01-01
"The Infrastructure of Accountability" brings together leading and emerging scholars who set forth an ambitious conceptual framework for understanding the full impact of large-scale, performance-based accountability systems on education. Over the past 20 years, schools and school systems have been utterly reshaped by the demands of…
Perceived Energy for Parenting: A New Conceptualization and Scale
ERIC Educational Resources Information Center
Janisse, Heather C.; Barnett, Douglas; Nies, Mary A.
2009-01-01
Parenting may be the most physically and mentally demanding social role people encounter during their life. Personal resources are essential to child rearing, yet perceptions of parenting energy have been largely unexplored. This manuscript reports on the need for and development of a measure of perceived energy for parenting (PEP), as well as a…
Electronic Scientific Data & Literature Aggregation: A Review for Librarians
ERIC Educational Resources Information Center
Losoff, Barbara
2009-01-01
The advent of large-scale digital repositories, along with the need for sharing useful data world-wide, demands change to the current information structure. The merging of digital scientific data with scholarly literature has the potential to fulfill the Semantic Web design principles. This paper will identify factors leading to integration of…
ERIC Educational Resources Information Center
Lei, B.
2017-01-01
This article investigates the traumatic experience of teachers who experienced the 2008 earthquake in Sichuan, China. A survey measuring participants' personal experiences, professional demands, and psychological responses was distributed to 241 teachers in five selected schools. Although the status of schoolteachers' trauma in a postdisaster…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Jacob; Edgar, Thomas W.; Daily, Jeffrey A.
With an ever-evolving power grid, concerns regarding how to maintain system stability, efficiency, and reliability remain constant because of increasing uncertainties and decreasing rotating inertia. To alleviate some of these concerns, demand response represents a viable solution and is virtually an untapped resource in the current power grid. This work describes a hierarchical control framework that allows coordination between distributed energy resources and demand response. This control framework is composed of two control layers: a coordination layer that ensures aggregations of resources are coordinated to achieve system objectives and a device layer that controls individual resources to assure the predeterminedmore » power profile is tracked in real time. Large-scale simulations are executed to study the hierarchical control, requiring advancements in simulation capabilities. Technical advancements necessary to investigate and answer control interaction questions, including the Framework for Network Co-Simulation platform and Arion modeling capability, are detailed. Insights into the interdependencies of controls across a complex system and how they must be tuned, as well as validation of the effectiveness of the proposed control framework, are yielded using a large-scale integrated transmission system model coupled with multiple distribution systems.« less
NASA Astrophysics Data System (ADS)
Ruan, Dianbo; Kim, Myeong-Seong; Yang, Bin; Qin, Jun; Kim, Kwang-Bum; Lee, Sang-Hyun; Liu, Qiuxiang; Tan, Lei; Qiao, Zhijun
2017-10-01
To address the large-scale application demands of high energy density, high power density, and long cycle lifetime, 700-F hybrid capacitor pouch cells have been prepared, comprising ∼240-μm-thick activated carbon cathodes, and ∼60-μm-thick Li4Ti5O12 anodes. Microspherical Li4Ti5O12 (M-LTO) synthesized by spray-drying features 200-400 nm primary particles and interconnected nanopore structures. M-LTO half-cells exhibits high specific capacities (175 mAhh g-1), good rate capabilities (148 mAhh g-1 at 20 C), and ultra-long cycling stabilities (90% specific capacity retention after 10,000 cycles). In addition, the obtained hybrid capacitors comprising activated carbon (AC) and M-LTO shows excellent cell performances, achieving a maximum energy density of 51.65 Wh kg-1, a maximum power density of 2466 W kg-1, and ∼92% capacitance retention after 10,000 cycles, thus meeting the demands for large-scale applications such as trolleybuses.
Underreporting on the MMPI-2-RF in a high-demand police officer selection context: an illustration.
Detrick, Paul; Chibnall, John T
2014-09-01
Positive response distortion is common in the high-demand context of employment selection. This study examined positive response distortion, in the form of underreporting, on the Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF). Police officer job applicants completed the MMPI-2-RF under high-demand and low-demand conditions, once during the preemployment psychological evaluation and once without contingencies after completing the police academy. Demand-related score elevations were evident on the Uncommon Virtues (L-r) and Adjustment Validity (K-r) scales. Underreporting was evident on the Higher-Order scales Emotional/Internalizing Dysfunction and Behavioral/Externalizing Dysfunction; 5 of 9 Restructured Clinical scales; 6 of 9 Internalizing scales; 3 of 4 Externalizing scales; and 3 of 5 Personality Psychopathology 5 scales. Regression analyses indicated that L-r predicted demand-related underreporting on behavioral/externalizing scales, and K-r predicted underreporting on emotional/internalizing scales. Select scales of the MMPI-2-RF are differentially associated with different types of underreporting among police officer applicants. PsycINFO Database Record (c) 2014 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David
2016-04-01
Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.
A distributed parallel storage architecture and its potential application within EOSDIS
NASA Technical Reports Server (NTRS)
Johnston, William E.; Tierney, Brian; Feuquay, Jay; Butzer, Tony
1994-01-01
We describe the architecture, implementation, use of a scalable, high performance, distributed-parallel data storage system developed in the ARPA funded MAGIC gigabit testbed. A collection of wide area distributed disk servers operate in parallel to provide logical block level access to large data sets. Operated primarily as a network-based cache, the architecture supports cooperation among independently owned resources to provide fast, large-scale, on-demand storage to support data handling, simulation, and computation.
Improving efficiency of polystyrene concrete production with composite binders
NASA Astrophysics Data System (ADS)
Lesovik, R. V.; Ageeva, M. S.; Lesovik, G. A.; Sopin, D. M.; Kazlitina, O. V.; Mitrokhina, A. A.
2018-03-01
According to leading marketing researchers, the construction market in Russia and CIS will continue growing at a rapid rate; this applies not only to a large-scale major construction, but to a construction of single-family houses and small-scale industrial facilities as well. Due to this, there are increased requirements for heat insulation of the building enclosures and a significant demand for efficient walling materials with high thermal performance. All these developments led to higher requirements imposed on the equipment that produces such materials.
NASA Astrophysics Data System (ADS)
Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris
2015-04-01
Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial decrease of the required number of function evaluations for detecting the optimal management policy, using an innovative, surrogate-assisted global optimization approach.
Performance evaluation of a full-scale innovative swine waste-to-energy system.
Xu, Jiele; Adair, Charles W; Deshusses, Marc A
2016-09-01
Intensive monitoring was carried out to evaluate the performance of a full-scale innovative swine waste-to-energy system at a commercial swine farm with 8640 heads of swine. Detailed mass balances over each unit of the system showed that the system, which includes a 7600m(3) anaerobic digester, a 65-kW microturbine, and a 4200m(3) aeration basin, was able to remove up to 92% of the chemical oxygen demand (COD), 99% of the biological oxygen demand (BOD), 77% of the total nitrogen (TN), and 82% of the total phosphorous (TP) discharged into the system as fresh pig waste. The overall biogas yield based on the COD input was 64% of the maximum theoretical, a value that indicates that even greater environmental benefits could be obtained with process optimization. Overall, the characterization of the materials fluxes in the system provides a greater understanding of the fate of organics and nutrients in large scale animal waste management systems. Copyright © 2016 Elsevier Ltd. All rights reserved.
A depth-first search algorithm to compute elementary flux modes by linear programming.
Quek, Lake-Ee; Nielsen, Lars K
2014-07-30
The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.
NASA Astrophysics Data System (ADS)
Wagener, T.
2017-12-01
Current societal problems and questions demand that we increasingly build hydrologic models for regional or even continental scale assessment of global change impacts. Such models offer new opportunities for scientific advancement, for example by enabling comparative hydrology or connectivity studies, and for improved support of water management decision, since we might better understand regional impacts on water resources from large scale phenomena such as droughts. On the other hand, we are faced with epistemic uncertainties when we move up in scale. The term epistemic uncertainty describes those uncertainties that are not well determined by historical observations. This lack of determination can be because the future is not like the past (e.g. due to climate change), because the historical data is unreliable (e.g. because it is imperfectly recorded from proxies or missing), or because it is scarce (either because measurements are not available at the right scale or there is no observation network available at all). In this talk I will explore: (1) how we might build a bridge between what we have learned about catchment scale processes and hydrologic model development and evaluation at larger scales. (2) How we can understand the impact of epistemic uncertainty in large scale hydrologic models. And (3) how we might utilize large scale hydrologic predictions to understand climate change impacts, e.g. on infectious disease risk.
NASA Astrophysics Data System (ADS)
Vogl, Raimund
2001-08-01
In 1997, a large PACS was first introduced at Innsbruck University Hospital in the context of a new traumatology centre. In the subsequent years, this initial PACS setting covering only one department was expanded to most of the hospital campus, with currently some 250 viewing stations attached. Constantly connecting new modalities and viewing stations created the demand for several redesigns from the original PACS configuration to cope with the increasing data load. We give an account of these changes necessary to develop a multi hospital PACS and the considerations that lead us there. Issues of personnel for running a large scale PACS are discussed and we give an outlook to the new information systems currently under development for archiving and communication of general medical imaging data and for simple telemedicine networking between several large university hospitals.
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-01-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600
Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.
Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T
2014-06-01
Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.
Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations
NASA Technical Reports Server (NTRS)
Sorensen, Danny C.
1996-01-01
Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.
Food or Fuel: New Competition for the World's Cropland. Worldwatch Paper 35.
ERIC Educational Resources Information Center
Brown, Lester R.
The paper explores how continuously expanding world demand for food, feed, and fuel is generating pressure to restructure agricultural land use. In addition, problems related to transfer of agricultural crop land to energy crops are discussed. The technology of energy crops has developed to the point where large-scale commercial production of…
Beyond the Classroom: Creating and Implementing New Models for Teaching.
ERIC Educational Resources Information Center
Free, Coen; Moerman, Yvonne
In the Netherlands, educational innovation related to community colleges has mostly emphasized enlarging the scale of the colleges, resulting in large regional educational centers serving 10,000 students or more. New demand on the educational process has also resulted in a growing examination of the role of the teacher in the classroom. At King…
Microcopying wildland maps for distribution and scanner digitizing
Elliot L Amidon; Joyce E. Dye
1976-01-01
Maps for wildland resource inventory and managament purposes typically show vegetation types, soils, and other areal information. For field work, maps must be large-scale. For safekeeping and compact storage, however, they can be reduced onto film, ready to be enlarged on demand by office viewers. By meeting certain simple requirements, film images are potential input...
Adapting Educational Measurement to the Demands of Test-Based Accountability
ERIC Educational Resources Information Center
Koretz, Daniel
2015-01-01
Accountability has become a primary function of large-scale testing in the United States. The pressure on educators to raise scores is vastly greater than it was several decades ago. Research has shown that high-stakes testing can generate behavioral responses that inflate scores, often severely. I argue that because of these responses, using…
ERIC Educational Resources Information Center
Filinger, Ronald H.; Hall, Paul W.
Because large scale individualized learning systems place excessive demands on conventional means of producing audiovisual software, electronic image generation has been investigated as an alternative. A prototype, experimental device, Scanimate-500, was designed and built by the Computer Image Corporation. It uses photographic, television, and…
Developing Server-Side Infrastructure for Large-Scale E-Learning of Web Technology
ERIC Educational Resources Information Center
Simpkins, Neil
2010-01-01
The growth of E-business has made experience in server-side technology an increasingly important area for educators. Server-side skills are in increasing demand and recognised to be of relatively greater value than comparable client-side aspects (Ehie, 2002). In response to this, many educational organisations have developed E-business courses,…
Basin of Mexico: A history of watershed mismanagement
Luis A. Bojorquez Tapia; Exequiel Ezcurra; Marisa Mazari-Hiriart; Salomon Diaz; Paola Gomez; Georgina Alcantar; Daniela Megarejo
2000-01-01
Mexico City Metropolitan Zone (MCMZ) is located within the Basin of Mexico. Because of its large population and demand for natural resources, several authors have questioned the viability of the city, especially in terms of water resources. These are reviewed at the regional and the local scales. It is concluded that a multi-basin management approach is necessary to...
Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations
NASA Astrophysics Data System (ADS)
Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.
2016-07-01
Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.
Optimal Multi-scale Demand-side Management for Continuous Power-Intensive Processes
NASA Astrophysics Data System (ADS)
Mitra, Sumit
With the advent of deregulation in electricity markets and an increasing share of intermittent power generation sources, the profitability of industrial consumers that operate power-intensive processes has become directly linked to the variability in energy prices. Thus, for industrial consumers that are able to adjust to the fluctuations, time-sensitive electricity prices (as part of so-called Demand-Side Management (DSM) in the smart grid) offer potential economical incentives. In this thesis, we introduce optimization models and decomposition strategies for the multi-scale Demand-Side Management of continuous power-intensive processes. On an operational level, we derive a mode formulation for scheduling under time-sensitive electricity prices. The formulation is applied to air separation plants and cement plants to minimize the operating cost. We also describe how a mode formulation can be used for industrial combined heat and power plants that are co-located at integrated chemical sites to increase operating profit by adjusting their steam and electricity production according to their inherent flexibility. Furthermore, a robust optimization formulation is developed to address the uncertainty in electricity prices by accounting for correlations and multiple ranges in the realization of the random variables. On a strategic level, we introduce a multi-scale model that provides an understanding of the value of flexibility of the current plant configuration and the value of additional flexibility in terms of retrofits for Demand-Side Management under product demand uncertainty. The integration of multiple time scales leads to large-scale two-stage stochastic programming problems, for which we need to apply decomposition strategies in order to obtain a good solution within a reasonable amount of time. Hence, we describe two decomposition schemes that can be applied to solve two-stage stochastic programming problems: First, a hybrid bi-level decomposition scheme with novel Lagrangean-type and subset-type cuts to strengthen the relaxation. Second, an enhanced cross-decomposition scheme that integrates Benders decomposition and Lagrangean decomposition on a scenario basis. To demonstrate the effectiveness of our developed methodology, we provide several industrial case studies throughout the thesis.
GIS applications for military operations in coastal zones
Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.
2009-01-01
In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).
GIS applications for military operations in coastal zones
NASA Astrophysics Data System (ADS)
Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.
In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.
Economics of carbon dioxide capture and utilization-a supply and demand perspective.
Naims, Henriette
2016-11-01
Lately, the technical research on carbon dioxide capture and utilization (CCU) has achieved important breakthroughs. While single CO 2 -based innovations are entering the markets, the possible economic effects of a large-scale CO 2 utilization still remain unclear to policy makers and the public. Hence, this paper reviews the literature on CCU and provides insights on the motivations and potential of making use of recovered CO 2 emissions as a commodity in the industrial production of materials and fuels. By analyzing data on current global CO 2 supply from industrial sources, best practice benchmark capture costs and the demand potential of CO 2 utilization and storage scenarios with comparative statics, conclusions can be drawn on the role of different CO 2 sources. For near-term scenarios the demand for the commodity CO 2 can be covered from industrial processes, that emit CO 2 at a high purity and low benchmark capture cost of approximately 33 €/t. In the long-term, with synthetic fuel production and large-scale CO 2 utilization, CO 2 is likely to be available from a variety of processes at benchmark costs of approx. 65 €/t. Even if fossil-fired power generation is phased out, the CO 2 emissions of current industrial processes would suffice for ambitious CCU demand scenarios. At current economic conditions, the business case for CO 2 utilization is technology specific and depends on whether efficiency gains or substitution of volatile priced raw materials can be achieved. Overall, it is argued that CCU should be advanced complementary to mitigation technologies and can unfold its potential in creating local circular economy solutions.
NASA Astrophysics Data System (ADS)
Wada, Y.; Wisser, D.; Bierkens, M. F. P.
2013-02-01
To sustain growing food demand and increasing standard of living, global water withdrawal and consumptive water use have been increasing rapidly. To analyze the human perturbation on water resources consistently over a large scale, a number of macro-scale hydrological models (MHMs) have been developed over the recent decades. However, few models consider the feedback between water availability and water demand, and even fewer models explicitly incorporate water allocation from surface water and groundwater resources. Here, we integrate a global water demand model into a global water balance model, and simulate water withdrawal and consumptive water use over the period 1979-2010, considering water allocation from surface water and groundwater resources and explicitly taking into account feedbacks between supply and demand, using two re-analysis products: ERA-Interim and MERRA. We implement an irrigation water scheme, which works dynamically with daily surface and soil water balance, and include a newly available extensive reservoir data set. Simulated surface water and groundwater withdrawal show generally good agreement with available reported national and sub-national statistics. The results show a consistent increase in both surface water and groundwater use worldwide, but groundwater use has been increasing more rapidly than surface water use since the 1990s. Human impacts on terrestrial water storage (TWS) signals are evident, altering the seasonal and inter-annual variability. The alteration is particularly large over the heavily regulated basins such as the Colorado and the Columbia, and over the major irrigated basins such as the Mississippi, the Indus, and the Ganges. Including human water use generally improves the correlation of simulated TWS anomalies with those of the GRACE observations.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
Expansion of Human Mesenchymal Stem Cells in a Microcarrier Bioreactor.
Tsai, Ang-Chen; Ma, Teng
2016-01-01
Human mesenchymal stem cells (hMSCs) are considered as a primary candidate in cell therapy owing to their self-renewability, high differentiation capabilities, and secretions of trophic factors. In clinical application, a large quantity of therapeutically competent hMSCs is required that cannot be produced in conventional petri dish culture. Bioreactors are scalable and have the capacity to meet the production demand. Microcarrier suspension culture in stirred-tank bioreactors is the most widely used method to expand anchorage dependent cells in a large scale. Stirred-tank bioreactors have the potential to scale up and microcarriers provide the high surface-volume ratio. As a result, a spinner flask bioreactor with microcarriers has been commonly used in large scale expansion of adherent cells. This chapter describes a detailed culture protocol for hMSC expansion in a 125 mL spinner flask using microcarriers, Cytodex I, and a procedure for cell seeding, expansion, metabolic sampling, and quantification and visualization using microculture tetrazolium (MTT) reagent.
Panoptes: web-based exploration of large scale genome variation data.
Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic
2017-10-15
The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venteris, Erik R.; Skaggs, Richard; Wigmosta, Mark S.
Algae’s high productivity provides potential resource advantages over other fuel crops. However, demand for land, water, and nutrients must be minimized to avoid impacts on food production. We apply our national-scale, open-pond, growth and resource models to assess several biomass to fuel technological pathways based on Chlorella. We compare resource demands between hydrothermal liquefaction (HTL) and lipid extraction (LE) to meet 1.89E+10 and 7.95E+10 L yr-1 biofuel targets. We estimate nutrient demands where post-fuel biomass is consumed as co-products and recycling by anaerobic digestion (AD) or catalytic hydrothermal gasification (CHG). Sites are selected through prioritization based on fuel value relativemore » to a set of site-specific resource costs. The highest priority sites are located along the Gulf of Mexico coast, but potential sites exist nationwide. We find that HTL reduces land and freshwater consumption by up to 46% and saline groundwater by around 70%. Without recycling, nitrogen (N) and phosphorous (P) demand is reduced 33%, but is large relative to current U.S. agricultural consumption. The most nutrient-efficient pathways are LE+CHG for N and HTL+CHG for P (by 42%). Resource gains for HTL+CHG are offset by a 344% increase in N consumption relative to LE+CHG (with potential for further recycling). Nutrient recycling is essential to effective use of alternative nutrient sources. Modeling of utilization availability and costs remains, but we find that for HTL+CHG at the 7.95E+10 L yr-1 production target, municipal sources can offset 17% of N and 40% of P demand and animal manures can generally meet demands.« less
NASA Astrophysics Data System (ADS)
Chiarelli, Davide Danilo; Davis, Kyle Frankel; Rulli, Maria Cristina; D'Odorico, Paolo
2016-08-01
Pressure on agricultural land has markedly increased since the start of the century, driven by demographic growth, changes in diet, increasing biofuel demand, and globalization. To better ensure access to adequate land and water resources, many investors and countries began leasing large areas of agricultural land in the global South, a phenomenon often termed "large-scale land acquisition" (LSLA). To date, this global land rush has resulted in the appropriation of 41million hectares and about 490 km3 of freshwater resources, affecting rural livelihoods and local environments. It remains unclear to what extent land and water acquisitions contribute to the emergence of water-stress conditions in acquired areas, and how these demands for water may be impacted by climate change. Here we analyze 18 African countries - 20 Mha (or 80%) of LSLA for the continent - and estimate that under present climate 210 km3 year-1of water would be appropriated if all acquired areas were actively under production. We also find that consumptive use of irrigation water is disproportionately contributed by water-intensive biofuel crops. Using the IPCCA1B scenario, we find only small changes in green (-1.6%) and blue (+2.0%) water demand in targeted areas. With a 3 °C temperature increase, crop yields are expected to decrease up to 20% with a consequent increase in the water footprint. When the effect of increasing atmospheric CO2concentrations is accounted for, crop yields increase by as much as 40% with a decrease in water footprint up to 29%. The relative importance of CO2 fertilization and warming will therefore determine water appropriations and changes in water footprint under climate change scenarios.
O'Donnell, Andrew P.; Kurama, Yahya C.; Kalkan, Erol; Taflanidis, Alexandros A.
2017-01-01
This paper experimentally evaluates four methods to scale earthquake ground-motions within an ensemble of records to minimize the statistical dispersion and maximize the accuracy in the dynamic peak roof drift demand and peak inter-story drift demand estimates from response-history analyses of nonlinear building structures. The scaling methods that are investigated are based on: (1) ASCE/SEI 7–10 guidelines; (2) spectral acceleration at the fundamental (first mode) period of the structure, Sa(T1); (3) maximum incremental velocity, MIV; and (4) modal pushover analysis. A total of 720 shake-table tests of four small-scale nonlinear building frame specimens with different static and dynamic characteristics are conducted. The peak displacement demands from full suites of 36 near-fault ground-motion records as well as from smaller “unbiased” and “biased” design subsets (bins) of ground-motions are included. Out of the four scaling methods, ground-motions scaled to the median MIV of the ensemble resulted in the smallest dispersion in the peak roof and inter-story drift demands. Scaling based on MIValso provided the most accurate median demands as compared with the “benchmark” demands for structures with greater nonlinearity; however, this accuracy was reduced for structures exhibiting reduced nonlinearity. The modal pushover-based scaling (MPS) procedure was the only method to conservatively overestimate the median drift demands.
Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application
NASA Astrophysics Data System (ADS)
Chen, Jinduan; Boccelli, Dominic L.
2018-02-01
Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.
Human cells: new platform for recombinant therapeutic protein production.
Swiech, Kamilla; Picanço-Castro, Virgínia; Covas, Dimas Tadeu
2012-07-01
The demand for recombinant therapeutic proteins is significantly increasing. There is a constant need to improve the existing expression systems, and also developing novel approaches to face the therapeutic proteins demands. Human cell lines have emerged as a new and powerful alternative for the production of human therapeutic proteins because this expression system is expected to produce recombinant proteins with post translation modifications more similar to their natural counterpart and reduce the potential immunogenic reactions against nonhuman epitopes. Currently, little information about the cultivation of human cells for the production of biopharmaceuticals is available. These cells have shown efficient production in laboratory scale and represent an important tool for the pharmaceutical industry. This review presents the cell lines available for large-scale recombinant proteins production and evaluates critically the advantages of this expression system in comparison with other expression systems for recombinant therapeutic protein production. Copyright © 2012 Elsevier Inc. All rights reserved.
Evergreen coniferous forests of the pacific northwest.
Waring, R H; Franklin, J F
1979-06-29
The massive, evergreen coniferous forests in the Pacific Northwest are unique among temperate forest regions of the world. The region's forests escaped decimation during Pleistocene glaciation; they are now dominated by a few broadly distributed and well-adapted conifers that grow to large size and great age. Large trees with evergreen needle- or scale-like leaves have distinct advantages under the current climatic regime. Photosynthesis and nutrient uptake and storage are possible during the relatively warm, wet fall and winter months. High evaporative demand during the warm, dry summer reduces photosynthesis. Deciduous hardwoods are repeatedly at a disadvantage in competing with conifers in the regional climate. Their photosynthesis is predominantly limited to the growing season when evaporative demand is high and water is often limiting. Most nutrients needed are also less available at this time. The large size attained by conifers provides a buffer against environmental stress (especially for nutrients and moisture). The long duration between destructive fires and storms permits conifers to outgrow hardwoods with more limited stature and life spans.
Approaches to advancescientific understanding of macrosystems ecology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin
Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less
On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment
Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela
2017-01-01
Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems. PMID:28049820
On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment.
Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela
2017-01-17
Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems.
NASA Astrophysics Data System (ADS)
Breshears, D. D.; Adams, H. D.; Eamus, D.; McDowell, N. G.; Law, D. J.; Will, R. E.; Williams, P.; Zou, C.
2013-12-01
Ecohydrology focuses on the interactions of water availability, ecosystem productivity, and biogeochemical cycles via ecological-hydrological connections. These connections can be particularly pronounced and socially relevant when there are large-scale rapid changes in vegetation. One such key change, vegetation mortality, can be triggered by drought and is projected to become more frequent and/or extensive in the future under changing climate. Recent research on drought-induced vegetation die-off has focused primarily on direct drought effects, such as soil moisture deficit, and, to a much lesser degree, the potential for warmer temperatures to exacerbate stress and accelerate mortality. However, temperature is tightly interrelated with atmospheric demand (vapor pressure deficit, VPD) but the latter has rarely been considered explicitly relative to die-off events. Here we highlight the importance of VPD in addition to soil moisture deficit and warmer temperature as an important driver of future die-off. Recent examples highlighting the importance of VPD include mortality patterns corresponding to VPD drivers, a strong dependence of forest growth on VPD, patterns of observed mortality along an environmental gradient, an experimentally-determined climate envelope for mortality, and a suite of modeling simulations segregating the drought effects of VPD from those of temperature. The vast bulk of evidence suggests that atmospheric demand needs to be considered in addition to temperature and soil moisture deficit in predicting risk of future vegetation die-off and associated ecohydrological transformations.
The cost of a large-scale hollow fibre MBR.
Verrecht, Bart; Maere, Thomas; Nopens, Ingmar; Brepols, Christoph; Judd, Simon
2010-10-01
A cost sensitivity analysis was carried out for a full-scale hollow fibre membrane bioreactor to quantify the effect of design choices and operational parameters on cost. Different options were subjected to a long term dynamic influent profile and evaluated using ASM1 for effluent quality, aeration requirements and sludge production. The results were used to calculate a net present value (NPV), incorporating both capital expenditure (capex), based on costs obtained from equipment manufacturers and full-scale plants, and operating expenditure (opex), accounting for energy demand, sludge production and chemical cleaning costs. Results show that the amount of contingency built in to cope with changes in feedwater flow has a large impact on NPV. Deviation from a constant daily flow increases NPV as mean plant utilisation decreases. Conversely, adding a buffer tank reduces NPV, since less membrane surface is required when average plant utilisation increases. Membrane cost and lifetime is decisive in determining NPV: an increased membrane replacement interval from 5 to 10 years reduces NPV by 19%. Operation at higher SRT increases the NPV, since the reduced costs for sludge treatment are offset by correspondingly higher aeration costs at higher MLSS levels, though the analysis is very sensitive to sludge treatment costs. A higher sustainable flux demands greater membrane aeration, but the subsequent opex increase is offset by the reduced membrane area and the corresponding lower capex. Copyright © 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Do¨hlert, Peter; Weidauer, Maik; Peifer, Raphael; Kohl, Stephan; Enthaler, Stephan
2015-01-01
The straightforward large-scale synthesis and the ability to adjust the properties of polymers make polymers very attractive materials. Polymers have been used in numerous applications and an increased demand is foreseeable. However, a serious issue is the accumulation of enormous amounts of end-of-life polymers, which are currently recycled by…
Ultrahydrophobic Fluorinated Polyhedral Oligomeric Silsesquioxanes (F-POSS) (Preprint)
2007-01-25
From - To) 25-01-2007 Journal Article 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Ultrahydrophobic Fluorinated Polyhedral Oligomeric Silsesquioxanes...Ultrahydrophobic Fluorinated Polyhedral Oligomeric Silsesquioxanes (F-POSS) Joseph M. Mabry,* Ashwani Vij,* Scott T. Iacono, and Brent D. Viers Recently...there exists a demand to construct ultrahydrophobic materials inspired by nature that are easy to prepare on a large scale. Polyhedral oligomeric
Large scale softwood planting operations in New Brunswick
M. K. Barteaux
1977-01-01
New Brunswick is presently planting 20,000 ac. per year and expects to be planting nearly 60,000 ac. per year by 1979. Total productive forest area is 15,000,000 acres. The program is directly related to projected industrial demand and other forest uses. Plantations are established by scarification and planting of paper pot containers and bare root stock.
Ralph J. Alig
2004-01-01
Over the past 25 years, renewable resource assessments have addressed demand, supply, and inventory of various renewable resources in increasingly sophisticated fashion, including simulation and optimization analyses of area changes in land uses (e.g., urbanization) and land covers (e.g., plantations vs. naturally regenerated forests). This synthesis reviews related...
SnoMAP: Pioneering the Path for Clinical Coding to Improve Patient Care.
Lawley, Michael; Truran, Donna; Hansen, David; Good, Norm; Staib, Andrew; Sullivan, Clair
2017-01-01
The increasing demand for healthcare and the static resources available necessitate data driven improvements in healthcare at large scale. The SnoMAP tool was rapidly developed to provide an automated solution that transforms and maps clinician-entered data to provide data which is fit for both administrative and clinical purposes. Accuracy of data mapping was maintained.
Forest amount affects soybean productivity in Brazilian agricultural frontier
NASA Astrophysics Data System (ADS)
Rattis, L.; Brando, P. M.; Marques, E. Q.; Queiroz, N.; Silverio, D. V.; Macedo, M.; Coe, M. T.
2017-12-01
Over the past three decades, large tracts of tropical forests have been converted to crop and pasturelands across southern Amazonia, largely to meet the increasing worldwide demand for protein. As the world's population continue to grow and consume more protein per capita, forest conversion to grow more crops could be a potential solution to meet such demand. However, widespread deforestation is expected to negatively affect crop productivity via multiple pathways (e.g., thermal regulation, rainfall, local moisture, pest control, among others). To quantify how deforestation affects crop productivity, we modeled the relationship between forest amount and enhanced vegetation index (EVI—a proxy for crop productivity) during the soybean planting season across southern Amazonia. Our hypothesis that forest amount causes increased crop productivity received strong support. We found that the maximum MODIS-based EVI in soybean fields increased as a function of forest amount across three spatial-scales, 0.5 km, 1 km, 2 km, 5 km, 10 km, 15 km and 20 km. However, the strength of this relationship varied across years and with precipitation, but only at the local scale (e.g., 500 meters and 1 km radius). Our results highlight the importance of considering forests to design sustainable landscapes.
Parallel Optimization of 3D Cardiac Electrophysiological Model Using GPU
Xia, Yong; Zhang, Henggui
2015-01-01
Large-scale 3D virtual heart model simulations are highly demanding in computational resources. This imposes a big challenge to the traditional computation resources based on CPU environment, which already cannot meet the requirement of the whole computation demands or are not easily available due to expensive costs. GPU as a parallel computing environment therefore provides an alternative to solve the large-scale computational problems of whole heart modeling. In this study, using a 3D sheep atrial model as a test bed, we developed a GPU-based simulation algorithm to simulate the conduction of electrical excitation waves in the 3D atria. In the GPU algorithm, a multicellular tissue model was split into two components: one is the single cell model (ordinary differential equation) and the other is the diffusion term of the monodomain model (partial differential equation). Such a decoupling enabled realization of the GPU parallel algorithm. Furthermore, several optimization strategies were proposed based on the features of the virtual heart model, which enabled a 200-fold speedup as compared to a CPU implementation. In conclusion, an optimized GPU algorithm has been developed that provides an economic and powerful platform for 3D whole heart simulations. PMID:26581957
Parallel Optimization of 3D Cardiac Electrophysiological Model Using GPU.
Xia, Yong; Wang, Kuanquan; Zhang, Henggui
2015-01-01
Large-scale 3D virtual heart model simulations are highly demanding in computational resources. This imposes a big challenge to the traditional computation resources based on CPU environment, which already cannot meet the requirement of the whole computation demands or are not easily available due to expensive costs. GPU as a parallel computing environment therefore provides an alternative to solve the large-scale computational problems of whole heart modeling. In this study, using a 3D sheep atrial model as a test bed, we developed a GPU-based simulation algorithm to simulate the conduction of electrical excitation waves in the 3D atria. In the GPU algorithm, a multicellular tissue model was split into two components: one is the single cell model (ordinary differential equation) and the other is the diffusion term of the monodomain model (partial differential equation). Such a decoupling enabled realization of the GPU parallel algorithm. Furthermore, several optimization strategies were proposed based on the features of the virtual heart model, which enabled a 200-fold speedup as compared to a CPU implementation. In conclusion, an optimized GPU algorithm has been developed that provides an economic and powerful platform for 3D whole heart simulations.
Kim, Sung-Han; Lee, Jae Seong; Hyun, Jung-Ho
2017-07-15
We investigated environmental impact of large-scale dyke on the sediment geochemistry, sulfate reduction rates (SRRs), sediment oxygen demand (SOD) and potential contribution of benthic nutrient flux (BNF) to primary production in the Yeongsan River estuary, Yellow Sea. The sediment near the dyke (YE1) with high organic carbon (C org ) content (>4%, dry wt.) was characterized by extremely high SOD (327mmolm -2 d -1 ) and SRRs (91-140mmolm -2 d -1 ). The sulfate reduction accounted for 73% of C org oxidation, and was responsible for strikingly high concentrations of NH 4 + (7.7mM), PO 4 3- (67μM) and HS - (487μM) in pore water. The BNF at YE1 accounted for approximately 200% of N and P required for primary production in the water column. The results present one of the most extreme cases that the construction of an artificial dyke may have profound impacts on the biogeochemical and ecological processes in coastal ecosystems. Copyright © 2017 Elsevier Ltd. All rights reserved.
Scaling earthquake ground motions for performance-based assessment of buildings
Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.
2011-01-01
The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.
Purification for the XENONnT dark matter experiment
NASA Astrophysics Data System (ADS)
Brown, Ethan; Xenon Collaboration
2017-01-01
The XENON1T experiment uses 3.5 tons of liquid xenon in a cryogenic detector to search for dark matter. Its upgrade, XENONnT, will similarly house 7.5 tons of liquid xenon. Operation of these large detectors requires continual purification of the xenon in an external purifier, and the need for less than part per billion level oxygen in the xenon, coupled with the large quantity of xenon to be purified, places high demands on the rate of flow through this purification system. Building on the success of the XENON10 and XENON100 experiments, XENON1T circulates gaseous xenon through heated getters at a rate of up to 100 SLPM, pushing commercial pumps to their limits moving this large quantity of gas without interruption for several years. Two upgrades are considered for XENONnT. A custom high-capacity magnetic piston pump based on the one developed for the EXO200 experiment has been scaled up to support the high demands of this much larger experiment. Additionally, a liquid phase circulation and purification system that purifies the cryogenic liquid directly is being developed, which takes advantage of the much smaller volumetric flow demands of liquid relative to gas. The implementation of both upgrades will be presented. Supported by the National Science Foundation.
Space-time dependence between energy sources and climate related energy production
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn; Borga, Marco; Creutin, Jean-Dominique; Ramos, Maria-Helena; Tøfte, Lena; Warland, Geir
2014-05-01
The European Renewable Energy Directive adopted in 2009 focuses on achieving a 20% share of renewable energy in the EU overall energy mix by 2020. A major part of renewable energy production is related to climate, called "climate related energy" (CRE) production. CRE production systems (wind, solar, and hydropower) are characterized by a large degree of intermittency and variability on both short and long time scales due to the natural variability of climate variables. The main strategies to handle the variability of CRE production include energy-storage, -transport, -diversity and -information (smart grids). The three first strategies aim to smooth out the intermittency and variability of CRE production in time and space whereas the last strategy aims to provide a more optimal interaction between energy production and demand, i.e. to smooth out the residual load (the difference between demand and production). In order to increase the CRE share in the electricity system, it is essential to understand the space-time co-variability between the weather variables and CRE production under both current and future climates. This study presents a review of the literature that searches to tackle these problems. It reveals that the majority of studies deals with either a single CRE source or with the combination of two CREs, mostly wind and solar. This may be due to the fact that the most advanced countries in terms of wind equipment have also very little hydropower potential (Denmark, Ireland or UK, for instance). Hydropower is characterized by both a large storage capacity and flexibility in electricity production, and has therefore a large potential for both balancing and storing energy from wind- and solar-power. Several studies look at how to better connect regions with large share of hydropower (e.g., Scandinavia and the Alps) to regions with high shares of wind- and solar-power (e.g., green battery North-Sea net). Considering time scales, various studies consider wind and solar power production and their co-fluctuation at small time scales. The multi-scale nature of the variability is less studied, i.e., the potential adverse or favorable co-fluctuation at intermediate time scales involving water scarcity or abundance, is less present in the literature.Our review points out that it could be especially interesting to promote research on how the pronounced large-scale fluctuations in inflow to hydropower (intra-annual run-off) and smaller scale fluctuations in wind- and solar-power interact in an energy system. There is a need to better represent the profound difference between wind-, solar- and hydro-energy sources. On the one hand, they are all directly linked to the 2-D horizontal dynamics of meteorology. On the other hand, the branching structure of hydrological systems transforms this variability and governs the complex combination of natural inflows and reservoir storage.Finally, we note that the CRE production is, in addition to weather, also influenced by the energy system and market, i.e., the energy transport and demand across scales as well as changes of market regulation. The CRE production system lies thus in this nexus between climate, energy systems and market regulations. The work presented is part of the FP7 project COMPLEX (Knowledge based climate mitigation systems for a low carbon economy; http://www.complex.ac.uk)
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
Jensen, Tue V.; Pinson, Pierre
2017-01-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.
Jensen, Tue V; Pinson, Pierre
2017-11-28
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
NASA Astrophysics Data System (ADS)
Jensen, Tue V.; Pinson, Pierre
2017-11-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
Pollution source localization in an urban water supply network based on dynamic water demand.
Yan, Xuesong; Zhu, Zhixin; Li, Tian
2017-10-27
Urban water supply networks are susceptible to intentional, accidental chemical, and biological pollution, which pose a threat to the health of consumers. In recent years, drinking-water pollution incidents have occurred frequently, seriously endangering social stability and security. The real-time monitoring for water quality can be effectively implemented by placing sensors in the water supply network. However, locating the source of pollution through the data detection obtained by water quality sensors is a challenging problem. The difficulty lies in the limited number of sensors, large number of water supply network nodes, and dynamic user demand for water, which leads the pollution source localization problem to an uncertainty, large-scale, and dynamic optimization problem. In this paper, we mainly study the dynamics of the pollution source localization problem. Previous studies of pollution source localization assume that hydraulic inputs (e.g., water demand of consumers) are known. However, because of the inherent variability of urban water demand, the problem is essentially a fluctuating dynamic problem of consumer's water demand. In this paper, the water demand is considered to be stochastic in nature and can be described using Gaussian model or autoregressive model. On this basis, an optimization algorithm is proposed based on these two dynamic water demand change models to locate the pollution source. The objective of the proposed algorithm is to find the locations and concentrations of pollution sources that meet the minimum between the analogue and detection values of the sensor. Simulation experiments were conducted using two different sizes of urban water supply network data, and the experimental results were compared with those of the standard genetic algorithm.
A depth-first search algorithm to compute elementary flux modes by linear programming
2014-01-01
Background The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Results Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. Conclusions The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints. PMID:25074068
On-demand microbicide products: design matters.
Patel, Sravan Kumar; Rohan, Lisa Cencia
2017-12-01
Sexual intercourse (vaginal and anal) is the predominant mode of human immunodeficiency virus (HIV) transmission. Topical microbicides used in an on-demand format (i.e., immediately before or after sex) can be part of an effective tool kit utilized to prevent sexual transmission of HIV. The effectiveness of prevention products is positively correlated with adherence, which is likely to depend on user acceptability of the product. The development of an efficacious and acceptable product is therefore paramount for the success of an on-demand product. Acceptability of on-demand products (e.g., gels, films, and tablets) and their attributes is influenced by a multitude of user-specific factors that span behavioral, lifestyle, socio-economic, and cultural aspects. In addition, physicochemical properties of the drug, anatomical and physiological aspects of anorectal and vaginal compartments, issues relating to large-scale production, and cost can impact product development. These factors together with user preferences determine the design space of an effective, acceptable, and feasible on-demand product. In this review, we summarize the interacting factors that together determine product choice and its target product profile.
Improving the local relevance of large scale water demand predictions: the way forward
NASA Astrophysics Data System (ADS)
Bernhard, Jeroen; Reynaud, Arnaud; de Roo, Ad
2016-04-01
Securing adequate availability of fresh water is of vital importance for socio-economic development of present and future Europe. Due to strong heterogeneity in climate conditions, some regions receive an abundant supply of water, where other areas almost completely depend on limited river discharge from upstream catchments. Furthermore, water demand differs greatly between regions due to differences in population density and local presence of intensive water using industries and agriculture. This results in many situations all across Europe where competition between water users translates into relative scarcity and economic damage. Additionally it is expected that inter-related economic and demographic developments, as well as climate change are to only further increase the need for efficient management of our water resources in the future. Successful policy making for such complex problems requires a good understanding of the system and reliable forecasting of conditions. The extent and complexity of the water use system however, stands in high contrast with the poor state of available data and lack of reliable predictions for this multi-disciplinary topic. Although the matching of available water to its demand is a European-wide problem, the amount of data with pan-European coverage is limited and usually with a national resolution at best. This is hindering researchers and policy makers because it usually makes large scale water demand predictions little relevant due to the strong regional heterogenic nature of the problem. We present in our study a first attempt of European-wide water demand predictions based on consistent regional data and econometric methods for the household and industry sector. We gathered data on water consumption, water prices and other relevant variables at the highest spatial detail available from national statistical offices and other organizational bodies. This database provides the most detailed up to date picture of present water use and water prices. Subsequently, econometric estimates allow us to make a monetary valuation of water and identify the dominant drivers of domestic and industrial water demand per country. Combined with socio-economic, demographic and climate scenarios we made predictions for future Europe. Since this is a first attempt we obtained mixed results between countries when it comes to data availability and therefore model uncertainty. For some countries we have been able to develop robust predictions based on vast amounts of data while some other countries proved more challenging. We do feel however, that large scale predictions based on regional data are the way forward to provide relevant scientific policy support. In order to improve on our work it is imperative to further expand our database of consistent regional data. We are looking forward to any kind of input and would be very interested in sharing our data to collaborate towards a better understanding of the water use system.
Power monitoring and control for large scale projects: SKA, a case study
NASA Astrophysics Data System (ADS)
Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis
2016-07-01
Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.
Networking for large-scale science: infrastructure, provisioning, transport and application mapping
NASA Astrophysics Data System (ADS)
Rao, Nageswara S.; Carter, Steven M.; Wu, Qishi; Wing, William R.; Zhu, Mengxia; Mezzacappa, Anthony; Veeraraghavan, Malathi; Blondin, John M.
2005-01-01
Large-scale science computations and experiments require unprecedented network capabilities in the form of large bandwidth and dynamically stable connections to support data transfers, interactive visualizations, and monitoring and steering operations. A number of component technologies dealing with the infrastructure, provisioning, transport and application mappings must be developed and/or optimized to achieve these capabilities. We present a brief account of the following technologies that contribute toward achieving these network capabilities: (a) DOE UltraScienceNet and NSF CHEETAH network testbeds that provide on-demand and scheduled dedicated network connections; (b) experimental results on transport protocols that achieve close to 100% utilization on dedicated 1Gbps wide-area channels; (c) a scheme for optimally mapping a visualization pipeline onto a network to minimize the end-to-end delays; and (d) interconnect configuration and protocols that provides multiple Gbps flows from Cray X1 to external hosts.
Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola
2016-01-01
Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.
Hybrid LES RANS technique based on a one-equation near-wall model
NASA Astrophysics Data System (ADS)
Breuer, M.; Jaffrézic, B.; Arora, K.
2008-05-01
In order to reduce the high computational effort of wall-resolved large-eddy simulations (LES), the present paper suggests a hybrid LES RANS approach which splits up the simulation into a near-wall RANS part and an outer LES part. Generally, RANS is adequate for attached boundary layers requiring reasonable CPU-time and memory, where LES can also be applied but demands extremely large resources. Contrarily, RANS often fails in flows with massive separation or large-scale vortical structures. Here, LES is without a doubt the best choice. The basic concept of hybrid methods is to combine the advantages of both approaches yielding a prediction method, which, on the one hand, assures reliable results for complex turbulent flows, including large-scale flow phenomena and massive separation, but, on the other hand, consumes much fewer resources than LES, especially for high Reynolds number flows encountered in technical applications. In the present study, a non-zonal hybrid technique is considered (according to the signification retained by the authors concerning the terms zonal and non-zonal), which leads to an approach where the suitable simulation technique is chosen more or less automatically. For this purpose the hybrid approach proposed relies on a unique modeling concept. In the LES mode a subgrid-scale model based on a one-equation model for the subgrid-scale turbulent kinetic energy is applied, where the length scale is defined by the filter width. For the viscosity-affected near-wall RANS mode the one-equation model proposed by Rodi et al. (J Fluids Eng 115:196 205, 1993) is used, which is based on the wall-normal velocity fluctuations as the velocity scale and algebraic relations for the length scales. Although the idea of combined LES RANS methods is not new, a variety of open questions still has to be answered. This includes, in particular, the demand for appropriate coupling techniques between LES and RANS, adaptive control mechanisms, and proper subgrid-scale and RANS models. Here, in addition to the study on the behavior of the suggested hybrid LES RANS approach, special emphasis is put on the investigation of suitable interface criteria and the adjustment of the RANS model. To investigate these issues, two different test cases are considered. Besides the standard plane channel flow test case, the flow over a periodic arrangement of hills is studied in detail. This test case includes a pressure-induced flow separation and subsequent reattachment. In comparison with a wall-resolved LES prediction encouraging results are achieved.
NASA Astrophysics Data System (ADS)
Brett, Gareth; Barnett, Matthew
2014-12-01
Liquid Air Energy Storage (LAES) provides large scale, long duration energy storage at the point of demand in the 5 MW/20 MWh to 100 MW/1,000 MWh range. LAES combines mature components from the industrial gas and electricity industries assembled in a novel process and is one of the few storage technologies that can be delivered at large scale, with no geographical constraints. The system uses no exotic materials or scarce resources and all major components have a proven lifetime of 25+ years. The system can also integrate low grade waste heat to increase power output. Founded in 2005, Highview Power Storage, is a UK based developer of LAES. The company has taken the concept from academic analysis, through laboratory testing, and in 2011 commissioned the world's first fully integrated system at pilot plant scale (300 kW/2.5 MWh) hosted at SSE's (Scottish & Southern Energy) 80 MW Biomass Plant in Greater London which was partly funded by a Department of Energy and Climate Change (DECC) grant. Highview is now working with commercial customers to deploy multi MW commercial reference plants in the UK and abroad.
Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing
NASA Astrophysics Data System (ADS)
Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey
Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.
From 5 Million to 20 Million a Year: The Challenge of Scale, Quality and Relevance in India's TVET
ERIC Educational Resources Information Center
Mehrotra, Santosh
2014-01-01
In the first decade of this century, India became one of the world's fastest growing large economies, and began to face serious skill-related shortages of workers. Its TVET system has not responded adequately to the growth in demand for semi-skilled and skilled workers. This article describes six sets of reforms that India's educational planners…
2015-01-01
Table 2: Segregation results in terms of STOI on a variety of novel noises (SNR=-2 dB) Babble-20 Cafeteria Factory Babble-100 Living Room Cafe Park...NOISEX-92 corpus [13], and a living room, a cafe and a park noise from the DEMAND corpus [12]. To put the performance of the noise-independent model in
ERIC Educational Resources Information Center
Harris, Diane; Pampaka, Maria
2016-01-01
Drawing on large-scale survey data and interviews with students during their first year at university, and case studies in their institutions, we explore the problems faced by students taking mathematically demanding courses, e.g. physics and engineering. These students are often taught mathematics as a service subject by lecturers of mathematics.…
Joseph O. Sexton; R. Douglas Ramsey; Dale L. Bartos
2006-01-01
Quaking aspen (Populus tremuloides Michx.) is the most widely distributed tree species in North America, but its presence is declining across much of the Western United States. Aspen decline is complex, but results largely from two factors widely divergent in temporal scale: (1) Holocene climatic drying of the region has led to water limitation of aspen seedling...
Spanish version of Bus Drivers' Job Demands Scale (BDJD-24).
Boada-Grau, Joan; Prizmic-Kuzmica, Aldo-Javier; González-Fernández, Marcos-David; Vigil-Colet, Andreu
2013-01-01
Karasek and Theorell's Job Demands-Control Model argues that adverse health-related outcomes, both psychological and physiological, arise from a combination of high job demand and a low level of job control. The objective was to adapt Meijman and Kompier's Bus Drivers' Job Demands Scale (BDJD-24), which enables us to assess the job demands of bus drivers, to Spanish. The final version of the Spanish adaptation was applied to a sample made up of 287 bus drivers living in Spain (80.1% men and 19.9% women), whose average age was 40.44 (SD= 11.78). The results yielded a three-factor structure for the scale used: Time Pressure, Safety, and Passengers. These findings confirm that the Spanish version replicates the factor structure of the original English scale. The reliability of the three subscales was acceptable, ranging from .75 to .84. Furthermore, the subscales were also related to different external correlates and to other scales and showed good convergent and criterion validity. The present instrument can be used to evaluate job demands of bus drivers, as its psychometrics are substantially sound.
Mastenbroek, N J J M; Demerouti, E; van Beukelen, P; Muijtjens, A M M; Scherpbier, A J J A; Jaarsma, A D C
2014-02-15
The Job Demands-Resources model (JD-R model) was used as the theoretical basis of a tailormade questionnaire to measure the psychosocial work environment and personal resources of recently graduated veterinary professionals. According to the JD-R model, two broad categories of work characteristics that determine employee wellbeing can be distinguished: job demands and job resources. Recently, the JD-R model has been expanded by integrating personal resource measures into the model. Three semistructured group interviews with veterinarians active in different work domains were conducted to identify relevant job demands, job resources and personal resources. These demands and resources were organised in themes (constructs). For measurement purposes, a set of questions ('a priori scale') was selected from the literature for each theme. The full set of a priori scales was included in a questionnaire that was administered to 1760 veterinary professionals. Exploratory factor analysis and reliability analysis were conducted to arrive at the final set of validated scales (final scales). 860 veterinarians (73 per cent females) participated. The final set of scales consisted of seven job demands scales (32 items), nine job resources scales (41 items), and six personal resources scales (26 items) which were considered to represent the most relevant potential predictors of work-related wellbeing in this occupational group. The procedure resulted in a tailormade questionnaire: the Veterinary Job Demands and Resources Questionnaire (Vet-DRQ). The use of valid theory and validated scales enhances opportunities for comparative national and international research.
Methane hydrates and the future of natural gas
Ruppel, Carolyn
2011-01-01
For decades, gas hydrates have been discussed as a potential resource, particularly for countries with limited access to conventional hydrocarbons or a strategic interest in establishing alternative, unconventional gas reserves. Methane has never been produced from gas hydrates at a commercial scale and, barring major changes in the economics of natural gas supply and demand, commercial production at a large scale is considered unlikely to commence within the next 15 years. Given the overall uncertainty still associated with gas hydrates as a potential resource, they have not been included in the EPPA model in MITEI’s Future of Natural Gas report. Still, gas hydrates remain a potentially large methane resource and must necessarily be included in any consideration of the natural gas supply beyond two decades from now.
Drivers and barriers to e-invoicing adoption in Greek large scale manufacturing industries
NASA Astrophysics Data System (ADS)
Marinagi, Catherine; Trivellas, Panagiotis; Reklitis, Panagiotis; Skourlas, Christos
2015-02-01
This paper attempts to investigate the drivers and barriers that large-scale Greek manufacturing industries experience in adopting electronic invoices (e-invoices), based on three case studies with organizations having international presence in many countries. The study focuses on the drivers that may affect the increase of the adoption and use of e-invoicing, including the customers demand for e-invoices, and sufficient know-how and adoption of e-invoicing in organizations. In addition, the study reveals important barriers that prevent the expansion of e-invoicing, such as suppliers' reluctance to implement e-invoicing, and IT infrastructures incompatibilities. Other issues examined by this study include the observed benefits from e-invoicing implementation, and the financial priorities of the organizations assumed to be supported by e-invoicing.
VanderKooi, S.P.; Thorsteinson, L.
2007-01-01
Water allocation among human and natural resource uses in the American West is challenging. Western rivers have been largely managed for hydropower, irrigation, drinking water, and navigation. Today land and water use practices have gained importance, particularly as aging dams are faced with re-licensing requirements and provisions of the Endangered Species and Clean Water Acts. Rising demand for scarce water heightens the need for scientific research to predict consequences of management actions on habitats, human resource use, and fish and wildlife. Climate change, introduction of invasive species, or restoration of fish passage can have large, landscape-scaled consequences - research must expand to encompass the appropriate scale and by applying multiple scientific disciplines to complex ecosystem challenges improve the adaptive management framework for decision-making.
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Large-scale bioenergy production: how to resolve sustainability trade-offs?
NASA Astrophysics Data System (ADS)
Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag
2018-02-01
Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the SDG agenda. Based on this, we argue that the development of policies for regulating externalities of large-scale bioenergy production should rely on broad sustainability assessments to discover potential trade-offs with the SDG agenda before implementation.
Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.; Sit, M. A.
2016-12-01
Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.
Large-scale production of functional human lysozyme from marker-free transgenic cloned cows.
Lu, Dan; Liu, Shen; Ding, Fangrong; Wang, Haiping; Li, Jing; Li, Ling; Dai, Yunping; Li, Ning
2016-03-10
Human lysozyme is an important natural non-specific immune protein that is highly expressed in breast milk and participates in the immune response of infants against bacterial and viral infections. Considering the medicinal value and market demand for human lysozyme, an animal model for large-scale production of recombinant human lysozyme (rhLZ) is needed. In this study, we generated transgenic cloned cows with the marker-free vector pBAC-hLF-hLZ, which was shown to efficiently express rhLZ in cow milk. Seven transgenic cloned cows, identified by polymerase chain reaction, Southern blot, and western blot analyses, produced rhLZ in milk at concentrations of up to 3149.19 ± 24.80 mg/L. The purified rhLZ had a similar molecular weight and enzymatic activity as wild-type human lysozyme possessed the same C-terminal and N-terminal amino acid sequences. The preliminary results from the milk yield and milk compositions from a naturally lactating transgenic cloned cow 0906 were also tested. These results provide a solid foundation for the large-scale production of rhLZ in the future.
Consumptive water use to feed humanity - curing a blind spot
NASA Astrophysics Data System (ADS)
Falkenmark, M.; Lannerstad, M.
2005-06-01
Since in large parts of the world it is getting difficult to meet growing water demands by mobilising more water, the discourse has turned its focus to demand management, governance and the necessary concern for aquatic ecosystems by reserving an "environmental flow" in the river. The latter calls for attention to river depletion which may be expected in response to changes in consumptive water use by both natural and anthropogenic systems. Basically, consumptive use has three faces: runoff generation influenced by land cover changes; consumptive use of water withdrawn; and evaporation from water systems (reservoirs, canals, river based cooling). After demonstrating the vulnerability to changes in consumptive use under savanna region conditions - representative of many poverty and hunger prone developing countries subject to attention in the Millennium Development Goal activities - the paper exemplifies; 1) changes in runoff generation in response to regional scale land cover changes; 2) consumptive use in large scale irrigation systems. It goes on to analyse the implications of seeing food as a human right by estimating the additional consumptive use requirements to produce food for the next two generations. Attention is paid to remaining degrees of freedom in terms of uncommitted water beyond an environmental flow reserve and to potential food trade consequences (so-called virtual water). The paper concludes that a human-right-to-food principle will have major consequences in terms of altered consumptive water use. It will therefore be essential for humanity to address river depletion to avoid loss of resilience of the life support system. This will demand a deep-going cooperation between hydrology, ecology and water governance.
Consumptive water use to feed humanity - curing a blind spot
NASA Astrophysics Data System (ADS)
Falkenmark, M.; Lannerstad, M.
2004-11-01
Since in large parts of the world it is getting difficult to meet growing water demands by mobilising more water, the discourse has turned its focus to demand management, governance and the necessary concern for aquatic ecosystems by reserving an "environmental flow" in the river. The latter calls for attention to river depletion which may be expected in response to changes in consumptive water use by both natural and anthropogenic systems. Basically, consumptive use has three faces: runoff generation influenced by land cover changes; consumptive use of water withdrawn; and evaporation from water systems (reservoirs, canals, river based cooling). After demonstrating the vulnerability to changes in consumptive use under savanna region conditions - representative of many poverty and hunger prone developing countries subject to attention in the Millennium Development Goal activities - the paper exemplifies 1) changes in runoff generation in response to regional scale land cover changes; 2) consumptive use in large scale irrigation systems. It goes on to analyse the implications of seeing food as a human right by estimating the additional consumptive use requirements to produce food for the next two generations. Attention is paid to remaining degrees of freedom in terms of uncommitted water beyond an environmental flow reserve and to potential food trade consequences (so-called virtual water). The paper concludes that a human-right-to-food principle will have major consequences in terms of altered consumptive water use. It will therefore be essential for humanity to address river depletion to avoid loss of resilience of the life support system. This will demand a deep-going cooperation between hydrology, ecology and water governance.
Discontinuities in stream nutrient uptake below lakes in mountain drainage networks
Arp, C.D.; Baker, M.A.
2007-01-01
In many watersheds, lakes and streams are hydrologically linked in spatial patterns that influence material transport and retention. We hypothesized that lakes affect stream nutrient cycling via modifications to stream hydrogeomorphology, source-waters, and biological communities. We tested this hypothesis in a lake district of the Sawtooth Mountains, Idaho. Uptake of NO3- and PO4-3 was compared among 25 reaches representing the following landscape positions: lake inlets and outlets, reaches >1-km downstream from lakes, and reference reaches with no nearby lakes. We quantified landscape-scale hydrographic and reach-scale hydrogeomorphic, source-water, and biological variables to characterize these landscape positions and analyze relationships to nutrient uptake. Nitrate uptake was undetectable at most lake outlets, whereas PO4-3 uptake was higher at outlets as compared to reference and lake inlet reaches. Patterns in nutrient demand farther downstream were similar to lake outlets with a gradual shift toward reference-reach functionality. Nitrate uptake was most correlated to sediment mobility and channel morphology, whereas PO 4-3 uptake was most correlated to source-water characteristics. The best integrated predictor of these patterns in nutrient demand was % contributing area (the proportion of watershed area not routing through a lake). We estimate that NO3- and PO 4-3 demand returned to 50% of pre-lake conditions within 1-4-km downstream of a small headwater lake and resetting of nutrient demand was slower downstream of a larger lake set lower in a watershed. Full resetting of these nutrient cycling processes was not reached within 20-km downstream, indicating that lakes can alter stream ecosystem functioning at large spatial scales throughout mountain watersheds. ?? 2007, by the American Society of Limnology and Oceanography, Inc.
Live immunization against East Coast fever--current status.
Di Giulio, Giuseppe; Lynen, Godelieve; Morzaria, Subhash; Oura, Chris; Bishop, Richard
2009-02-01
The infection-and-treatment method (ITM) for immunization of cattle against East Coast fever has historically been used only on a limited scale because of logistical and policy constraints. Recent large-scale deployment among pastoralists in Tanzania has stimulated demand. Concurrently, a suite of molecular tools, developed from the Theileria parva genome, has enabled improved quality control of the immunizing stabilate and post-immunization monitoring of the efficacy and biological impact of ITM in the field. This article outlines the current status of ITM immunization in the field, with associated developments in the molecular epidemiology of T. parva.
Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B
2011-01-01
In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less
Large-scale multi-agent transportation simulations
NASA Astrophysics Data System (ADS)
Cetin, Nurhan; Nagel, Kai; Raney, Bryan; Voellmy, Andreas
2002-08-01
It is now possible to microsimulate the traffic of whole metropolitan areas with 10 million travelers or more, "micro" meaning that each traveler is resolved individually as a particle. In contrast to physics or chemistry, these particles have internal intelligence; for example, they know where they are going. This means that a transportation simulation project will have, besides the traffic microsimulation, modules which model this intelligent behavior. The most important modules are for route generation and for demand generation. Demand is generated by each individual in the simulation making a plan of activities such as sleeping, eating, working, shopping, etc. If activities are planned at different locations, they obviously generate demand for transportation. This however is not enough since those plans are influenced by congestion which initially is not known. This is solved via a relaxation method, which means iterating back and forth between the activities/routes generation and the traffic simulation.
NASA Astrophysics Data System (ADS)
Ordway, E.; Asner, G. P.; Naylor, R. L.; Nkongho, R.; Lambin, E.
2017-12-01
Rapid integration of global agricultural markets and subsequent cropland displacement in recent decades increased large-scale tropical deforestation in South America and Southeast Asia. Growing land scarcity and more stringent land use regulations in these regions could incentivize the offshoring of export-oriented commodity crop production to sub-Saharan Africa (SSA). We assess the effects of domestic- and export-oriented agricultural expansion on deforestation in SSA in recent decades at the global, regional and local scales. Using Cameroon as a case-study, we explore the influence of emerging oil palm expansion on deforestation in greater depth. We found that commodity crops are expanding in SSA, increasing pressure on tropical forests. Four Congo Basin countries, Sierra Leone, Liberia, and Cote d'Ivoire were most at risk in terms of exposure, vulnerability and pressures from agricultural expansion. These countries averaged the highest percent forest cover (58% ±17.9) and lowest proportions of potentially available cropland outside forest areas (1% ±0.9). Foreign investment in these countries was concentrated in oil palm production (81%), with a median investment area of 41,582 thousand ha. Based on remote sensing and field survey results, however, medium- and large-scale non-industrial producers are driving a substantial fraction of the oil palm expansion leading to deforestation in Cameroon. Additionally, unlike Southeast Asia, oil palm expansion in sub-Saharan Africa is associated primarily with domestic market demands. In contrast, cocoa, the fastest expanding export-oriented crop across SSA, accounted for 57% of global expansion in 2000-2013 at a rate of 132 thousand ha yr-1, yet only amounted to 0.9% of foreign land investment. Commodity crop expansion in SSA appears largely driven by small- and medium-scale farmers rather than industrial plantations. Findings highlight that, although most agricultural expansion was associated with domestic demand, there is evidence of a growing influence of distant markets on land-use change in SSA.
Potential climatic impacts and reliability of very large-scale wind farms
NASA Astrophysics Data System (ADS)
Wang, C.; Prinn, R. G.
2010-02-01
Meeting future world energy needs while addressing climate change requires large-scale deployment of low or zero greenhouse gas (GHG) emission technologies such as wind energy. The widespread availability of wind power has fueled substantial interest in this renewable energy source as one of the needed technologies. For very large-scale utilization of this resource, there are however potential environmental impacts, and also problems arising from its inherent intermittency, in addition to the present need to lower unit costs. To explore some of these issues, we use a three-dimensional climate model to simulate the potential climate effects associated with installation of wind-powered generators over vast areas of land or coastal ocean. Using wind turbines to meet 10% or more of global energy demand in 2100, could cause surface warming exceeding 1 °C over land installations. In contrast, surface cooling exceeding 1 °C is computed over ocean installations, but the validity of simulating the impacts of wind turbines by simply increasing the ocean surface drag needs further study. Significant warming or cooling remote from both the land and ocean installations, and alterations of the global distributions of rainfall and clouds also occur. These results are influenced by the competing effects of increases in roughness and decreases in wind speed on near-surface turbulent heat fluxes, the differing nature of land and ocean surface friction, and the dimensions of the installations parallel and perpendicular to the prevailing winds. These results are also dependent on the accuracy of the model used, and the realism of the methods applied to simulate wind turbines. Additional theory and new field observations will be required for their ultimate validation. Intermittency of wind power on daily, monthly and longer time scales as computed in these simulations and inferred from meteorological observations, poses a demand for one or more options to ensure reliability, including backup generation capacity, very long distance power transmission lines, and onsite energy storage, each with specific economic and/or technological challenges.
Potential climatic impacts and reliability of very large-scale wind farms
NASA Astrophysics Data System (ADS)
Wang, C.; Prinn, R. G.
2009-09-01
Meeting future world energy needs while addressing climate change requires large-scale deployment of low or zero greenhouse gas (GHG) emission technologies such as wind energy. The widespread availability of wind power has fueled legitimate interest in this renewable energy source as one of the needed technologies. For very large-scale utilization of this resource, there are however potential environmental impacts, and also problems arising from its inherent intermittency, in addition to the present need to lower unit costs. To explore some of these issues, we use a three-dimensional climate model to simulate the potential climate effects associated with installation of wind-powered generators over vast areas of land or coastal ocean. Using wind turbines to meet 10% or more of global energy demand in 2100, could cause surface warming exceeding 1°C over land installations. In contrast, surface cooling exceeding 1°C is computed over ocean installations, but the validity of simulating the impacts of wind turbines by simply increasing the ocean surface drag needs further study. Significant warming or cooling remote from both the land and ocean installations, and alterations of the global distributions of rainfall and clouds also occur. These results are influenced by the competing effects of increases in roughness and decreases in wind speed on near-surface turbulent heat fluxes, the differing nature of land and ocean surface friction, and the dimensions of the installations parallel and perpendicular to the prevailing winds. These results are also dependent on the accuracy of the model used, and the realism of the methods applied to simulate wind turbines. Additional theory and new field observations will be required for their ultimate validation. Intermittency of wind power on daily, monthly and longer time scales as computed in these simulations and inferred from meteorological observations, poses a demand for one or more options to ensure reliability, including backup generation capacity, very long distance power transmission lines, and onsite energy storage, each with specific economic and/or technological challenges.
SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.
2014-05-01
As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.
Swimming in Light: A Large-Scale Computational Analysis of the Metabolism of Dinoroseobacter shibae
Rex, Rene; Bill, Nelli; Schmidt-Hohagen, Kerstin; Schomburg, Dietmar
2013-01-01
The Roseobacter clade is a ubiquitous group of marine α-proteobacteria. To gain insight into the versatile metabolism of this clade, we took a constraint-based approach and created a genome-scale metabolic model (iDsh827) of Dinoroseobacter shibae DFL12T. Our model is the first accounting for the energy demand of motility, the light-driven ATP generation and experimentally determined specific biomass composition. To cover a large variety of environmental conditions, as well as plasmid and single gene knock-out mutants, we simulated 391,560 different physiological states using flux balance analysis. We analyzed our results with regard to energy metabolism, validated them experimentally, and revealed a pronounced metabolic response to the availability of light. Furthermore, we introduced the energy demand of motility as an important parameter in genome-scale metabolic models. The results of our simulations also gave insight into the changing usage of the two degradation routes for dimethylsulfoniopropionate, an abundant compound in the ocean. A side product of dimethylsulfoniopropionate degradation is dimethyl sulfide, which seeds cloud formation and thus enhances the reflection of sunlight. By our exhaustive simulations, we were able to identify single-gene knock-out mutants, which show an increased production of dimethyl sulfide. In addition to the single-gene knock-out simulations we studied the effect of plasmid loss on the metabolism. Moreover, we explored the possible use of a functioning phosphofructokinase for D. shibae. PMID:24098096
Fujishiro, Kaori; Landsbergis, Paul A; Diez-Roux, Ana V; Stukovsky, Karen Hinckley; Shrager, Sandi; Baron, Sherry
2011-06-01
Immigrants have a different social context from those who stay in their home country or those who were born to the country that immigrants now live. Cultural theory of risk perception suggests that social context influences one's interpretation of questionnaire items. We examined psychometric properties of job control and job demand scales with US- and foreign-born workers who preferred English, Spanish, or Chinese (n = 3,114, mean age = 58.1). Across all groups, the job control scale had acceptable Cronbach's alpha (0.78-0.83) and equivalent factor loadings (ΔCFI < 0.01). Immigrants had low alpha (0.42-0.65) for the job demands scale regardless of language, education, or age of migration. Two job-demand items had different factor loadings across groups. Among immigrants, both scales had inconsistent associations with perceived job stress and self-rated health. For a better understanding of immigrants' job stress, the concept of job demands should be expanded and immigrants' expectations for job control explored.
Fujishiro, Kaori; Landsbergis, Paul; Roux, Ana V. Diez; Stukovsky, Karen Hinckley; Shrager, Sandi; Baron, Sherry
2014-01-01
Immigrants have a different social context from those who stay in their home country or those who were born to the country that immigrants now live. Cultural theory of risk perception suggests that social context influences one’s interpretation of questionnaire items. We examined psychometric properties of job control and job demand scales with US- and foreign-born workers who preferred English, Spanish, or Chinese (n=3114, mean age=58.1). Across all groups, the job control scale had acceptable Cronbach’s alpha (0.78–0.83) and equivalent factor loadings (ΔCFI<0.01). Immigrants had low alpha (0.42–0.65) for the job demands scale regardless of language, education, or age of migration. Two job-demand items had different factor loadings across groups. Among immigrants, both scales had inconsistent associations with perceived job stress and self-rated health. For a better understanding of immigrants’ job stress, the concept of job demands should be expanded and immigrants’ expectations for job control explored. (149/150 limit) PMID:20582720
Micro-Level Adaptation, Macro-Level Selection, and the Dynamics of Market Partitioning
García-Díaz, César; van Witteloostuijn, Arjen; Péli, Gábor
2015-01-01
This paper provides a micro-foundation for dual market structure formation through partitioning processes in marketplaces by developing a computational model of interacting economic agents. We propose an agent-based modeling approach, where firms are adaptive and profit-seeking agents entering into and exiting from the market according to their (lack of) profitability. Our firms are characterized by large and small sunk costs, respectively. They locate their offerings along a unimodal demand distribution over a one-dimensional product variety, with the distribution peak constituting the center and the tails standing for the peripheries. We found that large firms may first advance toward the most abundant demand spot, the market center, and release peripheral positions as predicted by extant dual market explanations. However, we also observed that large firms may then move back toward the market fringes to reduce competitive niche overlap in the center, triggering nonlinear resource occupation behavior. Novel results indicate that resource release dynamics depend on firm-level adaptive capabilities, and that a minimum scale of production for low sunk cost firms is key to the formation of the dual structure. PMID:26656107
Micro-Level Adaptation, Macro-Level Selection, and the Dynamics of Market Partitioning.
García-Díaz, César; van Witteloostuijn, Arjen; Péli, Gábor
2015-01-01
This paper provides a micro-foundation for dual market structure formation through partitioning processes in marketplaces by developing a computational model of interacting economic agents. We propose an agent-based modeling approach, where firms are adaptive and profit-seeking agents entering into and exiting from the market according to their (lack of) profitability. Our firms are characterized by large and small sunk costs, respectively. They locate their offerings along a unimodal demand distribution over a one-dimensional product variety, with the distribution peak constituting the center and the tails standing for the peripheries. We found that large firms may first advance toward the most abundant demand spot, the market center, and release peripheral positions as predicted by extant dual market explanations. However, we also observed that large firms may then move back toward the market fringes to reduce competitive niche overlap in the center, triggering nonlinear resource occupation behavior. Novel results indicate that resource release dynamics depend on firm-level adaptive capabilities, and that a minimum scale of production for low sunk cost firms is key to the formation of the dual structure.
Cereal area and nitrogen use efficiency are drivers of future nitrogen fertilizer consumption.
Dobermann, Achim; Cassman, Kenneth G
2005-09-01
At a global scale, cereal yields and fertilizer N consumption have increased in a near-linear fashion during the past 40 years and are highly correlated with one another. However, large differences exist in historical trends of N fertilizer usage and nitrogen use efficiency (NUE) among regions, countries, and crops. The reasons for these differences must be understood to estimate future N fertilizer requirements. Global nitrogen needs will depend on: (i) changes in cropped cereal area and the associated yield increases required to meet increasing cereal demand from population and income growth, and (ii) changes in NUE at the farm level. Our analysis indicates that the anticipated 38% increase in global cereal demand by 2025 can be met by a 30% increase in N use on cereals, provided that the steady decline in cereal harvest area is halted and the yield response to applied N can be increased by 20%. If losses of cereal cropping area continue at the rate of the past 20 years (-0.33% per year) and NUE cannot be increased substantially, a 60% increase in global N use on cereals would be required to meet cereal demand. Interventions to increase NUE and reduce N losses to the environment must be accomplished at the farm-or field-scale through a combination of improved technologies and carefully crafted local policies that contribute to the adoption of improved N management; uniform regional or national directives are unlikely to be effective at both sustaining yield increases and improving NUE. Examples from several countries show that increases in NUE at rates of 1% per year or more can be achieved if adequate investments are made in research and extension. Failure to arrest the decrease in cereal crop area and to improve NUE in the world's most important agricultural systems will likely cause severe damage to environmental services at local, regional, and global scales due to a large increase in reactive N load in the environment.
Cereal area and nitrogen use efficiency are drivers of future nitrogen fertilizer consumption.
Dobermann, Achim; Cassman, Kenneth G
2005-12-01
At a global scale, cereal yields and fertilizer N consumption have increased in a near-linear fashion during the past 40 years and are highly correlated with one another. However, large differences exist in historical trends of N fertilizer usage and nitrogen use efficiency (NUE) among regions, countries, and crops. The reasons for these differences must be understood to estimate future N fertilizer requirements. Global nitrogen needs will depend on: (i) changes in cropped cereal area and the associated yield increases required to meet increasing cereal demand from population and income growth, and (ii) changes in NUE at the farm level. Our analysis indicates that the anticipated 38% increase in global cereal demand by 2025 can be met by a 30% increase in N use on cereals, provided that the steady decline in cereal harvest area is halted and the yield response to applied N can be increased by 20%. If losses of cereal cropping area continue at the rate of the past 20 years (-0.33% per year) and NUE cannot be increased substantially, a 60% increase in global N use on cereals would be required to meet cereal demand. Interventions to increase NUE and reduce N losses to the environment must be accomplished at the farm- or field-scale through a combination of improved technologies and carefully crafted local policies that contribute to the adoption of improved N management; uniform regional or national directives are unlikey to be effective at both sustaining yield increases and improving NUE. Examples from several countries show that increases in NUE at rates of 1% per year or more can be achieved if adequate investments are made in research and extension. Failure to arrest the decrease in cereal crop area and to improve NUE in the world's most important agricultural systems will likely cause severe damage to environmental services at local, regional, and global scales due to a large increase in reactive N load in the environment.
William W. Oliver
2001-01-01
Conflicts over changing demands on our increasingly scarce stands of late successional ponderosa pine could be abated by increasing the proportion of stands with late successional attributes in the forest land base. However, we don't know whether these attributes can be developed through the management of younger stands. Nor do we know whether late successional...
Forecasting Demand for KC-135 Sorties: Deploy to Dwell Impacts
2013-06-01
fighter movements from individual units are rampant (6 OSS/ OSOS , 2013). However, TACC directed missions in this category are scarce, if not non...existent (6 OSS/ OSOS , 2013). Recent TACC tasked missions that appear to support CONUS fighter movements were training related: pre-deployment preparation...and large scale exercises directed by the Joint Staff (6 OSS/ OSOS , 2013). Anecdotal evidence that AMC supports CONUS fighter movements was flawed
Plentern mit Kiefern--Ergebnisse aus den USA [Plentering with pines--results from the United States
James M. Guldin; Don C. Bragg; Andreas Zingg
2017-01-01
Until now, scientifically reliable data on plentering of light-demanding tree species in Europe have been lacking. This gap is filled with long-term trials from the USA, among others with southern yellow pines. In the southern state of Arkansas, two plots of 16 hectares were installed in 1936, in the context of a large-scale trial of mixed loblolly pine (...
Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.
The effects of physicochemical wastewater treatment operations on forward osmosis.
Hey, Tobias; Bajraktari, Niada; Vogel, Jörg; Hélix Nielsen, Claus; la Cour Jansen, Jes; Jönsson, Karin
2017-09-01
Raw municipal wastewater from a full-scale wastewater treatment plant was physicochemically pretreated in a large pilot-scale system comprising coagulation, flocculation, microsieve and microfiltration operated in various configurations. The produced microsieve filtrates and microfiltration permeates were then concentrated using forward osmosis (FO). Aquaporin Inside TM FO membranes were used for both the microsieve filtrate and microfiltration permeates, and Hydration Technologies Inc.-thin-film composite membranes for the microfiltration permeate using only NaCl as the draw solution. The FO performance was evaluated in terms of the water flux, water flux decline and solute rejections of biochemical oxygen demand, and total and soluble phosphorus. The obtained results were compared with the results of FO after only mechanical pretreatment. The FO permeates satisfied the Swedish discharge demands for small and medium-sized wastewater treatment plants. The study demonstrates that physicochemical pretreatment can improve the FO water flux by up to 20%. In contrast, the solute rejection decreases significantly compared to the FO-treated wastewater with mechanical pretreatment.
Study of multi-functional precision optical measuring system for large scale equipment
NASA Astrophysics Data System (ADS)
Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi
2017-10-01
The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leng, Guoyong; Huang, Maoyi; Tang, Qiuhong
2013-09-16
Previous studies on irrigation impacts on land surface fluxes/states were mainly conducted as sensitivity experiments, with limited analysis of uncertainties from the input data and model irrigation schemes used. In this study, we calibrated and evaluated the performance of irrigation water use simulated by the Community Land Model version 4 (CLM4) against observations from agriculture census. We investigated the impacts of irrigation on land surface fluxes and states over the conterminous United States (CONUS) and explored possible directions of improvement. Specifically, we found large uncertainty in the irrigation area data from two widely used sources and CLM4 tended to producemore » unrealistically large temporal variations of irrigation demand for applications at the water resources region scale over CONUS. At seasonal to interannual time scales, the effects of irrigation on surface energy partitioning appeared to be large and persistent, and more pronounced in dry than wet years. Even with model calibration to yield overall good agreement with the irrigation amounts from the National Agricultural Statistics Service (NASS), differences between the two irrigation area datasets still dominate the differences in the interannual variability of land surface response to irrigation. Our results suggest that irrigation amount simulated by CLM4 can be improved by (1) calibrating model parameter values to account for regional differences in irrigation demand and (2) accurate representation of the spatial distribution and intensity of irrigated areas.« less
NASA Astrophysics Data System (ADS)
Aydemir, Ali; Popovski, Eftim; Bellstädt, Daniel; Fleiter, Tobias; Büchele, Richard
2017-11-01
Many earlier studies have assessed the DH generation mix without taking explicitly into account future changes in the building stock and heat demand. The approach of this study consists of three steps that combine stock modeling, energy demand forecasting, and simulation of different energy technologies. First, a detailed residential building stock model for Herten is constructed by using remote sensing together with a typology for the German building stock. Second, a bottom-up simulation model is used which calculates the thermal energy demand based on energy-related investments in buildings in order to forecast the thermal demand up to 2050. Third, solar thermal fields in combination with large-scale heat pumps are sized as an alternative to the current coal-fired CHPs. We finally assess cost of heat and CO2 reduction for these units for two scenarios which differ with regard to the DH expansion. It can be concluded that up to 2030 and 2050 a substantial reduction in buildings heat demand due to the improved building insulation is expected. The falling heat demand in the DH substantially reduces the economic feasibility of new RES generation capacity. This reduction might be compensated by continuously connecting apartment buildings to the DH network until 2050.
Environmental status of livestock and poultry sectors in China under current transformation stage.
Qian, Yi; Song, Kaihui; Hu, Tao; Ying, Tianyu
2018-05-01
Intensive animal husbandry had aroused great environmental concerns in many developed countries. However, some developing countries are still undergoing the environmental pollution from livestock and poultry sectors. Driven by the large demand, China has experienced a remarkable increase in dairy and meat production, especially in the transformation stage from conventional household breeding to large-scale industrial breeding. At the same time, a large amount of manure from the livestock and poultry sector is released into waterbodies and soil, causing eutrophication and soil degradation. This condition will be reinforced in the large-scale cultivation where the amount of manure exceeds the soil nutrient capacity, if not treated or utilized properly. Our research aims to analyze whether the transformation of raising scale would be beneficial to the environment as well as present the latest status of livestock and poultry sectors in China. The estimation of the pollutants generated and discharged from livestock and poultry sector in China will facilitate the legislation of manure management. This paper analyzes the pollutants generated from the manure of the five principal commercial animals in different farming practices. The results show that the fattening pigs contribute almost half of the pollutants released from manure. Moreover, the beef cattle exert the largest environmental impact for unitary production, about 2-3 times of pork and 5-20 times of chicken. The animals raised with large-scale feedlots practice generate fewer pollutants than those raised in households. The shift towards industrial production of livestock and poultry is easier to manage from the environmental perspective, but adequate large-scale cultivation is encouraged. Regulation control, manure treatment and financial subsidies for the manure treatment and utilization are recommended to achieve the ecological agriculture in China. Copyright © 2017 Elsevier B.V. All rights reserved.
Community-based native seed production for restoration in Brazil - the role of science and policy.
Schmidt, I B; de Urzedo, D I; Piña-Rodrigues, F C M; Vieira, D L M; de Rezende, G M; Sampaio, A B; Junqueira, R G P
2018-05-20
Large-scale restoration programmes in the tropics require large volumes of high quality, genetically diverse and locally adapted seeds from a large number of species. However, scarcity of native seeds is a critical restriction to achieve restoration targets. In this paper, we analyse three successful community-based networks that supply native seeds and seedlings for Brazilian Amazon and Cerrado restoration projects. In addition, we propose directions to promote local participation, legal, technical and commercialisation issues for up-scaling the market of native seeds for restoration with high quality and social justice. We argue that effective community-based restoration arrangements should follow some principles: (i) seed production must be based on real market demand; (ii) non-governmental and governmental organisations have a key role in supporting local organisation, legal requirements and selling processes; (iii) local ecological knowledge and labour should be valued, enabling local communities to promote large-scale seed production; (iv) applied research can help develop appropriate techniques and solve technical issues. The case studies from Brazil and principles presented here can be useful for the up-scaling restoration ecology efforts in many other parts of the world and especially in tropical countries where improving rural community income is a strategy for biodiversity conservation and restoration. © 2018 German Society for Plant Sciences and The Royal Botanical Society of the Netherlands.
Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor
Sheu, Jonathan; Beltzer, Jim; Fury, Brian; Wilczek, Katarzyna; Tobin, Steve; Falconer, Danny; Nolta, Jan; Bauer, Gerhard
2015-01-01
Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs), we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s) and in 10-layer cell factories (CF10s), while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation. PMID:26151065
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiffmann, Florian; VandeVondele, Joost, E-mail: Joost.VandeVondele@mat.ethz.ch
2015-06-28
We present an improved preconditioning scheme for electronic structure calculations based on the orbital transformation method. First, a preconditioner is developed which includes information from the full Kohn-Sham matrix but avoids computationally demanding diagonalisation steps in its construction. This reduces the computational cost of its construction, eliminating a bottleneck in large scale simulations, while maintaining rapid convergence. In addition, a modified form of Hotelling’s iterative inversion is introduced to replace the exact inversion of the preconditioner matrix. This method is highly effective during molecular dynamics (MD), as the solution obtained in earlier MD steps is a suitable initial guess. Filteringmore » small elements during sparse matrix multiplication leads to linear scaling inversion, while retaining robustness, already for relatively small systems. For system sizes ranging from a few hundred to a few thousand atoms, which are typical for many practical applications, the improvements to the algorithm lead to a 2-5 fold speedup per MD step.« less
Drivers and barriers to e-invoicing adoption in Greek large scale manufacturing industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinagi, Catherine, E-mail: marinagi@teihal.gr, E-mail: ptrivel@yahoo.com, E-mail: preklitis@yahoo.com; Trivellas, Panagiotis, E-mail: marinagi@teihal.gr, E-mail: ptrivel@yahoo.com, E-mail: preklitis@yahoo.com; Reklitis, Panagiotis, E-mail: marinagi@teihal.gr, E-mail: ptrivel@yahoo.com, E-mail: preklitis@yahoo.com
2015-02-09
This paper attempts to investigate the drivers and barriers that large-scale Greek manufacturing industries experience in adopting electronic invoices (e-invoices), based on three case studies with organizations having international presence in many countries. The study focuses on the drivers that may affect the increase of the adoption and use of e-invoicing, including the customers demand for e-invoices, and sufficient know-how and adoption of e-invoicing in organizations. In addition, the study reveals important barriers that prevent the expansion of e-invoicing, such as suppliers’ reluctance to implement e-invoicing, and IT infrastructures incompatibilities. Other issues examined by this study include the observed benefitsmore » from e-invoicing implementation, and the financial priorities of the organizations assumed to be supported by e-invoicing.« less
Global Detection of Live Virtual Machine Migration Based on Cellular Neural Networks
Xie, Kang; Yang, Yixian; Zhang, Ling; Jing, Maohua; Xin, Yang; Li, Zhongxian
2014-01-01
In order to meet the demands of operation monitoring of large scale, autoscaling, and heterogeneous virtual resources in the existing cloud computing, a new method of live virtual machine (VM) migration detection algorithm based on the cellular neural networks (CNNs), is presented. Through analyzing the detection process, the parameter relationship of CNN is mapped as an optimization problem, in which improved particle swarm optimization algorithm based on bubble sort is used to solve the problem. Experimental results demonstrate that the proposed method can display the VM migration processing intuitively. Compared with the best fit heuristic algorithm, this approach reduces the processing time, and emerging evidence has indicated that this new approach is affordable to parallelism and analog very large scale integration (VLSI) implementation allowing the VM migration detection to be performed better. PMID:24959631
Global detection of live virtual machine migration based on cellular neural networks.
Xie, Kang; Yang, Yixian; Zhang, Ling; Jing, Maohua; Xin, Yang; Li, Zhongxian
2014-01-01
In order to meet the demands of operation monitoring of large scale, autoscaling, and heterogeneous virtual resources in the existing cloud computing, a new method of live virtual machine (VM) migration detection algorithm based on the cellular neural networks (CNNs), is presented. Through analyzing the detection process, the parameter relationship of CNN is mapped as an optimization problem, in which improved particle swarm optimization algorithm based on bubble sort is used to solve the problem. Experimental results demonstrate that the proposed method can display the VM migration processing intuitively. Compared with the best fit heuristic algorithm, this approach reduces the processing time, and emerging evidence has indicated that this new approach is affordable to parallelism and analog very large scale integration (VLSI) implementation allowing the VM migration detection to be performed better.
Vacuum and the electron tube industry
NASA Astrophysics Data System (ADS)
Redhead, P. A.
2005-07-01
The electron tube industry started with the patenting of the thermionic diode by John Ambrose Fleming in 1904. The vacuum technology used by the infant tube industry was copied from the existing incandescent lamp industry. The growing demands for electron tubes for the military in the first world war led to major improvements in pumps and processing methods. By the 1920s, mass production methods were developing to satisfy the demands for receiving tubes by the burgeoning radio industry. Further expansion in the 1930s and 1940s resulted in improvements in automatic equipment for pumping vacuum tubes leading to the massive production rates of electron tubes in the second world war and the following two decades. The demand for radar during the war resulted in the development of techniques for large-scale production of microwave tubes and CRTs, the latter technology being put to good use later in TV picture tube production. The commercial introduction of the transistor ended the massive demand for receiving tubes. This review concentrates on the vacuum technology developed for receiving tube production.
NASA Astrophysics Data System (ADS)
Blokker, Mirjam; Agudelo-Vera, Claudia; Moerman, Andreas; van Thienen, Peter; Pieterse-Quirijns, Ilse
2017-04-01
Many researchers have developed drinking water demand models with various temporal and spatial scales. A limited number of models is available at a temporal scale of 1 s and a spatial scale of a single home. The reasons for building these models were described in the papers in which the models were introduced, along with a discussion on their potential applications. However, the predicted applications are seldom re-examined. SIMDEUM, a stochastic end-use model for drinking water demand, has often been applied in research and practice since it was developed. We are therefore re-examining its applications in this paper. SIMDEUM's original purpose was to calculate maximum demands in order to design self-cleaning networks. Yet, the model has been useful in many more applications. This paper gives an overview of the many fields of application for SIMDEUM and shows where this type of demand model is indispensable and where it has limited practical value. This overview also leads to an understanding of the requirements for demand models in various applications.
Diagnosing phosphorus limitations in natural terrestrial ecosystems in carbon cycle models
NASA Astrophysics Data System (ADS)
Sun, Yan; Peng, Shushi; Goll, Daniel S.; Ciais, Philippe; Guenet, Bertrand; Guimberteau, Matthieu; Hinsinger, Philippe; Janssens, Ivan A.; Peñuelas, Josep; Piao, Shilong; Poulter, Benjamin; Violette, Aurélie; Yang, Xiaojuan; Yin, Yi; Zeng, Hui
2017-07-01
Most of the Earth System Models (ESMs) project increases in net primary productivity (NPP) and terrestrial carbon (C) storage during the 21st century. Despite empirical evidence that limited availability of phosphorus (P) may limit the response of NPP to increasing atmospheric CO2, none of the ESMs used in the previous Intergovernmental Panel on Climate Change assessment accounted for P limitation. We diagnosed from ESM simulations the amount of P need to support increases in carbon uptake by natural ecosystems using two approaches: the demand derived from (1) changes in C stocks and (2) changes in NPP. The C stock-based additional P demand was estimated to range between -31 and 193 Tg P and between -89 and 262 Tg P for Representative Concentration Pathway (RCP) 2.6 and RCP8.5, respectively, with negative values indicating a P surplus. The NPP-based demand, which takes ecosystem P recycling into account, results in a significantly higher P demand of 648-1606 Tg P for RCP2.6 and 924-2110 Tg P for RCP8.5. We found that the P demand is sensitive to the turnover of P in decomposing plant material, explaining the large differences between the NPP-based demand and C stock-based demand. The discrepancy between diagnosed P demand and actual P availability (potential P deficit) depends mainly on the assumptions about availability of the different soil P forms. Overall, future P limitation strongly depends on both soil P availability and P recycling on ecosystem scale.
Diagnosing phosphorus limitations in natural terrestrial ecosystems in carbon cycle models.
Sun, Yan; Peng, Shushi; Goll, Daniel S; Ciais, Philippe; Guenet, Bertrand; Guimberteau, Matthieu; Hinsinger, Philippe; Janssens, Ivan A; Peñuelas, Josep; Piao, Shilong; Poulter, Benjamin; Violette, Aurélie; Yang, Xiaojuan; Yin, Yi; Zeng, Hui
2017-07-01
Most of the Earth System Models (ESMs) project increases in net primary productivity (NPP) and terrestrial carbon (C) storage during the 21st century. Despite empirical evidence that limited availability of phosphorus (P) may limit the response of NPP to increasing atmospheric CO 2 , none of the ESMs used in the previous Intergovernmental Panel on Climate Change assessment accounted for P limitation. We diagnosed from ESM simulations the amount of P need to support increases in carbon uptake by natural ecosystems using two approaches: the demand derived from (1) changes in C stocks and (2) changes in NPP. The C stock-based additional P demand was estimated to range between -31 and 193 Tg P and between -89 and 262 Tg P for Representative Concentration Pathway (RCP) 2.6 and RCP8.5, respectively, with negative values indicating a P surplus. The NPP-based demand, which takes ecosystem P recycling into account, results in a significantly higher P demand of 648-1606 Tg P for RCP2.6 and 924-2110 Tg P for RCP8.5. We found that the P demand is sensitive to the turnover of P in decomposing plant material, explaining the large differences between the NPP-based demand and C stock-based demand. The discrepancy between diagnosed P demand and actual P availability (potential P deficit) depends mainly on the assumptions about availability of the different soil P forms. Overall, future P limitation strongly depends on both soil P availability and P recycling on ecosystem scale.
Lakeside: Merging Urban Design with Scientific Analysis
Guzowski, Leah; Catlett, Charlie; Woodbury, Ed
2018-01-16
Researchers at the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago are developing tools that merge urban design with scientific analysis to improve the decision-making process associated with large-scale urban developments. One such tool, called LakeSim, has been prototyped with an initial focus on consumer-driven energy and transportation demand, through a partnership with the Chicago-based architectural and engineering design firm Skidmore, Owings & Merrill, Clean Energy Trust and developer McCaffery Interests. LakeSim began with the need to answer practical questions about urban design and planning, requiring a better understanding about the long-term impact of design decisions on energy and transportation demand for a 600-acre development project on Chicago's South Side - the Chicago Lakeside Development project.
Prediction Models for Dynamic Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aman, Saima; Frincu, Marc; Chelmis, Charalampos
2015-11-02
As Smart Grids move closer to dynamic curtailment programs, Demand Response (DR) events will become necessary not only on fixed time intervals and weekdays predetermined by static policies, but also during changing decision periods and weekends to react to real-time demand signals. Unique challenges arise in this context vis-a-vis demand prediction and curtailment estimation and the transformation of such tasks into an automated, efficient dynamic demand response (D 2R) process. While existing work has concentrated on increasing the accuracy of prediction models for DR, there is a lack of studies for prediction models for D 2R, which we address inmore » this paper. Our first contribution is the formal definition of D 2R, and the description of its challenges and requirements. Our second contribution is a feasibility analysis of very-short-term prediction of electricity consumption for D 2R over a diverse, large-scale dataset that includes both small residential customers and large buildings. Our third, and major contribution is a set of insights into the predictability of electricity consumption in the context of D 2R. Specifically, we focus on prediction models that can operate at a very small data granularity (here 15-min intervals), for both weekdays and weekends - all conditions that characterize scenarios for D 2R. We find that short-term time series and simple averaging models used by Independent Service Operators and utilities achieve superior prediction accuracy. We also observe that workdays are more predictable than weekends and holiday. Also, smaller customers have large variation in consumption and are less predictable than larger buildings. Key implications of our findings are that better models are required for small customers and for non-workdays, both of which are critical for D 2R. Also, prediction models require just few days’ worth of data indicating that small amounts of historical training data can be used to make reliable predictions, simplifying the complexity of big data challenge associated with D 2R.« less
Assessing trade-offs in large marine protected areas.
Davies, Tammy E; Epstein, Graham; Aguilera, Stacy E; Brooks, Cassandra M; Cox, Michael; Evans, Louisa S; Maxwell, Sara M; Nenadovic, Mateja; Ban, Natalie C
2018-01-01
Large marine protected areas (LMPAs) are increasingly being established and have a high profile in marine conservation. LMPAs are expected to achieve multiple objectives, and because of their size are postulated to avoid trade-offs that are common in smaller MPAs. However, evaluations across multiple outcomes are lacking. We used a systematic approach to code several social and ecological outcomes of 12 LMPAs. We found evidence of three types of trade-offs: trade-offs between different ecological resources (supply trade-offs); trade-offs between ecological resource conditions and the well-being of resource users (supply-demand trade-offs); and trade-offs between the well-being outcomes of different resource users (demand trade-offs). We also found several divergent outcomes that were attributed to influences beyond the scope of the LMPA. We suggest that despite their size, trade-offs can develop in LMPAs and should be considered in planning and design. LMPAs may improve their performance across multiple social and ecological objectives if integrated with larger-scale conservation efforts.
State of the art and future perspectives of thermophilic anaerobic digestion.
Ahring, B K; Mladenovska, Z; Iranpour, R; Westermann, P
2002-01-01
The state of the art of thermophilic digestion is discussed. Thermophilic digestion is a well established technology in Europe for treatment of mixtures of waste in common large scale biogas plants or for treatment of the organic fraction of municipal solid waste. Due to a large number of failures over time with thermophilic digestion of sewage sludge this process has lost its appeal in the USA. New demands on sanitation of biosolids before land use will, however, bring the attention back to the use of elevated temperatures during sludge stabilization. In the paper we show how the use of a start-up strategy based on the actual activity of key microbes can be used to ensure proper and fast transfer of mesophilic digesters into thermophilic operation. Extreme thermophilic temperatures of 65 degrees C or more may be necessary in the future to meet the demands for full sanitation of the waste material before final disposal. We show data of anaerobic digestion at extreme thermophilic temperatures.
Survey on the possibility of introducing new energy to regional development plans
NASA Astrophysics Data System (ADS)
1988-03-01
This report covers nationwide large-scale resort plans and at the same time studies the possibility of introducing new energy systems, mainly cogeneration, and their effects. Japanese industrial structure is rapidly moving toward information and service areas, and the development of resorts has become very active. With the increase of resort demands, much is expected of resort development as a means of regional promotion. Special features of energy consumption in resort facilities are that annual demand is large, that energy consumption fluctuates greatly, and that energy supply cost is high. These features are especially conspicuous in smaller facilities. Most suited for resort lodging facilities is a co-generation system, especially a diesel engine system. This system is expected to conserve energy; but to promote this system, it is necessary to revise the preferential tax treatment and Fire Service Act to meet the actual circumstances, and to develop a highly reliable system that can be operated unattended. An economical system in view of overall costs is also essential.
NASA Astrophysics Data System (ADS)
Abe, R.; Hamada, K.; Hirata, N.; Tamura, R.; Nishi, N.
2015-05-01
As well as the BIM of quality management in the construction industry, demand for quality management of the manufacturing process of the member is higher in shipbuilding field. The time series of three-dimensional deformation of the each process, and are accurately be grasped strongly demanded. In this study, we focused on the shipbuilding field, will be examined three-dimensional measurement method. The shipyard, since a large equipment and components are intricately arranged in a limited space, the installation of the measuring equipment and the target is limited. There is also the element to be measured is moved in each process, the establishment of the reference point for time series comparison is necessary to devise. In this paper will be discussed method for measuring the welding deformation in time series by using a total station. In particular, by using a plurality of measurement data obtained from this approach and evaluated the amount of deformation of each process.
Assessing trade-offs in large marine protected areas
Aguilera, Stacy E.; Brooks, Cassandra M.; Cox, Michael; Evans, Louisa S.; Maxwell, Sara M.; Nenadovic, Mateja
2018-01-01
Large marine protected areas (LMPAs) are increasingly being established and have a high profile in marine conservation. LMPAs are expected to achieve multiple objectives, and because of their size are postulated to avoid trade-offs that are common in smaller MPAs. However, evaluations across multiple outcomes are lacking. We used a systematic approach to code several social and ecological outcomes of 12 LMPAs. We found evidence of three types of trade-offs: trade-offs between different ecological resources (supply trade-offs); trade-offs between ecological resource conditions and the well-being of resource users (supply-demand trade-offs); and trade-offs between the well-being outcomes of different resource users (demand trade-offs). We also found several divergent outcomes that were attributed to influences beyond the scope of the LMPA. We suggest that despite their size, trade-offs can develop in LMPAs and should be considered in planning and design. LMPAs may improve their performance across multiple social and ecological objectives if integrated with larger-scale conservation efforts. PMID:29668750
The water footprint of sweeteners and bio-ethanol.
Gerbens-Leenes, Winnie; Hoekstra, Arjen Y
2012-04-01
An increasing demand for food together with a growing demand for energy crops result in an increasing demand for and competition over water. Sugar cane, sugar beet and maize are not only essential food crops, but also important feedstock for bio-ethanol. Crop growth requires water, a scarce resource. This study aims to assess the green, blue and grey water footprint (WF) of sweeteners and bio-ethanol from sugar cane, sugar beet and maize in the main producing countries. The WFs of sweeteners and bio-ethanol are mainly determined by the crop type that is used as a source and by agricultural practise and agro-climatic conditions; process water footprints are relatively small. The weighted global average WF of sugar cane is 209 m(3)/tonne; for sugar beet this is 133 m(3)/tonne and for maize 1222 m(3)/tonne. Large regional differences in WFs indicate that WFs of crops for sweeteners and bio-ethanol can be improved. It is more favourable to use maize as a feedstock for sweeteners or bio-ethanol than sugar beet or sugar cane. The WF of sugar cane contributes to water stress in the Indus and Ganges basins. In the Ukraine, the large grey WF of sugar beet contributes to water pollution. In some western European countries, blue WFs of sugar beet and maize need a large amount of available blue water for agriculture. The allocation of the limited global water resources to bio-energy on a large scale will be at the cost of water allocation to food and nature. Copyright © 2011 Elsevier Ltd. All rights reserved.
Sharma, Hitt J; Patil, Vishwanath D; Lalwani, Sanjay K; Manglani, Mamta V; Ravichandran, Latha; Kapre, Subhash V; Jadhav, Suresh S; Parekh, Sameer S; Ashtagi, Girija; Malshe, Nandini; Palkar, Sonali; Wade, Minal; Arunprasath, T K; Kumar, Dinesh; Shewale, Sunil D
2012-01-11
Hib vaccine can be easily incorporated in EPI vaccination schedule as the immunization schedule of Hib is similar to that of DTP vaccine. To meet the global demand of Hib vaccine, SIIL scaled up the Hib conjugate manufacturing process. This study was conducted in Indian infants to assess and compare the immunogenicity and safety of DTwP-HB+Hib (Pentavac(®)) vaccine of SIIL manufactured at large scale with the 'same vaccine' manufactured at a smaller scale. 720 infants aged 6-8 weeks were randomized (2:1 ratio) to receive 0.5 ml of Pentavac(®) vaccine from two different lots one produced at scaled up process and the other at a small scale process. Serum samples obtained before and at one month after the 3rd dose of vaccine from both the groups were tested for IgG antibody response by ELISA and compared to assess non-inferiority. Neither immunological interference nor increased reactogenicity was observed in either of the vaccine groups. All infants developed protective antibody titres to diphtheria, tetanus and Hib disease. For hepatitis B antigen, one child from each group remained sero-negative. The response to pertussis was 88% in large scale group vis-à-vis 87% in small scale group. Non-inferiority was concluded for all five components of the vaccine. No serious adverse event was reported in the study. The scale up vaccine achieved comparable response in terms of the safety and immunogenicity to small scale vaccine and therefore can be easily incorporated in the routine childhood vaccination programme. Copyright © 2011 Elsevier Ltd. All rights reserved.
Wang, W J
2016-07-06
There is a large population at high risk for diabetes in China, and there has been a dramatic increase in the incidence of diabetes in the country over the past 30 years. Interventions targeting the individual risk factors of diabetes can effectively prevent diabetes; these include factors such as an unhealthy diet, lack of physical activity, overweight, and obesity, among others. Evaluation of related knowledge, attitudes, and behaviors before and after intervention using appropriate scales can measure population demands and the effectiveness of interventions. Scientificity and practicability are basic requirements of scale development. The theoretical basis and measuring items of a scale should be consistent with the theory of behavior change and should measure the content of interventions in a standardized and detailed manner to produce good validity, reliability, and acceptability. The scale of knowledge, attitude, and behavior of lifestyle intervention in a diabetes high-risk population is a tool for demand evaluation and effect evaluation of lifestyle intervention that has good validity and reliability. Established by the National Center for Chronic and Noncommunicable Disease Control and Prevention, its use can help to decrease the Chinese population at high risk for diabetes through targeted and scientifically sound lifestyle interventions. Future development of intervention evaluation scales for useing in high-risk populations should consider new factors and characteristics of the different populations, to develop new scales and modify or simplify existing ones, as well as to extend the measurement dimensions to barriers and supporting environment for behaviors change.
Gender and regional differences in perceived job stress across Europe.
de Smet, P; Sans, S; Dramaix, M; Boulenguez, C; de Backer, G; Ferrario, M; Cesana, G; Houtman, I; Isacsson, S O; Kittel, F; Ostergren, P O; Peres, I; Pelfrene, E; Romon, M; Rosengren, A; Wilhelmsen, L; Kornitzer, M
2005-10-01
Over the last 20 years stress at work has been found to be predictive of several conditions such as coronary heart disease, high blood pressure and non-specific sick leave. The Karasek demand/control/strain concept has been the most widely used in prospective epidemiological studies. To describe distribution in Karasek's demand/control (DC) dimensions as well as prevalence of strain in samples from different parts of Europe grouped into three regions (South, Middle, Sweden), adjusting for occupation. To describe gender differences in Karasek's DC dimensions along with strain prevalence and assess the regional stability of those differences in different occupational groups. The Job stress, Absenteeism and Coronary heart disease in Europe (JACE) study, a Concerted Action (Biomed I) of the European Union, is a multicentre prospective cohort epidemiological study: 38,019 subjects at work aged 35-59 years were surveyed at baseline. Standardised techniques were used for occupation coding (International Standardised Classification of Occupations) and for the DC model (Karasek scale): five items for the psychological demand and nine items for the control or decision latitude dimensions, respectively. A total of 34,972 subjects had a complete data set. There were important regional differences in the Karasek scales and in prevalence of strain even after adjustment for occupational class. Mean demand and control were higher in the Swedish centres when compared to two centres in Milano and Barcelona (Southern region) and values observed in four centres (Ghent, Brussels, Lille and Hoofddorp) in Middle Europe were closer to those observed in the Southern cities than to those obtained in the Swedish cities. Clerks (ISCO 4) and, more specifically, office clerks (ISCO 41) exhibited the smallest regional variation. In a multivariate model, the factor 'region' explained a small fraction of total variance. In the two Southern centres as well as in the four Middle European centres, men perceived marginally less job-demand as compared to women whereas the reverse was observed in the two Swedish centres. Differences were larger for control: men appeared to perceive more control at work than did women. In a multivariate model, gender explained a small fraction whereas occupational level explained a large fraction of the variance. In this standardised multicentre European study Karasek's DC model showed large gender and occupational differences whereas geographic region explained a small fraction of the total DC variance, notwithstanding large differences in labour market and working conditions as pointed out by the European Commission as recently as 2000.
Dynamics Behaviors of Scale-Free Networks with Elastic Demand
NASA Astrophysics Data System (ADS)
Li, Yan-Lai; Sun, Hui-Jun; Wu, Jian-Jun
Many real-world networks, such as transportation networks and Internet, have the scale-free properties. It is important to study the bearing capacity of such networks. Considering the elastic demand condition, we analyze load distributions and bearing capacities with different parameters through artificially created scale-free networks. The simulation results show that the load distribution follows a power-law form, which means some ordered pairs, playing the dominant role in the transportation network, have higher demand than other pairs. We found that, with the decrease of perceptual error, the total and average ordered pair demand will decrease and then stay in a steady state. However, with the increase of the network size, the average demand of each ordered pair will decrease, which is particularly interesting for the network design problem.
NASA Astrophysics Data System (ADS)
Vanclooster, Marnik
2010-05-01
The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.
NASA Astrophysics Data System (ADS)
Strzepek, K. M.; Kirshen, P.; Yohe, G.
2001-05-01
The fundamental theme of this research was to investigate tradeoffs in model resolution for modeling water resources in the context of national economic development and capital investment decisions.. Based on a case study of China, the research team has developed water resource models at relatively fine scales, then investigated how they can be aggregated to regional or national scales and for use in national level planning decisions or global scale integrated assessment models of food and/or environmental change issues. The team has developed regional water supply and water demand functions.. Simplifying and aggregating the supply and demand functions will allow reduced form functions of the water sector for inclusion in large scale national economic models. Water Supply Cost functions were developed looking at both surface and groundwater supplies. Surface Water: Long time series of flows at the mouths of the 36 major river sub-basins in China are used in conjunction with different basin reservoir storage quantities to obtain storage-yield curves. These are then combined with reservoir and transmission cost data to obtain yield-cost or surface water demand curves. The methodology to obtain the long time series of flows for each basin is to fit a simple abcd water balance model to each basin. The costs of reservoir storage have been estimated by using a methodology developed in the USA that relates marginal storage costs to existing storage, slope and geological conditions. USA costs functions have then been adjusted to Chinese costs. The costs of some actual dams in China were used to "ground-truth" the methodology. Groundwater: The purpose of the groundwater work is to estimate the recharge in each basin, and the depths and quality of water of aquifers. A byproduct of the application of the abcd water balance model is the recharge. Depths and quality of aquifers are being taken from many separate reports on groundwater in different parts of China; we have been unable to find any global or regional datasets of groundwater.. Combining Surface and Groundwater Supply Functions Water Demand Curves. Water Use data is reported on political regions. Water Supply is reported and modeled on river basin regions. It is necessary to allocate water demands to river basins. To accomplish this China's 9 major river basins were divided into 36 hydroeconomic regions. The counties were then allocated to one of the 36-hydroeconomic zones. The county-level water use data was aggregated to 5 major water use sectors: 1)industry; 2)urban municipal and vegetable gardens: 3) major irrigation; 4) Energy and 5)Other agriculture (forestry, pasture; fishery). Sectoral Demand functions that include price and income elasticity were developed for the 5 sectors for each of the 9 major river basin. The supply and demand curves were aggregated at a variety of geographical scales as well as levels of economic sectoral aggregation. Implications for investment and sustainable development policies were examined for the various aggregation using partial and general equilibrium modeling of the Chinese economy. These results and policy implications for China as well as insights and recommendation for other developing countries will be presented.
An ultrasensitive strain sensor with a wide strain range based on graphene armour scales.
Yang, Yi-Fan; Tao, Lu-Qi; Pang, Yu; Tian, He; Ju, Zhen-Yi; Wu, Xiao-Ming; Yang, Yi; Ren, Tian-Ling
2018-06-12
An ultrasensitive strain sensor with a wide strain range based on graphene armour scales is demonstrated in this paper. The sensor shows an ultra-high gauge factor (GF, up to 1054) and a wide strain range (ε = 26%), both of which present an advantage compared to most other flexible sensors. Moreover, the sensor is developed by a simple fabrication process. Due to the excellent performance, this strain sensor can meet the demands of subtle, large and complex human motion monitoring, which indicates its tremendous application potential in health monitoring, mechanical control, real-time motion monitoring and so on.
NASA Astrophysics Data System (ADS)
Wada, Y.; Wisser, D.; Bierkens, M. F. P.
2014-01-01
To sustain growing food demand and increasing standard of living, global water withdrawal and consumptive water use have been increasing rapidly. To analyze the human perturbation on water resources consistently over large scales, a number of macro-scale hydrological models (MHMs) have been developed in recent decades. However, few models consider the interaction between terrestrial water fluxes, and human activities and associated water use, and even fewer models distinguish water use from surface water and groundwater resources. Here, we couple a global water demand model with a global hydrological model and dynamically simulate daily water withdrawal and consumptive water use over the period 1979-2010, using two re-analysis products: ERA-Interim and MERRA. We explicitly take into account the mutual feedback between supply and demand, and implement a newly developed water allocation scheme to distinguish surface water and groundwater use. Moreover, we include a new irrigation scheme, which works dynamically with a daily surface and soil water balance, and incorporate the newly available extensive Global Reservoir and Dams data set (GRanD). Simulated surface water and groundwater withdrawals generally show good agreement with reported national and subnational statistics. The results show a consistent increase in both surface water and groundwater use worldwide, with a more rapid increase in groundwater use since the 1990s. Human impacts on terrestrial water storage (TWS) signals are evident, altering the seasonal and interannual variability. This alteration is particularly large over heavily regulated basins such as the Colorado and the Columbia, and over the major irrigated basins such as the Mississippi, the Indus, and the Ganges. Including human water use and associated reservoir operations generally improves the correlation of simulated TWS anomalies with those of the GRACE observations.
NASA Astrophysics Data System (ADS)
Wada, Y.; Wisser, D.; Bierkens, M. F.
2014-12-01
To sustain growing food demand and increasing standard of living, global water withdrawal and consumptive water use have been increasing rapidly. To analyze the human perturbation on water resources consistently over large scales, a number of macro-scale hydrological models (MHMs) have been developed in recent decades. However, few models consider the interaction between terrestrial water fluxes, and human activities and associated water use, and even fewer models distinguish water use from surface water and groundwater resources. Here, we couple a global water demand model with a global hydrological model and dynamically simulate daily water withdrawal and consumptive water use over the period 1979-2010, using two re-analysis products: ERA-Interim and MERRA. We explicitly take into account the mutual feedback between supply and demand, and implement a newly developed water allocation scheme to distinguish surface water and groundwater use. Moreover, we include a new irrigation scheme, which works dynamically with a daily surface and soil water balance, and incorporate the newly available extensive global reservoir data set (GRanD). Simulated surface water and groundwater withdrawals generally show good agreement with reported national and sub-national statistics. The results show a consistent increase in both surface water and groundwater use worldwide, with a more rapid increase in groundwater use since the 1990s. Human impacts on terrestrial water storage (TWS) signals are evident, altering the seasonal and inter-annual variability. This alteration is particularly large over heavily regulated basins such as the Colorado and the Columbia, and over the major irrigated basins such as the Mississippi, the Indus, and the Ganges. Including human water use and associated reservoir operations generally improves the correlation of simulated TWS anomalies with those of the GRACE observations.
The cosmological principle is not in the sky
NASA Astrophysics Data System (ADS)
Park, Chan-Gyung; Hyun, Hwasu; Noh, Hyerim; Hwang, Jai-chan
2017-08-01
The homogeneity of matter distribution at large scales, known as the cosmological principle, is a central assumption in the standard cosmological model. The case is testable though, thus no longer needs to be a principle. Here we perform a test for spatial homogeneity using the Sloan Digital Sky Survey Luminous Red Galaxies (LRG) sample by counting galaxies within a specified volume with the radius scale varying up to 300 h-1 Mpc. We directly confront the large-scale structure data with the definition of spatial homogeneity by comparing the averages and dispersions of galaxy number counts with allowed ranges of the random distribution with homogeneity. The LRG sample shows significantly larger dispersions of number counts than the random catalogues up to 300 h-1 Mpc scale, and even the average is located far outside the range allowed in the random distribution; the deviations are statistically impossible to be realized in the random distribution. This implies that the cosmological principle does not hold even at such large scales. The same analysis of mock galaxies derived from the N-body simulation, however, suggests that the LRG sample is consistent with the current paradigm of cosmology, thus the simulation is also not homogeneous in that scale. We conclude that the cosmological principle is neither in the observed sky nor demanded to be there by the standard cosmological world model. This reveals the nature of the cosmological principle adopted in the modern cosmology paradigm, and opens a new field of research in theoretical cosmology.
Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis
2012-01-01
Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand. PMID:22276739
Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis.
Tu, Jing; Ge, Qinyu; Wang, Shengqin; Wang, Lei; Sun, Beili; Yang, Qi; Bai, Yunfei; Lu, Zuhong
2012-01-25
The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.
Gibon, Thomas; Wood, Richard; Arvesen, Anders; Bergesen, Joseph D; Suh, Sangwon; Hertwich, Edgar G
2015-09-15
Climate change mitigation demands large-scale technological change on a global level and, if successfully implemented, will significantly affect how products and services are produced and consumed. In order to anticipate the life cycle environmental impacts of products under climate mitigation scenarios, we present the modeling framework of an integrated hybrid life cycle assessment model covering nine world regions. Life cycle assessment databases and multiregional input-output tables are adapted using forecasted changes in technology and resources up to 2050 under a 2 °C scenario. We call the result of this modeling "technology hybridized environmental-economic model with integrated scenarios" (THEMIS). As a case study, we apply THEMIS in an integrated environmental assessment of concentrating solar power. Life-cycle greenhouse gas emissions for this plant range from 33 to 95 g CO2 eq./kWh across different world regions in 2010, falling to 30-87 g CO2 eq./kWh in 2050. Using regional life cycle data yields insightful results. More generally, these results also highlight the need for systematic life cycle frameworks that capture the actual consequences and feedback effects of large-scale policies in the long term.
Estimating of Soil Texture Using Landsat Imagery: a Case Study in Thatta Tehsil, Sindh
NASA Astrophysics Data System (ADS)
Khalil, Zahid
2016-07-01
Soil texture is considered as an important environment factor for agricultural growth. It is the most essential part for soil classification in large scale. Today the precise soil information in large scale is of great demand from various stakeholders including soil scientists, environmental managers, land use planners and traditional agricultural users. With the increasing demand of soil properties in fine scale spatial resolution made the traditional laboratory methods inadequate. In addition the costs of soil analysis with precision agriculture systems are more expensive than traditional methods. In this regard, the application of geo-spatial techniques can be used as an alternative for examining soil analysis. This study aims to examine the ability of Geo-spatial techniques in identifying the spatial patterns of soil attributes in fine scale. Around 28 samples of soil were collected from the different areas of Thatta Tehsil, Sindh, Pakistan for analyzing soil texture. An Ordinary Least Square (OLS) regression analysis was used to relate the reflectance values of Landsat8 OLI imagery with the soil variables. The analysis showed there was a significant relationship (p<0.05) of band 2 and 5 with silt% (R2 = 0.52), and band 4 and 6 with clay% (R2 =0.40). The equation derived from OLS analysis was then used for the whole study area for deriving soil attributes. The USDA textural classification triangle was implementing for the derivation of soil texture map in GIS environment. The outcome revealed that the 'sandy loam' was in great quantity followed by loam, sandy clay loam and clay loam. The outcome shows that the Geo-spatial techniques could be used efficiently for mapping soil texture of a larger area in fine scale. This technology helped in decreasing cost, time and increase detailed information by reducing field work to a considerable level.
NASA Astrophysics Data System (ADS)
Manfredi, Sabato
2016-06-01
Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
Advances in polycrystalline thin-film photovoltaics for space applications
NASA Technical Reports Server (NTRS)
Lanning, Bruce R.; Armstrong, Joseph H.; Misra, Mohan S.
1994-01-01
Polycrystalline, thin-film photovoltaics represent one of the few (if not the only) renewable power sources which has the potential to satisfy the demanding technical requirements for future space applications. The demand in space is for deployable, flexible arrays with high power-to-weight ratios and long-term stability (15-20 years). In addition, there is also the demand that these arrays be produced by scalable, low-cost, high yield, processes. An approach to significantly reduce costs and increase reliability is to interconnect individual cells series via monolithic integration. Both CIS and CdTe semiconductor films are optimum absorber materials for thin-film n-p heterojunction solar cells, having band gaps between 0.9-1.5 ev and demonstrated small area efficiencies, with cadmium sulfide window layers, above 16.5 percent. Both CIS and CdTe polycrystalline thin-film cells have been produced on a laboratory scale by a variety of physical and chemical deposition methods, including evaporation, sputtering, and electrodeposition. Translating laboratory processes which yield these high efficiency, small area cells into the design of a manufacturing process capable of producing 1-sq ft modules, however, requires a quantitative understanding of each individual step in the process and its (each step) effect on overall module performance. With a proper quantification and understanding of material transport and reactivity for each individual step, manufacturing process can be designed that is not 'reactor-specific' and can be controlled intelligently with the design parameters of the process. The objective of this paper is to present an overview of the current efforts at MMC to develop large-scale manufacturing processes for both CIS and CdTe thin-film polycrystalline modules. CIS cells/modules are fabricated in a 'substrate configuration' by physical vapor deposition techniques and CdTe cells/modules are fabricated in a 'superstrate configuration' by wet chemical methods. Both laser and mechanical scribing operations are used to monolithically integrate (series interconnect) the individual cells into modules. Results will be presented at the cell and module development levels with a brief description of the test methods used to qualify these devices for space applications. The approach and development efforts are directed towards large-scale manufacturability of established thin-film, polycrystalline processing methods for large area modules with less emphasis on maximizing small area efficiencies.
Future aircraft networks and schedules
NASA Astrophysics Data System (ADS)
Shu, Yan
2011-07-01
Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents computational results of these large-scale instances. To validate the models and solution algorithms developed, this thesis also compares the daily flight schedules that it designs with the schedules of the existing airlines. Furthermore, it creates instances that represent different economic and fuel-prices conditions and derives schedules under these different conditions. In addition, it discusses the implication of using new aircraft in the future flight schedules. Finally, future research in three areas---model, computational method, and simulation for validation---is proposed.
ERIC Educational Resources Information Center
Suldo, Shannon M.; Dedrick, Robert F.; Shaunessy-Dedrick, Elizabeth; Fefer, Sarah A.; Ferron, John
2015-01-01
Successful coping with academic demands is important given the inverse relationship between stress and positive adjustment in adolescents. The Coping With Academic Demands Scale (CADS) is a new measure of coping appropriate for students pursuing advanced high school curricula, specifically Advanced Placement (AP) classes and the International…
a Model Study of Small-Scale World Map Generalization
NASA Astrophysics Data System (ADS)
Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.
2018-04-01
With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.
Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology.
Oliver, Ruth Y; Ellis, Daniel P W; Chmura, Helen E; Krause, Jesse S; Pérez, Jonathan H; Sweet, Shannan K; Gough, Laura; Wingfield, John C; Boelman, Natalie T
2018-06-01
Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape's snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change.
Voltage collapse in complex power grids
Simpson-Porco, John W.; Dörfler, Florian; Bullo, Francesco
2016-01-01
A large-scale power grid's ability to transfer energy from producers to consumers is constrained by both the network structure and the nonlinear physics of power flow. Violations of these constraints have been observed to result in voltage collapse blackouts, where nodal voltages slowly decline before precipitously falling. However, methods to test for voltage collapse are dominantly simulation-based, offering little theoretical insight into how grid structure influences stability margins. For a simplified power flow model, here we derive a closed-form condition under which a power network is safe from voltage collapse. The condition combines the complex structure of the network with the reactive power demands of loads to produce a node-by-node measure of grid stress, a prediction of the largest nodal voltage deviation, and an estimate of the distance to collapse. We extensively test our predictions on large-scale systems, highlighting how our condition can be leveraged to increase grid stability margins. PMID:26887284
A high-throughput assay for quantifying appetite and digestive dynamics.
Jordi, Josua; Guggiana-Nilo, Drago; Soucy, Edward; Song, Erin Yue; Lei Wee, Caroline; Engert, Florian
2015-08-15
Food intake and digestion are vital functions, and their dysregulation is fundamental for many human diseases. Current methods do not support their dynamic quantification on large scales in unrestrained vertebrates. Here, we combine an infrared macroscope with fluorescently labeled food to quantify feeding behavior and intestinal nutrient metabolism with high temporal resolution, sensitivity, and throughput in naturally behaving zebrafish larvae. Using this method and rate-based modeling, we demonstrate that zebrafish larvae match nutrient intake to their bodily demand and that larvae adjust their digestion rate, according to the ingested meal size. Such adaptive feedback mechanisms make this model system amenable to identify potential chemical modulators. As proof of concept, we demonstrate that nicotine, l-lysine, ghrelin, and insulin have analogous impact on food intake as in mammals. Consequently, the method presented here will promote large-scale translational research of food intake and digestive function in a naturally behaving vertebrate. Copyright © 2015 the American Physiological Society.
A high-throughput assay for quantifying appetite and digestive dynamics
Guggiana-Nilo, Drago; Soucy, Edward; Song, Erin Yue; Lei Wee, Caroline; Engert, Florian
2015-01-01
Food intake and digestion are vital functions, and their dysregulation is fundamental for many human diseases. Current methods do not support their dynamic quantification on large scales in unrestrained vertebrates. Here, we combine an infrared macroscope with fluorescently labeled food to quantify feeding behavior and intestinal nutrient metabolism with high temporal resolution, sensitivity, and throughput in naturally behaving zebrafish larvae. Using this method and rate-based modeling, we demonstrate that zebrafish larvae match nutrient intake to their bodily demand and that larvae adjust their digestion rate, according to the ingested meal size. Such adaptive feedback mechanisms make this model system amenable to identify potential chemical modulators. As proof of concept, we demonstrate that nicotine, l-lysine, ghrelin, and insulin have analogous impact on food intake as in mammals. Consequently, the method presented here will promote large-scale translational research of food intake and digestive function in a naturally behaving vertebrate. PMID:26108871
Scalable parallel distance field construction for large-scale applications
Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; ...
2015-10-01
Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate itsmore » efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.« less
Possible implications of large scale radiation processing of food
NASA Astrophysics Data System (ADS)
Zagórski, Z. P.
Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.
Scalable Parallel Distance Field Construction for Large-Scale Applications.
Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H
2015-10-01
Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications.
The Practical Impact of Recent Computer Advances on the Analysis and Design of Large Scale Networks
1974-06-01
Capacity Considerations," ARPA Network Information Center, Stanford Research Institute. 10. Gitman , I., R. M. VanSlyke, H. Frank: "On Splitting...281-285. 12. Gitman , I., "On : ^e Capacity of Slotted ALOHA Networks and Some Design Problems", ARPANET Network Information Center, Stanford...sum of the average demands of that population." Gitman , Van Slyke, and Frank [3], have addressed the problem of splitting a channel between two
Assessment of the Study of Army Logistics 1981. Volume II. Analysis of Recommendations.
1983-02-01
conceived. This third generation equipment, because of its size, cost and processing characteristics, demands large scale integrated processing with a... generated by DS4. Three systems changes to SAILS ABX have been implemented which reduce the volume of supply status provided to the DS4 system. 15... generated by the wholesale system by 50 percent or nearly 1,000,000 transactions per month. Additional reductions will be generated by selected status
Zamosky, Lisa
2014-01-10
Access: It's one word that may ultimately reignite the expansion of retail medicine in 2014 and beyond. CVS Caremark has added 200 new clinics since 2011, with another 850 planned by 2017. While it's still too soon to predict a large-scale national expansion in clinic numbers, some experts believe their calling card--convenience--should be a consideration for every medical practice in the United States.
Theta, mental flexibility, and post-traumatic stress disorder: connecting in the parietal cortex.
Dunkley, Benjamin T; Sedge, Paul A; Doesburg, Sam M; Grodecki, Richard J; Jetly, Rakesh; Shek, Pang N; Taylor, Margot J; Pang, Elizabeth W
2015-01-01
Post-traumatic stress disorder (PTSD) is a mental health injury characterised by re-experiencing, avoidance, numbing and hyperarousal. Whilst the aetiology of the disorder is relatively well understood, there is debate about the prevalence of cognitive sequelae that manifest in PTSD. In particular, there are conflicting reports about deficits in executive function and mental flexibility. Even less is known about the neural changes that underlie such deficits. Here, we used magnetoencephalography to study differences in functional connectivity during a mental flexibility task in combat-related PTSD (all males, mean age = 37.4, n = 18) versus a military control (all males, mean age = 33.05, n = 19) group. We observed large-scale increases in theta connectivity in the PTSD group compared to controls. The PTSD group performance was compromised in the more attentionally-demanding task and this was characterised by 'late-stage' theta hyperconnectivity, concentrated in network connections involving right parietal cortex. Furthermore, we observed significant correlations with the connectivity strength in this region with a number of cognitive-behavioural outcomes, including measures of attention, depression and anxiety. These findings suggest atypical coordination of neural synchronisation in large scale networks contributes to deficits in mental flexibility for PTSD populations in timed, attentionally-demanding tasks, and this propensity toward network hyperconnectivity may play a more general role in the cognitive sequelae evident in this disorder.
Craig, Adam T; Joshua, Cynthia A; Sio, Alison R; Teobasi, Bobby; Dofai, Alfred; Dalipanda, Tenneth; Hardie, Kate; Kaldor, John; Kolbe, Anthony
2018-01-01
Between August-2016 and April-2017, Solomon Islands experienced the largest and longest-running dengue outbreak on record in the country, with 12,329 suspected cases, 877 hospitalisations and 16 deaths. We conducted a retrospective review of related data and documents, and conducted key informant interviews to characterise the event and investigate the adaptability of syndromic surveillance for enhanced and expanded data collection during a public health emergency in a low resource country setting. While the outbreak quickly consumed available public and clinical resources, we found that authorities were able to scale up the conventional national syndrome-based early warning surveillance system to support the increased information demands during the event demonstrating the flexibility of the system and syndromic surveillance more broadly. Challenges in scaling up included upskilling and assisting staff with no previous experience of the tasks required; managing large volumes of data; maintaining data quality for the duration of the outbreak; harmonising routine and enhanced surveillance data and maintaining surveillance for other diseases; producing information optimally useful for response planning; and managing staff fatigue. Solomon Islands, along with other countries of the region remains vulnerable to outbreaks of dengue and other communicable diseases. Ensuring surveillance systems are robust and able to adapt to changing demands during emergencies should be a health protection priority.
A weather regime characterisation of Irish wind generation and electricity demand in winters 2009–11
NASA Astrophysics Data System (ADS)
Cradden, Lucy C.; McDermott, Frank
2018-05-01
Prolonged cold spells were experienced in Ireland in the winters of 2009–10 and 2010–11, and electricity demand was relatively high at these times, whilst wind generation capacity factors were low. Such situations can cause difficulties for an electricity system with a high dependence on wind energy. Studying the atmospheric conditions associated with these two winters offers insights into the large-scale drivers for cold, calm spells, and helps to evaluate if they are rare events over the long-term. The influence of particular atmospheric patterns on coincidental winter wind generation and weather-related electricity demand is investigated here, with a focus on blocking in the North Atlantic/European sector. The occurrences of such patterns in the 2009–10 and 2010–11 winters are examined, and 2010–11 in particular was found to be unusual in a long-term context. The results are discussed in terms of the relevance to long-term planning and investment in the electricity system.
Cunningham, James K; Liu, Lon-Mu
2008-04-01
Research is needed to help treatment programs plan for the impacts of drug suppression efforts. Studies to date indicate that heroin suppression may increase treatment demand. This study examines whether treatment demand was impacted by a major US methamphetamine suppression policy -- legislation regulating precursor chemicals. The precursors ephedrine and pseudoephedrine, in forms used by large-scale methamphetamine producers, were regulated in August 1995 and October 1997, respectively. ARIMA-intervention time-series analysis was used to examine the impact of each precursor's regulation on monthly voluntary methamphetamine treatment admissions (a measure of treatment demand), including first-time admissions and re-admissions, in California (1992-2004). Cocaine, heroin, and alcohol treatment admissions were used as quasi-control series. The 1995 regulation of ephedrine was found to be associated with a significant reduction in methamphetamine treatment admissions that lasted approximately 2 years. The 1997 regulation of pseudoephedrine was associated with a significant reduction that lasted approximately 4 years. First-time admissions declined more than re-admissions. Cocaine, heroin, and alcohol admissions were generally unaffected. While heroin suppression may be associated with increased treatment demand as suggested by research to date, this study indicates that methamphetamine precursor regulation was associated with decreases in treatment demand. A possible explanation is that, during times of suppression, heroin users may seek treatment to obtain substitute drugs (e.g., methadone), while methamphetamine users have no comparable incentive. Methamphetamine suppression may particularly impact treatment demand among newer users, as indicated by larger declines in first-time admissions.
Role of the Freight Sector in Future Climate Change Mitigation Scenarios
Muratori, Matteo; Smith, Steven J.; Kyle, Page; ...
2017-02-27
The freight sector's role is examined using the Global Change Assessment Model (GCAM) for a range of climate change mitigation scenarios and future freight demand assumptions. Energy usage and CO 2 emissions from freight have historically grown with a correlation to GDP, and there is limited evidence of near-term global decoupling of freight demand from GDP. Over the 21 st century, greenhouse gas (GHG) emissions from freight are projected to grow faster than passenger transportation or other major end-use sectors, with the magnitude of growth dependent on the assumed extent of long-term decoupling. In climate change mitigation scenarios that applymore » a price to GHG emissions, mitigation of freight emissions (including the effects of demand elasticity, mode and technology shifting, and fuel substitution) is more limited than for other demand sectors. In such scenarios, shifting to less-emitting transportation modes and technologies is projected to play a relatively small role in reducing freight emissions in GCAM. Finally, by contrast, changes in the supply chain of liquid fuels that reduce the fuel carbon intensity, especially deriving from large-scale use of biofuels coupled to carbon capture and storage technologies, are responsible for the majority of freight emissions mitigation, followed by price-induced reduction in freight demand services.« less
Role of the Freight Sector in Future Climate Change Mitigation Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muratori, Matteo; Smith, Steven J.; Kyle, Page
The freight sector's role is examined using the Global Change Assessment Model (GCAM) for a range of climate change mitigation scenarios and future freight demand assumptions. Energy usage and CO 2 emissions from freight have historically grown with a correlation to GDP, and there is limited evidence of near-term global decoupling of freight demand from GDP. Over the 21 st century, greenhouse gas (GHG) emissions from freight are projected to grow faster than passenger transportation or other major end-use sectors, with the magnitude of growth dependent on the assumed extent of long-term decoupling. In climate change mitigation scenarios that applymore » a price to GHG emissions, mitigation of freight emissions (including the effects of demand elasticity, mode and technology shifting, and fuel substitution) is more limited than for other demand sectors. In such scenarios, shifting to less-emitting transportation modes and technologies is projected to play a relatively small role in reducing freight emissions in GCAM. Finally, by contrast, changes in the supply chain of liquid fuels that reduce the fuel carbon intensity, especially deriving from large-scale use of biofuels coupled to carbon capture and storage technologies, are responsible for the majority of freight emissions mitigation, followed by price-induced reduction in freight demand services.« less
Role of the Freight Sector in Future Climate Change Mitigation Scenarios.
Muratori, Matteo; Smith, Steven J; Kyle, Page; Link, Robert; Mignone, Bryan K; Kheshgi, Haroon S
2017-03-21
The freight sector's role is examined using the Global Change Assessment Model (GCAM) for a range of climate change mitigation scenarios and future freight demand assumptions. Energy usage and CO 2 emissions from freight have historically grown with a correlation to GDP, and there is limited evidence of near-term global decoupling of freight demand from GDP. Over the 21 st century, greenhouse gas (GHG) emissions from freight are projected to grow faster than passenger transportation or other major end-use sectors, with the magnitude of growth dependent on the assumed extent of long-term decoupling. In climate change mitigation scenarios that apply a price to GHG emissions, mitigation of freight emissions (including the effects of demand elasticity, mode and technology shifting, and fuel substitution) is more limited than for other demand sectors. In such scenarios, shifting to less-emitting transportation modes and technologies is projected to play a relatively small role in reducing freight emissions in GCAM. By contrast, changes in the supply chain of liquid fuels that reduce the fuel carbon intensity, especially deriving from large-scale use of biofuels coupled to carbon capture and storage technologies, are responsible for the majority of freight emissions mitigation, followed by price-induced reduction in freight demand services.
NASA Astrophysics Data System (ADS)
Neverre, Noémie; Dumas, Patrice
2014-05-01
The aim is to be able to assess future domestic water demands in a region with heterogeneous levels of economic development. This work offers an original combination of a quantitative projection of demands (similar to WaterGAP methodology) and an estimation of the marginal benefit of water. This method is applicable to different levels of economic development and usable for large-scale hydroeconomic modelling. The global method consists in building demand functions taking into account the impact of both the price of water and the level of equipment, proxied by economic development, on domestic water demand. Our basis is a 3-blocks inverse demand function: the first block consists of essential water requirements for food and hygiene; the second block matches intermediate needs; and the last block corresponds to additional water consumption, such as outdoor uses, which are the least valued. The volume of the first block is fixed to match recommended basic water requirements from the literature, but we assume that the volume limits of blocks 2 and 3 depend on the level of household equipment and therefore evolve with the level of GDP per capita (structural change), with a saturation. For blocks 1 and 2 we determine the value of water from elasticity, price and quantity data from the literature, using the point-extension method. For block 3, we use a hypothetical zero-cost demand and maximal demand with actual water costs to linearly interpolate the inverse demand function. These functions are calibrated on the 24 countries part of the Mediterranean basin using data from SIMEDD, and are used for the projection and valuation of domestic water demands at the 2050 horizon. They enable to project total water demand, and also the respective shares of the different categories of demand (basic demand, intermediate demand and additional uses). These projections are performed under different combined scenarios of population, GDP and water costs.
Diagnosing phosphorus limitations in natural terrestrial ecosystems in carbon cycle models
Sun, Yan; Peng, Shushi; Goll, Daniel S.; ...
2017-04-28
Most of the Earth System Models (ESMs) project increases in net primary productivity (NPP) and terrestrial carbon (C) storage during the 21st century. Despite empirical evidence that limited availability of phosphorus (P) may limit the response of NPP to increasing atmospheric CO 2, none of the ESMs used in the previous Intergovernmental Panel on Climate Change assessment accounted for P limitation. We diagnosed from ESM simulations the amount of P need to support increases in carbon uptake by natural ecosystems using two approaches: the demand derived from changes in C stocks and changes in NPP. The C stock-based additional Pmore » demand was estimated to range between -31 and 193 Tg P and between -89 and 262 Tg P for Representative Concentration Pathway (RCP) 2.6 and RCP8.5, respectively, with negative values indicating a P surplus. The NPP-based demand, which takes ecosystem P recycling into account, results in a significantly higher P demand of 648–1606 Tg P for RCP2.6 and 924–2110 Tg P for RCP8.5. We found that the P demand is sensitive to the turnover of P in decomposing plant material, explaining the large differences between the NPP-based demand and C stock-based demand. The discrepancy between diagnosed P demand and actual P availability (potential P deficit) depends mainly on the assumptions about availability of the different soil P forms. Altogether, future P limitation strongly depends on both soil P availability and P recycling on ecosystem scale.« less
Ovretveit, John; Klazinga, Niek
2013-02-01
Both public and private health and social care services are facing increased and changing demands to improve quality and reduce costs. To enable local services to respond to these demands, governments and other organisations have established large scale improvement programmes. These usually seek to enable many services to make changes to apply proven improvements and to make use of quality improvement methods. The purpose of this paper is to provide an empirical description of how one organisation coordinated ten national improvement programmes between 2004 and 2010. It provides details which may be useful to others seeking to plan and implement such programmes, and also contributes to the understanding of knowledge translation and of network governance. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Mendel-GPU: haplotyping and genotype imputation on graphics processing units
Chen, Gary K.; Wang, Kai; Stram, Alex H.; Sobel, Eric M.; Lange, Kenneth
2012-01-01
Motivation: In modern sequencing studies, one can improve the confidence of genotype calls by phasing haplotypes using information from an external reference panel of fully typed unrelated individuals. However, the computational demands are so high that they prohibit researchers with limited computational resources from haplotyping large-scale sequence data. Results: Our graphics processing unit based software delivers haplotyping and imputation accuracies comparable to competing programs at a fraction of the computational cost and peak memory demand. Availability: Mendel-GPU, our OpenCL software, runs on Linux platforms and is portable across AMD and nVidia GPUs. Users can download both code and documentation at http://code.google.com/p/mendel-gpu/. Contact: gary.k.chen@usc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22954633
Contextualizing Embodied Resources in Global Food Trade
NASA Astrophysics Data System (ADS)
MacDonald, G. K.; Brauman, K. A.; Sun, S.; West, P. C.; Carlson, K. M.; Cassidy, E. S.; Gerber, J. S.; Ray, D. K.
2014-12-01
Trade in agricultural commodities has created increasingly complex linkages between resource use and food supplies across national borders. Understanding the degree to which food production and consumption relies on trade is vital to understanding how to sustainably meet growing food demands across scales. We use detailed bilateral trade statistics and data on agricultural management to examine the land use and water consumption embodied in agricultural trade, which we relate to basic nutritional indicators to show how trade contributes to food availability worldwide. Agricultural trade carries enough calories to provide >1.7 billion people a basic diet each year. We identify key commodities and producer-consumer relationships that disproportionately contribute to embodied resource use and flows of food nutrition at the global scale. For example, just 15 disproportionately large soybean trades comprised ~10% the total harvested area embodied in export production. We conclude by framing these results in terms of the fraction of each country's food production and consumption that is linked to international trade. These findings help to characterize how countries allocate resources to domestic versus foreign food demand.
Final Project Report. Scalable fault tolerance runtime technology for petascale computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Sadayappan, P
With the massive number of components comprising the forthcoming petascale computer systems, hardware failures will be routinely encountered during execution of large-scale applications. Due to the multidisciplinary, multiresolution, and multiscale nature of scientific problems that drive the demand for high end systems, applications place increasingly differing demands on the system resources: disk, network, memory, and CPU. In addition to MPI, future applications are expected to use advanced programming models such as those developed under the DARPA HPCS program as well as existing global address space programming models such as Global Arrays, UPC, and Co-Array Fortran. While there has been amore » considerable amount of work in fault tolerant MPI with a number of strategies and extensions for fault tolerance proposed, virtually none of advanced models proposed for emerging petascale systems is currently fault aware. To achieve fault tolerance, development of underlying runtime and OS technologies able to scale to petascale level is needed. This project has evaluated range of runtime techniques for fault tolerance for advanced programming models.« less
High performance cellular level agent-based simulation with FLAME for the GPU.
Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela
2010-05-01
Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.
Simulation-optimization of large agro-hydrosystems using a decomposition approach
NASA Astrophysics Data System (ADS)
Schuetze, Niels; Grundmann, Jens
2014-05-01
In this contribution a stochastic simulation-optimization framework for decision support for optimal planning and operation of water supply of large agro-hydrosystems is presented. It is based on a decomposition solution strategy which allows for (i) the usage of numerical process models together with efficient Monte Carlo simulations for a reliable estimation of higher quantiles of the minimum agricultural water demand for full and deficit irrigation strategies at small scale (farm level), and (ii) the utilization of the optimization results at small scale for solving water resources management problems at regional scale. As a secondary result of several simulation-optimization runs at the smaller scale stochastic crop-water production functions (SCWPF) for different crops are derived which can be used as a basic tool for assessing the impact of climate variability on risk for potential yield. In addition, microeconomic impacts of climate change and the vulnerability of the agro-ecological systems are evaluated. The developed methodology is demonstrated through its application on a real-world case study for the South Al-Batinah region in the Sultanate of Oman where a coastal aquifer is affected by saltwater intrusion due to excessive groundwater withdrawal for irrigated agriculture.
Cannabis cultivation in Spain: A profile of plantations, growers and production systems.
Alvarez, Arturo; Gamella, Juan F; Parra, Iván
2016-11-01
The European market for cannabis derivatives is being transformed. The cultivation of cannabis within the EU and the shift of demand from hashish to domestic marihuana are key aspects of this transformation. Spain, formerly central to the trade of Moroccan hashish, is becoming a marihuana-producing country. The emergence of "import-substitution" has been researched in other EU countries, but thus far the Spanish case remains undocumented. This paper is based on analysis of data of 748 cannabis plantations seized by Spanish police in 2013. The sample comprises reports of seizures identified through a survey of online news and police reports. "Event-analysis" methods were applied to these sources. The analysis offers a typology of plantations, a profile of participants and the different production systems, and a model of regional distribution. Half of the plantations were small (less than 42 plants) and half contained between 100 and 1000 plants, with an average size of 261 plants. About three-quarters of plants were cultivated indoors using stolen electricity. 86% of all plants seized were from large-scale plantations (more than 220 plants). Most plantations were located along the Mediterranean coast, where population and tourism are concentrated. Over three-quarters of those indicted by police were Spanish (85%). Among the foreign owners of big plantations, Dutch nationals predominated. The number of seized plants by province was directly associated with the number of grow shops (β=0.962, p<0.001). The rise of large-scale cannabis plantations in the Spanish Mediterranean coast is increasingly replacing import of Moroccan hashish. Indoor cultivation supported by grow shops, that provide the technology and know-how, seem to be the dominant form of organization in this emerging industry. Large-scale plantations may have met most of the demand for marihuana in 2013. Copyright © 2016 Elsevier B.V. All rights reserved.
Ge, Yuan; Wang, Xiaochang; Zheng, Yucong; Dzakpasu, Mawuli; Zhao, Yaqian; Xiong, Jiaqing
2015-09-01
The choice of substrates with high adsorption capacity, yet readily available and economical is vital for sustainable pollutants removal in constructed wetlands (CWs). Two identical large-scale demonstration horizontal subsurface flow (HSSF) CWs (surface area, 340 m(2); depth, 0.6 m; HLR, 0.2 m/day) with gravel or slag substrates were evaluated for their potential use in remediating polluted urban river water in the prevailing climate of northwest China. Batch experiments to elucidate phosphorus adsorption mechanisms indicated a higher adsorption capacity of slag (3.15 g/kg) than gravel (0.81 g/kg), whereby circa 20 % more total phosphorus (TP) removal was recorded in HSSF-slag than HSSF-gravel. TP removal occurred predominantly via CaO-slag dissolution followed by Ca phosphate precipitation. Moreover, average removals of chemical oxygen demand and biochemical oxygen demand were approximately 10 % higher in HSSF-slag than HSSF-gravel. Nevertheless, TP adsorption by slag seemed to get quickly saturated over the monitoring period, and the removal efficiency of the HSSF-slag approached that of the HSSF-gravel after 1-year continuous operation. In contrast, the two CWs achieved similar nitrogen removal during the 2-year monitoring period. Findings also indicated that gravel provided better support for the development of other wetland components such as biomass, whereby the biomass production and the amount of total nitrogen (TN; 43.1-59.0 g/m(2)) and TP (4.15-5.75 g/m(2)) assimilated by local Phragmites australis in HSSF-gravel were higher than that in HSSF-slag (41.2-52.0 g/m(2) and 3.96-4.07 g/m(2), respectively). Overall, comparable pollutant removal rates could be achieved in large-scale HSSF CWs with either gravel or slag as substrate and provide a possible solution for polluted urban river remediation in northern China.
Cork, Randy D.; Detmer, William M.; Friedman, Charles P.
1998-01-01
This paper describes details of four scales of a questionnaire—“Computers in Medical Care”—measuring attributes of computer use, self-reported computer knowledge, computer feature demand, and computer optimism of academic physicians. The reliability (i.e., precision, or degree to which the scale's result is reproducible) and validity (i.e., accuracy, or degree to which the scale actually measures what it is supposed to measure) of each scale were examined by analysis of the responses of 771 full-time academic physicians across four departments at five academic medical centers in the United States. The objectives of this paper were to define the psychometric properties of the scales as the basis for a future demonstration study and, pending the results of further validity studies, to provide the questionnaire and scales to the medical informatics community as a tool for measuring the attitudes of health care providers. Methodology: The dimensionality of each scale and degree of association of each item with the attribute of interest were determined by principal components factor analysis with othogonal varimax rotation. Weakly associated items (factor loading <.40) were deleted. The reliability of each resultant scale was computed using Cronbach's alpha coefficient. Content validity was addressed during scale construction; construct validity was examined through factor analysis and by correlational analyses. Results: Attributes of computer use, computer knowledge, and computer optimism were unidimensional, with the corresponding scales having reliabilities of.79,.91, and.86, respectively. The computer-feature demand attribute differentiated into two dimensions: the first reflecting demand for high-level functionality with reliability of.81 and the second demand for usability with reliability of.69. There were significant positive correlations between computer use, computer knowledge, and computer optimism scale scores and respondents' hands-on computer use, computer training, and self-reported computer sophistication. In addition, items posited on the computer knowledge scale to be more difficult generated significantly lower scores. Conclusion: The four scales of the questionnaire appear to measure with adequate reliability five attributes of academic physicians' attitudes toward computers in medical care: computer use, self-reported computer knowledge, demand for computer functionality, demand for computer usability, and computer optimism. Results of initial validity studies are positive, but further validation of the scales is needed. The URL of a downloadable HTML copy of the questionnaire is provided. PMID:9524349
NASA Astrophysics Data System (ADS)
Evans, J. D.; Tislin, D.
2017-12-01
Observations from the Joint Polar Satellite System (JPSS) support National Weather Service (NWS) forecasters, whose Advanced Weather Interactive Processing System (AWIPS) Data Delivery (DD) will access JPSS data products on demand from the National Environmental Satellite, Data, and Information Service (NESDIS) Product Distribution and Access (PDA) service. Based on the Open Geospatial Consortium (OGC) Web Coverage Service, this on-demand service promises broad interoperability and frugal use of data networks by serving only the data that a user needs. But the volume, velocity, and variety of JPSS data products impose several challenges to such a service. It must be efficient to handle large volumes of complex, frequently updated data, and to fulfill many concurrent requests. It must offer flexible data handling and delivery, to work with a diverse and changing collection of data, and to tailor its outputs into products that users need, with minimal coordination between provider and user communities. It must support 24x7 operation, with no pauses in incoming data or user demand; and it must scale to rapid changes in data volume, variety, and demand as new satellites launch, more products come online, and users rely increasingly on the service. We are addressing these challenges in order to build an efficient and effective on-demand JPSS data service. For example, on-demand subsetting by many users at once may overload a server's processing capacity or its disk bandwidth - unless alleviated by spatial indexing, geolocation transforms, or pre-tiling and caching. Filtering by variable (/ band / layer) may also alleviate network loads, and provide fine-grained variable selection; to that end we are investigating how best to provide random access into the variety of spatiotemporal JPSS data products. Finally, producing tailored products (derivatives, aggregations) can boost flexibility for end users; but some tailoring operations may impose significant server loads. Operating this service in a cloud computing environment allows cost-effective scaling during the development and early deployment phases - and perhaps beyond. We will discuss how NESDIS and NWS are assessing and addressing these challenges to provide timely and effective access to JPSS data products for weather forecasters throughout the country.
Allometric Scaling of the Active Hematopoietic Stem Cell Pool across Mammals
Dingli, David; Pacheco, Jorge M.
2006-01-01
Background Many biological processes are characterized by allometric relations of the type Y = Y 0 Mb between an observable Y and body mass M, which pervade at multiple levels of organization. In what regards the hematopoietic stem cell pool, there is experimental evidence that the size of the hematopoietic stem cell pool is conserved in mammals. However, demands for blood cell formation vary across mammals and thus the size of the active stem cell compartment could vary across species. Methodology/Principle Findings Here we investigate the allometric scaling of the hematopoietic system in a large group of mammalian species using reticulocyte counts as a marker of the active stem cell pool. Our model predicts that the total number of active stem cells, in an adult mammal, scales with body mass with the exponent ¾. Conclusion/Significance The scaling predicted here provides an intuitive justification of the Hayflick hypothesis and supports the current view of a small active stem cell pool supported by a large, quiescent reserve. The present scaling shows excellent agreement with the available (indirect) data for smaller mammals. The small size of the active stem cell pool enhances the role of stochastic effects in the overall dynamics of the hematopoietic system. PMID:17183646
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, A.; Han, T. Y.
Cuprous oxide is a p-type semiconducting material that has been highly researched for its interesting properties. Many small-scale syntheses have exhibited excellent control over size and morphology. As the demand for cuprous oxide grows, the synthesis method need to evolve to facilitate large-scale production. This paper supplies a facile bulk synthesis method for Cu₂O on average, 1-liter reaction volume can produce 1 gram of particles. In order to study the shape and size control mechanisms on such a scale, the reaction volume was diminished to 250 mL producing on average 0.3 grams of nanoparticles per batch. Well-shaped nanoparticles have beenmore » synthesized using an aqueous solution of CuCl₂, NaOH, SDS surfactant, and NH₂OH-HCl at mild temperatures. The time allotted between the addition of NaOH and NH₂OH-HCl was determined to be critical for Cu(OH)2 production, an important precursor to the final produce The effects of stirring rates on a large scale was also analyzed during reagent addition and post reagent addition. A morphological change from rhombic dodecahedra to spheres occurred as the stirring speed was increased. The effects of NH₂OH-HCl concentration were also studied to control the etching effects of the final product.« less
Scale-Free Networks and Commercial Air Carrier Transportation in the United States
NASA Technical Reports Server (NTRS)
Conway, Sheila R.
2004-01-01
Network science, or the art of describing system structure, may be useful for the analysis and control of large, complex systems. For example, networks exhibiting scale-free structure have been found to be particularly well suited to deal with environmental uncertainty and large demand growth. The National Airspace System may be, at least in part, a scalable network. In fact, the hub-and-spoke structure of the commercial segment of the NAS is an often-cited example of an existing scale-free network After reviewing the nature and attributes of scale-free networks, this assertion is put to the test: is commercial air carrier transportation in the United States well explained by this model? If so, are the positive attributes of these networks, e.g. those of efficiency, flexibility and robustness, fully realized, or could we effect substantial improvement? This paper first outlines attributes of various network types, then looks more closely at the common carrier air transportation network from perspectives of the traveler, the airlines, and Air Traffic Control (ATC). Network models are applied within each paradigm, including discussion of implied strengths and weaknesses of each model. Finally, known limitations of scalable networks are discussed. With an eye towards NAS operations, utilizing the strengths and avoiding the weaknesses of scale-free networks are addressed.
Development of mpi_EPIC model for global agroecosystem modeling
Kang, Shujiang; Wang, Dali; Jeff A. Nichols; ...
2014-12-31
Models that address policy-maker concerns about multi-scale effects of food and bioenergy production systems are computationally demanding. We integrated the message passing interface algorithm into the process-based EPIC model to accelerate computation of ecosystem effects. Simulation performance was further enhanced by applying the Vampir framework. When this enhanced mpi_EPIC model was tested, total execution time for a global 30-year simulation of a switchgrass cropping system was shortened to less than 0.5 hours on a supercomputer. The results illustrate that mpi_EPIC using parallel design can balance simulation workloads and facilitate large-scale, high-resolution analysis of agricultural production systems, management alternatives and environmentalmore » effects.« less
Moderating diets to feed the future
NASA Astrophysics Data System (ADS)
Davis, Kyle F.; D'Odorico, Paolo; Rulli, Maria Cristina
2014-10-01
Population growth, dietary changes, and increasing biofuel use are placing unprecedented pressure on the global food system. While this demand likely cannot be met by expanding agricultural lands, much of the world's cropland can attain higher crop yields. Therefore, it is important to examine whether increasing crop productivity to the maximum attainable yield (i.e., yield gap closure) alone can substantially improve food security at global and national scales. Here we show that closing yield gaps through conventional technological development (i.e., fertilizers and irrigation) can potentially meet future global demand if diets are moderated and crop-based biofuel production is limited. In particular, we find that increases in dietary demand will be largely to blame should crop production fall short of demand. In converting projected diets to a globally adequate diet (3000 kcal/cap/d; 20% animal kcal) under current agrofuel use, we find that 1.8-2.6 billion additional people can be fed in 2030 and 2.1-3.1 billion additional people in 2050, depending on the extent to which yields can improve in those periods. Therefore, the simple combination of yield gap closure and moderating diets offers promise for feeding the world's population but only if long-term sustainability is the focus.
Communication architecture for large geostationary platforms
NASA Technical Reports Server (NTRS)
Bond, F. E.
1979-01-01
Large platforms have been proposed for supporting multipurpose communication payloads to exploit economy of scale, reduce congestion in the geostationary orbit, provide interconnectivity between diverse earth stations, and obtain significant frequency reuse with large multibeam antennas. This paper addresses a specific system design, starting with traffic projections in the next two decades and discussing tradeoffs and design approaches for major components including: antennas, transponders, and switches. Other issues explored are selection of frequency bands, modulation, multiple access, switching methods, and techniques for servicing areas with nonuniform traffic demands. Three-major services are considered: a high-volume trunking system, a direct-to-user system, and a broadcast system for video distribution and similar functions. Estimates of payload weight and d.c. power requirements are presented. Other subjects treated are: considerations of equipment layout for servicing by an orbit transfer vehicle, mechanical stability requirements for the large antennas, and reliability aspects of the large number of transponders employed.
The United States Air Force and U.S. National Security: A Historical Perspective 1947-1990
1991-01-01
there will remain areas in the world with potential demand for large scale protracted operations. To meet the needs of the joint force commander...Canada 5/78 Zaire AMNI MAC C-1418 airlifted U.S. Department of Energy personnel and equipment to the province of Alberta to aid in... motor vehicles and small arms ammunition to U.S. Embassy after drug traffickers threatened safety of U.S. personnel in Colombia. 1/85 Mali AMNI The
Naghdi, Mohammad Reza; Smail, Katia; Wang, Joy X; Wade, Fallou; Breaker, Ronald R; Perreault, Jonathan
2017-03-15
The discovery of noncoding RNAs (ncRNAs) and their importance for gene regulation led us to develop bioinformatics tools to pursue the discovery of novel ncRNAs. Finding ncRNAs de novo is challenging, first due to the difficulty of retrieving large numbers of sequences for given gene activities, and second due to exponential demands on calculation needed for comparative genomics on a large scale. Recently, several tools for the prediction of conserved RNA secondary structure were developed, but many of them are not designed to uncover new ncRNAs, or are too slow for conducting analyses on a large scale. Here we present various approaches using the database RiboGap as a primary tool for finding known ncRNAs and for uncovering simple sequence motifs with regulatory roles. This database also can be used to easily extract intergenic sequences of eubacteria and archaea to find conserved RNA structures upstream of given genes. We also show how to extend analysis further to choose the best candidate ncRNAs for experimental validation. Copyright © 2017 Elsevier Inc. All rights reserved.
Land grabbing: a preliminary quantification of economic impacts on rural livelihoods.
Davis, Kyle F; D'Odorico, Paolo; Rulli, Maria Cristina
2014-01-01
Global demands on agricultural land are increasing due to population growth, dietary changes and the use of biofuels. Their effect on food security is to reduce humans' ability to cope with the uncertainties of global climate change. In light of the 2008 food crisis, to secure reliable future access to sufficient agricultural land, many nations and corporations have begun purchasing large tracts of land in the global South, a phenomenon deemed "land grabbing" by popular media. Because land investors frequently export crops without providing adequate employment, this represents an effective income loss for local communities. We study 28 countries targeted by large-scale land acquisitions [comprising 87 % of reported cases and 27 million hectares (ha)] and estimate the effects of such investments on local communities' incomes. We find that this phenomenon can potentially affect the incomes of ~12 million people globally with implications for food security, poverty levels and urbanization. While it is important to note that our study incorporates a number of assumptions and limitations, it provides a much needed initial quantification of the economic impacts of large-scale land acquisitions on rural livelihoods.
Multi-granularity Bandwidth Allocation for Large-Scale WDM/TDM PON
NASA Astrophysics Data System (ADS)
Gao, Ziyue; Gan, Chaoqin; Ni, Cuiping; Shi, Qiongling
2017-12-01
WDM (wavelength-division multiplexing)/TDM (time-division multiplexing) PON (passive optical network) is being viewed as a promising solution for delivering multiple services and applications, such as high-definition video, video conference and data traffic. Considering the real-time transmission, QoS (quality of services) requirements and differentiated services model, a multi-granularity dynamic bandwidth allocation (DBA) in both domains of wavelengths and time for large-scale hybrid WDM/TDM PON is proposed in this paper. The proposed scheme achieves load balance by using the bandwidth prediction. Based on the bandwidth prediction, the wavelength assignment can be realized fairly and effectively to satisfy the different demands of various classes. Specially, the allocation of residual bandwidth further augments the DBA and makes full use of bandwidth resources in the network. To further improve the network performance, two schemes named extending the cycle of one free wavelength (ECoFW) and large bandwidth shrinkage (LBS) are proposed, which can prevent transmission from interruption when the user employs more than one wavelength. The simulation results show the effectiveness of the proposed scheme.
A prototype automatic phase compensation module
NASA Technical Reports Server (NTRS)
Terry, John D.
1992-01-01
The growing demands for high gain and accurate satellite communication systems will necessitate the utilization of large reflector systems. One area of concern of reflector based satellite communication is large scale surface deformations due to thermal effects. These distortions, when present, can degrade the performance of the reflector system appreciable. This performance degradation is manifested by a decrease in peak gain, and increase in sidelobe level, and pointing errors. It is essential to compensate for these distortion effects and to maintain the required system performance in the operating space environment. For this reason the development of a technique to offset the degradation effects is highly desirable. Currently, most research is direct at developing better material for the reflector. These materials have a lower coefficient of linear expansion thereby reducing the surface errors. Alternatively, one can minimize the distortion effects of these large scale errors by adaptive phased array compensation. Adaptive phased array techniques have been studied extensively at NASA and elsewhere. Presented in this paper is a prototype automatic phase compensation module designed and built at NASA Lewis Research Center which is the first stage of development for an adaptive array compensation module.
Drive-by large-region acoustic noise-source mapping via sparse beamforming tomography.
Tuna, Cagdas; Zhao, Shengkui; Nguyen, Thi Ngoc Tho; Jones, Douglas L
2016-10-01
Environmental noise is a risk factor for human physical and mental health, demanding an efficient large-scale noise-monitoring scheme. The current technology, however, involves extensive sound pressure level (SPL) measurements at a dense grid of locations, making it impractical on a city-wide scale. This paper presents an alternative approach using a microphone array mounted on a moving vehicle to generate two-dimensional acoustic tomographic maps that yield the locations and SPLs of the noise-sources sparsely distributed in the neighborhood traveled by the vehicle. The far-field frequency-domain delay-and-sum beamforming output power values computed at multiple locations as the vehicle drives by are used as tomographic measurements. The proposed method is tested with acoustic data collected by driving an electric vehicle with a rooftop-mounted microphone array along a straight road next to a large open field, on which various pre-recorded noise-sources were produced by a loudspeaker at different locations. The accuracy of the tomographic imaging results demonstrates the promise of this approach for rapid, low-cost environmental noise-monitoring.
Industrial biomanufacturing: The future of chemical production.
Clomburg, James M; Crumbley, Anna M; Gonzalez, Ramon
2017-01-06
The current model for industrial chemical manufacturing employs large-scale megafacilities that benefit from economies of unit scale. However, this strategy faces environmental, geographical, political, and economic challenges associated with energy and manufacturing demands. We review how exploiting biological processes for manufacturing (i.e., industrial biomanufacturing) addresses these concerns while also supporting and benefiting from economies of unit number. Key to this approach is the inherent small scale and capital efficiency of bioprocesses and the ability of engineered biocatalysts to produce designer products at high carbon and energy efficiency with adjustable output, at high selectivity, and under mild process conditions. The biological conversion of single-carbon compounds represents a test bed to establish this paradigm, enabling rapid, mobile, and widespread deployment, access to remote and distributed resources, and adaptation to new and changing markets. Copyright © 2017, American Association for the Advancement of Science.
NASA Astrophysics Data System (ADS)
Guojun, He; Lin, Guo; Zhicheng, Yu; Xiaojun, Zhu; Lei, Wang; Zhiqiang, Zhao
2017-03-01
In order to reduce the stochastic volatility of supply and demand, and maintain the electric power system's stability after large scale stochastic renewable energy sources connected to grid, the development and consumption should be promoted by marketing means. Bilateral contract transaction model of large users' direct power purchase conforms to the actual situation of our country. Trading pattern of large users' direct power purchase is analyzed in this paper, characteristics of each power generation are summed up, and centralized matching mode is mainly introduced. Through the establishment of power generation enterprises' priority evaluation index system and the analysis of power generation enterprises' priority based on fuzzy clustering, the sorting method of power generation enterprises' priority in trading patterns of large users' direct power purchase is put forward. Suggestions for trading mechanism of large users' direct power purchase are offered by this method, which is good for expand the promotion of large users' direct power purchase further.
Azad, Ariful; Ouzounis, Christos A; Kyrpides, Nikos C; Buluç, Aydin
2018-01-01
Abstract Biological networks capture structural or functional properties of relevant entities such as molecules, proteins or genes. Characteristic examples are gene expression networks or protein–protein interaction networks, which hold information about functional affinities or structural similarities. Such networks have been expanding in size due to increasing scale and abundance of biological data. While various clustering algorithms have been proposed to find highly connected regions, Markov Clustering (MCL) has been one of the most successful approaches to cluster sequence similarity or expression networks. Despite its popularity, MCL’s scalability to cluster large datasets still remains a bottleneck due to high running times and memory demands. Here, we present High-performance MCL (HipMCL), a parallel implementation of the original MCL algorithm that can run on distributed-memory computers. We show that HipMCL can efficiently utilize 2000 compute nodes and cluster a network of ∼70 million nodes with ∼68 billion edges in ∼2.4 h. By exploiting distributed-memory environments, HipMCL clusters large-scale networks several orders of magnitude faster than MCL and enables clustering of even bigger networks. HipMCL is based on MPI and OpenMP and is freely available under a modified BSD license. PMID:29315405
Brawer, Peter A; Martielli, Richard; Pye, Patrice L; Manwaring, Jamie; Tierney, Anna
2010-06-01
The primary care health setting is in crisis. Increasing demand for services, with dwindling numbers of providers, has resulted in decreased access and decreased satisfaction for both patients and providers. Moreover, the overwhelming majority of primary care visits are for behavioral and mental health concerns rather than issues of a purely medical etiology. Integrated-collaborative models of health care delivery offer possible solutions to this crisis. The purpose of this article is to review the existing data available after 2 years of the St. Louis Initiative for Integrated Care Excellence; an example of integrated-collaborative care on a large scale model within a regional Veterans Affairs Health Care System. There is clear evidence that the SLI(2)CE initiative rather dramatically increased access to health care, and modified primary care practitioners' willingness to address mental health issues within the primary care setting. In addition, data suggests strong fidelity to a model of integrated-collaborative care which has been successful in the past. Integrated-collaborative care offers unique advantages to the traditional view and practice of medical care. Through careful implementation and practice, success is possible on a large scale model. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Azad, Ariful; Pavlopoulos, Georgios A.; Ouzounis, Christos A.; ...
2018-01-05
Biological networks capture structural or functional properties of relevant entities such as molecules, proteins or genes. Characteristic examples are gene expression networks or protein–protein interaction networks, which hold information about functional affinities or structural similarities. Such networks have been expanding in size due to increasing scale and abundance of biological data. While various clustering algorithms have been proposed to find highly connected regions, Markov Clustering (MCL) has been one of the most successful approaches to cluster sequence similarity or expression networks. Despite its popularity, MCL’s scalability to cluster large datasets still remains a bottleneck due to high running times andmore » memory demands. In this paper, we present High-performance MCL (HipMCL), a parallel implementation of the original MCL algorithm that can run on distributed-memory computers. We show that HipMCL can efficiently utilize 2000 compute nodes and cluster a network of ~70 million nodes with ~68 billion edges in ~2.4 h. By exploiting distributed-memory environments, HipMCL clusters large-scale networks several orders of magnitude faster than MCL and enables clustering of even bigger networks. Finally, HipMCL is based on MPI and OpenMP and is freely available under a modified BSD license.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azad, Ariful; Pavlopoulos, Georgios A.; Ouzounis, Christos A.
Biological networks capture structural or functional properties of relevant entities such as molecules, proteins or genes. Characteristic examples are gene expression networks or protein–protein interaction networks, which hold information about functional affinities or structural similarities. Such networks have been expanding in size due to increasing scale and abundance of biological data. While various clustering algorithms have been proposed to find highly connected regions, Markov Clustering (MCL) has been one of the most successful approaches to cluster sequence similarity or expression networks. Despite its popularity, MCL’s scalability to cluster large datasets still remains a bottleneck due to high running times andmore » memory demands. In this paper, we present High-performance MCL (HipMCL), a parallel implementation of the original MCL algorithm that can run on distributed-memory computers. We show that HipMCL can efficiently utilize 2000 compute nodes and cluster a network of ~70 million nodes with ~68 billion edges in ~2.4 h. By exploiting distributed-memory environments, HipMCL clusters large-scale networks several orders of magnitude faster than MCL and enables clustering of even bigger networks. Finally, HipMCL is based on MPI and OpenMP and is freely available under a modified BSD license.« less
Amp: A modular approach to machine learning in atomistic simulations
NASA Astrophysics Data System (ADS)
Khorshidi, Alireza; Peterson, Andrew A.
2016-10-01
Electronic structure calculations, such as those employing Kohn-Sham density functional theory or ab initio wavefunction theories, have allowed for atomistic-level understandings of a wide variety of phenomena and properties of matter at small scales. However, the computational cost of electronic structure methods drastically increases with length and time scales, which makes these methods difficult for long time-scale molecular dynamics simulations or large-sized systems. Machine-learning techniques can provide accurate potentials that can match the quality of electronic structure calculations, provided sufficient training data. These potentials can then be used to rapidly simulate large and long time-scale phenomena at similar quality to the parent electronic structure approach. Machine-learning potentials usually take a bias-free mathematical form and can be readily developed for a wide variety of systems. Electronic structure calculations have favorable properties-namely that they are noiseless and targeted training data can be produced on-demand-that make them particularly well-suited for machine learning. This paper discusses our modular approach to atomistic machine learning through the development of the open-source Atomistic Machine-learning Package (Amp), which allows for representations of both the total and atom-centered potential energy surface, in both periodic and non-periodic systems. Potentials developed through the atom-centered approach are simultaneously applicable for systems with various sizes. Interpolation can be enhanced by introducing custom descriptors of the local environment. We demonstrate this in the current work for Gaussian-type, bispectrum, and Zernike-type descriptors. Amp has an intuitive and modular structure with an interface through the python scripting language yet has parallelizable fortran components for demanding tasks; it is designed to integrate closely with the widely used Atomic Simulation Environment (ASE), which makes it compatible with a wide variety of commercial and open-source electronic structure codes. We finally demonstrate that the neural network model inside Amp can accurately interpolate electronic structure energies as well as forces of thousands of multi-species atomic systems.
NASA Astrophysics Data System (ADS)
Katul, Gabriel G.; Oren, Ram; Manzoni, Stefano; Higgins, Chad; Parlange, Marc B.
2012-09-01
The role of evapotranspiration (ET) in the global, continental, regional, and local water cycles is reviewed. Elevated atmospheric CO2, air temperature, vapor pressure deficit (D), turbulent transport, radiative transfer, and reduced soil moisture all impact biotic and abiotic processes controlling ET that must be extrapolated to large scales. Suggesting a blueprint to achieve this link is the main compass of this review. Leaf-scale transpiration (fe) as governed by the plant biochemical demand for CO2 is first considered. When this biochemical demand is combined with mass transfer formulations, the problem remains mathematically intractable, requiring additional assumptions. A mathematical "closure" that assumes stomatal aperture is autonomously regulated so as to maximize the leaf carbon gain while minimizing water loss is proposed, which leads to analytical expressions for leaf-scale transpiration. This formulation predicts well the effects of elevated atmospheric CO2 and increases in D on fe. The case of soil moisture stress is then considered using extensive gas exchange measurements collected in drought studies. Upscaling the fe to the canopy is then discussed at multiple time scales. The impact of limited soil water availability within the rooting zone on the upscaled ET as well as some plant strategies to cope with prolonged soil moisture stress are briefly presented. Moving further up in direction and scale, the soil-plant system is then embedded within the atmospheric boundary layer, where the influence of soil moisture on rainfall is outlined. The review concludes by discussing outstanding challenges and how to tackle them by means of novel theoretical, numerical, and experimental approaches.
Forest gradient response in Sierran landscapes: the physical template
Urban, Dean L.; Miller, Carol; Halpin, Patrick N.; Stephenson, Nathan L.
2000-01-01
Vegetation pattern on landscapes is the manifestation of physical gradients, biotic response to these gradients, and disturbances. Here we focus on the physical template as it governs the distribution of mixed-conifer forests in California's Sierra Nevada. We extended a forest simulation model to examine montane environmental gradients, emphasizing factors affecting the water balance in these summer-dry landscapes. The model simulates the soil moisture regime in terms of the interaction of water supply and demand: supply depends on precipitation and water storage, while evapotranspirational demand varies with solar radiation and temperature. The forest cover itself can affect the water balance via canopy interception and evapotranspiration. We simulated Sierran forests as slope facets, defined as gridded stands of homogeneous topographic exposure, and verified simulated gradient response against sample quadrats distributed across Sequoia National Park. We then performed a modified sensitivity analysis of abiotic factors governing the physical gradient. Importantly, the model's sensitivity to temperature, precipitation, and soil depth varies considerably over the physical template, particularly relative to elevation. The physical drivers of the water balance have characteristic spatial scales that differ by orders of magnitude. Across large spatial extents, temperature and precipitation as defined by elevation primarily govern the location of the mixed conifer zone. If the analysis is constrained to elevations within the mixed-conifer zone, local topography comes into play as it influences drainage. Soil depth varies considerably at all measured scales, and is especially dominant at fine (within-stand) scales. Physical site variables can influence soil moisture deficit either by affecting water supply or water demand; these effects have qualitatively different implications for forest response. These results have clear implications about purely inferential approaches to gradient analysis, and bear strongly on our ability to use correlative approaches in assessing the potential responses of montane forests to anthropogenic climatic change.
Boithias, Laurie; Acuña, Vicenç; Vergoñós, Laura; Ziv, Guy; Marcé, Rafael; Sabater, Sergi
2014-02-01
Spatial differences in the supply and demand of ecosystem services such as water provisioning often imply that the demand for ecosystem services cannot be fulfilled at the local scale, but it can be fulfilled at larger scales (regional, continental). Differences in the supply:demand (S:D) ratio for a given service result in different values, and these differences might be assessed with monetary or non-monetary metrics. Water scarcity occurs where and when water resources are not enough to meet all the demands, and this affects equally the service of water provisioning and the ecosystem needs. In this study we assess the value of water in a Mediterranean basin under different global change (i.e. both climate and anthropogenic changes) and mitigation scenarios, with a non-monetary metric: the S:D ratio. We computed water balances across the Ebro basin (North-East Spain) with the spatially explicit InVEST model. We highlight the spatial and temporal mismatches existing across a single hydrological basin regarding water provisioning and its consumption, considering or not, the environmental demand (environmental flow). The study shows that water scarcity is commonly a local issue (sub-basin to region), but that all demands are met at the largest considered spatial scale (basin). This was not the case in the worst-case scenario (increasing demands and decreasing supply), as the S:D ratio at the basin scale was near 1, indicating that serious problems of water scarcity might occur in the near future even at the basin scale. The analysis of possible mitigation scenarios reveals that the impact of global change may be counteracted by the decrease of irrigated areas. Furthermore, the comparison between a non-monetary (S:D ratio) and a monetary (water price) valuation metrics reveals that the S:D ratio provides similar values and might be therefore used as a spatially explicit metric to valuate the ecosystem service water provisioning. © 2013.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeForest, Nicholas; Mendes, Goncalo; Stadler, Michael
2013-06-02
In much of the developed world, air-conditioning in buildings is the dominant driver of summer peak electricity demand. In the developing world a steadily increasing utilization of air-conditioning places additional strain on already-congested grids. This common thread represents a large and growing threat to the reliable delivery of electricity around the world, requiring capital-intensive expansion of capacity and draining available investment resources. Thermal energy storage (TES), in the form of ice or chilled water, may be one of the few technologies currently capable of mitigating this problem cost effectively and at scale. The installation of TES capacity allows a buildingmore » to meet its on-peak air conditioning load without interruption using electricity purchased off-peak and operating with improved thermodynamic efficiency. In this way, TES has the potential to fundamentally alter consumption dynamics and reduce impacts of air conditioning. This investigation presents a simulation study of a large office building in four distinct geographical contexts: Miami, Lisbon, Shanghai, and Mumbai. The optimization tool DER-CAM (Distributed Energy Resources Customer Adoption Model) is applied to optimally size TES systems for each location. Summer load profiles are investigated to assess the effectiveness and consistency in reducing peak electricity demand. Additionally, annual energy requirements are used to determine system cost feasibility, payback periods and customer savings under local utility tariffs.« less
Climate mitigation and the future of tropical landscapes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomson, Allison M.; Calvin, Katherine V.; Chini, Louise Parsons
2010-11-16
Land use change to meet 21st Century demands for food, fuel, and fiber will occur in the context of both a changing climate as well as societal efforts to mitigate climate change. This changing natural and human environment will have large consequences for forest resources, terrestrial carbon storage and emissions, and food and energy crop production over the next century. Any climate change mitigation policies enacted will change the environment under which land-use decisions are made and alter global land use change patterns. Here we use the GCAM integrated assessment model to explore how climate mitigation policies that achieve amore » climate stabilization at 4.5 W m-2 radiative forcing in 2100 and value carbon in terrestrial ecosystems interact with future agricultural productivity and food and energy demands to influence land use in the tropics. The regional land use results are downscaled from GCAM regions to produce gridded maps of tropical land use change. We find that tropical forests are preserved only in cases where a climate mitigation policy that values terrestrial carbon is in place, and crop productivity growth continues throughout the century. Crop productivity growth is also necessary to avoid large scale deforestation globally and enable the production of bioenergy crops. The terrestrial carbon pricing assumptions in GCAM are effective at avoiding deforestation even when cropland must expand to meet future food demand.« less
NASA Astrophysics Data System (ADS)
Toth, Elena; Bragalli, Cristiana; Neri, Mattia
2017-04-01
In Mediterranean regions, inherently affected by water scarcity conditions, the gap between water availability and demand may further increase in the near future due to both climatic and anthropogenic drivers. In particular, the high degree of urbanization and the concentration of population and activities in coastal areas is often severely impacting the water availability also for the residential sector. It is therefore crucial analysing the importance of both climatic and touristic factors as drivers for the water demand in such areas, to better understand and model the expected consumption in order to improve the water management policies and practices. The study presents an analysis referred to a large number of municipalities, covering almost the whole Romagna region, in Northern Italy, representing one of the most economically developed areas in Europe and characterized by an extremely profitable tourist industry, especially in the coastal cities. For this region it is therefore extremely important to assess the significance of the drivers that may influence the demand in the different periods of the year, that is climatic factors (rainfall depths and occurrence, temperature averages and extremes), but also the presence of tourists, in both official tourist accommodation structures and in holidays homes (and the latter are very difficult to estimate). Analyses on the Italian water industry at seasonal or monthly time scale has been so far, extremely limited in the literature by the scarce availability of data on the water demands, that are made public only as annual volumes. All the study municipalities are supplied by the same water company, who provided monthly consumption volumes data at the main inlet points of the entire distribution network for a period of 7 years (2009-2015). For the same period, precipitation and temperature data have been collected and summarised in indexes representing monthly averages, days of occurrence and over threshold values; in addition, information on the tourist flows, at monthly scale, have been collected and processed. Such data have been validated and aggregated at municipal or multi-municipal scale and are analysed, in particular in reference to a severe dry period occurred in 2011-2012, in order to understand the demand pattern and the users' response to a water scarcity condition, examining the influence of the different climatic and anthropogenic (touristic) drivers on the water demand. Finally, a non-linear model, based on a neural network architecture, was implemented for each municipality, for simulating the monthly water demand as a function of previous demands and of the identified climatic and touristic indexes: the outcomes of the models demonstrate the added value of the addition of determinants based on both climatic and touristic data and such value, as expected, is higher for the coastal municipalities, having a higher tourist vocation.
Piezoelectric Polymers Actuators for Precise Shape Control of Large Scale Space Antennas
NASA Technical Reports Server (NTRS)
Chen, Qin; Natale, Don; Neese, Bret; Ren, Kailiang; Lin, Minren; Zhang, Q. M.; Pattom, Matthew; Wang, K. W.; Fang, Houfei; Im, Eastwood
2007-01-01
Extremely large, lightweight, in-space deployable active and passive microwave antennas are demanded by future space missions. This paper investigates the development of PVDF based piezopolymer actuators for controlling the surface accuracy of a membrane reflector. Uniaxially stretched PVDF films were poled using an electrodeless method which yielded high quality poled piezofilms required for this application. To further improve the piezoperformance of piezopolymers, several PVDF based copolymers were examined. It was found that one of them exhibits nearly three times improvement in the in-plane piezoresponse compared with PVDF and P(VDF-TrFE) piezopolymers. Preliminary experimental results indicate that these flexible actuators are very promising in controlling precisely the shape of the space reflectors.
Zhang, Tan; Chen, Ang
2017-01-01
Based on the job demands-resources model, the study developed and validated an instrument that measures physical education teachers' job demands-resources perception. Expert review established content validity with the average item rating of 3.6/5.0. Construct validity and reliability were determined with a teacher sample ( n = 397). Exploratory factor analysis established a five-dimension construct structure matching the theoretical construct deliberated in the literature. The composite reliability scores for the five dimensions range from .68 to .83. Validity coefficients (intraclass correlational coefficients) are .69 for job resources items and .82 for job demands items. Inter-scale correlational coefficients range from -.32 to .47. Confirmatory factor analysis confirmed the construct validity with high dimensional factor loadings (ranging from .47 to .84 for job resources scale and from .50 to .85 for job demands scale) and adequate model fit indexes (root mean square error of approximation = .06). The instrument provides a tool to measure physical education teachers' perception of their working environment.
NASA Astrophysics Data System (ADS)
Wang, B.; Bauer, S.; Pfeiffer, W. T.
2015-12-01
Large scale energy storage will be required to mitigate offsets between electric energy demand and the fluctuating electric energy production from renewable sources like wind farms, if renewables dominate energy supply. Porous formations in the subsurface could provide the large storage capacities required if chemical energy carriers such as hydrogen gas produced during phases of energy surplus are stored. This work assesses the behavior of a porous media hydrogen storage operation through numerical scenario simulation of a synthetic, heterogeneous sandstone formation formed by an anticlinal structure. The structural model is parameterized using data available for the North German Basin as well as data given for formations with similar characteristics. Based on the geological setting at the storage site a total of 15 facies distributions is generated and the hydrological parameters are assigned accordingly. Hydraulic parameters are spatially distributed according to the facies present and include permeability, porosity relative permeability and capillary pressure. The storage is designed to supply energy in times of deficiency on the order of seven days, which represents the typical time span of weather conditions with no wind. It is found that using five injection/extraction wells 21.3 mio sm³ of hydrogen gas can be stored and retrieved to supply 62,688 MWh of energy within 7 days. This requires a ratio of working to cushion gas of 0.59. The retrievable energy within this time represents the demand of about 450000 people. Furthermore it is found that for longer storage times, larger gas volumes have to be used, for higher delivery rates additionally the number of wells has to be increased. The formation investigated here thus seems to offer sufficient capacity and deliverability to be used for a large scale hydrogen gas storage operation.
Amlung, Michael; McCarty, Kayleigh N.; Morris, David H.; Tsai, Chia-Lin; McCarthy, Denis M.
2015-01-01
Background and aims Although increases in subjective alcohol craving have been observed following moderate doses of alcohol (e.g., priming effects), the effects of alcohol consumption on behavioral economic demand for alcohol are largely unstudied. This study examined the effects of alcohol intoxication on alcohol demand and craving. Design A between-subjects design in which participants were randomly assigned to either an alcohol (n = 31), placebo (n = 29) or control (n = 25) condition. Setting A laboratory setting at the University of Missouri, USA. Participants Eighty-five young adult moderate drinkers were recruited from the University of Missouri and surrounding community. Measurements Change in demand for alcohol across time was measured using three single items: alcohol consumption at no cost (i.e., intensity), maximum price paid for a single drink (i.e., breakpoint), and total amount spent on alcohol (i.e., Omax). Alcohol demand at baseline was also assessed using an alcohol purchase task (APT). Craving was assessed using a single visual analog scale item. Findings In the alcohol group compared with the combined non-alcohol groups, intensity, breakpoint, and craving increased from baseline to the ascending limb and decreased thereafter (ps < 0.05; Omax p = 0.06). Change in craving following alcohol consumption was significantly associated with change in each of the demand indices (ps < 0.0001). Finally, the demand single items were associated with corresponding indices from the APT (ps < 0.01). Conclusions Alcohol demand increases following intoxication, in terms of both the maximum amount people are willing to pay for one drink and the number of drinks people would consume if drinks were free. Behavioral economic measures of alcohol value can complement subjective craving as measures of moment-to-moment fluctuations in drinking motivation following intoxication. PMID:25732875
Arrindell, W A; Bridges, K R; van der Ende, J; St Lawrence, J S; Gray-Shellberg, L; Harnish, R; Rogers, R; Sanderman, R
2001-12-01
The Scale for Interpersonal Behaviour (SIB), a multidimensional, self-report measure of state assertiveness, was administered to a nationwide sample of 2375 undergraduates enrolled at 11 colleges and universities across the USA. The SIB was developed in the Netherlands for the independent assessment of both distress associated with self-assertion in a variety of social situations and the likelihood of engaging in a specific assertive response. This is done with four factorially-derived, first-order dimensions: (i) Display of negative feelings (Negative assertion); (ii) Expression of and dealing with personal limitations; (iii) Initiating assertiveness; and (iv) Praising others and the ability to deal with compliments/praise of others (Positive assertion). The present study was designed to determine the cross-national invariance of the original Dutch factors and the construct validity of the corresponding dimensions. It also set out to develop norms for a nationwide sample of US students. The results provide further support for the reliability, factorial and construct validity of the SIB. Compared to their Dutch equivalents, US students had meaningfully higher distress in assertiveness scores on all SIB scales (medium to large effect sizes), whereas differences on the performance scales reflected small effect sizes. The cross-national differences in distress scores were hypothesized to have originated from the American culture being more socially demanding with respect to interpersonal competence than the Dutch, and from the perceived threats and related cognitive appraisals that are associated with such demands.
Li, Xiwen; Chen, Yuning; Yang, Qing; Wang, Yitao
2015-01-01
The usage amount of medicinal plant rapidly increased along with the development of traditional Chinese medicine industry. The higher market demand and the shortage of wild herbal resources enforce us to carry out large-scale introduction and cultivation. Herbal cultivation can ease current contradiction between medicinal resources supply and demand while they bring new problems such as pesticide residues and plant disease and pests. Researchers have recently placed high hopes on the application of natural fostering, a new method incorporated herbal production and diversity protecting practically, which can solve the problems brought by artificial cultivation. However no modes can solve all problems existing in current herbal production. This study evaluated different production modes including cultivation, natural fostering, and wild collection to guide the traditional Chinese medicine production for sustainable utilization of herbal resources. PMID:26074987
Advanced Continuous Flow Platform for On-Demand Pharmaceutical Manufacturing.
Zhang, Ping; Weeranoppanant, Nopphon; Thomas, Dale A; Tahara, Kohei; Stelzer, Torsten; Russell, Mary Grace; O'Mahony, Marcus; Myerson, Allan S; Lin, Hongkun; Kelly, Liam P; Jensen, Klavs F; Jamison, Timothy F; Dai, Chunhui; Cui, Yuqing; Briggs, Naomi; Beingessner, Rachel L; Adamo, Andrea
2018-02-21
As a demonstration of an alternative to the challenges faced with batch pharmaceutical manufacturing including the large production footprint and lengthy time-scale, we previously reported a refrigerator-sized continuous flow system for the on-demand production of essential medicines. Building on this technology, herein we report a second-generation, reconfigurable and 25 % smaller (by volume) continuous flow pharmaceutical manufacturing platform featuring advances in reaction and purification equipment. Consisting of two compact [0.7 (L)×0.5 (D)×1.3 m (H)] stand-alone units for synthesis and purification/formulation processes, the capabilities of this automated system are demonstrated with the synthesis of nicardipine hydrochloride and the production of concentrated liquid doses of ciprofloxacin hydrochloride, neostigmine methylsulfate and rufinamide that meet US Pharmacopeia standards. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Efficient methods and readily customizable libraries for managing complexity of large networks.
Dogrusoz, Ugur; Karacelik, Alper; Safarli, Ilkin; Balci, Hasan; Dervishi, Leonard; Siper, Metin Can
2018-01-01
One common problem in visualizing real-life networks, including biological pathways, is the large size of these networks. Often times, users find themselves facing slow, non-scaling operations due to network size, if not a "hairball" network, hindering effective analysis. One extremely useful method for reducing complexity of large networks is the use of hierarchical clustering and nesting, and applying expand-collapse operations on demand during analysis. Another such method is hiding currently unnecessary details, to later gradually reveal on demand. Major challenges when applying complexity reduction operations on large networks include efficiency and maintaining the user's mental map of the drawing. We developed specialized incremental layout methods for preserving a user's mental map while managing complexity of large networks through expand-collapse and hide-show operations. We also developed open-source JavaScript libraries as plug-ins to the web based graph visualization library named Cytsocape.js to implement these methods as complexity management operations. Through efficient specialized algorithms provided by these extensions, one can collapse or hide desired parts of a network, yielding potentially much smaller networks, making them more suitable for interactive visual analysis. This work fills an important gap by making efficient implementations of some already known complexity management techniques freely available to tool developers through a couple of open source, customizable software libraries, and by introducing some heuristics which can be applied upon such complexity management techniques to ensure preserving mental map of users.
Breshears, David D.; Adams, Henry D.; Eamus, Derek; ...
2013-08-02
Drought-induced tree mortality, including large-scale die-off events and increases in background rates of mortality, is a global phenomenon (Allen et al., 2010) that can directly impact numerous earth system properties and ecosystem goods and services (Adams et al., 2010; Breshears et al., 2011; Anderegg et al., 2013). Tree mortality is particularly of concern because of the likelihood that it will increase in frequency and extent with climate change (McDowell et al., 2008, 2011; Adams et al., 2009; McDowell, 2011; Williams et al., 2013). Recent plant science advances related to drought have focused on understanding the physiological mechanisms that not onlymore » affect plant growth and associated carbon metabolism, but also the more challenging issue of predicting plant mortality thresholds (McDowell et al., 2013). Although some advances related to mechanisms of mortality have been made and have increased emphasis on interrelationships between carbon metabolism and plant hydraulics (McDowell et al., 2011), notably few studies have specifically evaluated effects of increasing atmospheric demand for moisture (i.e., vapour pressure deficit; VPD) on rates of tree death. In this opinion article we highlight the importance of considering the key risks of future large-scale tree die-off and other mortality events arising from increased VPD. Here we focus on mortality of trees, but our point about the importance of VPD is also relevant to other vascular plants.« less
A Vision for the Future of Environmental Research: Creating Environmental Intelligence Centers
NASA Astrophysics Data System (ADS)
Barron, E. J.
2002-12-01
The nature of the environmental issues facing our nation demands a capability that allows us to enhance economic vitality, maintain environmental quality, and limit threats to life and property through more fundamental understanding of the Earth. It is "advanced" knowledge of how the system may respond that gives environmental information most of its power and utility. This fact is evident in the demand for new forecasting products, involving air quality, energy demand, water quality and quantity, ultraviolet radiation, and human health indexes. As we demonstrate feasibility and benefit, society is likely to demand a growing number of new operational forecast products on prediction time scales of days to decades into the future. The driving forces that govern our environment are widely recognized, involving primarily weather and climate, patterns of land use and land cover, and resource use with its associated waste products. The importance of these driving forces has been demonstrated by a decade of research on greenhouse gas emissions, ozone depletion and deforestation, and through the birth of Earth System Science. But, there are also major challenges. We find the strongest intersection between human activity, environmental stresses, system interactions and human decision-making in regional analysis coupled to larger spatial scales. In addition, most regions are influenced by multiple-stresses. Multiple, cumulative, and interactive stresses are clearly the most difficult to understand and hence the most difficult to assess and to manage. Currently, we are incapable of addressing these issues in a truly integrated fashion at global scales. The lack of an ability to combine global and regional forcing and to assess the response of the system to multiple stresses at the spatial and temporal scales of interest to humans limits our ability to assess the impacts of specific human perturbations, to assess advantages and risks, and to enhance economic and societal well being in the context of global, national and regional stewardship. These societal needs lead to a vision that uses a regional framework as a stepping-stone to a comprehensive national or global capability. The development of a comprehensive regional framework depends on a new approach to environmental research - the creation of regional Environmental Intelligence Centers. A key objective is to bring a demanding level of discipline to "forecasting" in a broad arena of environmental issues. The regional vision described above is designed to address a broad range of current and future environmental issues by creating a capability based on integrating diverse observing systems, making data readily accessible, developing an increasingly comprehensive predictive capability at the spatial and temporal scales appropriate for examining societal issues, and creating a vigorous intersection with decision-makers. With demonstrated success over a few large-scale regions of the U.S., this strategy will very likely grow into a national capability that far exceeds current capabilities.
Evaluating the impacts of real-time pricing on the usage of wind generation
Sioshansi, Ramteen; Short, Walter
2009-02-13
One of the impediments to large-scale use of wind generation within power systems is its nondispatchability and variable and uncertain real-time availability. Operating constraints on conventional generators such as minimum generation points, forbidden zones, and ramping limits as well as system constraints such as power flow limits and ancillary service requirements may force a system operator to curtail wind generation in order to ensure feasibility. Furthermore, the pattern of wind availability and electricity demand may not allow wind generation to be fully utilized in all hours. One solution to these issues, which could reduce these inflexibilities, is the use ofmore » real-time pricing (RTP) tariffs which can both smooth-out the diurnal load pattern in order to reduce the impact of binding unit operating and system constraints on wind utilization, and allow demand to increase in response to the availability of costless wind generation. As a result, we use and analyze a detailed unit commitment model of the Texas power system with different estimates of demand elasticities to demonstrate the potential increases in wind generation from implementing RTP.« less
Improving Design Efficiency for Large-Scale Heterogeneous Circuits
NASA Astrophysics Data System (ADS)
Gregerson, Anthony
Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.
Pushparajan, Charlotte; Claus, Juan Daniel; Marshall, Sean D G; Visnovsky, Gabriel
2017-12-01
The DSIR-HA-1179 coleopteran cell line has been identified as a susceptible and permissive host for the in vitro replication of the Oryctes nudivirus, which can be used as a biopesticide against the coconut rhinoceros beetle, pest of palms. The major challenge to in vitro large-scale Oryctes nudivirus production is ensuring process economy. This rests, among other requisites, on the use of low-cost culture media tailored to the nutritional and metabolic needs of the cell line, both in uninfected and infected cultures. The aim of the present study was to characterize the nutritional demands and the metabolic characteristics of the DSIR-HA-1179 cell line during growth and subsequent infection with Oryctes nudivirus in the TC-100 culture medium. Serum-supplementation of the culture medium was found to be critical for cell growth, and addition of 10% fetal bovine serum v/v led to a maximum viable cell density (16.8 × 10 5 cells ml -1 ) with a population doubling time of 4.2 d. Nutritional and metabolic characterization of the cell line revealed a trend of glucose and glutamine consumption but minimal uptake of other amino acids, negligible production of lactate and ammonia, and the accumulation of alanine, both before and after infection. The monitoring of virus production kinetics showed that the TC-100 culture medium was nutritionally sufficient to give a peak yield of 7.38 × 10 7 TCID 50 ml -1 of OrNV at the 6th day post-infection in attached cultures of DSIR-HA-1179 cells in 25 cm 2 T-flasks. Knowledge of the cell line's nutritional demands and virus production kinetics will aid in the formulation of a low-cost culture medium and better process design for large-scale OrNV production in future.
Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology
Ellis, Daniel P. W.; Pérez, Jonathan H.; Wingfield, John C.; Boelman, Natalie T.
2018-01-01
Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape’s snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change. PMID:29938220
Static analysis techniques for semiautomatic synthesis of message passing software skeletons
Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...
2015-06-29
The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less
Computational catalyst screening: Scaling, bond-order and catalysis
Abild-Pedersen, Frank
2015-10-01
Here, the design of new and better heterogeneous catalysts needed to accommodate the growing demand for energy from renewable sources is an important challenge for coming generations. Most surface catalyzed processes involve a large number of complex reaction networks and the energetics ultimately defines the turn-over-frequency and the selectivity of the process. In order not to get lost in the large quantities of data, simplification schemes that still contain the key elements of the reaction are required. Adsorption and transition state scaling relations constitutes such a scheme that not only maps the reaction relevant information in terms of few parametersmore » but also provides an efficient way of screening for new materials in a continuous multi-dimensional energy space. As with all relations they impose certain restrictions on what can be achieved and in this paper, I show why these limitations exist and how we can change the behavior through an energy-resolved approach that still maintains the screening capabilities needed in computational catalysis.« less
Simulation of FRET dyes allows quantitative comparison against experimental data
NASA Astrophysics Data System (ADS)
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
2018-03-01
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
Wearable Large-Scale Perovskite Solar-Power Source via Nanocellular Scaffold.
Hu, Xiaotian; Huang, Zengqi; Zhou, Xue; Li, Pengwei; Wang, Yang; Huang, Zhandong; Su, Meng; Ren, Wanjie; Li, Fengyu; Li, Mingzhu; Chen, Yiwang; Song, Yanlin
2017-11-01
Dramatic advances in perovskite solar cells (PSCs) and the blossoming of wearable electronics have triggered tremendous demands for flexible solar-power sources. However, the fracturing of functional crystalline films and transmittance wastage from flexible substrates are critical challenges to approaching the high-performance PSCs with flexural endurance. In this work, a nanocellular scaffold is introduced to architect a mechanics buffer layer and optics resonant cavity. The nanocellular scaffold releases mechanical stresses during flexural experiences and significantly improves the crystalline quality of the perovskite films. The nanocellular optics resonant cavity optimizes light harvesting and charge transportation of devices. More importantly, these flexible PSCs, which demonstrate excellent performance and mechanical stability, are practically fabricated in modules as a wearable solar-power source. A power conversion efficiency of 12.32% for a flexible large-scale device (polyethylene terephthalate substrate, indium tin oxide-free, 1.01 cm 2 ) is achieved. This ingenious flexible structure will enable a new approach for development of wearable electronics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The analysis of MAI in large scale MIMO-CDMA system
NASA Astrophysics Data System (ADS)
Berceanu, Madalina-Georgiana; Voicu, Carmen; Halunga, Simona
2016-12-01
Recently, technological development imposed a rapid growth in the use of data carried by cellular services, which also implies the necessity of higher data rates and lower latency. To meet the users' demands, it was brought into discussion a series of new data processing techniques. In this paper, we approached the MIMO technology that uses multiple antennas at the receiver and transmitter ends. To study the performances obtained by this technology, we proposed a MIMO-CDMA system, where image transmission has been used instead of random data transmission to take benefit of a larger range of quality indicators. In the simulations we increased the number of antennas, we observed how the performances of the system are modified and, based on that, we were able to make a comparison between a conventional MIMO and a Large Scale MIMO system, in terms of BER and MSSIM index, which is a metric that compares the quality of the image before transmission with the received one.
Legume abundance along successional and rainfall gradients in Neotropical forests.
Gei, Maga; Rozendaal, Danaë M A; Poorter, Lourens; Bongers, Frans; Sprent, Janet I; Garner, Mira D; Aide, T Mitchell; Andrade, José Luis; Balvanera, Patricia; Becknell, Justin M; Brancalion, Pedro H S; Cabral, George A L; César, Ricardo Gomes; Chazdon, Robin L; Cole, Rebecca J; Colletta, Gabriel Dalla; de Jong, Ben; Denslow, Julie S; Dent, Daisy H; DeWalt, Saara J; Dupuy, Juan Manuel; Durán, Sandra M; do Espírito Santo, Mário Marcos; Fernandes, G Wilson; Nunes, Yule Roberta Ferreira; Finegan, Bryan; Moser, Vanessa Granda; Hall, Jefferson S; Hernández-Stefanoni, José Luis; Junqueira, André B; Kennard, Deborah; Lebrija-Trejos, Edwin; Letcher, Susan G; Lohbeck, Madelon; Marín-Spiotta, Erika; Martínez-Ramos, Miguel; Meave, Jorge A; Menge, Duncan N L; Mora, Francisco; Muñoz, Rodrigo; Muscarella, Robert; Ochoa-Gaona, Susana; Orihuela-Belmonte, Edith; Ostertag, Rebecca; Peña-Claros, Marielos; Pérez-García, Eduardo A; Piotto, Daniel; Reich, Peter B; Reyes-García, Casandra; Rodríguez-Velázquez, Jorge; Romero-Pérez, I Eunice; Sanaphre-Villanueva, Lucía; Sanchez-Azofeifa, Arturo; Schwartz, Naomi B; de Almeida, Arlete Silva; Almeida-Cortez, Jarcilene S; Silver, Whendee; de Souza Moreno, Vanessa; Sullivan, Benjamin W; Swenson, Nathan G; Uriarte, Maria; van Breugel, Michiel; van der Wal, Hans; Veloso, Maria das Dores Magalhães; Vester, Hans F M; Vieira, Ima Célia Guimarães; Zimmerman, Jess K; Powers, Jennifer S
2018-05-28
The nutrient demands of regrowing tropical forests are partly satisfied by nitrogen-fixing legume trees, but our understanding of the abundance of those species is biased towards wet tropical regions. Here we show how the abundance of Leguminosae is affected by both recovery from disturbance and large-scale rainfall gradients through a synthesis of forest inventory plots from a network of 42 Neotropical forest chronosequences. During the first three decades of natural forest regeneration, legume basal area is twice as high in dry compared with wet secondary forests. The tremendous ecological success of legumes in recently disturbed, water-limited forests is likely to be related to both their reduced leaflet size and ability to fix N 2 , which together enhance legume drought tolerance and water-use efficiency. Earth system models should incorporate these large-scale successional and climatic patterns of legume dominance to provide more accurate estimates of the maximum potential for natural nitrogen fixation across tropical forests.
Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities
NASA Technical Reports Server (NTRS)
Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.
2016-01-01
The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (LIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper will document the latest improvements of the LIF system design and demonstrations of the redeveloped AHF and IHF LIF systems.
Metadata and annotations for multi-scale electrophysiological data.
Bower, Mark R; Stead, Matt; Brinkmann, Benjamin H; Dufendach, Kevin; Worrell, Gregory A
2009-01-01
The increasing use of high-frequency (kHz), long-duration (days) intracranial monitoring from multiple electrodes during pre-surgical evaluation for epilepsy produces large amounts of data that are challenging to store and maintain. Descriptive metadata and clinical annotations of these large data sets also pose challenges to simple, often manual, methods of data analysis. The problems of reliable communication of metadata and annotations between programs, the maintenance of the meanings within that information over long time periods, and the flexibility to re-sort data for analysis place differing demands on data structures and algorithms. Solutions to these individual problem domains (communication, storage and analysis) can be configured to provide easy translation and clarity across the domains. The Multi-scale Annotation Format (MAF) provides an integrated metadata and annotation environment that maximizes code reuse, minimizes error probability and encourages future changes by reducing the tendency to over-fit information technology solutions to current problems. An example of a graphical utility for generating and evaluating metadata and annotations for "big data" files is presented.
Bioinspired greigite magnetic nanocrystals: chemical synthesis and biomedicine applications
Feng, Mei; Lu, Yang; Yang, Yuan; Zhang, Meng; Xu, Yun-Jun; Gao, Huai-Ling; Dong, Liang; Xu, Wei-Ping; Yu, Shu-Hong
2013-01-01
Large scale greigite with uniform dimensions has stimulated significant demands for applications such as hyperthermia, photovoltaics, medicine and cell separation, etc. However, the inhomogeneity and hydrophobicity for most of the as prepared greigite crystals has limited their applications in biomedicine. Herein, we report a green chemical method utilizing β-cyclodextrin (β-CD) and polyethylene glycol (PEG) to synthesize bioinspired greigite (Fe3S4) magnetic nanocrystals (GMNCs) with similar structure and magnetic property of magnetosome in a large scale. β-CD and PEG is responsible to control the crystal phase and morphology, as well as to bound onto the surface of nanocrystals and form polymer layers. The GMNCs exhibit a transverse relaxivity of 94.8 mM−1s−1 which is as high as iron oxide nanocrystals, and an entrapment efficiency of 58.7% for magnetic guided delivery of chemotherapeutic drug doxorubicin. Moreover, enhanced chemotherapeutic treatment of mice tumor was obtained via intravenous injection of doxorubicin loaded GMNCs. PMID:24141204
NASA Astrophysics Data System (ADS)
Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.
2017-12-01
Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.
Fast Open-World Person Re-Identification.
Zhu, Xiatian; Wu, Botong; Huang, Dongcheng; Zheng, Wei-Shi
2018-05-01
Existing person re-identification (re-id) methods typically assume that: 1) any probe person is guaranteed to appear in the gallery target population during deployment (i.e., closed-world) and 2) the probe set contains only a limited number of people (i.e., small search scale). Both assumptions are artificial and breached in real-world applications, since the probe population in target people search can be extremely vast in practice due to the ambiguity of probe search space boundary. Therefore, it is unrealistic that any probe person is assumed as one target people, and a large-scale search in person images is inherently demanded. In this paper, we introduce a new person re-id search setting, called large scale open-world (LSOW) re-id, characterized by huge size probe images and open person population in search thus more close to practical deployments. Under LSOW, the under-studied problem of person re-id efficiency is essential in addition to that of commonly studied re-id accuracy. We, therefore, develop a novel fast person re-id method, called Cross-view Identity Correlation and vErification (X-ICE) hashing, for joint learning of cross-view identity representation binarisation and discrimination in a unified manner. Extensive comparative experiments on three large-scale benchmarks have been conducted to validate the superiority and advantages of the proposed X-ICE method over a wide range of the state-of-the-art hashing models, person re-id methods, and their combinations.
NASA Astrophysics Data System (ADS)
Lassonde, Sylvain; Boucher, Olivier; Breon, François-Marie; Tobin, Isabelle; Vautard, Robert
2016-04-01
The share of renewable energies in the mix of electricity production is increasing worldwide. This trend is driven by environmental and economic policies aiming at a reduction of greenhouse gas emissions and an improvement of energy security. It is expected to continue in the forthcoming years and decades. Electricity production from renewables is related to weather and climate factors such as the diurnal and seasonal cycles of sunlight and wind, but is also linked to variability on all time scales. The intermittency in the renewable electricity production (solar, wind power) could eventually hinder their future deployment. Intermittency is indeed a challenge as demand and supply of electricity need to be balanced at any time. This challenge can be addressed by the deployment of an overcapacity in power generation (from renewable and/or thermal sources), a large-scale energy storage system and/or improved management of the demand. The main goal of this study is to optimize a hypothetical renewable energy system at the French and European scales in order to investigate if spatial diversity of the production (here electricity from wind energy) could be a response to the intermittency. We use ECMWF (European Centre for Medium-Range Weather Forecasts) ERA-interim meteorological reanalysis and meteorological fields from the Weather Research and Forecasts (WRF) model to estimate the potential for wind power generation. Electricity demand and production are provided by the French electricity network (RTE) at the scale of administrative regions for years 2013 and 2014. Firstly we will show how the simulated production of wind power compares against the measured production at the national and regional scale. Several modelling and bias correction methods of wind power production will be discussed. Secondly, we will present results from an optimization procedure that aims to minimize some measure of the intermittency of wind energy. For instance we estimate the optimal distribution between French regions (with or without cross-border inputs) that minimizes the impact of low-production periods computed in a running mean sense and its sensitivity to the period considered. We will also assess which meteorological situations are the most problematic over the 35-year ERA-interim climatology(1980-2015).
Residential outdoor water use in Tucson, Arizona: Geospatial, demographic and temporal perspectives
NASA Astrophysics Data System (ADS)
Himmel, Alexander I.
Outdoor water use by single-family residences in the desert city of Tucson, Arizona is investigated as a multi-scaled coupled human-environment system, using remotely sensed images, GIS data, household water use records and survey responses. Like many desert cities, Tucson's municipal water system faces stresses at multiple spatial and temporal scales: rising demand, limited supplies, competition for distant resources and the likelihood of shortages due to regional climate change. Though the need for demand management is recognized, conflict between the long-term regional scale of the ecosystem that sustains Tucson's water supply and the short-term, local scale of the municipal utility results in a "lack of fit", shown here as the inability to reduce consumption to sustainable levels. While direct regulation of outdoor water use has not been successful, geographic research suggests that modification of the built environment, the focus of the three studies comprising this dissertation, holds promise as a demand management strategy. The first study is a spatial analysis of survey responses on outdoor water use practices during a drought. Next, the potential for substituting common amenities (irrigated landscapes and swimming pools) for private ones is investigated. Residential use was found to be sensitive to park proximity, greenness (proxied by the Normalized Difference Vegetation Index), size and presence of a park pool. Most small parks were net water savers; large parks offered the opportunity to substitute reclaimed water for potable supplies. The last study correlates long-term Landsat-based vegetation and water use trends and integrates these with a spatial analysis of kinetic temperatures. Findings indicate that despite reduced water use, Tucson became greener over the 1995 -- 2008 period. This effect is attributed to a pulse of vegetation establishment in response to a shift in the El Nino -- Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO) around 1976 and to irrigation prior to the study period. I conclude that although the coupled human-environment system of Tucson's municipal water supply and use practices is complex, there are scale-dependent competitive advantages to be gained through thoughtful modification of the built environment.
The scaling structure of the global road network
Giometto, Andrea; Shai, Saray; Bertuzzo, Enrico; Mucha, Peter J.; Rinaldo, Andrea
2017-01-01
Because of increasing global urbanization and its immediate consequences, including changes in patterns of food demand, circulation and land use, the next century will witness a major increase in the extent of paved roads built worldwide. To model the effects of this increase, it is crucial to understand whether possible self-organized patterns are inherent in the global road network structure. Here, we use the largest updated database comprising all major roads on the Earth, together with global urban and cropland inventories, to suggest that road length distributions within croplands are indistinguishable from urban ones, once rescaled to account for the difference in mean road length. Such similarity extends to road length distributions within urban or agricultural domains of a given area. We find two distinct regimes for the scaling of the mean road length with the associated area, holding in general at small and at large values of the latter. In suitably large urban and cropland domains, we find that mean and total road lengths increase linearly with their domain area, differently from earlier suggestions. Scaling regimes suggest that simple and universal mechanisms regulate urban and cropland road expansion at the global scale. As such, our findings bear implications for global road infrastructure growth based on land-use change and for planning policies sustaining urban expansions. PMID:29134071
The scaling structure of the global road network.
Strano, Emanuele; Giometto, Andrea; Shai, Saray; Bertuzzo, Enrico; Mucha, Peter J; Rinaldo, Andrea
2017-10-01
Because of increasing global urbanization and its immediate consequences, including changes in patterns of food demand, circulation and land use, the next century will witness a major increase in the extent of paved roads built worldwide. To model the effects of this increase, it is crucial to understand whether possible self-organized patterns are inherent in the global road network structure. Here, we use the largest updated database comprising all major roads on the Earth, together with global urban and cropland inventories, to suggest that road length distributions within croplands are indistinguishable from urban ones, once rescaled to account for the difference in mean road length. Such similarity extends to road length distributions within urban or agricultural domains of a given area. We find two distinct regimes for the scaling of the mean road length with the associated area, holding in general at small and at large values of the latter. In suitably large urban and cropland domains, we find that mean and total road lengths increase linearly with their domain area, differently from earlier suggestions. Scaling regimes suggest that simple and universal mechanisms regulate urban and cropland road expansion at the global scale. As such, our findings bear implications for global road infrastructure growth based on land-use change and for planning policies sustaining urban expansions.
Stochastic optimisation of water allocation on a global scale
NASA Astrophysics Data System (ADS)
Schmitz, Oliver; Straatsma, Menno; Karssenberg, Derek; Bierkens, Marc F. P.
2014-05-01
Climate change, increasing population and further economic developments are expected to increase water scarcity for many regions of the world. Optimal water management strategies are required to minimise the water gap between water supply and domestic, industrial and agricultural water demand. A crucial aspect of water allocation is the spatial scale of optimisation. Blue water supply peaks at the upstream parts of large catchments, whereas demands are often largest at the industrialised downstream parts. Two extremes exist in water allocation: (i) 'First come, first serve,' which allows the upstream water demands to be fulfilled without considerations of downstream demands, and (ii) 'All for one, one for all' that satisfies water allocation over the whole catchment. In practice, water treaties govern intermediate solutions. The objective of this study is to determine the effect of these two end members on water allocation optimisation with respect to water scarcity. We conduct this study on a global scale with the year 2100 as temporal horizon. Water supply is calculated using the hydrological model PCR-GLOBWB, operating at a 5 arcminutes resolution and a daily time step. PCR-GLOBWB is forced with temperature and precipitation fields from the Hadgem2-ES global circulation model that participated in the latest coupled model intercomparison project (CMIP5). Water demands are calculated for representative concentration pathway 6.0 (RCP 6.0) and shared socio-economic pathway scenario 2 (SSP2). To enable the fast computation of the optimisation, we developed a hydrologically correct network of 1800 basin segments with an average size of 100 000 square kilometres. The maximum number of nodes in a network was 140 for the Amazon Basin. Water demands and supplies are aggregated to cubic kilometres per month per segment. A new open source implementation of the water allocation is developed for the stochastic optimisation of the water allocation. We apply a Genetic Algorithm for each segment to estimate the set of parameters that distribute the water supply for each node. We use the Python programming language and a flexible software architecture allowing to straightforwardly 1) exchange the process description for the nodes such that different water allocation schemes can be tested 2) exchange the objective function 3) apply the optimisation either to the whole catchment or to different sub-levels and 4) use multi-core CPUs concurrently and therefore reducing computation time. We demonstrate the application of the scientific workflow to the model outputs of PCR-GLOBWB and present first results on how water scarcity depends on the choice between the two extremes in water allocation.
Large Field Visualization with Demand-Driven Calculation
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Henze, Chris
1999-01-01
We present a system designed for the interactive definition and visualization of fields derived from large data sets: the Demand-Driven Visualizer (DDV). The system allows the user to write arbitrary expressions to define new fields, and then apply a variety of visualization techniques to the result. Expressions can include differential operators and numerous other built-in functions, ail of which are evaluated at specific field locations completely on demand. The payoff of following a demand-driven design philosophy throughout becomes particularly evident when working with large time-series data, where the costs of eager evaluation alternatives can be prohibitive.
Concept Overview & Preliminary Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruth, Mark
2017-07-12
'H2@Scale' is an opportunity for wide-scale use of hydrogen as an intermediate that carries energy from various production options to multiple uses. It is based on identifying and developing opportunities for low-cost hydrogen production and investigating opportunities for using that hydrogen across the electricity, industrial, and transportation sectors. One of the key production opportunities is use of low-cost electricity that may be generated under high penetrations of variable renewable generators such as wind and solar photovoltaics. The technical potential demand for hydrogen across the sectors is 60 million metric tons per year. The U.S. has sufficient domestic renewable resources somore » that each could meet that demand and could readily meet the demand using a portfolio of generation options. This presentation provides an overview of the concept and the technical potential demand and resources. It also motivates analysis and research on H2@Scale.« less
Centennial-scale reductions in nitrogen availability in temperate forests of the United States
McLauchlan, Kendra K.; Gerhart, Laci M.; Battles, John J.; Craine, Joseph M.; Elmore, Andrew J.; Higuera, Phil E.; Mack, Michelle M; McNeil, Brendan E.; Nelson, David M.; Pederson, Neil; Perakis, Steven
2017-01-01
Forests cover 30% of the terrestrial Earth surface and are a major component of the global carbon (C) cycle. Humans have doubled the amount of global reactive nitrogen (N), increasing deposition of N onto forests worldwide. However, other global changes—especially climate change and elevated atmospheric carbon dioxide concentrations—are increasing demand for N, the element limiting primary productivity in temperate forests, which could be reducing N availability. To determine the long-term, integrated effects of global changes on forest N cycling, we measured stable N isotopes in wood, a proxy for N supply relative to demand, on large spatial and temporal scales across the continental U.S.A. Here, we show that forest N availability has generally declined across much of the U.S. since at least 1850 C.E. with cool, wet forests demonstrating the greatest declines. Across sites, recent trajectories of N availability were independent of recent atmospheric N deposition rates, implying a minor role for modern N deposition on the trajectory of N status of North American forests. Our results demonstrate that current trends of global changes are likely to be consistent with forest oligotrophication into the foreseeable future, further constraining forest C fixation and potentially storage.
Revegetation in China’s Loess Plateau is approaching sustainable water resource limits
NASA Astrophysics Data System (ADS)
Feng, Xiaoming; Fu, Bojie; Piao, Shilong; Wang, Shuai; Ciais, Philippe; Zeng, Zhenzhong; Lü, Yihe; Zeng, Yuan; Li, Yue; Jiang, Xiaohui; Wu, Bingfang
2016-11-01
Revegetation of degraded ecosystems provides opportunities for carbon sequestration and bioenergy production. However, vegetation expansion in water-limited areas creates potentially conflicting demands for water between the ecosystem and humans. Current understanding of these competing demands is still limited. Here, we study the semi-arid Loess Plateau in China, where the `Grain to Green’ large-scale revegetation programme has been in operation since 1999. As expected, we found that the new planting has caused both net primary productivity (NPP) and evapotranspiration (ET) to increase. Also the increase of ET has induced a significant (p < 0.001) decrease in the ratio of river runoff to annual precipitation across hydrological catchments. From currently revegetated areas and human water demand, we estimate a threshold of NPP of 400 +/- 5 g C m-2 yr-1 above which the population will suffer water shortages. NPP in this region is found to be already close to this limit. The threshold of NPP could change by -36% in the worst case of climate drying and high human withdrawals, to +43% in the best case. Our results develop a new conceptual framework to determine the critical carbon sequestration that is sustainable in terms of both ecological and socio-economic resource demands in a coupled anthropogenic-biological system.
Schevernels, Hanne; Krebs, Ruth M.; Santens, Patrick; Woldorff, Marty G.; Boehler, C. Nico
2013-01-01
Recently, attempts have been made to disentangle the neural underpinnings of preparatory processes related to reward and attention. Functional magnetic resonance imaging (fMRI) research showed that neural activity related to the anticipation of reward and to attentional demands invokes neural activity patterns featuring large-scale overlap, along with some differences and interactions. Due to the limited temporal resolution of fMRI, however, the temporal dynamics of these processes remain unclear. Here, we report an event-related potentials (ERP) study in which cued attentional demands and reward prospect were combined in a factorial design. Results showed that reward prediction dominated early cue processing, as well as the early and later parts of the contingent negative variation (CNV) slow-wave ERP component that has been associated with task-preparation processes. Moreover these reward-related electrophysiological effects correlated across participants with response-time speeding on reward-prospect trials. In contrast, cued attentional demands affected only the later part of the CNV, with the highest amplitudes following cues predicting high-difficulty potential-reward targets, thus suggesting maximal task preparation when the task requires it and entails reward prospect. Consequently, we suggest that task-preparation processes triggered by reward can arise earlier, and potentially more directly, than strategic top-down aspects of preparation based on attentional demands. PMID:24064071
NASA Technical Reports Server (NTRS)
Smith, J. L.
1978-01-01
An analysis is given of the Low-Cost Silicon Solar Array Project plans for the industrialization of new production technologies expected to be forthcoming as a result of the project's technology development efforts. In particular, LSSA's mandate to insure an annual production capability of 500 MW peak for the photovoltaic supply industry by 1986 is critically examined. The examination focuses on one of the concerns behind this goal -- timely development of industrial capacity to supply anticipated demand. Some of the conclusions include: (1) construction of small-scale pilot plants should be undertaken only for purposes of technology development; (2) large-scale demonstrations should be undertaken only when the technology is well in hand; (3) commercial-scale production should be left to the private sector; (4) the 500-MW annual output goal should be shifted to Program Headquarters.
Negative emissions—Part 3: Innovation and upscaling
NASA Astrophysics Data System (ADS)
Nemet, Gregory F.; Callaghan, Max W.; Creutzig, Felix; Fuss, Sabine; Hartmann, Jens; Hilaire, Jérôme; Lamb, William F.; Minx, Jan C.; Rogers, Sophia; Smith, Pete
2018-06-01
We assess the literature on innovation and upscaling for negative emissions technologies (NETs) using a systematic and reproducible literature coding procedure. To structure our review, we employ the framework of sequential stages in the innovation process, with which we code each NETs article in innovation space. We find that while there is a growing body of innovation literature on NETs, 59% of the articles are focused on the earliest stages of the innovation process, ‘research and development’ (R&D). The subsequent stages of innovation are also represented in the literature, but at much lower levels of activity than R&D. Distinguishing between innovation stages that are related to the supply of the technology (R&D, demonstrations, scale up) and demand for the technology (demand pull, niche markets, public acceptance), we find an overwhelming emphasis (83%) on the supply side. BECCS articles have an above average share of demand-side articles while direct air carbon capture and storage has a very low share. Innovation in NETs has much to learn from successfully diffused technologies; appealing to heterogeneous users, managing policy risk, as well as understanding and addressing public concerns are all crucial yet not well represented in the extant literature. Results from integrated assessment models show that while NETs play a key role in the second half of the 21st century for 1.5 °C and 2 °C scenarios, the major period of new NETs deployment is between 2030 and 2050. Given that the broader innovation literature consistently finds long time periods involved in scaling up and deploying novel technologies, there is an urgency to developing NETs that is largely unappreciated. This challenge is exacerbated by the thousands to millions of actors that potentially need to adopt these technologies for them to achieve planetary scale. This urgency is reflected neither in the Paris Agreement nor in most of the literature we review here. If NETs are to be deployed at the levels required to meet 1.5 °C and 2 °C targets, then important post-R&D issues will need to be addressed in the literature, including incentives for early deployment, niche markets, scale-up, demand, and—particularly if deployment is to be hastened—public acceptance.
NASA Astrophysics Data System (ADS)
Wang, H.; Asefa, T.
2017-12-01
A real-time decision support tool (DST) for water supply system would consider system uncertainties, e.g., uncertain streamflow and demand, as well as operational constraints and infrastructure outage (e.g., pump station shutdown, an offline reservoir due to maintenance). Such DST is often used by water managers for resource allocation and delivery for customers. Although most seasonal DST used by water managers recognize those system uncertainties and operational constraints, most use only historical information or assume deterministic outlook of water supply systems. This study presents a seasonal DST that incorporates rainfall/streamflow uncertainties, seasonal demand outlook and system operational constraints. Large scale climate-information is captured through a rainfall simulator driven by a Bayesian non-homogeneous Markov Chain Monte Carlo model that allows non-stationary transition probabilities contingent on Nino 3.4 index. An ad-hoc seasonal demand forecasting model considers weather conditions explicitly and socio-economic factors implicitly. Latin Hypercube sampling is employed to effectively sample probability density functions of flow and demand. Seasonal system operation is modelled as a mixed-integer optimization problem that aims at minimizing operational costs. It embeds the flexibility of modifying operational rules at different components, e.g., surface water treatment plants, desalination facilities, and groundwater pumping stations. The proposed framework is illustrated at a wholesale water supplier in Southeastern United States, Tampa Bay Water. The use of the tool is demonstrated in proving operational guidance in a typical drawdown and refill cycle of a regional reservoir. The DST provided: 1) probabilistic outlook of reservoir storage and chance of a successful refill by the end of rainy season; 2) operational expectations for large infrastructures (e.g., high service pumps and booster stations) throughout the season. Other potential use of such DST is also discussed.
How to meet the increasing demands of water, food and energy in the future?
NASA Astrophysics Data System (ADS)
Shi, Haiyun; Chen, Ji; Sivakumar, Bellie; Peart, Mervyn
2017-04-01
Regarded as a driving force in water, food and energy demands, the world's population has been increasing rapidly since the beginning of the 20th century. According to the medium-growth projection scenario of the United Nations, the world's population will reach 9.5 billion by 2050. In response to the continuously growing population during this century, water, food and energy demands have also been increasing rapidly, and social problems (e.g., water, food, and energy shortages) will be most likely to occur, especially if no proper management strategies are adopted. Then, how to meet the increasing demands of water, food and energy in the future? This study focuses on the sustainable developments of population, water, food, energy and dams, and the significances of this study can be concluded as follows: First, we reveal the close association between dams and social development through analysing the related data for the period 1960-2010, and argue that construction of additional large dams will have to be considered as one of the best available options to meet the increasing water, food and energy demands in the future. We conduct the projections of global water, food and energy consumptions and dam development for the period 2010-2050, and the results show that, compared to 2010, the total water, food and energy consumptions in 2050 will increase by 20%, 34% and 37%, respectively. Moreover, it is projected that additional 4,340 dams will be constructed by 2050 all over the world. Second, we analyse the current situation of global water scarcity based on the related data representing water resources availability (per capita available water resources), dam development (the number of dams), and the level of economic development (per capita gross domestic product). At the global scale, water scarcity exists in more than 70% of the countries around the world, including 43 countries suffering from economic water scarcity and 129 countries suffering from physical water scarcity. At the continental scale, most countries of Africa, the south and west Asia, and the central Europe are suffering from water scarcity. Third, with comprehensive consideration of population growth as the major driving force, water resources availability as the basic supporting factor, and topography as the important constraint, we address the question of future dam development and predict the locations of future large dams around the world. The results show that there will be 1,433 large dams built in the future, mainly in the Tibet Plateau and the Yunnan-Guizhou Plateau in Asia, the East African Plateau and the western part of Africa, the Andes Mountains and the Brazilian Plateau region in South America, the Rocky Mountains in North America, the Alps in Europe, and the Murray-Darling Basin in Oceania. Taking into account of the current situation of global water scarcity, these large dams are most likely to be constructed in countries that have abundant total available water resources or per capita available water resources, no matter whether they are experiencing "economic water scarcity" or have sufficient financial support.
Rapid Neocortical Dynamics: Cellular and Network Mechanisms
Haider, Bilal; McCormick, David A.
2011-01-01
The highly interconnected local and large-scale networks of the neocortical sheet rapidly and dynamically modulate their functional connectivity according to behavioral demands. This basic operating principle of the neocortex is mediated by the continuously changing flow of excitatory and inhibitory synaptic barrages that not only control participation of neurons in networks but also define the networks themselves. The rapid control of neuronal responsiveness via synaptic bombardment is a fundamental property of cortical dynamics that may provide the basis of diverse behaviors, including sensory perception, motor integration, working memory, and attention. PMID:19409263
Machine learning and computer vision approaches for phenotypic profiling.
Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J
2017-01-02
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.
Making visible the unseen elements of nursing.
Allen, Davina
The traditional image of nurses as caregivers needs revision but this is challenging as much nursing work cannot easily be explained. This article summarises the main findings from a large-scale study of a relatively invisible, but everyday, element of nursing practice--"organising work". This has always been a component of nursing but has recently been seen as a distraction from patient care. More must be done to recognise and communicate its value and the demands it places on staff to shape education, professional development and how nurses are viewed.
Machine learning and computer vision approaches for phenotypic profiling
Morris, Quaid
2017-01-01
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. PMID:27940887
NASA Technical Reports Server (NTRS)
1975-01-01
Energy consumption in the United States has risen in response to both increasing population and to increasing levels of affluence. Depletion of domestic energy reserves requires consumption modulation, production of fossil fuels, more efficient conversion techniques, and large scale transitions to non-fossile fuel energy sources. Widening disparity between the wealthy and poor nations of the world contributes to trends that increase the likelihood of group action by the lesser developed countries to achieve political and economic goals. The formation of anticartel cartels is envisioned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furuuchi, Kazuyuki; Sperling, Marcus, E-mail: kazuyuki.furuuchi@manipal.edu, E-mail: marcus.sperling@univie.ac.at
2017-05-01
We study quantum tunnelling in Dante's Inferno model of large field inflation. Such a tunnelling process, which will terminate inflation, becomes problematic if the tunnelling rate is rapid compared to the Hubble time scale at the time of inflation. Consequently, we constrain the parameter space of Dante's Inferno model by demanding a suppressed tunnelling rate during inflation. The constraints are derived and explicit numerical bounds are provided for representative examples. Our considerations are at the level of an effective field theory; hence, the presented constraints have to hold regardless of any UV completion.
Wafer-scale growth of VO2 thin films using a combinatorial approach
Zhang, Hai-Tian; Zhang, Lei; Mukherjee, Debangshu; Zheng, Yuan-Xia; Haislmaier, Ryan C.; Alem, Nasim; Engel-Herbert, Roman
2015-01-01
Transition metal oxides offer functional properties beyond conventional semiconductors. Bridging the gap between the fundamental research frontier in oxide electronics and their realization in commercial devices demands a wafer-scale growth approach for high-quality transition metal oxide thin films. Such a method requires excellent control over the transition metal valence state to avoid performance deterioration, which has been proved challenging. Here we present a scalable growth approach that enables a precise valence state control. By creating an oxygen activity gradient across the wafer, a continuous valence state library is established to directly identify the optimal growth condition. Single-crystalline VO2 thin films have been grown on wafer scale, exhibiting more than four orders of magnitude change in resistivity across the metal-to-insulator transition. It is demonstrated that ‘electronic grade' transition metal oxide films can be realized on a large scale using a combinatorial growth approach, which can be extended to other multivalent oxide systems. PMID:26450653
Autonomous smart sensor network for full-scale structural health monitoring
NASA Astrophysics Data System (ADS)
Rice, Jennifer A.; Mechitov, Kirill A.; Spencer, B. F., Jr.; Agha, Gul A.
2010-04-01
The demands of aging infrastructure require effective methods for structural monitoring and maintenance. Wireless smart sensor networks offer the ability to enhance structural health monitoring (SHM) practices through the utilization of onboard computation to achieve distributed data management. Such an approach is scalable to the large number of sensor nodes required for high-fidelity modal analysis and damage detection. While smart sensor technology is not new, the number of full-scale SHM applications has been limited. This slow progress is due, in part, to the complex network management issues that arise when moving from a laboratory setting to a full-scale monitoring implementation. This paper presents flexible network management software that enables continuous and autonomous operation of wireless smart sensor networks for full-scale SHM applications. The software components combine sleep/wake cycling for enhanced power management with threshold detection for triggering network wide tasks, such as synchronized sensing or decentralized modal analysis, during periods of critical structural response.
Optimization and resilience of complex supply-demand networks
NASA Astrophysics Data System (ADS)
Zhang, Si-Ping; Huang, Zi-Gang; Dong, Jia-Qi; Eisenberg, Daniel; Seager, Thomas P.; Lai, Ying-Cheng
2015-06-01
Supply-demand processes take place on a large variety of real-world networked systems ranging from power grids and the internet to social networking and urban systems. In a modern infrastructure, supply-demand systems are constantly expanding, leading to constant increase in load requirement for resources and consequently, to problems such as low efficiency, resource scarcity, and partial system failures. Under certain conditions global catastrophe on the scale of the whole system can occur through the dynamical process of cascading failures. We investigate optimization and resilience of time-varying supply-demand systems by constructing network models of such systems, where resources are transported from the supplier sites to users through various links. Here by optimization we mean minimization of the maximum load on links, and system resilience can be characterized using the cascading failure size of users who fail to connect with suppliers. We consider two representative classes of supply schemes: load driven supply and fix fraction supply. Our findings are: (1) optimized systems are more robust since relatively smaller cascading failures occur when triggered by external perturbation to the links; (2) a large fraction of links can be free of load if resources are directed to transport through the shortest paths; (3) redundant links in the performance of the system can help to reroute the traffic but may undesirably transmit and enlarge the failure size of the system; (4) the patterns of cascading failures depend strongly upon the capacity of links; (5) the specific location of the trigger determines the specific route of cascading failure, but has little effect on the final cascading size; (6) system expansion typically reduces the efficiency; and (7) when the locations of the suppliers are optimized over a long expanding period, fewer suppliers are required. These results hold for heterogeneous networks in general, providing insights into designing optimal and resilient complex supply-demand systems that expand constantly in time.
Potential climatic impacts and reliability of large-scale offshore wind farms
NASA Astrophysics Data System (ADS)
Wang, Chien; Prinn, Ronald G.
2011-04-01
The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land-based installations. However, the intermittency caused by the significant seasonal wind variations over several major offshore sites is substantial, and demands further options to ensure the reliability of large-scale offshore wind power. The method that we used to simulate the offshore wind turbine effect on the lower atmosphere involved simply increasing the ocean surface drag coefficient. While this method is consistent with several detailed fine-scale simulations of wind turbines, it still needs further study to ensure its validity. New field observations of actual wind turbine arrays are definitely required to provide ultimate validation of the model predictions presented here.
Supply-demand balance in outward-directed networks and Kleiber's law
Painter, Page R
2005-01-01
Background Recent theories have attempted to derive the value of the exponent α in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). Methods The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Results Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. Conclusion The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate. PMID:16283939
Supply-demand balance in outward-directed networks and Kleiber's law.
Painter, Page R
2005-11-10
Recent theories have attempted to derive the value of the exponent alpha in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate.
NASA Astrophysics Data System (ADS)
Ho, Michelle; Lall, Upmanu; Sun, Xun; Cook, Edward
2017-04-01
Large-scale water storage infrastructure in the Conterminous United States (CONUS) provides a means of regulating the temporal variability in water supply with storage capacities ranging from seasonal storage in the wetter east to multi-annual and decadal-scale storage in the drier west. Regional differences in water availability across the CONUS provides opportunities for optimizing water dependent economic activities, such as food and energy production, through storage and transportation. However, the ability to sufficiently regulate water supplies into the future is compromised by inadequate monitoring of non-federally-owned dams that make up around 97% of all dams. Furthermore, many of these dams are reaching or have exceeded their economic design life. Understanding the role of dams in the current and future landscape of water requirements in the CONUS is needed to prioritize dam safety remediation or identify where redundant dams may be removed. A national water assessment and planning process is needed for addressing water requirements, accounting for regional differences in water supply and demand, and the role of dams in such a landscape. Most dams in the CONUS were designed without knowledge of devastating floods and prolonged droughts detected in multi-centennial paleoclimate records, consideration of projected climate change, nor consideration of optimal operation across large-scale regions. As a step towards informing water supply across the CONUS we present a paleoclimate reconstruction of annual streamflow across the CONUS over the past 555 years using a spatially and temporally complete paleoclimate record of summer drought across the CONUS targeting a set of US Geological Survey streamflow sites. The spatial and temporal structures of national streamflow variability are analyzed using hierarchical clustering, principal component analysis, and wavelet analyses. The reconstructions show signals of contemporary droughts such as the Dust Bowl (1930s) and 1950s droughts. Decadal-scale variability was detected in the late 1900s in the western US, however, similar modes of temporal variability were rarely present prior to the 1950s. The 20th century featured longer wet spells and shorter dry spells compared with the preceding 450 years. Streamflow in the Pacific Northwest and Northeast are negatively correlated with the central US suggesting the potential to mitigate some drought impacts by balancing economic activities and insurance pools across these regions during major droughts. The converging issues of a slowly growing US population, evolving demands for food, energy, and water, aging dams, and reduced water storage capacity through decommissioning and sedimentation highlights the pressing need for a national water assessment and a subsequent national water plan. There are many factors that need to be understood in order to appropriately assess dam and reservoir requirements across the CONUS and to improve water use and flood protection efficiency. In addition to historical and paleoclimate-informed surface water supply, factors requiring consideration in planning for future dam and reservoir infrastructure include: -the role of conjunctive surface and groundwater storage and use; -basin-scale operational strategies to balance sectoral water demand; -projections of surface water supply; -projections of regional water demands; -impacts of water conservation; and -the influence of water policy and financial instruments.
NASA Astrophysics Data System (ADS)
Neverre, Noémie; Dumas, Patrice; Nassopoulos, Hypatia
2016-04-01
Global changes are expected to exacerbate water scarcity issues in the Mediterranean region in the next decades. In this work, we investigate the impacts of reservoirs operation rules based on an economic criterion. We examine whether can they help reduce the costs of water scarcity, and whether they become more relevant under future climatic and socioeconomic conditions. We develop an original hydroeconomic model able to compare future water supply and demand on a large scale, while representing river basin heterogeneity. On the demand side, we focus on the two main sectors of water use: the irrigation and domestic sectors. Demands are projected in terms of both quantity and economic value. Irrigation requirements are computed for 12 types of crops, at the 0.5° spatial resolution, under future climatic conditions (A1B scenario). The computation of the economic benefits of irrigation water is based on a yield comparison approach between rainfed and irrigated crops. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The economic value of domestic water is defined as the economic surplus. On the supply side, we evaluate the impacts of climate change on water inflows to the reservoirs. Operating rules of the reservoirs are set up using a parameterisation-simulation-optimisation approach. The objective is to maximise water benefits. We introduce prudential parametric rules in order to take into account spatial and temporal trade-offs. The methodology is applied to Algeria at the 2050 horizon. Overall, our results show that the supply-demand imbalance and its costs will increase in most basins under future climatic and socioeconomic conditions. Our results suggest that the benefits of operating rules based on economic criteria are not unequivocally increased with global changes: in some basins the positive impact of economic prioritisation is higher under future conditions, but in other basins it is higher under historical conditions. Global changes may be an incentive to use valuation in operating rules in some basins. In other basins, the benefits of reservoirs management based on economic criteria are less pronounced; in this case, trade-offs could arise between implementing economic based operation policies or not. Given its generic nature and low data requirements, the framework developed could be implemented in other regions concerned with water scarcity and its cost, or extended to a global coverage. Water policies at the country or regional level could be assessed.
NASA Astrophysics Data System (ADS)
Troy, Tara J.; Ines, Amor V. M.; Lall, Upmanu; Robertson, Andrew W.
2013-04-01
Large-scale hydrologic models, such as the Variable Infiltration Capacity (VIC) model, are used for a variety of studies, from drought monitoring to projecting the potential impact of climate change on the hydrologic cycle decades in advance. The majority of these models simulates the natural hydrological cycle and neglects the effects of human activities such as irrigation, which can result in streamflow withdrawals and increased evapotranspiration. In some parts of the world, these activities do not significantly affect the hydrologic cycle, but this is not the case in south Asia where irrigated agriculture has a large water footprint. To address this gap, we incorporate a crop growth model and irrigation model into the VIC model in order to simulate the impacts of irrigated and rainfed agriculture on the hydrologic cycle over south Asia (Indus, Ganges, and Brahmaputra basin and peninsular India). The crop growth model responds to climate signals, including temperature and water stress, to simulate the growth of maize, wheat, rice, and millet. For the primarily rainfed maize crop, the crop growth model shows good correlation with observed All-India yields (0.7) with lower correlations for the irrigated wheat and rice crops (0.4). The difference in correlation is because irrigation provides a buffer against climate conditions, so that rainfed crop growth is more tied to climate than irrigated crop growth. The irrigation water demands induce hydrologic water stress in significant parts of the region, particularly in the Indus, with the streamflow unable to meet the irrigation demands. Although rainfall can vary significantly in south Asia, we find that water scarcity is largely chronic due to the irrigation demands rather than being intermittent due to climate variability.
Bertrand, Jane T; Njeuhmeli, Emmanuel; Forsythe, Steven; Mattison, Sarah K; Mahler, Hally; Hankins, Catherine A
2011-01-01
This paper proposes an approach to estimating the costs of demand creation for voluntary medical male circumcision (VMMC) scale-up in 13 countries of eastern and southern Africa. It addresses two key questions: (1) what are the elements of a standardized package for demand creation? And (2) what challenges exist and must be taken into account in estimating the costs of demand creation? We conducted a key informant study on VMMC demand creation using purposive sampling to recruit seven people who provide technical assistance to government programs and manage budgets for VMMC demand creation. Key informants provided their views on the important elements of VMMC demand creation and the most effective funding allocations across different types of communication approaches (e.g., mass media, small media, outreach/mobilization). The key finding was the wide range of views, suggesting that a standard package of core demand creation elements would not be universally applicable. This underscored the importance of tailoring demand creation strategies and estimates to specific country contexts before estimating costs. The key informant interviews, supplemented by the researchers' field experience, identified these issues to be addressed in future costing exercises: variations in the cost of VMMC demand creation activities by country and program, decisions about the quality and comprehensiveness of programming, and lack of data on critical elements needed to "trigger the decision" among eligible men. Based on this study's findings, we propose a seven-step methodological approach to estimate the cost of VMMC scale-up in a priority country, based on our key assumptions. However, further work is needed to better understand core components of a demand creation package and how to cost them. Notwithstanding the methodological challenges, estimating the cost of demand creation remains an essential element in deriving estimates of the total costs for VMMC scale-up in eastern and southern Africa.
Bertrand, Jane T.; Njeuhmeli, Emmanuel; Forsythe, Steven; Mattison, Sarah K.; Mahler, Hally; Hankins, Catherine A.
2011-01-01
Background This paper proposes an approach to estimating the costs of demand creation for voluntary medical male circumcision (VMMC) scale-up in 13 countries of eastern and southern Africa. It addresses two key questions: (1) what are the elements of a standardized package for demand creation? And (2) what challenges exist and must be taken into account in estimating the costs of demand creation? Methods and Findings We conducted a key informant study on VMMC demand creation using purposive sampling to recruit seven people who provide technical assistance to government programs and manage budgets for VMMC demand creation. Key informants provided their views on the important elements of VMMC demand creation and the most effective funding allocations across different types of communication approaches (e.g., mass media, small media, outreach/mobilization). The key finding was the wide range of views, suggesting that a standard package of core demand creation elements would not be universally applicable. This underscored the importance of tailoring demand creation strategies and estimates to specific country contexts before estimating costs. The key informant interviews, supplemented by the researchers' field experience, identified these issues to be addressed in future costing exercises: variations in the cost of VMMC demand creation activities by country and program, decisions about the quality and comprehensiveness of programming, and lack of data on critical elements needed to “trigger the decision” among eligible men. Conclusions Based on this study's findings, we propose a seven-step methodological approach to estimate the cost of VMMC scale-up in a priority country, based on our key assumptions. However, further work is needed to better understand core components of a demand creation package and how to cost them. Notwithstanding the methodological challenges, estimating the cost of demand creation remains an essential element in deriving estimates of the total costs for VMMC scale-up in eastern and southern Africa. PMID:22140450
Phytoremediation of industrial mines wastewater using water hyacinth.
Saha, Priyanka; Shinde, Omkar; Sarkar, Supriya
2017-01-02
The wastewater at Sukinda chromite mines (SCM) area of Orissa (India) showed high levels of toxic hexavalent chromium (Cr VI). Wastewater from chromium-contaminated mines exhibit potential threats for biotic community in the vicinity. The aim of the present investigation is to develop a suitable phytoremediation technology for the effective removal of toxic hexavalent chromium from mines wastewater. A water hyacinth species Eichhornia crassipes was chosen to remediate the problem of Cr (VI) pollution from wastewater. It has been observed that this plant was able to remove 99.5% Cr (VI) of the processed water of SCM in 15 days. This aquatic plant not only removed hexavalent Cr, but is also capable of reducing total dissolved solids (TDS), biological oxygen demand (BOD), chemical oxygen demand (COD), and other elements of water also. Large-scale experiment was also performed using 100 L of water from SCM and the same removal efficiency was achieved.
Phytoremediation of industrial mines wastewater using water hyacinth
Saha, Priyanka; Shinde, Omkar; Sarkar, Supriya
2017-01-01
ABSTRACT The wastewater at Sukinda chromite mines (SCM) area of Orissa (India) showed high levels of toxic hexavalent chromium (Cr VI). Wastewater from chromium-contaminated mines exhibit potential threats for biotic community in the vicinity. The aim of the present investigation is to develop a suitable phytoremediation technology for the effective removal of toxic hexavalent chromium from mines wastewater. A water hyacinth species Eichhornia crassipes was chosen to remediate the problem of Cr (VI) pollution from wastewater. It has been observed that this plant was able to remove 99.5% Cr (VI) of the processed water of SCM in 15 days. This aquatic plant not only removed hexavalent Cr, but is also capable of reducing total dissolved solids (TDS), biological oxygen demand (BOD), chemical oxygen demand (COD), and other elements of water also. Large-scale experiment was also performed using 100 L of water from SCM and the same removal efficiency was achieved. PMID:27551860
Holden, Børge; Gitlesen, Jens Petter
2008-01-01
In addition to explaining challenging behaviour by way of behaviour analytic, functional analyses, challenging behaviour is increasingly explained by way of psychiatric symptomatology. According to some researchers, the two approaches complement each other, as psychiatric symptomatology may form a motivational basis for the individual's response to more immediate environmental challenges, like deprivation and aversive conditions. The most common example may be that depressive mood may render task demands aversive. Consequently, the person may show escape-motivated challenging behaviour in the presence of demands. The question becomes whether, or to what extent, relationships between psychiatric symptomatologies and particular functions of challenging behaviour exist. In the present, preliminary study, PAS-ADD checklist, a psychiatric screening instrument, and motivation assessment scale (MAS) were employed in order to investigate this issue. The results show that symptomatologies are largely unrelated to particular behavioural functions. Practical implications are discussed.
Making the Business Case for Regional and National Water Data Collection
NASA Astrophysics Data System (ADS)
Pinero, E.
2017-12-01
Water-related risks are becoming more and more of a concern with organizations that either depend on water use or are responsible for water services provision. Yet this concern does not always translate into a business case to support large scale water data collection. One reason is that water demand varies across sectors and physical setting. There is typically no single parameter or reason where a given entity would be interested in national or even regional scale data. Even for public sector entities, water issues are local and their jurisdiction does not span regional scale coverage. Therefore, to make the case for adequate data collection not only are technology and web platforms necessary, but one also needs a compelling business case. One way to make the case will involve raising awareness of the critical cross-cutting role of water such that sectors see the need for water data to support sustainability of other systems, such as energy, food, and resilience. Another factor will be understanding the full life cycle role of water, especially in the supply chain, and that there are many variables that drive water demand. Such an understanding will make clearer the need for more regional scale understanding. This will begin to address the apparent catch 22 that there is a need for data to understand the scope of the challenge, but until the scope of the challenge is understood, there is nno impelling business case to collect data. Examples, such as the Alliance for Water Stewardship standard and CEO Water Mandate Water Action Hub will be discussed to illustrate recent innovations in making a case for efficient collection of watershed scale and regional data.
Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers
NASA Astrophysics Data System (ADS)
Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi
2018-03-01
Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.
Development of novel IVD assays: a manufacturer's perspective.
Metcalfe, Thomas A
2010-01-01
IVD manufacturers are heavily reliant on novel IVD assays to fuel their growth and drive innovation within the industry. They represent a key part of the IVD industry's value proposition to customers and the healthcare industry in general, driving product differentiation, helping to create demand for new systems and generating incremental revenue. However, the discovery of novel biomarkers and their qualification for a specific clinical purpose is a high risk undertaking and the large, risky investments associated with doing this on a large scale are incompatible with IVD manufacturer's business models. This article describes the sources of novel IVD assays, the processes for discovering and qualifying novel assays and the reliance of IVD manufacturers on collaborations and in-licensing to source new IVD assays for their platforms.
Choudhury, Aziz Ahmed; Khanam, Rasheda; Moin, Syed Mamun Ibne; Ahmed, Salahuddin; Begum, Nazma; Shoma, Nurun Naher; Quaiyum, Md Abdul; Baqui, Abdullah H.
2017-01-01
Background According to the Bangladesh Demographic and Health Survey 2014, only approximately 37 percent of women deliver in a health facility. Among the eight administrative divisions of Bangladesh, the facility delivery rate is lowest in the Sylhet division (22.6 percent) where we assessed the effect of integrated supply- and demand-side interventions on the facility-based delivery rate. Methods Population-based cohort data of pregnant women from an ongoing maternal and newborn health improvement study being conducted in a population of ~120,000 in Sylhet district were used. The study required collection and processing of biological samples immediately after delivery. Therefore, the project assembled various strategies to increase institutional delivery rates. The supply-side intervention included capacity expansion of the health facilities through service provider refresher training, 24/7 service coverage, additions of drugs and supplies, and incentives to the providers. The demand-side component involved financial incentives to cover expenses, a provision of emergency transport, and referral support to a tertiary-level hospital. We conducted a before-and-after observational study to assess the impact of the intervention in a total of 1,861 deliveries between December 2014 and November 2016. Results Overall, implementation of the intervention package was associated with 52.6 percentage point increase in the proportions of facility-based deliveries from a baseline rate of 25.0 percent to 77.6 percent in 24 months. We observed lower rates of institutional deliveries when only supply-side interventions were implemented. The proportion rose to 47.1 percent and continued increasing when the project emphasized addressing the financial barriers to accessing obstetric care in a health facility. Conclusions An integrated supply- and demand-side intervention was associated with a substantial increase in institutional delivery. The package can be tailored to identify which combination of interventions may produce the optimum result and be scaled. Rigorous implementation research studies are needed to draw confident conclusions and to provide information about the costs, feasibility for scale-up and sustainability. PMID:29073229
From natural to artificial photosynthesis.
Barber, James; Tran, Phong D
2013-04-06
Demand for energy is projected to increase at least twofold by mid-century relative to the present global consumption because of predicted population and economic growth. This demand could be met, in principle, from fossil energy resources, particularly coal. However, the cumulative nature of carbon dioxide (CO(2)) emissions demands that stabilizing the atmospheric CO(2) levels to just twice their pre-anthropogenic values by mid-century will be extremely challenging, requiring invention, development and deployment of schemes for carbon-neutral energy production on a scale commensurate with, or larger than, the entire present-day energy supply from all sources combined. Among renewable and exploitable energy resources, nuclear fusion energy or solar energy are by far the largest. However, in both cases, technological breakthroughs are required with nuclear fusion being very difficult, if not impossible on the scale required. On the other hand, 1 h of sunlight falling on our planet is equivalent to all the energy consumed by humans in an entire year. If solar energy is to be a major primary energy source, then it must be stored and despatched on demand to the end user. An especially attractive approach is to store solar energy in the form of chemical bonds as occurs in natural photosynthesis. However, a technology is needed which has a year-round average conversion efficiency significantly higher than currently available by natural photosynthesis so as to reduce land-area requirements and to be independent of food production. Therefore, the scientific challenge is to construct an 'artificial leaf' able to efficiently capture and convert solar energy and then store it in the form of chemical bonds of a high-energy density fuel such as hydrogen while at the same time producing oxygen from water. Realistically, the efficiency target for such a technology must be 10 per cent or better. Here, we review the molecular details of the energy capturing reactions of natural photosynthesis, particularly the water-splitting reaction of photosystem II and the hydrogen-generating reaction of hydrogenases. We then follow on to describe how these two reactions are being mimicked in physico-chemical-based catalytic or electrocatalytic systems with the challenge of creating a large-scale robust and efficient artificial leaf technology.
(Un)certainty in climate change impacts on global energy consumption
NASA Astrophysics Data System (ADS)
van Ruijven, B. J.; De Cian, E.; Sue Wing, I.
2017-12-01
Climate change is expected to have an influence on the energy sector, especially on energy demand. For many locations, this change in energy demand is a balance between increase of demand for space cooling and a decrease of space heating demand. We perform a large-scale uncertainty analysis to characterize climate change risk on energy consumption as driven by climate and socioeconomic uncertainty. We combine a dynamic econometric model1 with multiple realizations of temperature projections from all 21 CMIP5 models (from the NASA Earth Exchange Global Daily Downscaled Projections2) under moderate (RCP4.5) and vigorous (RCP8.5) warming. Global spatial population projections for five SSPs are combined with GDP projections to construct scenarios for future energy demand driven by socioeconomic change. Between the climate models, we find a median global increase in climate-related energy demand of around 24% by 2050 under RCP8.5 with an interquartile range of 18-38%. Most climate models agree on increases in energy demand of more than 25% or 50% in tropical regions, the Southern USA and Southern China (see Figure). With respect to socioeconomic scenarios, we find wide variations between the SSPs for the number of people in low-income countries who are exposed to increases in energy demand. Figure attached: Number of models that agree on total climate-related energy consumption to increase or decrease by more than 0, 10, 25 or 50% by 2050 under RCP8.5 and SSP5 as result of the CMIP5 ensemble of temperature projections. References1. De Cian, E. & Sue Wing, I. Global Energy Demand in a Warming Climate. (FEEM, 2016). 2. Thrasher, B., Maurer, E. P., McKellar, C. & Duffy, P. B. Technical Note: Bias correcting climate model simulated daily temperature extremes with quantile mapping. Hydrol Earth Syst Sci 16, 3309-3314 (2012).
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
Jansen, Tessa; Zwaanswijk, Marieke; Hek, Karin; de Bakker, Dinny
2015-05-06
In the Netherlands, primary out-of-hours (OOH) care is provided by large scale General Practitioner (GP) cooperatives. GP cooperatives can be contacted by patients living in the area surrounding the GP cooperative (catchment area) at hours when the patient's own general practice is closed. The frequency of primary OOH care use substantially differs between GP cooperative catchment areas. To enable a better match between supply and demand of OOH services, understanding of the factors associated with primary OOH care use is essential. The present study evaluated the contribution of sociodemographic composition of the neighbourhood in explaining differences in primary OOH care use between GP cooperative catchment areas. Data about patients' contacts with primary OOH services (n = 1,668,047) were derived from routine electronic health records of 21 GP cooperatives participating in the NIVEL Primary Care Database in 2012. The study sample is representative for the Dutch population (for age and gender). Data were matched with sociodemographic characteristics (e.g. gender, age, low-income status, degree of urbanisation) on postcode level. Multilevel linear regression models included postcode level (first level), nested within GP cooperative catchment areas (second level). We investigated whether contacts in primary OOH care were associated with neighbourhood sociodemographic characteristics. The demand of primary OOH care was significantly higher in neighbourhoods with more women, low-income households, non-Western immigrants, neighbourhoods with a higher degree of urbanisation, and low neighbourhood socioeconomic status. Conversely, lower demand was associated with neighbourhoods with more 5 to 24 year old inhabitants. Sociodemographic neighbourhood characteristics explained a large part of the variation between GP cooperatives (R-squared ranging from 8% to 52%). Nevertheless, the multilevel models also showed that a considerable amount of variation in demand between GP cooperatives remained unexplained by sociodemographic characteristics, particularly regarding high-urgency contacts. Although part of the variation between GP cooperatives could not be attributed to neighbourhood characteristics, the sociodemographic composition of the neighbourhood is a fair predictor of the demand of primary OOH care. Accordingly, this study provides a useful starting point for an improved planning of the supply of primary OOH care.
NASA Astrophysics Data System (ADS)
Richey, A. S.; Richey, J. E.; Tan, A.; Liu, M.; Adam, J. C.; Sokolov, V.
2015-12-01
Central Asia presents a perfect case study to understand the dynamic, and often conflicting, linkages between food, energy, and water in natural systems. The destruction of the Aral Sea is a well-known environmental disaster, largely driven by increased irrigation demand on the rivers that feed the endorheic sea. Continued reliance on these rivers, the Amu Darya and Syr Darya, often place available water resources at odds between hydropower demands upstream and irrigation requirements downstream. A combination of tools is required to understand these linkages and how they may change in the future as a function of climate change and population growth. In addition, the region is geopolitically complex as the former Soviet basin states develop management strategies to sustainably manage shared resources. This complexity increases the importance of relying upon publically available information sources and tools. Preliminary work has shown potential for the Variable Infiltration Capacity (VIC) model to recreate the natural water balance in the Amu Darya and Syr Darya basins by comparing results to total terrestrial water storage changes observed from NASA's Gravity Recovery and Climate Experiment (GRACE) satellite mission. Modeled streamflow is well correlated to observed streamflow at upstream gauges prior to the large-scale expansion of irrigation and hydropower. However, current modeled results are unable to capture the human influence of water use on downstream flow. This study examines the utility of a crop simulation model, CropSyst, to represent irrigation demand and GRACE to improve modeled streamflow estimates in the Amu Darya and Syr Darya basins. Specifically we determine crop water demand with CropSyst utilizing available data on irrigation schemes and cropping patterns. We determine how this demand can be met either by surface water, modeled by VIC with a reservoir operation scheme, and/or by groundwater derived from GRACE. Finally, we assess how the inclusion of CropSyst and groundwater to model and meet irrigation demand improves modeled streamflow from VIC throughout the basins. The results of this work are integrated into a decision support platform to assist the basin states in understanding water availability and the impact of management decisions on available resources.
Mixture model normalization for non-targeted gas chromatography/mass spectrometry metabolomics data.
Reisetter, Anna C; Muehlbauer, Michael J; Bain, James R; Nodzenski, Michael; Stevens, Robert D; Ilkayeva, Olga; Metzger, Boyd E; Newgard, Christopher B; Lowe, William L; Scholtens, Denise M
2017-02-02
Metabolomics offers a unique integrative perspective for health research, reflecting genetic and environmental contributions to disease-related phenotypes. Identifying robust associations in population-based or large-scale clinical studies demands large numbers of subjects and therefore sample batching for gas-chromatography/mass spectrometry (GC/MS) non-targeted assays. When run over weeks or months, technical noise due to batch and run-order threatens data interpretability. Application of existing normalization methods to metabolomics is challenged by unsatisfied modeling assumptions and, notably, failure to address batch-specific truncation of low abundance compounds. To curtail technical noise and make GC/MS metabolomics data amenable to analyses describing biologically relevant variability, we propose mixture model normalization (mixnorm) that accommodates truncated data and estimates per-metabolite batch and run-order effects using quality control samples. Mixnorm outperforms other approaches across many metrics, including improved correlation of non-targeted and targeted measurements and superior performance when metabolite detectability varies according to batch. For some metrics, particularly when truncation is less frequent for a metabolite, mean centering and median scaling demonstrate comparable performance to mixnorm. When quality control samples are systematically included in batches, mixnorm is uniquely suited to normalizing non-targeted GC/MS metabolomics data due to explicit accommodation of batch effects, run order and varying thresholds of detectability. Especially in large-scale studies, normalization is crucial for drawing accurate conclusions from non-targeted GC/MS metabolomics data.
Inkjet-Spray Hybrid Printing for 3D Freeform Fabrication of Multilayered Hydrogel Structures.
Yoon, Sejeong; Park, Ju An; Lee, Hwa-Rim; Yoon, Woong Hee; Hwang, Dong Soo; Jung, Sungjune
2018-04-30
Here, a new bioprinting process by combining drop-on-demand inkjet printing with a spray-coating technique, which enables the high-resolution, high-speed, and freeform fabrication of large-scale cell-laden hydrogel structures is reported. Hydrogel structures with various shapes and composed of different materials, including alginate, cellulose nanofiber, and fibrinogen, are fabricated using the inkjet-spray printing. To manufacture cell-friendly hydrogel structures with controllable stiffness, gelatine methacryloyl is saponified to stabilize jet formation and is subsequently mixed with sodium alginate to prepare blend inks. The hydrogels crosslinked from the blend inks are characterized by assessing physical properties including the microstructure and mechanical stiffness and cellular responses including the cell viability, metabolic activity, and functionality of human dermal fibroblasts within the hydrogel. Cell-laden hydrogel structures are generated on a large scale and collagen type I secretion and spreading of cells within the hydrogels are assessed. The results demonstrate that the inkjet-spray printing system will ensure the formation of a cell-laden hydrogel structure with high shape fidelity in a rapid and reliable manner. Ultimately, the proposed printing technique and the blend bioink to be used to fabricate 3D laminated large-scale tissue equivalents that potentially mimic the function of native tissues is expected. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Hua, Wei-Bo; Guo, Xiao-Dong; Zheng, Zhuo; Wang, Yan-Jie; Zhong, Ben-He; Fang, Baizeng; Wang, Jia-Zhao; Chou, Shu-Lei; Liu, Heng
2015-02-01
Developing advanced electrode materials that deliver high energy at ultra-fast charge and discharge rates are very crucial to meet an increasing large-scale market demand for high power lithium ion batteries (LIBs). A three-dimensional (3D) nanoflower structure is successfully developed in the large-scale synthesis of LiNi1/3Co1/3Mn1/3O2 material for the first time. The fast co-precipitation is the key technique to prepare the nanoflower structure in our method. After heat treatment, the obtained LiNi1/3Co1/3Mn1/3O2 nanoflowers (NL333) pronouncedly present a pristine flower-like nano-architecture and provide fast pathways for the transport of Li-ions and electrons. As a cathode material in a LIB, the prepared NL333 electrode demonstrates an outstanding high-rate capability. Particularly, in a narrow voltage range of 2.7-4.3 V, the discharge capacity at an ultra-fast charge-discharge rate (20C) is up to 126 mAh g-1, which reaches 78% of that at 0.2C, and is much higher than that (i.e., 44.17%) of the traditional bulk LiNi1/3Co1/3Mn1/3O2.
A study of residence time distribution using radiotracer technique in the large scale plant facility
NASA Astrophysics Data System (ADS)
Wetchagarun, S.; Tippayakul, C.; Petchrak, A.; Sukrod, K.; Khoonkamjorn, P.
2017-06-01
As the demand for troubleshooting of large industrial plants increases, radiotracer techniques, which have capability to provide fast, online and effective detections to plant problems, have been continually developed. One of the good potential applications of the radiotracer for troubleshooting in a process plant is the analysis of Residence Time Distribution (RTD). In this paper, the study of RTD in a large scale plant facility using radiotracer technique was presented. The objective of this work is to gain experience on the RTD analysis using radiotracer technique in a “larger than laboratory” scale plant setup which can be comparable to the real industrial application. The experiment was carried out at the sedimentation tank in the water treatment facility of Thailand Institute of Nuclear Technology (Public Organization). Br-82 was selected to use in this work due to its chemical property, its suitable half-life and its on-site availability. NH4Br in the form of aqueous solution was injected into the system as the radiotracer. Six NaI detectors were placed along the pipelines and at the tank in order to determine the RTD of the system. The RTD and the Mean Residence Time (MRT) of the tank was analysed and calculated from the measured data. The experience and knowledge attained from this study is important for extending this technique to be applied to industrial facilities in the future.
Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko
2012-12-22
Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with large scale administrative data in health services and clinical research.
Impact of Large Scale Energy Efficiency Programs On Consumer Tariffs and Utility Finances in India
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhyankar, Nikit; Phadke, Amol
2011-01-20
Large-scale EE programs would modestly increase tariffs but reduce consumers' electricity bills significantly. However, the primary benefit of EE programs is a significant reduction in power shortages, which might make these programs politically acceptable even if tariffs increase. To increase political support, utilities could pursue programs that would result in minimal tariff increases. This can be achieved in four ways: (a) focus only on low-cost programs (such as replacing electric water heaters with gas water heaters); (b) sell power conserved through the EE program to the market at a price higher than the cost of peak power purchase; (c) focusmore » on programs where a partial utility subsidy of incremental capital cost might work and (d) increase the number of participant consumers by offering a basket of EE programs to fit all consumer subcategories and tariff tiers. Large scale EE programs can result in consistently negative cash flows and significantly erode the utility's overall profitability. In case the utility is facing shortages, the cash flow is very sensitive to the marginal tariff of the unmet demand. This will have an important bearing on the choice of EE programs in Indian states where low-paying rural and agricultural consumers form the majority of the unmet demand. These findings clearly call for a flexible, sustainable solution to the cash-flow management issue. One option is to include a mechanism like FAC in the utility incentive mechanism. Another sustainable solution might be to have the net program cost and revenue loss built into utility's revenue requirement and thus into consumer tariffs up front. However, the latter approach requires institutionalization of EE as a resource. The utility incentive mechanisms would be able to address the utility disincentive of forgone long-run return but have a minor impact on consumer benefits. Fundamentally, providing incentives for EE programs to make them comparable to supply-side investments is a way of moving the electricity sector toward a model focused on providing energy services rather than providing electricity.« less
A new fast scanning system for the measurement of large angle tracks in nuclear emulsions
NASA Astrophysics Data System (ADS)
Alexandrov, A.; Buonaura, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; Di Crescenzo, A.; Di Marco, N.; Galati, G.; Lauria, A.; Montesi, M. C.; Pupilli, F.; Shchedrina, T.; Tioukov, V.; Vladymyrov, M.
2015-11-01
Nuclear emulsions have been widely used in particle physics to identify new particles through the observation of their decays thanks to their unique spatial resolution. Nevertheless, before the advent of automatic scanning systems, the emulsion analysis was very demanding in terms of well trained manpower. Due to this reason, they were gradually replaced by electronic detectors, until the '90s, when automatic microscopes started to be developed in Japan and in Europe. Automatic scanning was essential to conceive large scale emulsion-based neutrino experiments like CHORUS, DONUT and OPERA. Standard scanning systems have been initially designed to recognize tracks within a limited angular acceptance (θ lesssim 30°) where θ is the track angle with respect to a line perpendicular to the emulsion plane. In this paper we describe the implementation of a novel fast automatic scanning system aimed at extending the track recognition to the full angular range and improving the present scanning speed. Indeed, nuclear emulsions do not have any intrinsic limit to detect particle direction. Such improvement opens new perspectives to use nuclear emulsions in several fields in addition to large scale neutrino experiments, like muon radiography, medical applications and dark matter directional detection.
NASA Astrophysics Data System (ADS)
Tian, Zhang; Yanfeng, Gong
2017-05-01
In order to solve the contradiction between demand and distribution range of primary energy resource, Ultra High Voltage (UHV) power grids should be developed rapidly to meet development of energy bases and accessing of large-scale renewable energy. This paper reviewed the latest research processes of AC/DC transmission technologies, summarized the characteristics of AC/DC power grids, concluded that China’s power grids certainly enter a new period of large -scale hybrid UHV AC/DC power grids and characteristics of “strong DC and weak AC” becomes increasingly pro minent; possible problems in operation of AC/DC power grids was discussed, and interaction or effect between AC/DC power grids was made an intensive study of; according to above problems in operation of power grids, preliminary scheme is summarized as fo llows: strengthening backbone structures, enhancing AC/DC transmission technologies, promoting protection measures of clean energ y accessing grids, and taking actions to solve stability problems of voltage and frequency etc. It’s valuable for making hybrid UHV AC/DC power grids adapt to operating mode of large power grids, thus guaranteeing security and stability of power system.
Grabiner, Mark D; Marone, Jane R; Wyatt, Marilynn; Sessoms, Pinata; Kaufman, Kenton R
2018-06-01
The fractal scaling evident in the step-to-step fluctuations of stepping-related time series reflects, to some degree, neuromotor noise. The primary purpose of this study was to determine the extent to which the fractal scaling of step width, step width and step width variability are affected by performance of an attention-demanding task. We hypothesized that the attention-demanding task would shift the structure of the step width time series toward white, uncorrelated noise. Subjects performed two 10-min treadmill walking trials, a control trial of undisturbed walking and a trial during which they performed a mental arithmetic/texting task. Motion capture data was converted to step width time series, the fractal scaling of which were determined from their power spectra. Fractal scaling decreased by 22% during the texting condition (p < 0.001) supporting the hypothesized shift toward white uncorrelated noise. Step width and step width variability increased 19% and five percent, respectively (p < 0.001). However, a stepwise discriminant analysis to which all three variables were input revealed that the control and dual task conditions were discriminated only by step width fractal scaling. The change of the fractal scaling of step width is consistent with increased cognitive demand and suggests a transition in the characteristics of the signal noise. This may reflect an important advance toward the understanding of the manner in which neuromotor noise contributes to some types of falls. However, further investigation of the repeatability of the results, the sensitivity of the results to progressive increases in cognitive load imposed by attention-demanding tasks, and the extent to which the results can be generalized to the gait of older adults seems warranted. Copyright © 2018 Elsevier B.V. All rights reserved.
Global impacts of energy demand on the freshwater resources of nations.
Holland, Robert Alan; Scott, Kate A; Flörke, Martina; Brown, Gareth; Ewers, Robert M; Farmer, Elizabeth; Kapos, Valerie; Muggeridge, Ann; Scharlemann, Jörn P W; Taylor, Gail; Barrett, John; Eigenbrod, Felix
2015-12-01
The growing geographic disconnect between consumption of goods, the extraction and processing of resources, and the environmental impacts associated with production activities makes it crucial to factor global trade into sustainability assessments. Using an empirically validated environmentally extended global trade model, we examine the relationship between two key resources underpinning economies and human well--being-energy and freshwater. A comparison of three energy sectors (petroleum, gas, and electricity) reveals that freshwater consumption associated with gas and electricity production is largely confined within the territorial boundaries where demand originates. This finding contrasts with petroleum, which exhibits a varying ratio of territorial to international freshwater consumption, depending on the origin of demand. For example, although the United States and China have similar demand associated with the petroleum sector, international freshwater consumption is three times higher for the former than the latter. Based on mapping patterns of freshwater consumption associated with energy sectors at subnational scales, our analysis also reveals concordance between pressure on freshwater resources associated with energy production and freshwater scarcity in a number of river basins globally. These energy-driven pressures on freshwater resources in areas distant from the origin of energy demand complicate the design of policy to ensure security of fresh water and energy supply. Although much of the debate around energy is focused on greenhouse gas emissions, our findings highlight the need to consider the full range of consequences of energy production when designing policy.
Global impacts of energy demand on the freshwater resources of nations
Holland, Robert Alan; Scott, Kate A.; Flörke, Martina; Brown, Gareth; Ewers, Robert M.; Farmer, Elizabeth; Kapos, Valerie; Muggeridge, Ann; Taylor, Gail; Barrett, John; Eigenbrod, Felix
2015-01-01
The growing geographic disconnect between consumption of goods, the extraction and processing of resources, and the environmental impacts associated with production activities makes it crucial to factor global trade into sustainability assessments. Using an empirically validated environmentally extended global trade model, we examine the relationship between two key resources underpinning economies and human well-being—energy and freshwater. A comparison of three energy sectors (petroleum, gas, and electricity) reveals that freshwater consumption associated with gas and electricity production is largely confined within the territorial boundaries where demand originates. This finding contrasts with petroleum, which exhibits a varying ratio of territorial to international freshwater consumption, depending on the origin of demand. For example, although the United States and China have similar demand associated with the petroleum sector, international freshwater consumption is three times higher for the former than the latter. Based on mapping patterns of freshwater consumption associated with energy sectors at subnational scales, our analysis also reveals concordance between pressure on freshwater resources associated with energy production and freshwater scarcity in a number of river basins globally. These energy-driven pressures on freshwater resources in areas distant from the origin of energy demand complicate the design of policy to ensure security of fresh water and energy supply. Although much of the debate around energy is focused on greenhouse gas emissions, our findings highlight the need to consider the full range of consequences of energy production when designing policy. PMID:26627262
Big Data Analytics for Demand Response: Clustering Over Space and Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelmis, Charalampos; Kolte, Jahanvi; Prasanna, Viktor K.
The pervasive deployment of advanced sensing infrastructure in Cyber-Physical systems, such as the Smart Grid, has resulted in an unprecedented data explosion. Such data exhibit both large volumes and high velocity characteristics, two of the three pillars of Big Data, and have a time-series notion as datasets in this context typically consist of successive measurements made over a time interval. Time-series data can be valuable for data mining and analytics tasks such as identifying the “right” customers among a diverse population, to target for Demand Response programs. However, time series are challenging to mine due to their high dimensionality. Inmore » this paper, we motivate this problem using a real application from the smart grid domain. We explore novel representations of time-series data for BigData analytics, and propose a clustering technique for determining natural segmentation of customers and identification of temporal consumption patterns. Our method is generizable to large-scale, real-world scenarios, without making any assumptions about the data. We evaluate our technique using real datasets from smart meters, totaling ~ 18,200,000 data points, and show the efficacy of our technique in efficiency detecting the number of optimal number of clusters.« less
Physicians workforce: legal immigrants will extend baby boom demands.
2005-10-15
The baby boom generation will place large demands on the Medicare program and the U.S. health care system. These demands may be extended by a large legal immigrant population that will become Medicare-eligible soon after the baby boom generation does. The U.S. health care system should be prepared for sustained stress from this again population.
Efficient population-scale variant analysis and prioritization with VAPr.
Birmingham, Amanda; Mark, Adam M; Mazzaferro, Carlo; Xu, Guorong; Fisch, Kathleen M
2018-04-06
With the growing availability of population-scale whole-exome and whole-genome sequencing, demand for reproducible, scalable variant analysis has spread within genomic research communities. To address this need, we introduce the Python package VAPr (Variant Analysis and Prioritization). VAPr leverages existing annotation tools ANNOVAR and MyVariant.info with MongoDB-based flexible storage and filtering functionality. It offers biologists and bioinformatics generalists easy-to-use and scalable analysis and prioritization of genomic variants from large cohort studies. VAPr is developed in Python and is available for free use and extension under the MIT License. An install package is available on PyPi at https://pypi.python.org/pypi/VAPr, while source code and extensive documentation are on GitHub at https://github.com/ucsd-ccbb/VAPr. kfisch@ucsd.edu.
Bridging the Gap Between the iLEAPS and GEWEX Land-Surface Modeling Communities
NASA Technical Reports Server (NTRS)
Bonan, Gordon; Santanello, Joseph A., Jr.
2013-01-01
Models of Earth's weather and climate require fluxes of momentum, energy, and moisture across the land-atmosphere interface to solve the equations of atmospheric physics and dynamics. Just as atmospheric models can, and do, differ between weather and climate applications, mostly related to issues of scale, resolved or parameterised physics,and computational requirements, so too can the land models that provide the required surface fluxes differ between weather and climate models. Here, however, the issue is less one of scale-dependent parameterisations.Computational demands can influence other minor land model differences, especially with respect to initialisation, data assimilation, and forecast skill. However, the distinction among land models (and their development and application) is largely driven by the different science and research needs of the weather and climate communities.
Climate Change and Macro-Economic Cycles in Pre-Industrial Europe
Pei, Qing; Zhang, David D.; Lee, Harry F.; Li, Guodong
2014-01-01
Climate change has been proven to be the ultimate cause of social crisis in pre-industrial Europe at a large scale. However, detailed analyses on climate change and macro-economic cycles in the pre-industrial era remain lacking, especially within different temporal scales. Therefore, fine-grained, paleo-climate, and economic data were employed with statistical methods to quantitatively assess the relations between climate change and agrarian economy in Europe during AD 1500 to 1800. In the study, the Butterworth filter was adopted to filter the data series into a long-term trend (low-frequency) and short-term fluctuations (high-frequency). Granger Causality Analysis was conducted to scrutinize the associations between climate change and macro-economic cycle at different frequency bands. Based on quantitative results, climate change can only show significant effects on the macro-economic cycle within the long-term. In terms of the short-term effects, society can relieve the influences from climate variations by social adaptation methods and self-adjustment mechanism. On a large spatial scale, temperature holds higher importance for the European agrarian economy than precipitation. By examining the supply-demand mechanism in the grain market, population during the study period acted as the producer in the long term, whereas as the consumer in the short term. These findings merely reflect the general interactions between climate change and macro-economic cycles at the large spatial region with a long-term study period. The findings neither illustrate individual incidents that can temporarily distort the agrarian economy nor explain some specific cases. In the study, the scale thinking in the analysis is raised as an essential methodological issue for the first time to interpret the associations between climatic impact and macro-economy in the past agrarian society within different temporal scales. PMID:24516601
Climate change and macro-economic cycles in pre-industrial europe.
Pei, Qing; Zhang, David D; Lee, Harry F; Li, Guodong
2014-01-01
Climate change has been proven to be the ultimate cause of social crisis in pre-industrial Europe at a large scale. However, detailed analyses on climate change and macro-economic cycles in the pre-industrial era remain lacking, especially within different temporal scales. Therefore, fine-grained, paleo-climate, and economic data were employed with statistical methods to quantitatively assess the relations between climate change and agrarian economy in Europe during AD 1500 to 1800. In the study, the Butterworth filter was adopted to filter the data series into a long-term trend (low-frequency) and short-term fluctuations (high-frequency). Granger Causality Analysis was conducted to scrutinize the associations between climate change and macro-economic cycle at different frequency bands. Based on quantitative results, climate change can only show significant effects on the macro-economic cycle within the long-term. In terms of the short-term effects, society can relieve the influences from climate variations by social adaptation methods and self-adjustment mechanism. On a large spatial scale, temperature holds higher importance for the European agrarian economy than precipitation. By examining the supply-demand mechanism in the grain market, population during the study period acted as the producer in the long term, whereas as the consumer in the short term. These findings merely reflect the general interactions between climate change and macro-economic cycles at the large spatial region with a long-term study period. The findings neither illustrate individual incidents that can temporarily distort the agrarian economy nor explain some specific cases. In the study, the scale thinking in the analysis is raised as an essential methodological issue for the first time to interpret the associations between climatic impact and macro-economy in the past agrarian society within different temporal scales.
Spin determination at the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Yavin, Itay
The quantum field theory describing the Electroweak sector demands some new physics at the TeV scale in order to unitarize the scattering of longitudinal W bosons. If this new physics takes the form of a scalar Higgs boson then it is hard to understand the huge hierarchy of scales between the Electroweak scale ˜ TeV and the Planck scale ˜ 1019 GeV. This is known as the Naturalness problem. Normally, in order to solve this problem, new particles, in addition to the Higgs boson, are required to be present in the spectrum below a few TeV. If such particles are indeed discovered at the Large Hadron Collider it will become important to determine their spin. Several classes of models for physics beyond the Electroweak scale exist. Determining the spin of any such newly discovered particle could prove to be the only means of distinguishing between these different models. In the first part of this thesis; we present a thorough discussion regarding such a measurement. We survey the different potentially useful channels for spin determination and a detailed analysis of the most promising channel is performed. The Littlest Higgs model offers a way to solve the Hierarchy problem by introduring heavy partners to Standard Model particles with the same spin and quantum numbers. However, this model is only good up to ˜ 10 TeV. In the second part of this thesis we present an extension of this model into a strongly coupled theory above ˜ 10 TeV. We use the celebrated AdS/CFT correspondence to calculate properties of the low-energy physics in terms of high-energy parameters. We comment on some of the tensions inherent to such a construction involving a large-N CFT (or equivalently, an AdS space).
NASA Astrophysics Data System (ADS)
Ordway, E.; Lambin, E.; Asner, G. P.
2015-12-01
The changing structure of demand for commodities associated with food security and energy has had a startling impact on land use change in tropical forests in recent decades. Yet, the composition of conversion in the Congo basin remains a major uncertainty, particularly with regards to the scale of drivers of change. Owing to rapid expansion of production globally and longstanding historical production locally in the Congo basin, oil palm offers a lens through which to evaluate local land use decisions across a spectrum of small- to large-scales of production as well as interactions with regional and global supply chains. We examined the effect of global commodity crop expansion on land use change in Southwest Cameroon using a mixed-methods approach to integrate remote sensing, field surveys and socioeconomic data. Southwest Cameroon (2.5 Mha) has a long history of large- and small-scale agriculture, ranging from mixed crop subsistence agriculture to large monocrop plantations of oil palm, cocoa, and rubber. Trends and spatial patterns of forest conversion and agricultural transitions were analyzed from 2000-2015 using satellite imagery. We used economic, demographic and field survey datasets to assess how regional and global market factors and local commodity crop decisions affect land use patterns. Our results show that oil palm is a major commodity crop expanding in this region, and that conversion is occurring primarily through expansion by medium-scale producers and local elites. Results also indicate that global and regional supply chain dynamics influence local land use decision making. This research contributes new information on land use patterns and dynamics in the Congo basin, an understudied region. More specifically, results from this research contribute information on recent trends of oil palm expansion in Cameroon that will be used in national land use planning strategies.
Radiologic image communication and archive service: a secure, scalable, shared approach
NASA Astrophysics Data System (ADS)
Fellingham, Linda L.; Kohli, Jagdish C.
1995-11-01
The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.
Chaney, John M; Gamwell, Kaitlyn L; Baraldi, Amanda N; Ramsey, Rachelle R; Cushing, Christopher C; Mullins, Alexandria J; Gillaspy, Stephen R; Jarvis, James N; Mullins, Larry L
2016-10-01
Examine caregiver demand and general parent distress as mediators in the parent illness uncertainty-child depressive symptom association in youth with juvenile rheumatic diseases. Children and adolescents completed the Child Depression Inventory; caregivers completed the Parent Perceptions of Uncertainty Scale, the Care for My Child with Rheumatic Disease Scale, and the Brief Symptom Inventory. The pediatric rheumatologist provided ratings of clinical disease status. Analyses revealed significant direct associations between illness uncertainty and caregiver demand, and between caregiver demand and both parent distress and child depressive symptoms. Results also revealed significant parent uncertainty → caregiver demand → parent distress and parent uncertainty → caregiver demand → child depressive symptom indirect paths. Results highlight the role of illness appraisals in adjustment to juvenile rheumatic diseases, and provide preliminary evidence that parent appraisals of illness uncertainty impact parent distress and child depressive symptoms indirectly through increased perceptions of caregiver demand. © The Author 2016. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The global land rush and climate change
NASA Astrophysics Data System (ADS)
Davis, Kyle Frankel; Rulli, Maria Cristina; D'Odorico, Paolo
2015-08-01
Climate change poses a serious global challenge in the face of rapidly increasing human demand for energy and food. A recent phenomenon in which climate change may play an important role is the acquisition of large tracts of land in the developing world by governments and corporations. In the target countries, where land is relatively inexpensive, the potential to increase crop yields is generally high and property rights are often poorly defined. By acquiring land, investors can realize large profits and countries can substantially alter the land and water resources under their control, thereby changing their outlook for meeting future demand. While the drivers, actors, and impacts involved with land deals have received substantial attention in the literature, we propose that climate change plays an important yet underappreciated role, both through its direct effects on agricultural production and through its influence on mitigative or adaptive policy decisions. Drawing from various literature sources as well as a new global database on reported land deals, we trace the evolution of the global land rush and highlight prominent examples in which the role of climate change is evident. We find that climate change—both historical and anticipated—interacts substantially with drivers of land acquisitions, having important implications for the resilience of communities in targeted areas. As a result of this synthesis, we ultimately contend that considerations of climate change should be integrated into future policy decisions relating to the large-scale land acquisitions.
Deforestation risk due to commodity crop expansion in sub-Saharan Africa
NASA Astrophysics Data System (ADS)
Ordway, Elsa M.; Asner, Gregory P.; Lambin, Eric F.
2017-04-01
Rapid integration of global agricultural markets and subsequent cropland displacement in recent decades increased large-scale tropical deforestation in South America and Southeast Asia. Growing land scarcity and more stringent land use regulations in these regions could incentivize the offshoring of export-oriented commodity crops to sub-Saharan Africa (SSA). We assess the effects of domestic- and export-oriented agricultural expansion on deforestation in SSA in recent decades. Analyses were conducted at the global, regional and local scales. We found that commodity crops are expanding in SSA, increasing pressure on tropical forests. Four Congo Basin countries, Sierra Leone, Liberia, and Côte d’Ivoire were most at risk in terms of exposure, vulnerability and pressures from agricultural expansion. These countries averaged the highest percent forest cover (58% ± 17.93) and lowest proportions of potentially available cropland outside forest areas (1% ± 0.89). Foreign investment in these countries was concentrated in oil palm production (81%), with a median investment area of 41 582 thousand ha. Cocoa, the fastest expanding export-oriented crop across SSA, accounted for 57% of global expansion in 2000-2013 at a rate of 132 thousand ha yr-1. However, cocoa only amounted to 0.89% of foreign land investment. Commodity crop expansion in SSA appears largely driven by small- and medium-scale farmers rather than industrial plantations. Land-use changes associated with large-scale investments remain to be observed in many countries. Although domestic demand for commodity crops was associated with most agricultural expansion, we provide evidence of a growing influence of distant markets on land-use change in SSA.
Integrity of Bolted Angle Connections Subjected to Simulated Column Removal
Weigand, Jonathan M.; Berman, Jeffrey W.
2016-01-01
Large-scale tests of steel gravity framing systems (SGFSs) have shown that the connections are critical to the system integrity, when a column suffers damage that compromises its ability to carry gravity loads. When supporting columns were removed, the SGFSs redistributed gravity loads through the development of an alternate load path in a sustained tensile configuration resulting from large vertical deflections. The ability of the system to sustain such an alternate load path depends on the capacity of the gravity connections to remain intact after undergoing large rotation and axial extension demands, for which they were not designed. This study experimentally evaluates the performance of steel bolted angle connections subjected to loading consistent with an interior column removal. The characteristic connection behaviors are described and the performance of multiple connection configurations are compared in terms of their peak resistances and deformation capacities. PMID:27110059
HIGH-EFFICIENCY AUTONOMOUS LASER ADAPTIVE OPTICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baranec, Christoph; Riddle, Reed; Tendulkar, Shriharsh
2014-07-20
As new large-scale astronomical surveys greatly increase the number of objects targeted and discoveries made, the requirement for efficient follow-up observations is crucial. Adaptive optics imaging, which compensates for the image-blurring effects of Earth's turbulent atmosphere, is essential for these surveys, but the scarcity, complexity and high demand of current systems limit their availability for following up large numbers of targets. To address this need, we have engineered and implemented Robo-AO, a fully autonomous laser adaptive optics and imaging system that routinely images over 200 objects per night with an acuity 10 times sharper at visible wavelengths than typically possible frommore » the ground. By greatly improving the angular resolution, sensitivity, and efficiency of 1-3 m class telescopes, we have eliminated a major obstacle in the follow-up of the discoveries from current and future large astronomical surveys.« less
ERIC Educational Resources Information Center
Gido, Eric O.; Sibiko, Kenneth W.; Ayuya, Oscar I.; Mwangi, Joseph K.
2015-01-01
Purpose: The objective of the study was to determine the level and determinants of demand for extension services among small-scale maize farmers in Kenya. Design/methodology/approach: Based on an exploratory research design, primary data were collected from a sample of 352 households through face-to-face interviews. Focus group discussions were…
Optimal crop selection and water allocation under limited water supply in irrigation
NASA Astrophysics Data System (ADS)
Stange, Peter; Grießbach, Ulrike; Schütze, Niels
2015-04-01
Due to climate change, extreme weather conditions such as droughts may have an increasing impact on irrigated agriculture. To cope with limited water resources in irrigation systems, a new decision support framework is developed which focuses on an integrated management of both irrigation water supply and demand at the same time. For modeling the regional water demand, local (and site-specific) water demand functions are used which are derived from optimized agronomic response on farms scale. To account for climate variability the agronomic response is represented by stochastic crop water production functions (SCWPF). These functions take into account different soil types, crops and stochastically generated climate scenarios. The SCWPF's are used to compute the water demand considering different conditions, e.g., variable and fixed costs. This generic approach enables the consideration of both multiple crops at farm scale as well as of the aggregated response to water pricing at a regional scale for full and deficit irrigation systems. Within the SAPHIR (SAxonian Platform for High Performance IRrigation) project a prototype of a decision support system is developed which helps to evaluate combined water supply and demand management policies.
Sanne, Bjarte; Mykletun, Arnstein; Dahl, Alv A; Moen, Bente E; Tell, Grethe S
2005-09-01
To test the strain/iso-strain, interaction and buffer hypotheses of the Job Demand-Control-Support model in relation to anxiety and depression. Five thousand five hundred and sixty-two workers with valid Demand-Control-Support Questionnaire (DCSQ) scores were examined with the sub-scales of the Hospital Anxiety and Depression Scale as outcomes. Multiple statistical methods were applied. The strain and iso-strain hypotheses were confirmed. Generally, additive and non-interaction effects were found between psychological demands, control and social support. The buffer hypotheses were refuted. Results from analyses testing different interaction operationalizations were complementary. High demands, low control and low support individually, but particularly combined, are risk factors for anxiety and depression. Support is the DCSQ index most strongly associated with anxiety and depression in women. Assessment of psychosocial work environment may identify workers at risk, and serve as a basis for job-redesign.
Power control and management of the grid containing largescale wind power systems
NASA Astrophysics Data System (ADS)
Aula, Fadhil Toufick
The ever increasing demand for electricity has driven many countries toward the installation of new generation facilities. However, concerns such as environmental pollution and global warming issues, clean energy sources, high costs associated with installation of new conventional power plants, and fossil fuels depletion have created many interests in finding alternatives to conventional fossil fuels for generating electricity. Wind energy is one of the most rapidly growing renewable power sources and wind power generations have been increasingly demanded as an alternative to the conventional fossil fuels. However, wind power fluctuates due to variation of wind speed. Therefore, large-scale integration of wind energy conversion systems is a threat to the stability and reliability of utility grids containing these systems. They disturb the balance between power generation and consumption, affect the quality of the electricity, and complicate load sharing and load distribution managing and planning. Overall, wind power systems do not help in providing any services such as operating and regulating reserves to the power grid. In order to resolve these issues, research has been conducted in utilizing weather forecasting data to improve the performance of the wind power system, reduce the influence of the fluctuations, and plan power management of the grid containing large-scale wind power systems which consist of doubly-fed induction generator based energy conversion system. The aims of this research, my dissertation, are to provide new methods for: smoothing the output power of the wind power systems and reducing the influence of their fluctuations, power managing and planning of a grid containing these systems and other conventional power plants, and providing a new structure of implementing of latest microprocessor technology for controlling and managing the operation of the wind power system. In this research, in order to reduce and smooth the fluctuations, two methods are presented. The first method is based on a de-loaded technique while the other method is based on utilizing multiple storage facilities. The de-loaded technique is based on characteristics of the power of a wind turbine and estimation of the generated power according to weather forecasting data. The technique provides a reference power by which the wind power system will operate and generate a smooth power. In contrast, utilizing storage facilities will allow the wind power system to operate at its maximum tracking power points' strategy. Two types of energy storages are considered in this research, battery energy storage system (BESS) and pumped-hydropower storage system (PHSS), to suppress the output fluctuations and to support the wind power system to follow the system load demands. Furthermore, this method provides the ability to store energy when there is a surplus of the generated power and to reuse it when there is a shortage of power generation from wind power systems. Both methods are new in terms of utilizing of the techniques and wind speed data. A microprocessor embedded system using an IntelRTM Atom(TM) processor is presented for controlling the wind power system and for providing the remote communication for enhancing the operation of the individual wind power system in a wind farm. The embedded system helps the wind power system to respond and to follow the commands of the central control of the power system. Moreover, it enhances the performance of the wind power system through self-managing, self-functioning, and self-correcting. Finally, a method of system power management and planning is modeled and studied for a grid containing large-scale wind power systems. The method is based on a new technique through constructing a new load demand curve (NLDC) from merging the estimation of generated power from wind power systems and forecasting of the load. To summarize, the methods and their results presented in this dissertation, enhance the operation of the large-scale wind power systems and reduce their drawbacks on the operation of the power grid.
NASA Astrophysics Data System (ADS)
Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.
2015-05-01
Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential Uncertainty Fitting algorithm (SUFI-2) and the SWAT-CUP interface, followed by a manual water quality calibration on a monthly basis. The refined modeling approach developed in this study led to successful predictions across most parts of the Corn Belt region and can be used for testing pollution mitigation measures and agricultural economic scenarios, providing useful information to policy makers and recommendations on similar efforts at the regional scale.
Chen, Liang-Hua; Cai, Feng; Zhang, Dan-Ju; Zhang, Li; Zhu, Peng; Gao, Shun
2017-07-01
The pharmacological importance of recombinant human stem cell factor (rhSCF) has increased the demand to establish effective and large-scale production and purification processes. A good source of bioactive recombinant protein with capability of being scaled-up without losing activity has always been a challenge. The objectives of the study were the rapid and efficient pilot-scale expression and purification of rhSCF. The gene encoding stem cell factor (SCF) was cloned into pBV220 and transformed into Escherichia coli. The recombinant SCF was expressed and isolated using a procedure consisting of isolation of inclusion bodies (IBs), denaturation, and refolding followed by chromatographic steps toward purification. The yield of rhSCF reached 835.6 g/20 L, and the expression levels of rhSCF were about 33.9% of the total E. coli protein content. rhSCF was purified by isolation of IBs, denaturation, and refolding, followed by SP-Sepharose chromatography, Source 30 reversed-phase chromatography, and Q-Sepharose chromatography. This procedure was developed to isolate 5.5 g of rhSCF (99.5% purity) with specific activity at 0.96 × 10 6 IU/mg, endotoxin levels of pyrogen at 1.0 EU/mg, and bacterial DNA at 10 ng/mg. Pilot-scale fermentations and purifications were set up for the production of rhSCF that can be upscaled for industry. © 2016 International Union of Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Berry, J. A.; Wolf, A.; Vygodskaya, N. N.
2004-12-01
Measurements of energy and water balance over Boreal forest ecosystems have generally shown very large ratios of sensible heat flux to latent heat flux (Bowen ratio) - especially on fine summer days. This strong control on evaporation at the plant scale can restrict precipitation and effect hydrometeorlogy at the regional scale. The large Bowen ratio is, in part, explained by the low maximum stomatal conductance of Boreal forest tree species and is probably related to their very low photosynthetic capacity. However, mid-day conductance can be much lower than expected on this basis and reflects the additional effect of a dynamic feedback system between stomatal conductance and the properties of the atmospheric boundary layer. Low stomatal conductance leads to a large sensible heat flux which, in turn, leads to a deeper, warmer and dryer atmospheric boundary layer and to a greater evaporative demand on the plant, causing the stomata close still more. Predicting the response of this non-linear system presents a major challenge. Physiological studies conducted in the Canadian Boreal forest show very large differences in the tendency of species to experience mid day stomatal closure. Jack pine was found to be quite susceptible while black spruce the most resistant to mid day stomatal closure. These species had very similar photosynthetic capacity (Vmax) and Ball-Berry stomatal sensitivity coefficients. Jack pine was, however, more sensitive to inhibition of photosynthesis by elevated temperatures and, as a consequence, stomata closed as temperature and the vapor pressure deficit increased during mid day. In contrast, black spruce was much less effected. These differences could have profound implications for simulating regional scale hydrometeorology over large areas dominated by monospecific stands in the NEESPI domain.
NASA Astrophysics Data System (ADS)
Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.
2014-04-01
Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.
Priorities for toxic wastewater management in Pakistan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, A.
1996-12-31
This study assesses the number of industries in Pakistan, the total discharge of wastewater, the biological oxygen demand (BOD) load, and the toxicity of the wastewater. The industrial sector is a major contributor to water pollution, with high levels of BOD, heavy metals, and toxic compounds. Only 30 industries have installed water pollution control equipment, and most are working at a very low operational level. Priority industrial sectors for pollution control are medium- to large-scale textile industries and small-scale tanneries and electroplating industries. Each day the textile industries discharge about 85,000 m{sup 3} of wastewater with a high BOD, whilemore » the electroplating industries discharge about 23,000 m{sup 3} of highly toxic and hazardous wastewater. Various in-plant modifications can reduce wastewater discharges. Economic incentives, like tax rebates, subsidies, and soft loans, could be an option for motivating medium- to large-scale industries to control water pollution. Central treatment plants may be constructed for treating wastewater generated by small-scale industries. The estimated costs for the treatment of textile and electroplating wastewater are given. The legislative structure in Pakistan is insufficient for control of industrial pollution; not only do existing laws need revision, but more laws and regulations are needed to improve the state of affairs, and enforcement agencies need to be strengthened. 15 refs., 1 fig., 9 tabs.« less
Zheng, Yucong; Wang, Xiaochang C; Dzakpasu, Mawuli; Ge, Yuan; Zhao, Yaqian; Xiong, Jiaqing
2016-01-01
Hybrid constructed wetland (HCW) systems have been used to treat various wastewaters across the world. However, large-scale applications of HCWs are scarce, particularly for on-site improvement of the water quality of highly polluted urban rivers in semi-arid regions. In this study, a large pilot-scale HCW system was constructed to improve the water quality of the Zaohe River in Xi'an, China. With a total area of about 8000 m(2), the pilot HCW system, composed of different configurations of surface and subsurface flow wetlands, was operated for 2 years at an average inflow volume rate of 362 m(3)/day. Local Phragmites australis and Typha orientalis from the riverbank were planted in the HCW system. Findings indicate a higher treatment efficiency for organics and suspended solids than nutrients. The inflow concentrations of 5-day biochemical oxygen demand (BOD5), chemical oxygen demand (COD), suspended solids (SS), total nitrogen (TN), NH3-N, and total phosphorus (TP) were 125.6, 350.9, 334.2, 38.5, 27.2, and 3.9 mg/L, respectively. Average removal efficiencies of 94.4, 74.5, 92.0, 56.3, 57.5, and 69.2%, respectively, were recorded. However, the pollutant removal rates were highly seasonal especially for nitrogen. Higher removals were recorded for all pollutants in the autumn while significantly lower removals were recorded in the winter. Plant uptake and assimilation accounted for circa 19-29 and 16-23% of the TN and TP removal, respectively. Moreover, P. australis demonstrated a higher nutrient uptake ability and competitive potential. Overall, the high efficiency of the pilot HCW for improving the water quality of such a highly polluted urban river provided practical evidence of the applicability of the HCW technology for protecting urban water environments.
Science and Strategic - Climate Implications
NASA Astrophysics Data System (ADS)
Tindall, J. A.; Moran, E. H.
2008-12-01
Energy of weather systems greatly exceeds energy produced and used by humans. Variation in this energy causes climate variability potentially resulting in local, national, and/or global catastrophes beyond our ability to deter the loss of life and economic destabilization. Large scale natural disasters routinely result in shortages of water, disruption of energy supplies, and destruction of infrastructure. The resulting unforeseen and disastrous events occurring beyond national emergency preparation, as related to climate variability, could insight civil unrest due to dwindling and/or inaccessible resources necessary for survival. Lack of these necessary resources in impacted countries often leads to wars. Climate change coupled with population growth, which exposes more of the population to potential risks associated with climate and environmental change, demands faster technological response. Understanding climate/associated environmental changes, the relation to human activity and behavior, and including this in national and international emergency/security management plans would alleviate shortcomings in our present and future technological status. The scale of environmental change will determine the potential magnitude of civil unrest at the local, national, and/or global level along with security issues at each level. Commonly, security issues related to possible civil unrest owing to temporal environmental change is not part of a short and/or long-term strategy, yet recent large-scale disasters are reminders that system failures (as in hurricane Katrina) include acknowledged breaches to individual, community, and infrastructure security. Without advance planning and management concerning environmental change, oncoming and climate related events will intensify the level of devastation and human catastrophe. Depending upon the magnitude and period of catastrophic events and/or environmental changes, destabilization of agricultural systems, energy supplies, and other lines of commodities often results in severely unbalanced supply and demand ratios, which eventually affect the entire global community. National economies potentially risk destabilization, which is especially important since economics plays a major role in strategic planning. This presentation will address these issues and the role that science can play in human sustainability and local, national, and international security.
NASA Astrophysics Data System (ADS)
Destouni, G.
2008-12-01
Excess nutrient and pollutant releases from various point and diffuse sources at and below the land surface, associated with land use, industry and households, pose serious eutrophication and pollution risks to inland and coastal water ecosystems worldwide. These risks must be assessed, for instance according to the EU Water Framework Directive (WFD). The WFD demands economically efficient, basin-scale water management for achieving and maintaining good physico-chemical and ecological status in all the inland and coastal waters of EU member states. This paper synthesizes a series of hydro-biogeochemical and linked economic efficiency studies of basin-scale waterborne nutrient and pollutant flows, the development over the last decades up to the current levels of these flows, the main monitoring and modelling uncertainties associated with their quantification, and the effectiveness and economic efficiency of different possible abatement strategies for abating them in order to meet WFD requirements and other environmental goals on local, national and international levels under climate and other regional change. The studies include different Swedish and Baltic Sea drainage basins. Main findings include quantification of near-coastal monitoring gaps and long-term nutrient and pollutant memory in the subsurface (soil-groundwater-sediment) water systems of drainage basins. The former may significantly mask nutrient and pollutant loads to the sea while the latter may continue to uphold large loads to inland and coastal waters long time after source mitigation. A methodology is presented for finding a rational trade-off between the two resource-demanding options to reduce, or accept and explicitly account for the uncertainties implied by these monitoring gaps and long-term nutrient-pollution memories and time lags, and other knowledge, data and model uncertainties that limit the effectiveness and efficiency of water pollution and eutrophication management.
A Vision for Co-optimized T&D System Interaction with Renewables and Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Lindsay; Zéphyr, Luckny; Cardell, Judith B.
The evolution of the power system to the reliable, efficient and sustainable system of the future will involve development of both demand- and supply-side technology and operations. The use of demand response to counterbalance the intermittency of renewable generation brings the consumer into the spotlight. Though individual consumers are interconnected at the low-voltage distribution system, these resources are typically modeled as variables at the transmission network level. In this paper, a vision for cooptimized interaction of distribution systems, or microgrids, with the high-voltage transmission system is described. In this framework, microgrids encompass consumers, distributed renewables and storage. The energy managementmore » system of the microgrid can also sell (buy) excess (necessary) energy from the transmission system. Preliminary work explores price mechanisms to manage the microgrid and its interactions with the transmission system. Wholesale market operations are addressed through the development of scalable stochastic optimization methods that provide the ability to co-optimize interactions between the transmission and distribution systems. Modeling challenges of the co-optimization are addressed via solution methods for large-scale stochastic optimization, including decomposition and stochastic dual dynamic programming.« less
A Vision for Co-optimized T&D System Interaction with Renewables and Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, C. Lindsay; Zéphyr, Luckny; Liu, Jialin
The evolution of the power system to the reliable, effi- cient and sustainable system of the future will involve development of both demand- and supply-side technology and operations. The use of demand response to counterbalance the intermittency of re- newable generation brings the consumer into the spotlight. Though individual consumers are interconnected at the low-voltage distri- bution system, these resources are typically modeled as variables at the transmission network level. In this paper, a vision for co- optimized interaction of distribution systems, or microgrids, with the high-voltage transmission system is described. In this frame- work, microgrids encompass consumers, distributed renewablesmore » and storage. The energy management system of the microgrid can also sell (buy) excess (necessary) energy from the transmission system. Preliminary work explores price mechanisms to manage the microgrid and its interactions with the transmission system. Wholesale market operations are addressed through the devel- opment of scalable stochastic optimization methods that provide the ability to co-optimize interactions between the transmission and distribution systems. Modeling challenges of the co-optimization are addressed via solution methods for large-scale stochastic op- timization, including decomposition and stochastic dual dynamic programming.« less
Challenges in Managing Trustworthy Large-scale Digital Science
NASA Astrophysics Data System (ADS)
Evans, B. J. K.
2017-12-01
The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.
Hydropower and sustainability: resilience and vulnerability in China's powersheds.
McNally, Amy; Magee, Darrin; Wolf, Aaron T
2009-07-01
Large dams represent a whole complex of social, economic and ecological processes, perhaps more than any other large infrastructure project. Today, countries with rapidly developing economies are constructing new dams to provide energy and flood control to growing populations in riparian and distant urban communities. If the system is lacking institutional capacity to absorb these physical and institutional changes there is potential for conflict, thereby threatening human security. In this paper, we propose analyzing sustainability (political, socioeconomic, and ecological) in terms of resilience versus vulnerability, framed within the spatial abstraction of a powershed. The powershed framework facilitates multi-scalar and transboundary analysis while remaining focused on the questions of resilience and vulnerability relating to hydropower dams. Focusing on examples from China, this paper describes the complex nature of dams using the sustainability and powershed frameworks. We then analyze the roles of institutions in China to understand the relationships between power, human security and the socio-ecological system. To inform the study of conflicts over dams China is a particularly useful case study because we can examine what happens at the international, national and local scales. The powershed perspective allows us to examine resilience and vulnerability across political boundaries from a dynamic, process-defined analytical scale while remaining focused on a host of questions relating to hydro-development that invoke drivers and impacts on national and sub-national scales. The ability to disaggregate the affects of hydropower dam construction from political boundaries allows for a deeper analysis of resilience and vulnerability. From our analysis we find that reforms in China's hydropower sector since 1996 have been motivated by the need to create stability at the national scale rather than resilient solutions to China's growing demand for energy and water resource control at the local and international scales. Some measures that improved economic development through the market economy and a combination of dam construction and institutional reform may indeed improve hydro-political resilience at a single scale. However, if China does address large-scale hydropower construction's potential to create multi-scale geopolitical tensions, they may be vulnerable to conflict - though not necessarily violent - in domestic and international political arenas. We conclude with a look toward a resilient basin institution for the Nu/Salween River, the site of a proposed large-scale hydropower development effort in China and Myanmar.
Association between Emotional Symptoms and Job Demands in an Asian Electronics Factory.
Huang, Wei-Lieh; Guo, Yue Leon; Chen, Pau-Chung; Wang, Jui; Chu, Po-Ching
2017-09-19
Various work-related issues including mental health have been described for the electronic industry. Although East Asian countries play important roles in the electronics industry, the association between job demands and emotional symptoms has been rarely examined. The present study recruited 603 workers from either office or clean room environments in an electronics factory in Taiwan. Their personal factors, work-related factors, and emotional symptoms were assessed by a self-administered questionnaire. The symptoms of depression and hostility were reported in 24.88% and 24.38% of the subjects, respectively, while 14.93% reported both. A multivariate analysis showed that, overall, women workers were more likely to have emotional symptoms than male workers (odds ration (OR) = 1.50, 95% CI = 1.02-2.18). Among clean room workers, working under high pressure (OR = 1.84, 95% CI = 1.05-3.21), conflicting demands (OR = 2.15, 95% CI = 1.30-3.57), and social isolation at work (OR = 2.99, 95% CI = 1.23-7.30) were associated with emotional symptoms. The findings suggest that in the Asian electronics industry, for women, working under high pressure, conflicting demands, and social isolation at work are risk factors for emotional symptoms, especially for clean room workers. Further large-scale, longitudinal studies are necessary to confirm and prevent the mental health problems in this fast-evolving, highly competitive industry.
Association between Emotional Symptoms and Job Demands in an Asian Electronics Factory
Huang, Wei-Lieh; Guo, Yue Leon; Chen, Pau-Chung; Wang, Jui; Chu, Po-Ching
2017-01-01
Various work-related issues including mental health have been described for the electronic industry. Although East Asian countries play important roles in the electronics industry, the association between job demands and emotional symptoms has been rarely examined. The present study recruited 603 workers from either office or clean room environments in an electronics factory in Taiwan. Their personal factors, work-related factors, and emotional symptoms were assessed by a self-administered questionnaire. The symptoms of depression and hostility were reported in 24.88% and 24.38% of the subjects, respectively, while 14.93% reported both. A multivariate analysis showed that, overall, women workers were more likely to have emotional symptoms than male workers (odds ration (OR) = 1.50, 95% CI = 1.02–2.18). Among clean room workers, working under high pressure (OR = 1.84, 95% CI = 1.05–3.21), conflicting demands (OR = 2.15, 95% CI = 1.30–3.57), and social isolation at work (OR = 2.99, 95% CI = 1.23–7.30) were associated with emotional symptoms. The findings suggest that in the Asian electronics industry, for women, working under high pressure, conflicting demands, and social isolation at work are risk factors for emotional symptoms, especially for clean room workers. Further large-scale, longitudinal studies are necessary to confirm and prevent the mental health problems in this fast-evolving, highly competitive industry. PMID:28925986
Efficacy of adaptation measures to future water scarcity on a global scale
NASA Astrophysics Data System (ADS)
Yoshikawa, S.; Kanae, S.
2015-12-01
Water supply sources for all sector are critically important for agricultural and industrial productivity. The current rapid increase in water use is considered unsustainable and threatens human life. In our previous study (Yoshikawa et al., 2014 in HESS), we estimated the time-varying dependence of water requirements from water supply sources during past and future periods using the global water resources model, H08. The sources of water requirements were specified using four categories: rivers, large reservoirs, medium-size reservoirs, and non-local non-renewable blue water (NNBW). We also estimated ΔNNBW which is defined as an increase in NNBW from the past to the future. From the results, we could require the further development of water supply sources in order to sustain future water use. For coping with water scarcity using ΔNNBW, there is need for adaptation measure. To address adaptation measures, we need to set adaptation options which can be divided between 'Supply enhancement' and 'Demand management'. The supply enhancement includes increased storage, groundwater development, inter-basin transfer, desalination and re-use urban waste water. Demand management is defined as a set of actions controlling water demand by reducing water loss, increasing water productivity, and water re-allocation. In this study, we focus on estimating further future water demand under taking into account of several adaptation measures using H08 model.
Standardized Sample Preparation Using a Drop-on-Demand Printing Platform
2013-05-07
successful and robust methodology for energetic sample preparation. Keywords: drop-on-demand; inkjet printing; sample preparation OPEN ACCESS...on a similar length scale. Recently, drop-on-demand inkjet printing technology has emerged as an effective approach to produce test materials to...which most of the material is concentrated along the edges, samples prepared using drop-on-demand inkjet technology demonstrate excellent uniform
Daley, Monica A; Birn-Jeffery, Aleksandra
2018-05-22
Birds provide an interesting opportunity to study the relationships between body size, limb morphology and bipedal locomotor function. Birds are ecologically diverse and span a large range of body size and limb proportions, yet all use their hindlimbs for bipedal terrestrial locomotion, for at least some part of their life history. Here, we review the scaling of avian striding bipedal gaits to explore how body mass and leg morphology influence walking and running. We collate literature data from 21 species, spanning a 2500× range in body mass from painted quail to ostriches. Using dynamic similarity theory to interpret scaling trends, we find evidence for independent effects of body mass, leg length and leg posture on gait. We find no evidence for scaling of duty factor with body size, suggesting that vertical forces scale with dynamic similarity. However, at dynamically similar speeds, large birds use relatively shorter stride lengths and higher stride frequencies compared with small birds. We also find that birds with long legs for their mass, such as the white stork and red-legged seriema, use longer strides and lower swing frequencies, consistent with the influence of high limb inertia on gait. We discuss the observed scaling of avian bipedal gait in relation to mechanical demands for force, work and power relative to muscle actuator capacity, muscle activation costs related to leg cycling frequency, and considerations of stability and agility. Many opportunities remain for future work to investigate how morphology influences gait dynamics among birds specialized for different habitats and locomotor behaviors. © 2018. Published by The Company of Biologists Ltd.
NASA Astrophysics Data System (ADS)
Taneja, Jayant Kumar
Electricity is an indispensable commodity to modern society, yet it is delivered via a grid architecture that remains largely unchanged over the past century. A host of factors are conspiring to topple this dated yet venerated design: developments in renewable electricity generation technology, policies to reduce greenhouse gas emissions, and advances in information technology for managing energy systems. Modern electric grids are emerging as complex distributed systems in which a portfolio of power generation resources, often incorporating fluctuating renewable resources such as wind and solar, must be managed dynamically to meet uncontrolled, time-varying demand. Uncertainty in both supply and demand makes control of modern electric grids fundamentally more challenging, and growing portfolios of renewables exacerbate the challenge. We study three electricity grids: the state of California, the province of Ontario, and the country of Germany. To understand the effects of increasing renewables, we develop a methodology to scale renewables penetration. Analyzing these grids yields key insights about rigid limits to renewables penetration and their implications in meeting long-term emissions targets. We argue that to achieve deep penetration of renewables, the operational model of the grid must be inverted, changing the paradigm from load-following supplies to supply-following loads. To alleviate the challenge of supply-demand matching on deeply renewable grids, we first examine well-known techniques, including altering management of existing supply resources, employing utility-scale energy storage, targeting energy efficiency improvements, and exercising basic demand-side management. Then, we create several instantiations of supply-following loads -- including refrigerators, heating and cooling systems, and laptop computers -- by employing a combination of sensor networks, advanced control techniques, and enhanced energy storage. We examine the capacity of each load for supply-following and study the behaviors of populations of these loads, assessing their potential at various levels of deployment throughout the California electricity grid. Using combinations of supply-following strategies, we can reduce peak natural gas generation by 19% on a model of the California grid with 60% renewables. We then assess remaining variability on this deeply renewable grid incorporating supply-following loads, characterizing additional capabilities needed to ensure supply-demand matching in future sustainable electricity grids.
Initial Low-Reynolds Number Iced Aerodynamic Performance for CRM Wing
NASA Technical Reports Server (NTRS)
Woodard, Brian; Diebold, Jeff; Broeren, Andy; Potapczuk, Mark; Lee, Sam; Bragg, Michael
2015-01-01
NASA, FAA, ONERA, and other partner organizations have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large scale, three-dimensional swept wings. These are extremely complex phenomena important to the design, certification and safe operation of small and large transport aircraft. There is increasing demand to balance trade-offs in aircraft efficiency, cost and noise that tend to compete directly with allowable performance degradations over an increasing range of icing conditions. Computational fluid dynamics codes have reached a level of maturity that they are being proposed by manufacturers for use in certification of aircraft for flight in icing. However, sufficient high-quality data to evaluate their performance on iced swept wings are not currently available in the public domain and significant knowledge gaps remain.
NASA Astrophysics Data System (ADS)
Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang
Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.
Segregated Systems of Human Brain Networks.
Wig, Gagan S
2017-12-01
The organization of the brain network enables its function. Evaluation of this organization has revealed that large-scale brain networks consist of multiple segregated subnetworks of interacting brain areas. Descriptions of resting-state network architecture have provided clues for understanding the functional significance of these segregated subnetworks, many of which correspond to distinct brain systems. The present report synthesizes accumulating evidence to reveal how maintaining segregated brain systems renders the human brain network functionally specialized, adaptable to task demands, and largely resilient following focal brain damage. The organizational properties that support system segregation are harmonious with the properties that promote integration across the network, but confer unique and important features to the brain network that are central to its function and behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gold nanoparticles for high-throughput genotyping of long-range haplotypes
NASA Astrophysics Data System (ADS)
Chen, Peng; Pan, Dun; Fan, Chunhai; Chen, Jianhua; Huang, Ke; Wang, Dongfang; Zhang, Honglu; Li, You; Feng, Guoyin; Liang, Peiji; He, Lin; Shi, Yongyong
2011-10-01
Completion of the Human Genome Project and the HapMap Project has led to increasing demands for mapping complex traits in humans to understand the aetiology of diseases. Identifying variations in the DNA sequence, which affect how we develop disease and respond to pathogens and drugs, is important for this purpose, but it is difficult to identify these variations in large sample sets. Here we show that through a combination of capillary sequencing and polymerase chain reaction assisted by gold nanoparticles, it is possible to identify several DNA variations that are associated with age-related macular degeneration and psoriasis on significant regions of human genomic DNA. Our method is accurate and promising for large-scale and high-throughput genetic analysis of susceptibility towards disease and drug resistance.
NASA Astrophysics Data System (ADS)
Higashino, Satoru; Kobayashi, Shoei; Yamagami, Tamotsu
2007-06-01
High data transfer rate has been demanded for data storage devices along increasing the storage capacity. In order to increase the transfer rate, high-speed data processing techniques in read-channel devices are required. Generally, parallel architecture is utilized for the high-speed digital processing. We have developed a new architecture of Interpolated Timing Recovery (ITR) to achieve high-speed data transfer rate and wide capture-range in read-channel devices for the information storage channels. It facilitates the parallel implementation on large-scale-integration (LSI) devices.
The Need for Optical Means as an Alternative for Electronic Computing
NASA Technical Reports Server (NTRS)
Adbeldayem, Hossin; Frazier, Donald; Witherow, William; Paley, Steve; Penn, Benjamin; Bank, Curtis; Whitaker, Ann F. (Technical Monitor)
2001-01-01
An increasing demand for faster computers is rapidly growing to encounter the fast growing rate of Internet, space communication, and robotic industry. Unfortunately, the Very Large Scale Integration technology is approaching its fundamental limits beyond which the device will be unreliable. Optical interconnections and optical integrated circuits are strongly believed to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by conventional electronics. This paper demonstrates two ultra-fast, all-optical logic gates and a high-density storage medium, which are essential components in building the future optical computer.
Multi-resolution integrated modeling for basin-scale water resources management and policy analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Hoshin V.; Brookshire, David S.; Springer, E. P.
Approximately one-third of the land surface of the Earth is considered to be arid or semi-arid with an annual average of less than 12-14 inches of rainfall. The availability of water in such regions is of course, particularly sensitive to climate variability while the demand for water is experiencing explosive population growth. The competition for available water is exerting considerable pressure on the water resources management. Policy and decision makers in the southwestern U.S. increasingly have to cope with over-stressed rivers and aquifers as population and water demands grow. Other factors such as endangered species and Native American water rightsmore » further complicate the management problems. Further, as groundwater tables are drawn down due to pumping in excess of natural recharge, considerable (potentially irreversible) environmental impacts begin to be felt as, for example, rivers run dry for significant portions of the year, riparian habitats disappear (with consequent effects on the bio-diversity of the region), aquifers compact resulting in large scale subsidence, and water quality begins to suffer. The current drought (1999-2002) in the southwestern U.S. is raising new concerns about how to sustain the combination of agricultural, urban and in-stream uses of water that underlie the socio-economic and ecological structure in the region. The water stressed nature of arid and semi-arid environments means that competing water uses of various kinds vie for access to a highly limited resource. If basin-scale water sustainability is to be achieved, managers must somehow achieve a balance between supply and demand throughout the basin, not just for the surface water or stream. The need to move water around a basin such as the Rio Grande or Colorado River to achieve this balance has created the stimulus for water transfers and water markets, and for accurate hydrologic information to sustain such institutions [Matthews et al. 2002; Brookshire et al 2003; Krause, Chermak Brookshire, 2003].« less
Demand driven decision support for efficient water resources allocation in irrigated agriculture
NASA Astrophysics Data System (ADS)
Schuetze, Niels; Grießbach, Ulrike Ulrike; Röhm, Patric; Stange, Peter; Wagner, Michael; Seidel, Sabine; Werisch, Stefan; Barfus, Klemens
2014-05-01
Due to climate change, extreme weather conditions, such as longer dry spells in the summer months, may have an increasing impact on the agriculture in Saxony (Eastern Germany). For this reason, and, additionally, declining amounts of rainfall during the growing season the use of irrigation will be more important in future in Eastern Germany. To cope with this higher demand of water, a new decision support framework is developed which focuses on an integrated management of both irrigation water supply and demand. For modeling the regional water demand, local (and site-specific) water demand functions are used which are derived from the optimized agronomic response at farms scale. To account for climate variability the agronomic response is represented by stochastic crop water production functions (SCWPF) which provide the estimated yield subject to the minimum amount of irrigation water. These functions take into account the different soil types, crops and stochastically generated climate scenarios. By applying mathematical interpolation and optimization techniques, the SCWPF's are used to compute the water demand considering different constraints, for instance variable and fix costs or the producer price. This generic approach enables the computation for both multiple crops at farm scale as well as of the aggregated response to water pricing at a regional scale for full and deficit irrigation systems. Within the SAPHIR (SAxonian Platform for High Performance Irrigation) project a prototype of a decision support system is developed which helps to evaluate combined water supply and demand management policies for an effective and efficient utilization of water in order to meet future demands. The prototype is implemented as a web-based decision support system and it is based on a service-oriented geo-database architecture.
Life histories of hosts and pathogens predict patterns in tropical fungal plant diseases.
García-Guzmán, Graciela; Heil, Martin
2014-03-01
Plant pathogens affect the fitness of their hosts and maintain biodiversity. However, we lack theories to predict the type and intensity of infections in wild plants. Here we demonstrate using fungal pathogens of tropical plants that an examination of the life histories of hosts and pathogens can reveal general patterns in their interactions. Fungal infections were more commonly reported for light-demanding than for shade-tolerant species and for evergreen rather than for deciduous hosts. Both patterns are consistent with classical defence theory, which predicts lower resistance in fast-growing species and suggests that the deciduous habit can reduce enemy populations. In our literature survey, necrotrophs were found mainly to infect shade-tolerant woody species whereas biotrophs dominated in light-demanding herbaceous hosts. Far-red signalling and its inhibitory effects on jasmonic acid signalling are likely to explain this phenomenon. Multiple changes between the necrotrophic and the symptomless endophytic lifestyle at the ecological and evolutionary scale indicate that endophytes should be considered when trying to understand large-scale patterns in the fungal infections of plants. Combining knowledge about the molecular mechanisms of pathogen resistance with classical defence theory enables the formulation of testable predictions concerning general patterns in the infections of wild plants by fungal pathogens. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.
Menzies, Kevin
2014-08-13
The growth in simulation capability over the past 20 years has led to remarkable changes in the design process for gas turbines. The availability of relatively cheap computational power coupled to improvements in numerical methods and physical modelling in simulation codes have enabled the development of aircraft propulsion systems that are more powerful and yet more efficient than ever before. However, the design challenges are correspondingly greater, especially to reduce environmental impact. The simulation requirements to achieve a reduced environmental impact are described along with the implications of continued growth in available computational power. It is concluded that achieving the environmental goals will demand large-scale multi-disciplinary simulations requiring significantly increased computational power, to enable optimization of the airframe and propulsion system over the entire operational envelope. However even with massive parallelization, the limits imposed by communications latency will constrain the time required to achieve a solution, and therefore the position of such large-scale calculations in the industrial design process. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
European large-scale farmland investments and the land-water-energy-food nexus
NASA Astrophysics Data System (ADS)
Siciliano, Giuseppina; Rulli, Maria Cristina; D'Odorico, Paolo
2017-12-01
The escalating human demand for food, water, energy, fibres and minerals have resulted in increasing commercial pressures on land and water resources, which are partly reflected by the recent increase in transnational land investments. Studies have shown that many of the land-water issues associated with land acquisitions are directly related to the areas of energy and food production. This paper explores the land-water-energy-food nexus in relation to large-scale farmland investments pursued by investors from European countries. The analysis is based on a "resource assessment approach" which evaluates the linkages between land acquisitions for agricultural (including both energy and food production) and forestry purposes, and the availability of land and water in the target countries. To that end, the water appropriated by agricultural and forestry productions is quantitatively assessed and its impact on water resource availability is analysed. The analysis is meant to provide useful information to investors from EU countries and policy makers on aspects of resource acquisition, scarcity, and access to promote responsible land investments in the target countries.
Towards a large-scale scalable adaptive heart model using shallow tree meshes
NASA Astrophysics Data System (ADS)
Krause, Dorian; Dickopf, Thomas; Potse, Mark; Krause, Rolf
2015-10-01
Electrophysiological heart models are sophisticated computational tools that place high demands on the computing hardware due to the high spatial resolution required to capture the steep depolarization front. To address this challenge, we present a novel adaptive scheme for resolving the deporalization front accurately using adaptivity in space. Our adaptive scheme is based on locally structured meshes. These tensor meshes in space are organized in a parallel forest of trees, which allows us to resolve complicated geometries and to realize high variations in the local mesh sizes with a minimal memory footprint in the adaptive scheme. We discuss both a non-conforming mortar element approximation and a conforming finite element space and present an efficient technique for the assembly of the respective stiffness matrices using matrix representations of the inclusion operators into the product space on the so-called shallow tree meshes. We analyzed the parallel performance and scalability for a two-dimensional ventricle slice as well as for a full large-scale heart model. Our results demonstrate that the method has good performance and high accuracy.
Huang, Chun; Zhang, Jin; Young, Neil P; Snaith, Henry J; Grant, Patrick S
2016-05-10
Supercapacitors are in demand for short-term electrical charge and discharge applications. Unlike conventional supercapacitors, solid-state versions have no liquid electrolyte and do not require robust, rigid packaging for containment. Consequently they can be thinner, lighter and more flexible. However, solid-state supercapacitors suffer from lower power density and where new materials have been developed to improve performance, there remains a gap between promising laboratory results that usually require nano-structured materials and fine-scale processing approaches, and current manufacturing technology that operates at large scale. We demonstrate a new, scalable capability to produce discrete, multi-layered electrodes with a different material and/or morphology in each layer, and where each layer plays a different, critical role in enhancing the dynamics of charge/discharge. This layered structure allows efficient utilisation of each material and enables conservative use of hard-to-obtain materials. The layered electrode shows amongst the highest combinations of energy and power densities for solid-state supercapacitors. Our functional design and spray manufacturing approach to heterogeneous electrodes provide a new way forward for improved energy storage devices.
Analysis on the dynamic error for optoelectronic scanning coordinate measurement network
NASA Astrophysics Data System (ADS)
Shi, Shendong; Yang, Linghui; Lin, Jiarui; Guo, Siyang; Ren, Yongjie
2018-01-01
Large-scale dynamic three-dimension coordinate measurement technique is eagerly demanded in equipment manufacturing. Noted for advantages of high accuracy, scale expandability and multitask parallel measurement, optoelectronic scanning measurement network has got close attention. It is widely used in large components jointing, spacecraft rendezvous and docking simulation, digital shipbuilding and automated guided vehicle navigation. At present, most research about optoelectronic scanning measurement network is focused on static measurement capacity and research about dynamic accuracy is insufficient. Limited by the measurement principle, the dynamic error is non-negligible and restricts the application. The workshop measurement and positioning system is a representative which can realize dynamic measurement function in theory. In this paper we conduct deep research on dynamic error resources and divide them two parts: phase error and synchronization error. Dynamic error model is constructed. Based on the theory above, simulation about dynamic error is carried out. Dynamic error is quantized and the rule of volatility and periodicity has been found. Dynamic error characteristics are shown in detail. The research result lays foundation for further accuracy improvement.
Fast propagation of electromagnetic fields through graded-index media.
Zhong, Huiying; Zhang, Site; Shi, Rui; Hellmann, Christian; Wyrowski, Frank
2018-04-01
Graded-index (GRIN) media are widely used for modeling different situations: some components are designed considering GRIN modulation, e.g., multi-mode fibers, optical lenses, or acousto-optical modulators; on the other hand, there are other components where the refractive-index variation is undesired due to, e.g., stress or heating; and finally, some effects in nature are characterized by a GRIN variation, like turbulence in air or biological tissues. Modeling electromagnetic fields propagating in GRIN media is then of high importance for optical simulation and design. Though ray tracing can be used to evaluate some basic effects in GRIN media, the field properties are not considered and evaluated. The general physical optics techniques, like finite element method or finite difference time domain, can be used to calculate fields in GRIN media, but they need great numerical effort or may even be impractical for large-scale components. Therefore, there still exists a demand for a fast physical optics model of field propagation through GRIN media on a large scale, which will be explored in this paper.
Enhancing Solar Cell Efficiencies through 1-D Nanostructures
2009-01-01
The current global energy problem can be attributed to insufficient fossil fuel supplies and excessive greenhouse gas emissions resulting from increasing fossil fuel consumption. The huge demand for clean energy potentially can be met by solar-to-electricity conversions. The large-scale use of solar energy is not occurring due to the high cost and inadequate efficiencies of existing solar cells. Nanostructured materials have offered new opportunities to design more efficient solar cells, particularly one-dimensional (1-D) nanomaterials for enhancing solar cell efficiencies. These 1-D nanostructures, including nanotubes, nanowires, and nanorods, offer significant opportunities to improve efficiencies of solar cells by facilitating photon absorption, electron transport, and electron collection; however, tremendous challenges must be conquered before the large-scale commercialization of such cells. This review specifically focuses on the use of 1-D nanostructures for enhancing solar cell efficiencies. Other nanostructured solar cells or solar cells based on bulk materials are not covered in this review. Major topics addressed include dye-sensitized solar cells, quantum-dot-sensitized solar cells, and p-n junction solar cells.
Engineering design for a large scale renewable energy network installation in an urban environment
NASA Astrophysics Data System (ADS)
Mansouri Kouhestani, F.; Byrne, J. M.; Hazendonk, P.; Spencer, L.; Brown, M. B.
2016-12-01
Humanity's current avid consumption of resources cannot be maintained and the use of renewable energy is a significant approach towards sustainable energy future. Alberta is the largest greenhouse gas-producing province in Canada (per capita) and Climate change is expected to impact Alberta with warmer temperatures, intense floods, and earlier snow melting. However, as one of the sunniest and windiest places in Canada, Alberta is poised to become one of Canada's leader provinces in utilizing renewable energies. This research has four main objectives. First, to determine the feasibility of implementing solar and wind energy systems at the University of Lethbridge campus. Second, to quantify rooftop and parking lot solar photovoltaic potential for the city of Lethbridge. Third, to determine the available rooftop area for PV deployment in a large scale region (Province of Alberta). Forth, to investigate different strategies for correlating solar PV array production with electricity demand in the province of Alberta. The proposed work addresses the need for Alberta reductions to fossil fuel pollution that drives climate change, and degrades our air, water and land resources.
Computational biomedicine: a challenge for the twenty-first century.
Coveney, Peter V; Shublaq, Nour W
2012-01-01
With the relentless increase of computer power and the widespread availability of digital patient-specific medical data, we are now entering an era when it is becoming possible to develop predictive models of human disease and pathology, which can be used to support and enhance clinical decision-making. The approach amounts to a grand challenge to computational science insofar as we need to be able to provide seamless yet secure access to large scale heterogeneous personal healthcare data in a facile way, typically integrated into complex workflows-some parts of which may need to be run on high performance computers-in a facile way that is integrated into clinical decision support software. In this paper, we review the state of the art in terms of case studies drawn from neurovascular pathologies and HIV/AIDS. These studies are representative of a large number of projects currently being performed within the Virtual Physiological Human initiative. They make demands of information technology at many scales, from the desktop to national and international infrastructures for data storage and processing, linked by high performance networks.
Optimization of Industrial Ozone Generation with Pulsed Power
NASA Astrophysics Data System (ADS)
Lopez, Jose; Guerrero, Daniel; Freilich, Alfred; Ramoino, Luca; Seton Hall University Team; Degremont Technologies-Ozonia Team
2013-09-01
Ozone (O3) is widely used for applications ranging from various industrial chemical synthesis processes to large-scale water treatment. The consequent surge in world-wide demand has brought about the requirement for ozone generation at the rate of several hundreds grams per kilowatt hour (g/kWh). For many years, ozone has been generated by means of dielectric barrier discharges (DBD), where a high-energy electric field between two electrodes separated by a dielectric and gap containing pure oxygen or air produce various microplasmas. The resultant microplasmas provide sufficient energy to dissociate the oxygen molecules while allowing the proper energetics channels for the formation of ozone. This presentation will review the current power schemes used for large-scale ozone generation and explore the use of high-voltage nanosecond pulses with reduced electric fields. The created microplasmas in a high reduced electric field are expected to be more efficient for ozone generation. This is confirmed with the current results of this work which observed that the efficiency of ozone generation increases by over eight time when the rise time and pulse duration are shortened. Department of Physics, South Orange, NJ, USA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denholm, Paul; Clark, Kara; O'Connell, Matt
Increasing the use of grid-flexibility options (improved grid management, demand response, and energy storage) could enable 25% or higher penetration of PV at low costs (see Denholm et al. 2016). Considering the large-scale integration of solar into electric-power systems complicates the calculation of the value of solar. In fact a comprehensive examination reveals that the value of solar technologies—or any other power-system technology or operating strategy—can only be understood in the context of the generation system as a whole. This is well illustrated by analysis of curtailment at high PV penetrations within the bulk power and transmission systems. As themore » deployment of PV increases, it is possible that during some sunny midday periods due to limited flexibility of conventional generators, system operators would need to reduce (curtail) PV output in order to maintain the crucial balance between electric supply and demand. As a result, PV’s value and cost competitiveness would degrade. For example, for utility-scale PV with a baseline SunShot LCOE of 6¢/kWh, increasing the annual energy demand met by solar energy from 10% to 20% would increase the marginal LCOE of PV from 6¢/kWh to almost 11¢/kWh in a California grid system with limited flexibility. However, this loss of value could be stemmed by increasing system flexibility via enhanced control of variable-generation resources, added energy storage, and the ability to motivate more electricity consumers to shift consumption to lower-demand periods. The combination of these measures would minimize solar curtailment and keep PV cost-competitive at penetrations at least as high as 25%. Efficient deployment of the grid-flexibility options needed to maintain solar’s value will require various innovations, from the development of communication, control, and energy storage technologies to the implementation of new market rules and operating procedures.« less
Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin
2016-05-16
This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.
Empirical evidence on the demand for carve-outs in employment group mental health coverage.
Salkever, David S.; Shinogle, Judith A.
2000-06-01
BACKGROUND AND AIMS OF THE STUDY: The use of specialized behavioral health companies to manage mental/health benefits has become widespread in recent years. Recent studies have reported on the cost and utilization impacts of behavioral health carve-outs. Yet little previous research has examined the factors which lead employer-based health plans to adopt a carve-out strategy for mental health benefits. The examination of these factors is the main focus of our study. Our empirical analysis is also intended to explore several hypotheses (moral hazard, adverse selection, economies of scale and alternate utilization management strategies) that have recently been advanced to explain the popularity of carve-outs. METHODS: The data for this study are from a survey of employers who have long-term disability contracts with one large insurer. The analysis uses data from 248 employers who offer mental health benefits combined with local market information (e.g. health care price proxies, state tax rates etc), state regulations (mental health and substance abuse mandate and parity laws) and employee characteristics. Two different measures of carve-out use were used as dependent variables in the analysis: (1) the fraction of health plans offered by the employer that contained carve-out provisions and (2) a dichotomous indicator for those employers who included a carve-out arrangement in all the health plans they offered. RESULTS: Our results tended to support the general cost-control hypothesis that factors associated with higher use and/or costs of mental health services increase the demand for carve-outs. Our results gave less consistent support to the argument that carve-outs are demanded to control adverse selection, though only a few variables provided a direct test of this hypothesis. The role of economies of scale (i.e., group size) and the effectiveness of alternative strategies for managing moral hazard costs (i.e., HMOs) were confirmed by our results. DISCUSSION: We considered a number of different hypotheses concerning employers' demands for mental health carve-outs and found varying degrees of support for these hypotheses in our data. Our results tended to support the general cost-control hypothesis that factors associated with higher use and/or costs of mental health services increase the demand for carve-outs. LIMITATIONS: Our database includes a small number of relatively large employers and is not representative of employers nationally. Our selection criteria, concerning size and the requirement that some employees are covered by LTD insurance, probably resulted in a study sample that offers richer benefits than do employers nationally. Our employers also report a higher percentage of salaried employees relative to the national data. Another deficiency in the current study is the lack of detailed information on the socio-demographic and behavioral characteristics of covered employees. Finally, the cross-sectional nature of our analysis raises concerns about susceptibility of our findings to omitted variables bias. IMPLICATIONS FOR FURTHER RESEARCH: Research with more information on covered employee characteristics will allow for a stronger test of the general hypothesis that factors associated with a higher demand for services are also associated with a higher demand for carve-outs. Also, future analyses that capture the experience of states that have recently passed mandate and parity laws, and that use pooled data to control for omitted variables bias, will provide more definitive evidence on the relationship between these laws and carve-out demand.
Voltage Imaging of Waking Mouse Cortex Reveals Emergence of Critical Neuronal Dynamics
Scott, Gregory; Fagerholm, Erik D.; Mutoh, Hiroki; Leech, Robert; Sharp, David J.; Shew, Woodrow L.
2014-01-01
Complex cognitive processes require neuronal activity to be coordinated across multiple scales, ranging from local microcircuits to cortex-wide networks. However, multiscale cortical dynamics are not well understood because few experimental approaches have provided sufficient support for hypotheses involving multiscale interactions. To address these limitations, we used, in experiments involving mice, genetically encoded voltage indicator imaging, which measures cortex-wide electrical activity at high spatiotemporal resolution. Here we show that, as mice recovered from anesthesia, scale-invariant spatiotemporal patterns of neuronal activity gradually emerge. We show for the first time that this scale-invariant activity spans four orders of magnitude in awake mice. In contrast, we found that the cortical dynamics of anesthetized mice were not scale invariant. Our results bridge empirical evidence from disparate scales and support theoretical predictions that the awake cortex operates in a dynamical regime known as criticality. The criticality hypothesis predicts that small-scale cortical dynamics are governed by the same principles as those governing larger-scale dynamics. Importantly, these scale-invariant principles also optimize certain aspects of information processing. Our results suggest that during the emergence from anesthesia, criticality arises as information processing demands increase. We expect that, as measurement tools advance toward larger scales and greater resolution, the multiscale framework offered by criticality will continue to provide quantitative predictions and insight on how neurons, microcircuits, and large-scale networks are dynamically coordinated in the brain. PMID:25505314
NASA Astrophysics Data System (ADS)
Du, Xinxin; O'Brien, Lucy; Riedel-Kruse, Ingmar
Many adult organs grow or shrink to accommodate fluctuating levels of physiological demand. Specifically, the intestine of the fruit fly (the midgut) expands four-fold in the number of mature cells and, proportionally, the number of stem cells when the fly eats. However, the cellular behaviors that give rise to this stem scaling are not well-understood. Here we present a biophysical model of the adult fly midgut. A set of differential equations can recapitulate the physiological kinetics of cells during midgut growth and shrinkage as long as the rate of stem cell fate commitment depends on the stem cell number density in the tissue. To elucidate the source of this dependence, we model the tissue in a 2D simulation with soft spheres, where stem cells choose fate commitment through Delta-Notch pathway interactions with other stem cells, a known process in fly midguts. We find that as long as stem cells exhibit a large enough amplitude of random motion through the tissue (`stem cell motility'), and explore a large enough `territory' in their lifetime, stem cell scaling can occur. These model observations are confirmed through in vivo live-imaging, where we indeed see that stem cells are motile in the fly midgut.
Bridging the Science/Policy Gap through Boundary Chain Partnerships and Communities of Practice
NASA Astrophysics Data System (ADS)
Kalafatis, S.
2014-12-01
Generating the capacity to facilitate the informed usage of climate change science by decision makers on a large scale is fast becoming an area of great concern. While research demonstrates that sustained interactions between producers of such information and potential users can overcome barriers to information usage, it also demonstrates the high resource demand of these efforts. Our social science work at Great Lakes Integrated Sciences and Assessments (GLISA) sheds light on scaling up the usability of climate science through two research areas. The first focuses on partnerships with other boundary organizations that GLISA has leveraged - the "boundary chains" approach. These partnerships reduce the transaction costs involved with outreach and have enhanced the scope of GLISA's climate service efforts to encompass new users such as First Nations groups in Wisconsin and Michigan and underserved neighborhoods in St. Paul, Minnesota. The second research area looks at the development of information usability across the regional scale of the eight Great Lakes states. It has identified the critical role that communities of practice are playing in making information usable to large groups of users who work in similar contexts and have similar information needs. Both these research areas demonstrate the emerging potential of flexible knowledge networks to enhance society's ability to prepare for the impacts of climate change.
Unintended consequences of increasing block tariffs pricing policy in urban water
NASA Astrophysics Data System (ADS)
Dahan, Momi; Nisan, Udi
2007-03-01
We exploit a unique data set to estimate the degree of economies of scale in water consumption, controlling for the standard demand factors. We found a linear Engel curve in water consumption: each additional household member consumes the same water quantity regardless of household size, except for a single-person household. Our evidence suggests that the increasing block tariffs (IBT) structure, which is indifferent to household size, has unintended consequences. Large households, which are also likely to be poor given the negative correlation between income and household size, are charged a higher price for water. The degree of economies of scale found here erodes the effectiveness of IBT price structure as a way to introduce an equity consideration. This implication is important in view of the global trend toward the use of IBT.
Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping
NASA Astrophysics Data System (ADS)
Mah, S. B.; Cryderman, C. S.
2015-08-01
Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.
Electrochemical micro/nano-machining: principles and practices.
Zhan, Dongping; Han, Lianhuan; Zhang, Jie; He, Quanfeng; Tian, Zhao-Wu; Tian, Zhong-Qun
2017-03-06
Micro/nano-machining (MNM) is becoming the cutting-edge of high-tech manufacturing because of the increasing industrial demand for supersmooth surfaces and functional three-dimensional micro/nano-structures (3D-MNS) in ultra-large scale integrated circuits, microelectromechanical systems, miniaturized total analysis systems, precision optics, and so on. Taking advantage of no tool wear, no surface stress, environmental friendliness, simple operation, and low cost, electrochemical micro/nano-machining (EC-MNM) has an irreplaceable role in MNM. This comprehensive review presents the state-of-art of EC-MNM techniques for direct writing, surface planarization and polishing, and 3D-MNS fabrications. The key point of EC-MNM is to confine electrochemical reactions at the micro/nano-meter scale. This review will bring together various solutions to "confined reaction" ranging from electrochemical principles through technical characteristics to relevant applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, H.-W.; Chang, N.-B., E-mail: nchang@mail.ucf.ed; Chen, J.-C.
2010-07-15
Limited to insufficient land resources, incinerators are considered in many countries such as Japan and Germany as the major technology for a waste management scheme capable of dealing with the increasing demand for municipal and industrial solid waste treatment in urban regions. The evaluation of these municipal incinerators in terms of secondary pollution potential, cost-effectiveness, and operational efficiency has become a new focus in the highly interdisciplinary area of production economics, systems analysis, and waste management. This paper aims to demonstrate the application of data envelopment analysis (DEA) - a production economics tool - to evaluate performance-based efficiencies of 19more » large-scale municipal incinerators in Taiwan with different operational conditions. A 4-year operational data set from 2002 to 2005 was collected in support of DEA modeling using Monte Carlo simulation to outline the possibility distributions of operational efficiency of these incinerators. Uncertainty analysis using the Monte Carlo simulation provides a balance between simplifications of our analysis and the soundness of capturing the essential random features that complicate solid waste management systems. To cope with future challenges, efforts in the DEA modeling, systems analysis, and prediction of the performance of large-scale municipal solid waste incinerators under normal operation and special conditions were directed toward generating a compromised assessment procedure. Our research findings will eventually lead to the identification of the optimal management strategies for promoting the quality of solid waste incineration, not only in Taiwan, but also elsewhere in the world.« less
Islam, Md Ashraful; Kim, Jung Han; Schropp, Anthony; Kalita, Hirokjyoti; Choudhary, Nitin; Weitzman, Dylan; Khondaker, Saiful I; Oh, Kyu Hwan; Roy, Tania; Chung, Hee-Suk; Jung, Yeonwoong
2017-10-11
Two-dimensional (2D) transition metal dichalcogenides (TMDs) such as molybdenum or tungsten disulfides (MoS 2 or WS 2 ) exhibit extremely large in-plane strain limits and unusual optical/electrical properties, offering unprecedented opportunities for flexible electronics/optoelectronics in new form factors. In order for them to be technologically viable building-blocks for such emerging technologies, it is critically demanded to grow/integrate them onto flexible or arbitrary-shaped substrates on a large wafer-scale compatible with the prevailing microelectronics processes. However, conventional approaches to assemble them on such unconventional substrates via mechanical exfoliations or coevaporation chemical growths have been limited to small-area transfers of 2D TMD layers with uncontrolled spatial homogeneity. Moreover, additional processes involving a prolonged exposure to strong chemical etchants have been required for the separation of as-grown 2D layers, which is detrimental to their material properties. Herein, we report a viable strategy to universally combine the centimeter-scale growth of various 2D TMD layers and their direct assemblies on mechanically deformable substrates. By exploring the water-assisted debonding of gold (Au) interfaced with silicon dioxide (SiO 2 ), we demonstrate the direct growth, transfer, and integration of 2D TMD layers and heterostructures such as 2D MoS 2 and 2D MoS 2 /WS 2 vertical stacks on centimeter-scale plastic and metal foil substrates. We identify the dual function of the Au layer as a growth substrate as well as a sacrificial layer which facilitates 2D layer transfer. Furthermore, we demonstrate the versatility of this integration approach by fabricating centimeter-scale 2D MoS 2 /single walled carbon nanotube (SWNT) vertical heterojunctions which exhibit current rectification and photoresponse. This study opens a pathway to explore large-scale 2D TMD van der Waals layers as device building blocks for emerging mechanically deformable electronics/optoelectronics.
Using Microsensor Technology to Quantify Match Demands in Collegiate Women's Volleyball.
Vlantes, Travis G; Readdy, Tucker
2017-12-01
Vlantes, TG and Readdy, T. Using microsensor technology to quantify match demands in collegiate women's volleyball. J Strength Cond Res 31(12): 3266-3278, 2017-The purpose of this study was to quantify internal and external load demands of women's NCAA Division I collegiate volleyball competitions using microsensor technology and session rating of perceived exertion (S-RPE). Eleven collegiate volleyball players wore microsensor technology (Optimeye S5; Catapult Sports, Chicago, IL, USA) during 15 matches played throughout the 2016 season. Parameters examined include player load (PL), high impact PL, percentage of HI PL, explosive efforts (EEs), and jumps. Session rating of perceived exertion was collected 20 minutes postmatch using a modified Borg scale. The relationship between internal and external load was explored, comparing S-RPE data with the microsensor metrics (PL, HI PL, % HI PL, EEs, and jumps). The setter had the greatest mean PL and highest number of jumps of all positions in a 5-1 system, playing all 6 rotations. Playing 4 sets yielded a mean PL increase of 25.1% over 3 sets, whereas playing 5 sets showed a 31.0% increase in PL. A multivariate analysis of variance revealed significant differences (p < 0.01) across all position groups when examining % HI PL and jumps. Cohen's d analysis revealed large (≥0.8) effect sizes for these differences. Defensive specialists recorded the greatest mean S-RPE values over all 15 matches (886 ± 384.6). Establishing positional load demands allows coaches, trainers, and strength and conditioning professionals to implement training programs for position-specific demands, creating consistent peak performance, and reducing injury risk.
Noren, Shawn R.; Udevitz, Mark S.; Jay, Chadwick V.
2014-01-01
Decreases in sea ice have altered habitat use and activity patterns of female Pacific walruses Odobenus rosmarus divergens and could affect their energetic demands, reproductive success, and population status. However, a lack of physiological data from walruses has hampered efforts to develop the bioenergetics models required for fully understanding potential population-level impacts. We analyzed long-term longitudinal data sets of caloric consumption and body mass from nine female Pacific walruses housed at six aquaria using a hierarchical Bayesian approach to quantify relative energetic demands for maintenance, growth, pregnancy, and lactation. By examining body mass fluctuations in response to food consumption, the model explicitly uncoupled caloric demand from caloric intake. This is important for pinnipeds because they sequester and deplete large quantities of lipids throughout their lifetimes. Model outputs were scaled to account for activity levels typical of free-ranging Pacific walruses, averaging 83% of the time active in water and 17% of the time hauled-out resting. Estimated caloric requirements ranged from 26,900 kcal d−1 for 2-yr-olds to 93,370 kcal d−1 for simultaneously lactating and pregnant walruses. Daily consumption requirements were higher for pregnancy than lactation, reflecting energetic demands of increasing body size and lipid deposition during pregnancy. Although walruses forage during lactation, fat sequestered during pregnancy sustained 27% of caloric requirements during the first month of lactation, suggesting that walruses use a mixed strategy of capital and income breeding. Ultimately, this model will aid in our understanding of the energetic and population consequences of sea ice loss.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan, Wenhua; Zhao, Jianshi; Li, Hong-Yi
Hydrological drought is a substantial negative deviation from normal hydrologic conditions and is influenced by climate and human activities such as water management. By perturbing the streamflow regime, climate change and water management may significantly alter drought characteristics in the future. Here we utilize a high-resolution integrated modeling framework that represents water management in terms of both local surface water extraction and reservoir regulation, and use the Standardized Streamflow Index (SSI) to quantify hydrological drought. We explore the impacts of water management on hydrological drought over the contiguous US in a warming climate with and without emissions mitigation. Despite themore » uncertainty of climate change impacts, local surface water extraction consistently intensifies drought that dominates at the regional to national scale. However, reservoir regulation alleviates drought by enhancing summer flow downstream of reservoirs. The relative dominance of drought intensification or relief is largely determined by the water demand, with drought intensification dominating in regions with intense water demand such as the Great Plains and California, while drought relief dominates in regions with low water demand. At the national level, water management increases the spatial extent of extreme drought despite some alleviations of moderate to severe drought. In an emissions mitigation scenario with increased irrigation demand for bioenergy production, water management intensifies drought more than the business-as-usual scenario at the national level, so the impacts of emissions mitigation must be evaluated by considering its benefit in reducing warming and evapotranspiration against its effects on increasing water demand and intensifying drought.« less
NASA Astrophysics Data System (ADS)
Wan, Wenhua; Zhao, Jianshi; Li, Hong-Yi; Mishra, Ashok; Ruby Leung, L.; Hejazi, Mohamad; Wang, Wei; Lu, Hui; Deng, Zhiqun; Demissisie, Yonas; Wang, Hao
2017-11-01
Hydrological drought is a substantial negative deviation from normal hydrologic conditions and is influenced by climate and human activities such as water management. By perturbing the streamflow regime, climate change and water management may significantly alter drought characteristics in the future. Here we utilize a high-resolution integrated modeling framework that represents water management in terms of both local surface water extraction and reservoir regulation and use the Standardized Streamflow Index to quantify hydrological drought. We explore the impacts of water management on hydrological drought over the contiguous U.S. in a warming climate with and without emissions mitigation. Despite the uncertainty of climate change impacts, local surface water extraction consistently intensifies drought that dominates at the regional to national scale. However, reservoir regulation alleviates drought by enhancing summer flow downstream of reservoirs. The relative dominance of drought intensification or relief is largely determined by the water demand, with drought intensification dominating in regions with intense water demand such as the Great Plains and California, while drought relief dominates in regions with low water demand. At the national level, water management increases the spatial extent of extreme drought despite some alleviations of moderate to severe drought. In an emissions mitigation scenario with increased irrigation demand for bioenergy production, water management intensifies drought more than the business-as-usual scenario at the national level, so the impacts of emissions mitigation must be evaluated by considering its benefit in reducing warming and evapotranspiration against its effects on increasing water demand and intensifying drought.
Higher Education and MOOCS in India and the Global South
ERIC Educational Resources Information Center
Alcorn, Brandon; Christensen, Gayle; Kapur, Devesh
2015-01-01
Demographic surges and economic growth have created an exploding demand for higher education in the Global South--a demand that low- and middle-income countries cannot realistically meet with traditional institutions alone. In India, the demand increasingly is being met by online education. Recently, MOOCs--with their potential to scale up rapidly…
Climate-wise choices in a world of oil abundance
NASA Astrophysics Data System (ADS)
Brandt, Adam R.; Masnadi, Mohammad S.; Englander, Jacob G.; Koomey, Jonathan; Gordon, Deborah
2018-04-01
Constrained oil supply has given way to abundance at a time when strong action on climate change is wavering. Recent innovation has pushed US oil production to all-time heights and driven oil prices lower. At the same time, attention to climate policy is wavering due to geopolitical upheaval. Nevertheless, climate-wise choices in the oil sector remain a priority, given oil’s large role in modern economies. Here we use a set of open-source models along with a detailed dataset comprising 75 global crude oils (~25% of global production) to estimate the effects of carbon intensity and oil demand on decadal scale oil-sector emissions. We find that oil resources are abundant relative to all projections of 21st century demand, due to large light-tight oil (LTO) and heavy oil/bitumen (HOB) resources. We then investigate the ‘barrel forward’ emissions from producing, refining, and consuming all products from a barrel of crude. These oil resources have diverse life-cycle-greenhouse gas (LC-GHG) emissions impacts, and median per-barrel emissions for unconventional resources vary significantly. Median HOB life cycle emissions are 1.5 times those of median LTO emissions, exceeding them by 200 kgCO2eq./bbl. We show that reducing oil LC-GHGs is a mitigation opportunity worth 10–50 gigatonnes CO2 eq. cumulatively by 2050. We discuss means to reduce oil sector LC-GHGs. Results point to the need for policymakers to address both oil supply and oil demand when considering options to reduce LC-GHGs.
Kvist, Tarja; Mäntynen, Raija; Partanen, Pirjo; Turunen, Hannele; Miettinen, Merja; Vehviläinen-Julkunen, Katri
2012-01-01
This paper describes the development of the Kuopio University Hospital Job Satisfaction Scale (KUHJSS) and the results of the survey. The scale was developed through a systematic literature review, and its validity and reliability were assessed using several psychometric properties including expert evaluation (n = 5), a pilot survey (n = 172), and exploratory factor analysis. The final version of KUHJSS included 37 items. A large sample psychometric evaluation was made by nursing staff (n = 2708). The exploratory factor analysis revealed seven factors with modest internal consistency (0.64–0.92). The staff reported relatively high job satisfaction. The greatest satisfaction was derived from motivating factors associated with the work; the least, from the job's demands. Respondents who considered their working units to provide an excellent quality of care reported the highest job satisfaction in every subarea (P < .0001). The KUHJSS proved to be a reliable and valid tool for measuring job satisfaction in hospital care. PMID:23133750
NASA Astrophysics Data System (ADS)
Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine
2017-06-01
The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.
CD-ROM technology at the EROS data center
Madigan, Michael E.; Weinheimer, Mary C.
1993-01-01
The vast amount of digital spatial data often required by a single user has created a demand for media alternatives to 1/2" magnetic tape. One such medium that has been recently adopted at the U.S. Geological Survey's EROS Data Center is the compact disc (CD). CD's are a versatile, dynamic, and low-cost method for providing a variety of data on a single media device and are compatible with various computer platforms. CD drives are available for personal computers, UNIX workstations, and mainframe systems, either directly connected, or through a network. This medium furnishes a quick method of reproducing and distributing large amounts of data on a single CD. Several data sets are already available on CD's, including collections of historical Landsat multispectral scanner data and biweekly composites of Advanced Very High Resolution Radiometer data for the conterminous United States. The EROS Data Center intends to provide even more data sets on CD's. Plans include specific data sets on a customized disc to fulfill individual requests, and mass production of unique data sets for large-scale distribution. Requests for a single compact disc-read only memory (CD-ROM) containing a large volume of data either for archiving or for one-time distribution can be addressed with a CD-write once (CD-WO) unit. Mass production and large-scale distribution will require CD-ROM replication and mastering.
Open Source Tools for Assessment of Global Water Availability, Demands, and Scarcity
NASA Astrophysics Data System (ADS)
Li, X.; Vernon, C. R.; Hejazi, M. I.; Link, R. P.; Liu, Y.; Feng, L.; Huang, Z.; Liu, L.
2017-12-01
Water availability and water demands are essential factors for estimating water scarcity conditions. To reproduce historical observations and to quantify future changes in water availability and water demand, two open source tools have been developed by the JGCRI (Joint Global Change Research Institute): Xanthos and GCAM-STWD. Xanthos is a gridded global hydrologic model, designed to quantify and analyze water availability in 235 river basins. Xanthos uses a runoff generation and a river routing modules to simulate both historical and future estimates of total runoff and streamflows on a monthly time step at a spatial resolution of 0.5 degrees. GCAM-STWD is a spatiotemporal water disaggregation model used with the Global Change Assessment Model (GCAM) to spatially downscale global water demands for six major enduse sectors (irrigation, domestic, electricity generation, mining, and manufacturing) from the region scale to the scale of 0.5 degrees. GCAM-STWD then temporally downscales the gridded annual global water demands to monthly results. These two tools, written in Python, can be integrated to assess global, regional or basin-scale water scarcity or water stress. Both of the tools are extensible to ensure flexibility and promote contribution from researchers that utilize GCAM and study global water use and supply.
OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics
NASA Astrophysics Data System (ADS)
Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.
2014-12-01
OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.
The Work-Health-Check (WHC): a brief new tool for assessing psychosocial stress in the workplace.
Gadinger, M C; Schilling, O; Litaker, D; Fischer, J E
2012-01-01
Brief, psychometrically robust questionnaires assessing work-related psychosocial stressors are lacking. The purpose of the study is to evaluate the psychometric properties of a brief new questionnaire for assessing sources of work-related psychosocial stress. Managers, blue- and white-collar workers (n= 628 at measurement point one, n=459 at measurement point two), sampled from an online panel of a German marketing research institute. We either developed or identified appropriate items from existing questionnaires for ten scales, which are conceptually based in work stress models and reflected either work-related demands or resources. Factorial structure was evaluated by confirmatory factor analyses (CFA). Scale reliability was assessed by Cronbach's Alpha, and test-retest; correlations with work-related efforts demonstrated convergent and discriminant validity for the demand and resource scales, respectively. Scale correlations with health indicators tested criterion validity. All scales had satisfactory reliability (Cronbach's Alpha: 0.74-0.93, retest reliabilities: 0.66-0.81). CFA supported the anticipated factorial structure. Significant correlations between job-related efforts and demand scales (mean r=0.44) and non-significant correlations with the resource scales (mean r=0.07) suggested good convergent and discriminant validity, respectively. Scale correlations with health indicators demonstrated good criterion validity. The WHC appears to be a brief, psychometrically robust instrument for assessing work-related psychosocial stressors.
Climate variability and the European agricultural production
NASA Astrophysics Data System (ADS)
Guimarães Nobre, Gabriela; Hunink, Johannes E.; Baruth, Bettina; Aerts, Jeroen C. J. H.; Ward, Philip J.
2017-04-01
By 2050, the global demand for maize, wheat and other major crops is expected to grow sharply. To meet this challenge, agricultural systems have to increase substantially their production. However, the expanding world population, coupled with a decline of arable land per person, and the variability in global climate, are obstacles to achieving the increasing demand. Creating a resilient agriculture system requires the incorporation of preparedness measures against weather-related events, which can trigger disruptive risks such as droughts. This study examines the influence of large-scale climate variability on agriculture production applying a robust decision-making tool named fast-and-frugal trees (FFT). We created FFTs using a dataset of crop production and indices of climate variability: the El Niño Southern Oscillation (SOI) and the North Atlantic Oscillation (NAO). Our main goal is to predict the occurrence of below-average crop production, using these two indices at different lead times. Initial results indicated that SOI and NAO have strong links with European low sugar beet production. For some areas, the FFTs were able to detect below-average productivity events six months before harvesting with hit rate and predictive positive value higher than 70%. We found that shorter lead times, such as three months before harvesting, have the highest predictive skill. Additionally, we observed that the responses of low production events to the phases of the NAO and SOI vary spatially and seasonally. Through the comprehension of the relationship between large scale climate variability and European drought related agricultural impact, this study reflects on how this information could potentially improve the management of the agricultural sector by coupling the findings with seasonal forecasting system of crop production.
Suzuki, Tomoko; Miyaki, Koichi; Tsutsumi, Akizumi; Hashimoto, Hideki; Kawakami, Norito; Takahashi, Masaya; Shimazu, Akihito; Inoue, Akiomi; Kurioka, Sumiko; Kakehashi, Masayuki; Sasaki, Yasuharu; Shimbo, Takuro
2013-09-05
This study examined the association between traditional Japanese dietary pattern and depressive symptoms in Japanese workers, employing large-scale samples, considering socioeconomic status (SES) and job stress factors. A cross-sectional study of 2266 Japanese employees aged 21-65 years from all areas of Japan was conducted as part of the Japanese Study of Health, Occupation and Psychosocial factors related Equity (J-HOPE). Habitual diet was assessed by FFQ (BDHQ). The depression degree and job stress factors (job demand, job control, and worksite support) were measured by K6 and Job Content Questionnaire. Participants with high scores for the balanced Japanese dietary pattern were significantly less likely to show probable mood/anxiety disorders (K6≥9) with multivariate adjustment including SES and job stress factors (odds ratio=0.66 [0.51-0.86], trend P=0.002). Other dietary patterns were not associated with depressive symptoms. Even after stratification by job stress factors, the Japanese dietary pattern was consistently protective against depressive symptoms. Furthermore, a highly significant difference between the first and third tertiles of the dietary pattern was observed in participants with active strain (high demand and high control) with low worksite supports (8.5 vs. 5.2, P=0.011). Female participant sample was relatively small. Japanese dietary pattern consistently related to low depressive symptoms in this large-scale cohort of Japanese workers, even after adjusting for SES and job stress factors. The protective impact is especially strong for workers with active strain and low support. Making better use of traditional dietary patterns may facilitate reducing social disparities in mental health. Copyright © 2013 Elsevier B.V. All rights reserved.
Energy efficiency design strategies for buildings with grid-connected photovoltaic systems
NASA Astrophysics Data System (ADS)
Yimprayoon, Chanikarn
The building sector in the United States represents more than 40% of the nation's energy consumption. Energy efficiency design strategies and renewable energy are keys to reduce building energy demand. Grid-connected photovoltaic (PV) systems installed on buildings have been the fastest growing market in the PV industry. This growth poses challenges for buildings qualified to serve in this market sector. Electricity produced from solar energy is intermittent. Matching building electricity demand with PV output can increase PV system efficiency. Through experimental methods and case studies, computer simulations were used to investigate the priorities of energy efficiency design strategies that decreased electricity demand while producing load profiles matching with unique output profiles from PV. Three building types (residential, commercial, and industrial) of varying sizes and use patterns located in 16 climate zones were modeled according to ASHRAE 90.1 requirements. Buildings were analyzed individually and as a group. Complying with ASHRAE energy standards can reduce annual electricity consumption at least 13%. With energy efficiency design strategies, the reduction could reach up to 65%, making it possible for PV systems to meet reduced demands in residential and industrial buildings. The peak electricity demand reduction could be up to 71% with integration of strategies and PV. Reducing lighting power density was the best single strategy with high overall performances. Combined strategies such as zero energy building are also recommended. Electricity consumption reductions are the sum of the reductions from strategies and PV output. However, peak electricity reductions were less than their sum because they reduced peak at different times. The potential of grid stress reduction is significant. Investment incentives from government and utilities are necessary. The PV system sizes on net metering interconnection should not be limited by legislation existing in some states. Data from this study provides insight of impacts from applying energy efficiency design strategies in buildings with grid-connected PV systems. With the current transition from traditional electric grids to future smart grids, this information plus large database of various building conditions allow possible investigations needed by governments or utilities in large scale communities for implementing various measures and policies.
Brackish groundwater in the United States
Stanton, Jennifer S.; Anning, David W.; Brown, Craig J.; Moore, Richard B.; McGuire, Virginia L.; Qi, Sharon L.; Harris, Alta C.; Dennehy, Kevin F.; McMahon, Peter B.; Degnan, James R.; Böhlke, John Karl
2017-04-05
For some parts of the Nation, large-scale development of groundwater has caused decreases in the amount of groundwater that is present in aquifer storage and that discharges to surface-water bodies. Water supply in some areas, particularly in arid and semiarid regions, is not adequate to meet demand, and severe drought is affecting large parts of the United States. Future water demand is projected to heighten the current stress on groundwater resources. This combination of factors has led to concerns about the availability of freshwater to meet domestic, agricultural, industrial, mining, and environmental needs. To ensure the water security of the Nation, currently [2016] untapped water sources may need to be developed.Brackish groundwater is an unconventional water source that may offer a partial solution to current and future water demands. In support of the national census of water resources, the U.S. Geological Survey completed the national brackish groundwater assessment to better understand the occurrence and characteristics of brackish groundwater in the United States as a potential water resource. Analyses completed as part of this assessment relied on previously collected data from multiple sources; no new data were collected. Compiled data included readily available information about groundwater chemistry, horizontal and vertical extents and hydrogeologic characteristics of principal aquifers (regionally extensive aquifers or aquifer systems that have the potential to be used as a source of potable water), and groundwater use. Although these data were obtained from a wide variety of sources, the compiled data are biased toward shallow and fresh groundwater resources; data representing groundwater that is at great depths and is saline were not as readily available.One of the most important contributions of this assessment is the creation of a database containing chemical characteristics and aquifer information for the known areas with brackish groundwater in the United States. Previously published digital data relating to brackish groundwater resources were limited to a small number of State- and regional-level studies. Data sources for this assessment ranged from single publications to large datasets and from local studies to national assessments. Geochemical data included concentrations of dissolved solids, major ions, trace elements, nutrients, and radionuclides as well as physical properties of the water (pH, temperature, and specific conductance). Additionally, the database provides selected well information (location, yield, depth, and contributing aquifer) necessary for evaluating the water resource.The assessment was divided into national-, regional-, and aquifer-scale analyses. National-scale analyses included evaluation of the three-dimensional distribution of observed dissolved-solids concentrations in groundwater, the three-dimensional probability of brackish groundwater occurrence, and the geochemical characteristics of saline (greater than or equal to 1,000 mg/L of dissolved solids) groundwater resources. Regional-scale analyses included a summary of the percentage of observed grid cell volume in the region that was occupied by brackish groundwater within the mixture of air, water, and rock for multiple depth intervals. Aquifer-scale analyses focused primarily on four regions that contained the largest amounts of observed brackish groundwater and included a generalized description of hydrogeologic characteristics from previously published work; the distribution of dissolved-solids concentrations; considerations for developing brackish groundwater resources, including a summary of other chemical characteristics that may limit the use of brackish groundwater and the ability of sampled wells producing brackish groundwater to yield useful amounts of water; and the amount of saline groundwater being used in 2010.
NASA Astrophysics Data System (ADS)
Du, Shihong; Zhang, Fangli; Zhang, Xiuyuan
2015-07-01
While most existing studies have focused on extracting geometric information on buildings, only a few have concentrated on semantic information. The lack of semantic information cannot satisfy many demands on resolving environmental and social issues. This study presents an approach to semantically classify buildings into much finer categories than those of existing studies by learning random forest (RF) classifier from a large number of imbalanced samples with high-dimensional features. First, a two-level segmentation mechanism combining GIS and VHR image produces single image objects at a large scale and intra-object components at a small scale. Second, a semi-supervised method chooses a large number of unbiased samples by considering the spatial proximity and intra-cluster similarity of buildings. Third, two important improvements in RF classifier are made: a voting-distribution ranked rule for reducing the influences of imbalanced samples on classification accuracy and a feature importance measurement for evaluating each feature's contribution to the recognition of each category. Fourth, the semantic classification of urban buildings is practically conducted in Beijing city, and the results demonstrate that the proposed approach is effective and accurate. The seven categories used in the study are finer than those in existing work and more helpful to studying many environmental and social problems.
Whispering - The hidden side of auditory communication.
Frühholz, Sascha; Trost, Wiebke; Grandjean, Didier
2016-11-15
Whispering is a unique expression mode that is specific to auditory communication. Individuals switch their vocalization mode to whispering especially when affected by inner emotions in certain social contexts, such as in intimate relationships or intimidating social interactions. Although this context-dependent whispering is adaptive, whispered voices are acoustically far less rich than phonated voices and thus impose higher hearing and neural auditory decoding demands for recognizing their socio-affective value by listeners. The neural dynamics underlying this recognition especially from whispered voices are largely unknown. Here we show that whispered voices in humans are considerably impoverished as quantified by an entropy measure of spectral acoustic information, and this missing information needs large-scale neural compensation in terms of auditory and cognitive processing. Notably, recognizing the socio-affective information from voices was slightly more difficult from whispered voices, probably based on missing tonal information. While phonated voices elicited extended activity in auditory regions for decoding of relevant tonal and time information and the valence of voices, whispered voices elicited activity in a complex auditory-frontal brain network. Our data suggest that a large-scale multidirectional brain network compensates for the impoverished sound quality of socially meaningful environmental signals to support their accurate recognition and valence attribution. Copyright © 2016 Elsevier Inc. All rights reserved.
Large Scale Skill in Regional Climate Modeling and the Lateral Boundary Condition Scheme
NASA Astrophysics Data System (ADS)
Veljović, K.; Rajković, B.; Mesinger, F.
2009-04-01
Several points are made concerning the somewhat controversial issue of regional climate modeling: should a regional climate model (RCM) be expected to maintain the large scale skill of the driver global model that is supplying its lateral boundary condition (LBC)? Given that this is normally desired, is it able to do so without help via the fairly popular large scale nudging? Specifically, without such nudging, will the RCM kinetic energy necessarily decrease with time compared to that of the driver model or analysis data as suggested by a study using the Regional Atmospheric Modeling System (RAMS)? Finally, can the lateral boundary condition scheme make a difference: is the almost universally used but somewhat costly relaxation scheme necessary for a desirable RCM performance? Experiments are made to explore these questions running the Eta model in two versions differing in the lateral boundary scheme used. One of these schemes is the traditional relaxation scheme, and the other the Eta model scheme in which information is used at the outermost boundary only, and not all variables are prescribed at the outflow boundary. Forecast lateral boundary conditions are used, and results are verified against the analyses. Thus, skill of the two RCM forecasts can be and is compared not only against each other but also against that of the driver global forecast. A novel verification method is used in the manner of customary precipitation verification in that forecast spatial wind speed distribution is verified against analyses by calculating bias adjusted equitable threat scores and bias scores for wind speeds greater than chosen wind speed thresholds. In this way, focusing on a high wind speed value in the upper troposphere, verification of large scale features we suggest can be done in a manner that may be more physically meaningful than verifications via spectral decomposition that are a standard RCM verification method. The results we have at this point are somewhat limited in view of the integrations having being done only for 10-day forecasts. Even so, one should note that they are among very few done using forecast as opposed to reanalysis or analysis global driving data. Our results suggest that (1) running the Eta as an RCM no significant loss of large-scale kinetic energy with time seems to be taking place; (2) no disadvantage from using the Eta LBC scheme compared to the relaxation scheme is seen, while enjoying the advantage of the scheme being significantly less demanding than the relaxation given that it needs driver model fields at the outermost domain boundary only; and (3) the Eta RCM skill in forecasting large scales, with no large scale nudging, seems to be just about the same as that of the driver model, or, in the terminology of Castro et al., the Eta RCM does not lose "value of the large scale" which exists in the larger global analyses used for the initial condition and for verification.
Du, Jian; Song, Wenxia; Zhang, Xiu; Zhao, Jian; Liu, Guodong; Qu, Yinbo
2018-04-23
High dosage of enzyme is required to achieve effective lignocellulose hydrolysis, especially at high-solid loadings, which is a significant barrier to large-scale bioconversion of lignocellulose. Here, we screened four chemical additives and three accessory proteins for their effects on the enzymatic hydrolysis of various lignocellulosic materials. The effects were found to be highly dependent on the composition and solid loadings of substrates. For xylan-extracted lignin-rich corncob residue, the enhancing effect of PEG 6000 was most pronounced and negligibly affected by solid content, which reduced more than half of enzyme demand at 20% dry matter (DM). Lytic polysaccharide monooxygenase enhanced the hydrolysis of ammonium sulfite wheat straw pulp, and its addition reduced about half of protein demand at the solid loading of 20% DM. Supplementation of the additives in the hydrolysis of pure cellulose and complex lignocellulosic materials revealed that their effects are tightly linked to pretreatment strategies.
Dynamics of assembly production flow
NASA Astrophysics Data System (ADS)
Ezaki, Takahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro
2015-06-01
Despite recent developments in management theory, maintaining a manufacturing schedule remains difficult because of production delays and fluctuations in demand and supply of materials. The response of manufacturing systems to such disruptions to dynamic behavior has been rarely studied. To capture these responses, we investigate a process that models the assembly of parts into end products. The complete assembly process is represented by a directed tree, where the smallest parts are injected at leaves and the end products are removed at the root. A discrete assembly process, represented by a node on the network, integrates parts, which are then sent to the next downstream node as a single part. The model exhibits some intriguing phenomena, including overstock cascade, phase transition in terms of demand and supply fluctuations, nonmonotonic distribution of stockout in the network, and the formation of a stockout path and stockout chains. Surprisingly, these rich phenomena result from only the nature of distributed assembly processes. From a physical perspective, these phenomena provide insight into delay dynamics and inventory distributions in large-scale manufacturing systems.
Recovery of magnetite from low grade banded magnetite quartzite (BMQ) ore
NASA Astrophysics Data System (ADS)
Tripathy, Alok; Bagchi, Subhankar; Rao, Danda Srinivas; Nayak, Bijaya Ketana; Rout, Prashanta Kumar; Biswal, Surendra Kumar
2018-04-01
There has been a steady increase of iron ore demand in the last few decades. This growing demand could be countered by use of low grade iron ore after beneficiation. Banded iron formations (BIF) are one of the resources of such low grade iron ores. Banded magnetite quartzite (BMQ) is one such BIF and a source of iron phase mineral in the form of magnetite. In the present study a low grade BMQ ore containing around 25.47% Fe was beneficiated for recovery of magnetite. XRD study shows that quartz, magnetite, hematite, and goethite are the major minerals phases present in the low grade BMQ sample. Unit operations such as crushing, scrubbing, grinding, and magnetic separations were used for recovering magnetite. Based on the large scale beneficiation studies the process flowsheet has been developed for enrichment of magnetite. It was found that with the help of developed process flowsheet it is possible to enrich Fe value up to 65.14% in the concentrate with a yield of 24.59%.
Phakthongsuk, Pitchaya
2009-04-01
To test the construct validity of the Thai version of the job content questionnaire (TJCQ). The present descriptive study recruited 10415 participants from all occupations according to the International Standard Classification of Occupations. The instrument consisted of a 48-item of the job content questionnaire. Eight items newly developed by the authors from in-depth interviews were added. Exploratory factor analysis showed six factor models of work hazards, decision latitude, psychological demand, social support, physical demand, and job security. However, supervisor and co-worker support were not distinguished into two factors and some items distributed differently along the factors extracted. Confirmatory factor analysis supported the construct of six latent factors, although the overall fit was moderately acceptable. Cronbach's alpha coefficients higher than 0.7, supported the internal consistency of TJCQ scales except for job security (0.55). These findings suggest that TJCQ is valid and reliable for assessing job stress among Thai populations.
Current issues relating to psychosocial job strain and cardiovascular disease research.
Theorell, T; Karasek, R A
1996-01-01
The authors comment on recent reviews of cardiovascular job strain research by P. L. Schnall and P. A. Landsbergis (1994), and by T. S. Kristensen (1995), which conclude that job strain as defined by the demand-control model (the combination of contributions of low job decision latitudes and high psychological job demands) is confirmed as a risk factor for cardiovascular mortality in a large majority of studies. Lack of social support at work appears to further increase risk. Several still-unresolved research questions are examined in light of recent studies: (a) methodological issues related to use of occupational aggregate estimations and occupational career aggregate assessments, use of standard scales for job analysis and recall bias issues in self-reporting; (b) confounding factors and differential strengths of association by subgroups in job strain-cardiovascular disease analyses with respect to social class, gender, and working hours; and (c) review of results of monitoring job strain-blood pressure associations and associated methodological issues.
Shiraki, Hiroto; Ashina, Shuichi
2018-01-01
After the severe nuclear disaster in Fukushima, which was triggered by the Great East Japan earthquake in March 2011, nuclear power plants in Japan were temporarily shut down for mandatory inspections. To prevent large-scale blackouts, the Japanese government requested companies and households to reduce electricity consumption in summer and winter. It is reported that the domestic electricity demand had a structural decrease because of the electricity conservation effect (ECE). However, quantitative analysis of the ECE is not sufficient, and especially time variation of the ECE remains unclear. Understanding the ECE is important because Japan’s NDC (nationally determined contribution) assumes the reduction of CO2 emissions through aggressive energy conservation. In this study, we develop a time series model of monthly electricity demand in Japan and estimate time variation of the ECE. Moreover, we evaluate the impact of electricity conservation on CO2 emissions from power plants. The dynamic linear model is used to separate the ECE from the effects of other irrelevant factors (e.g. air temperature, economic production, and electricity price). Our result clearly shows that consumers’ electricity conservation behavior after the earthquake was not temporary but became established as a habit. Between March 2011 and March 2016, the ECE on industrial electricity demand ranged from 3.9% to 5.4%, and the ECE on residential electricity demand ranged from 1.6% to 7.6%. The ECE on the total electricity demand was estimated at 3.2%–6.0%. We found a seasonal pattern that the residential ECE in summer is higher than that in winter. The emissions increase from the shutdown of nuclear power plants was mitigated by electricity conservation. The emissions reduction effect was estimated at 0.82 MtCO2–2.26 MtCO2 (−4.5% on average compared to the zero-ECE case). The time-varying ECE is necessary for predicting Japan’s electricity demand and CO2 emissions after the earthquake. PMID:29708988
Honjo, Keita; Shiraki, Hiroto; Ashina, Shuichi
2018-01-01
After the severe nuclear disaster in Fukushima, which was triggered by the Great East Japan earthquake in March 2011, nuclear power plants in Japan were temporarily shut down for mandatory inspections. To prevent large-scale blackouts, the Japanese government requested companies and households to reduce electricity consumption in summer and winter. It is reported that the domestic electricity demand had a structural decrease because of the electricity conservation effect (ECE). However, quantitative analysis of the ECE is not sufficient, and especially time variation of the ECE remains unclear. Understanding the ECE is important because Japan's NDC (nationally determined contribution) assumes the reduction of CO2 emissions through aggressive energy conservation. In this study, we develop a time series model of monthly electricity demand in Japan and estimate time variation of the ECE. Moreover, we evaluate the impact of electricity conservation on CO2 emissions from power plants. The dynamic linear model is used to separate the ECE from the effects of other irrelevant factors (e.g. air temperature, economic production, and electricity price). Our result clearly shows that consumers' electricity conservation behavior after the earthquake was not temporary but became established as a habit. Between March 2011 and March 2016, the ECE on industrial electricity demand ranged from 3.9% to 5.4%, and the ECE on residential electricity demand ranged from 1.6% to 7.6%. The ECE on the total electricity demand was estimated at 3.2%-6.0%. We found a seasonal pattern that the residential ECE in summer is higher than that in winter. The emissions increase from the shutdown of nuclear power plants was mitigated by electricity conservation. The emissions reduction effect was estimated at 0.82 MtCO2-2.26 MtCO2 (-4.5% on average compared to the zero-ECE case). The time-varying ECE is necessary for predicting Japan's electricity demand and CO2 emissions after the earthquake.
The economics and environmental impacts of large-scale wind power in a carbon constrained world
NASA Astrophysics Data System (ADS)
Decarolis, Joseph Frank
Serious climate change mitigation aimed at stabilizing atmospheric concentrations of CO2 will require a radical shift to a decarbonized energy supply. The electric power sector will be a primary target for deep reductions in CO2 emissions because electric power plants are among the largest and most manageable point sources of emissions. With respect to new capacity, wind power is currently one of the most inexpensive ways to produce electricity without CO2 emissions and it may have a significant role to play in a carbon constrained world. Yet most research in the wind industry remains focused on near term issues, while energy system models that focus on century-long time horizons undervalue wind by imposing exogenous limits on growth. This thesis fills a critical gap in the literature by taking a closer look at the cost and environmental impacts of large-scale wind. Estimates of the average cost of wind generation---now roughly 4¢/kWh---do not address the cons arising from the spatial distribution and intermittency of wind. This thesis develops a theoretical framework for assessing the intermittency cost of wind. In addition, an economic characterization of a wind system is provided in which long-distance electricity transmission, storage, and gas turbines are used to supplement variable wind power output to meet a time-varying load. With somewhat optimistic assumptions about the cost of wind turbines, the use of wind to serve 50% of demand adds ˜1--2¢/kWh to the cost of electricity, a cost comparable to that of other large-scale low carbon technologies. This thesis also explores the environmental impacts posed by large-scale wind. Though avian mortality and noise caused controversy in the early years of wind development, improved technology and exhaustive siting assessments have minimized their impact. The aesthetic valuation of wind farms can be improved significantly with better design, siting, construction, and maintenance procedures, but opposition may increase as wind is developed on a large scale. Finally, this thesis summarizes collaborative work utilizing general circulation models to determine whether wind turbines have an impact of climate. The results suggest that the climatic impact is non-negligible at continental scales, but further research is warranted.
NASA Astrophysics Data System (ADS)
Moore, T. S.; Sanderman, J.; Baldock, J.; Plante, A. F.
2016-12-01
National-scale inventories typically include soil organic carbon (SOC) content, but not chemical composition or biogeochemical stability. Australia's Soil Carbon Research Programme (SCaRP) represents a national inventory of SOC content and composition in agricultural systems. The program used physical fractionation followed by 13C nuclear magnetic resonance (NMR) spectroscopy. While these techniques are highly effective, they are typically too expensive and time consuming for use in large-scale SOC monitoring. We seek to understand if analytical thermal analysis is a viable alternative. Coupled differential scanning calorimetry (DSC) and evolved gas analysis (CO2- and H2O-EGA) yields valuable data on SOC composition and stability via ramped combustion. The technique requires little training to use, and does not require fractionation or other sample pre-treatment. We analyzed 300 agricultural samples collected by SCaRP, divided into four fractions: whole soil, coarse particulates (POM), untreated mineral associated (HUM), and hydrofluoric acid (HF)-treated HUM. All samples were analyzed by DSC-EGA, but only the POM and HF-HUM fractions were analyzed by NMR. Multivariate statistical analyses were used to explore natural clustering in SOC composition and stability based on DSC-EGA data. A partial least-squares regression (PLSR) model was used to explore correlations among the NMR and DSC-EGA data. Correlations demonstrated regions of combustion attributable to specific functional groups, which may relate to SOC stability. We are increasingly challenged with developing an efficient technique to assess SOC composition and stability at large spatial and temporal scales. Correlations between NMR and DSC-EGA may demonstrate the viability of using thermal analysis in lieu of more demanding methods in future large-scale surveys, and may provide data that goes beyond chemical composition to better approach quantification of biogeochemical stability.
Hotz, Christine; Loechl, Cornelia; de Brauw, Alan; Eozenou, Patrick; Gilligan, Daniel; Moursi, Mourad; Munhaua, Bernardino; van Jaarsveld, Paul; Carriquiry, Alicia; Meenakshi, J V
2012-07-14
β-Carotene-rich orange sweet potato (OSP) has been shown to improve vitamin A status of infants and young children in controlled efficacy trials and in a small-scale effectiveness study with intensive exposure to project inputs. However, the potential of this important food crop to reduce the risk of vitamin A deficiency in deficient populations will depend on the ability to distribute OSP vines and promote its household production and consumption on a large scale. In rural Mozambique, we conducted a randomised, controlled effectiveness study of a large-scale intervention to promote household-level OSP production and consumption using integrated agricultural, demand creation/behaviour change and marketing components. The following two intervention models were compared: a low-intensity (1 year) and a high-intensity (nearly 3 years) training model. The primary nutrition outcomes were OSP and vitamin A intakes by children 6-35 months and 3-5·5 years of age, and women. The intervention resulted in significant net increases in OSP intakes (model 1: 46, 48 and 97 g/d) and vitamin A intakes (model 1: 263, 254 and 492 μg retinol activity equivalents/d) among the younger children, older children and women, respectively. OSP accounted for 47-60 % of all sweet potato consumed and, among reference children, provided 80 % of total vitamin A intakes. A similar magnitude of impact was observed for both models, suggesting that group-level trainings in nutrition and agriculture could be limited to the first project year without compromising impact. Introduction of OSP to rural, sweet potato-producing communities in Mozambique is an effective way to improve vitamin A intakes.
Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich
2017-04-16
Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.
Social desirability scales as indicators of self-enhancement and impression management.
Parmač Kovačić, Maja; Galić, Zvonimir; Jerneić, Željko
2014-01-01
This article presents 2 studies testing Paulhus's (2002) assumption that unconscious self-enhancement and conscious impression management represent separate processes of socially desirable responding (SDR) that can be observed within 2 content domains (egoistic and moralistic bias). In Study 1, we devised egoistic and moralistic SDR scales intended to measure self-enhancement in honest responding and impression management under demands for positive self-presentation. In Study 2, we correlated scores on these scales with external indicators of self-enhancement and impression management. In honest responding, both SDR scales most strongly correlated with self-enhancement indicators, whereas under demands for positive self-presentation they correlated more strongly with external measures of impression management.
Depressed mood and self-esteem in young Asian, black, and white women in America.
Woods, N F; Lentz, M; Mitchell, E; Oakley, L D
1994-01-01
During the last two decades, investigators have explored the relationship between women's life conditions and their mental health. Some have related women's socially disadvantaged status, or their socialization to a traditional feminine role, to depression and low self-esteem. Others have emphasized the consequences of women's roles, or the balance of social demands and resources, on their well-being. More recently, feminist scholars have proposed a developmental account of depression. We tested a model comparing the effects of personal resources, social demands and resources, socialization, and women's roles, on self-esteem and depressed mood in young adult Asian, Black, and White women in America. Women who resided in middle-income and racially mixed neighborhoods were interviewed in their homes. Personal resources were indicated by education and income and social resources by unconflicted network size as measured by Barrera's (1981) Arizona Social Support Interview Schedule. Social demands were assessed by conflicted network size as measured by the Barrera scale and by the Positive Life Events and Negative Life Events scales from Norbeck's (1984) revision of the Sarason Life Events Scale. Women's roles included employment, parenting, and partnership with an adult (e.g., marriage). Self-esteem was assessed with the Rosenberg Self Esteem Scale (Rosenberg, 1965) and depressed mood with the Center for Epidemiologic Studies Depression scale (Radloff, 1977). Although models for Asian, Black, and White women differed, social network and social demands as well as personal resources were common to each group as predictors of self-esteem and depression.
NASA Astrophysics Data System (ADS)
Wolfsberg, A.; Hagood, M.; Pasqualini, D.; Wood, T.; Wilson, C.; Witkowski, M.; Levitt, D.; Pawar, R.; Keating, G.; Ziock, H.
2008-12-01
The United States is increasingly dependent on imported oil and gas; commodities for which other nations are competing and for which future supply may be inadequate to support our transportation fuel needs. Therefore, a renewed interest in 'harder-to-get' unconventional fuels has emerged in both industry and government with directed focus on world class hydrocarbon resources within a corridor extending from Canada southward through the Rocky Mountain States. Within this Western Energy Corridor, co-located with significant conventional hydrocarbon and renewable energy resources, lie some of the world's richest unconventional hydrocarbon resources in oil shales, oil sands and coal for coal-to-liquid conversion. However, development of these resources poses substantial environmental concerns as well as increasing competition for limited resources of water and habitat. With large-scale energy development in the predominantly rural region, local communities, infrastructures, and economies will face increasing demands for roads, electricity, law enforcement, labor, and other support services. The Western Energy Corridor Initiative (WECI) seeks to develop an integrated assessment of the impacts of unconventional fuel development, the interrelationships of planned energy developments in different basins, and the resultant demands placed on the region. This initial WECI study focuses on two of the most important current issues for industry, regulators, and stakeholders -- the assessment of carbon and water resources issues, impacts, and management strategies. Through scenario analyses using coupled systems and process level models, this study investigates the viability of integrated development of multiple energy resources in a carbon neutral and environmentally acceptable manner, and the interrelationships of various energy resource development plans. The modeling framework is designed to extend to include infrastructure, employment, training, fiscal and economic demands placed on the region as a result of various development and climate change scenarios. The multi-scale modeling approach involves a systems dynamics (SD) modeling framework linked with more detailed models such as one for basin-scale hydrology investigating the spatial relationships of water rights and requirements, reservoir locations, and climate change impacts (the details of the SD model and the hydrologic model are presented in other contributions by Pasqualini et al. and Wilson et al.). A link to a CO2 sequestration performance assessment model is also being built to enable analysis of alternative carbon management options. With these evolving capabilities, our analyses consider interdependent demands and impacts placed on the region for various development scenarios.
From natural to artificial photosynthesis
Barber, James; Tran, Phong D.
2013-01-01
Demand for energy is projected to increase at least twofold by mid-century relative to the present global consumption because of predicted population and economic growth. This demand could be met, in principle, from fossil energy resources, particularly coal. However, the cumulative nature of carbon dioxide (CO2) emissions demands that stabilizing the atmospheric CO2 levels to just twice their pre-anthropogenic values by mid-century will be extremely challenging, requiring invention, development and deployment of schemes for carbon-neutral energy production on a scale commensurate with, or larger than, the entire present-day energy supply from all sources combined. Among renewable and exploitable energy resources, nuclear fusion energy or solar energy are by far the largest. However, in both cases, technological breakthroughs are required with nuclear fusion being very difficult, if not impossible on the scale required. On the other hand, 1 h of sunlight falling on our planet is equivalent to all the energy consumed by humans in an entire year. If solar energy is to be a major primary energy source, then it must be stored and despatched on demand to the end user. An especially attractive approach is to store solar energy in the form of chemical bonds as occurs in natural photosynthesis. However, a technology is needed which has a year-round average conversion efficiency significantly higher than currently available by natural photosynthesis so as to reduce land-area requirements and to be independent of food production. Therefore, the scientific challenge is to construct an ‘artificial leaf’ able to efficiently capture and convert solar energy and then store it in the form of chemical bonds of a high-energy density fuel such as hydrogen while at the same time producing oxygen from water. Realistically, the efficiency target for such a technology must be 10 per cent or better. Here, we review the molecular details of the energy capturing reactions of natural photosynthesis, particularly the water-splitting reaction of photosystem II and the hydrogen-generating reaction of hydrogenases. We then follow on to describe how these two reactions are being mimicked in physico-chemical-based catalytic or electrocatalytic systems with the challenge of creating a large-scale robust and efficient artificial leaf technology. PMID:23365193
ERIC Educational Resources Information Center
Landmann, Mareike
2013-01-01
Universities in Germany show an increasing need for specific information on professional demands encountered and addressed by graduates training to become teachers. To provide information on demands and abilities in teaching graduates, a specialised teacher module was developed in the framework of the German Cooperation Project for Graduate Tracer…
Cramer, Duncan
2003-01-01
This study is an examination of the extent to which satisfaction with a main current romantic relationship is associated with negative conflict, demand for approval, self-esteem, and the 3 facilitative conditions of unconditional regard, empathy, and congruence. One or more of these conditions have been proposed as important determinants of relationship satisfaction by various relationship-enhancement approaches such as behavioral marital therapy and cognitive-behavioral marital therapy. College students (86 women and 58 men) completed S. S. Hendrick's (1988) Relationship Satisfaction Scale, a measure of negative conflict formulated by the author, R. G. Jones's (1969) Demand for Approval Scale (modified for a particular relationship), M. Rosenberg's (1965) Self-Esteem Scale, and a shortened modified version of G. T. Barrett-Lennard's (1964) Relationship Inventory. Relationship satisfaction was most strongly related to the level of regard and empathy, which is consistent with approaches to relationships that emphasize empathy training.
NASA Astrophysics Data System (ADS)
Bhattarai, N.; Jain, M.
2016-12-01
Expected changes in temperature and precipitation patterns in the rice-wheat belt of Northern India have implications for balancing crop water demand and available water resources. Because the impacts of water scarcity and reduced crop production are realized at a local scale, water-saving interventions are most effective when implemented locally. However, a paucity of fine-scale studies on the relationship between variations in climate and crop water demand has limited our ability to effectively implement such interventions. In an effort to better understand the responses of irrigated crops to changing climate in Northern India at finer-scales, we propose a remote sensing based semi-empirical approach. First, we employ a multi-model surface energy balance (SEB) approach to map seasonal evapotranspiration (ET)/water use (1995-2015) at 30 to 100 m resolution from space and investigate how seasonal and inter-annual variations in temperature and precipitation are associated with regional surface-energy budgets. Second, using remote estimates of ET and other biophysical variables, such as vegetation indices, land surface temperature, and albedo, we will explain the possible relationships between climate change and seasonal water demands of crops. Our estimates of high/moderate resolution (30 to 100 m) seasonal ET maps can make clear distinctions between impacts of climate variations on crop water demand at field, plot, and regional scales in Northern India. Finally, by improving our ability to identify targeted area for water-saving interventions, this study supports agricultural resiliency of Northern India in the face of climate change.
Necromass as a source of energy to microorganisms in marine sediments.
NASA Astrophysics Data System (ADS)
Bradley, J.; Amend, J.; LaRowe, D.
2017-12-01
Marine sediments constitute one of the largest, most energy-limited biospheres on Earth. Despite increasing exploration and interest characterizing microbial communities in marine sediments, the production and role of microbial dead-matter (necromass) has largely been overlooked. Necromass is produced on a global scale, yet its significance as a power source to heterotrophic microorganisms remains unknown. We developed a physical, bio-energetic and geochemical model to quantify the total power supply from necromass oxidation and the total power demand of living microorganisms in marine sediments. This model is first applied to sediments from the oligotrophic South Pacific Gyre (SPG), where organic carbon and biomass concentrations are extremely low, yet microorganisms persist for millions of years in some of the lowest energy states on Earth. We show that necromass does not supply sufficient power to support the total demands of the living community (<39%) at SPG. Application of our model on a global scale, however, shows that necromass produced and subsequently oxidized can provide sufficient power to satisfy the maintenance demands of microorganisms in marine sediments for up to 60,000 years following burial. Our model assumes that all counted cells are viable. Yet, if only a fraction of counted cells are alive, the role of necromass as an electron donor in fueling microbial metabolisms is even greater. This new insight requires a reassessment of carbon fluxes in the deep biosphere. By extension, we also demonstrate a mechanism for microbial communities to persist by oxidizing necromass over geological timescales, and thereby endure unfavorable, low-energy settings that might be analogous to conditions on early Earth and on other planetary bodies.
Reconstructing European forest management from 1600 to 2010
NASA Astrophysics Data System (ADS)
McGrath, M. J.; Luyssaert, S.; Meyfroidt, P.; Kaplan, J. O.; Buergi, M.; Chen, Y.; Erb, K.; Gimmi, U.; McInerney, D.; Naudts, K.; Otto, J.; Pasztor, F.; Ryder, J.; Schelhaas, M.-J.; Valade, A.
2015-04-01
European forest use for fuel, timber and food dates back to pre-Roman times. Century-scale ecological processes and their legacy effects require accounting for forest management when studying today's forest carbon sink. Forest management reconstructions that are used to drive land surface models are one way to quantify the impact of both historical and today's large scale application of forest management on today's forest-related carbon sink and surface climate. In this study we reconstruct European forest management from 1600 to 2010 making use of diverse approaches, data sources and assumptions. Between 1600 and 1828, a demand-supply approach was used in which wood supply was reconstructed based on estimates of historical annual wood increment and land cover reconstructions. For the same period demand estimates accounted for the fuelwood needed in households, wood used in food processing, charcoal used in metal smelting and salt production, timber for construction and population estimates. Comparing estimated demand and supply resulted in a spatially explicit reconstruction of the share of forests under coppice, high stand management and forest left unmanaged. For the reconstruction between 1829 and 2010 a supply-driven back-casting method was used. The method used age reconstructions from the years 1950 to 2010 as its starting point. Our reconstruction reproduces the most important changes in forest management between 1600 and 2010: (1) an increase of 593 000 km2 in conifers at the expense of deciduous forest (decreasing by 538 000 km2), (2) a 612 000 km2 decrease in unmanaged forest, (3) a 152 000 km2 decrease in coppice management, (4) a 818 000 km2 increase in high stand management, and (5) the rise and fall of litter raking which at its peak in 1853 removed 50 Tg dry litter per year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Elaine
Demand response may be a valuable flexible resource for low-carbon electric power grids. However, there are as many types of possible demand response as there are ways to use electricity, making demand response difficult to study at scale in realistic settings. This talk reviews our state of knowledge regarding the potential value of demand response in several example systems as a function of increasing levels of wind and solar power, sometimes drawing on the analogy between demand response and storage. Overall, we find demand response to be promising, but its potential value is very system dependent. Furthermore, demand response, likemore » storage, can easily saturate ancillary service markets.« less
Evolving network simulation study. From regular lattice to scale free network
NASA Astrophysics Data System (ADS)
Makowiec, D.
2005-12-01
The Watts-Strogatz algorithm of transferring the square lattice to a small world network is modified by introducing preferential rewiring constrained by connectivity demand. The evolution of the network is two-step: sequential preferential rewiring of edges controlled by p and updating the information about changes done. The evolving system self-organizes into stationary states. The topological transition in the graph structure is noticed with respect to p. Leafy phase a graph formed by multiple connected vertices (graph skeleton) with plenty of leaves attached to each skeleton vertex emerges when p is small enough to pretend asynchronous evolution. Tangling phase where edges of a graph circulate frequently among low degree vertices occurs when p is large. There exist conditions at which the resulting stationary network ensemble provides networks which degree distribution exhibit power-law decay in large interval of degrees.
Opportunities for condensed matter research at the NHMFL
NASA Astrophysics Data System (ADS)
Crow, Jack E.
2004-03-01
Magnetic fields have long been recognized as critical for science and technology. During the last 20 years, research in high magnetic fields has advanced the world's understanding of a host of materials science issues and led to new states of matter, e.g., the quantum and fractional quantum Hall Effects for which the scientists were awarded Nobel prizes. The demands of science have driven a continuing appetite for higher and more specialized magnetic fields and new capabilities have developed both at the NHMFL and in many other laboratories across the world. In this presentation, a short overview of large-scale worldwide facilities with an emphasis on those available at the NHMFL will be presented along with some scientific and technological drivers that have been the underpinnings for the large investments needed to build and support these facilities.
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
NASA,FAA,ONERA Swept-Wing Icing and Aerodynamics: Summary of Research and Current Status
NASA Technical Reports Server (NTRS)
Broeren, Andy
2015-01-01
NASA, FAA, ONERA, and other partner organizations have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large scale, three-dimensional swept wings. These are extremely complex phenomena important to the design, certification and safe operation of small and large transport aircraft. There is increasing demand to balance trade-offs in aircraft efficiency, cost and noise that tend to compete directly with allowable performance degradations over an increasing range of icing conditions. Computational fluid dynamics codes have reached a level of maturity that they are being proposed by manufacturers for use in certification of aircraft for flight in icing. However, sufficient high-quality data to evaluate their performance on iced swept wings are not currently available in the public domain and significant knowledge gaps remain.
Castillo-Cagigal, Manuel; Matallanas, Eduardo; Gutiérrez, Alvaro; Monasterio-Huelin, Félix; Caamaño-Martín, Estefaná; Masa-Bote, Daniel; Jiménez-Leube, Javier
2011-01-01
In this paper we present a heterogeneous collaborative sensor network for electrical management in the residential sector. Improving demand-side management is very important in distributed energy generation applications. Sensing and control are the foundations of the "Smart Grid" which is the future of large-scale energy management. The system presented in this paper has been developed on a self-sufficient solar house called "MagicBox" equipped with grid connection, PV generation, lead-acid batteries, controllable appliances and smart metering. Therefore, there is a large number of energy variables to be monitored that allow us to precisely manage the energy performance of the house by means of collaborative sensors. The experimental results, performed on a real house, demonstrate the feasibility of the proposed collaborative system to reduce the consumption of electrical power and to increase energy efficiency.
NASA Astrophysics Data System (ADS)
Rossi, Mauro; Torri, Dino; Santi, Elisa; Bacaro, Giovanni; Marchesini, Ivan
2014-05-01
Landslide phenomena and erosion processes are widespread and cause every year extensive damages to the environment and sensible reduction of ecosystem services. These processes are in competition among them, and their complex interaction control the landscapes evolution. Landslide phenomena and erosion processes can be strongly influenced by land use, vegetation, soil characteristics and anthropic actions. Such type of phenomena are mainly model separately using empirical and physically based approaches. The former rely upon the identification of simple empirical laws correlating/relating the occurrence of instability processes to some of their potential causes. The latter are based on physical descriptions of the processes, and depending on the degree of complexity they can integrate different variables characterizing the process and their trigger. Those model often couple an hydrological model with an erosion or a landslide model. The spatial modeling schemas are heterogeneous, but mostly the raster (i.e. matrices of data) or the conceptual (i.e. cascading planes and channels) description of the terrain are used. The two model types are generally designed and applied at different scales. Empirical models, less demanding in terms of input data cannot consider explicitly the real process triggering mechanisms and commonly they are exploited to assess the potential occurrence of instability phenomena over large areas (small scale assessment). Physically-based models are high-demanding in term of input data, difficult to obtain over large areas if not with large uncertainty, and their applicability is often limited to small catchments or single slopes (large scale assessment). More those models, even if physically-based, are simplified description of the instability processes and can neglect significant issues of the real triggering mechanisms. For instance the influence of vegetation has been considered just partially. Although in the literature a variety of model approaches have been proposed to model separately landslide and erosion processes, only few attempts were made to model both jointly, mostly integrating pre-existing models. To overcome this limitation we develop a new model called LANDPLANER (LANDscape, Plants, LANdslide and ERosion), specifically design to describe the dynamic response of slopes (or basins) under different changing scenarios including: (i) changes of meteorological factors, (ii) changes of vegetation or land-use, (iii) and changes of slope morphology. The was applied in different study area in order to check its basic assumptions, and to test its general operability and applicability. Results show a reasonable model behaviors and confirm its easy applicability in real cases.
[Effect of occupational stress on mental health].
Yu, Shan-fa; Zhang, Rui; Ma, Liang-qing; Gu, Gui-zhen; Yang, Yan; Li, Kui-rong
2003-02-01
To study the effect of job psychological demands and job control on mental health and their interaction. 93 male freight train dispatchers were evaluated by using revised Job Demand-Control Scale and 7 strain scales. Stepwise regression analysis, Univariate ANOVA, Kruskal-Wallis H and Modian methods were used in statistic analysis. Kruskal-Wallis H and Modian methods analysis revealed the difference in mental health scores among groups of decision latitude (mean rank 55.57, 47.95, 48.42, 33.50, P < 0.05), the differences in scores of mental health (37.45, 40.01, 58.35), job satisfaction (53.18, 46.91, 32.43), daily life strains (33.00, 44.96, 56.12) and depression (36.45, 42.25, 53.61) among groups of job time demands (P < 0.05) were all statistically significant. ANOVA showed that job time demands and decision latitude had interaction effects on physical complains (R(2) = 0.24), state-anxiety (R(2) = 0.26), and daytime fatigue (R(2) = 0.28) (P < 0.05). Regression analysis revealed a significant job time demands and job decision latitude interaction effect as well as significant main effects of the some independent variables on different job strains (R(2) > 0.05). Job time demands and job decision latitude have direct and interactive effects on psychosomatic health, the more time demands, the more psychological strains, the effect of job time demands is greater than that of job decision latitude.
NASA Astrophysics Data System (ADS)
Newman, J. P.; Dandy, G. C.; Maier, H. R.
2014-10-01
In many regions, conventional water supplies are unable to meet projected consumer demand. Consequently, interest has arisen in integrated urban water systems, which involve the reclamation or harvesting of alternative, localized water sources. However, this makes the planning and design of water infrastructure more difficult, as multiple objectives need to be considered, water sources need to be selected from a number of alternatives, and end uses of these sources need to be specified. In addition, the scale at which each treatment, collection, and distribution network should operate needs to be investigated. In order to deal with this complexity, a framework for planning and designing water infrastructure taking into account integrated urban water management principles is presented in this paper and applied to a rural greenfield development. Various options for water supply, and the scale at which they operate were investigated in order to determine the life-cycle trade-offs between water savings, cost, and GHG emissions as calculated from models calibrated using Australian data. The decision space includes the choice of water sources, storage tanks, treatment facilities, and pipes for water conveyance. For each water system analyzed, infrastructure components were sized using multiobjective genetic algorithms. The results indicate that local water sources are competitive in terms of cost and GHG emissions, and can reduce demand on the potable system by as much as 54%. Economies of scale in treatment dominated the diseconomies of scale in collection and distribution of water. Therefore, water systems that connect large clusters of households tend to be more cost efficient and have lower GHG emissions. In addition, water systems that recycle wastewater tended to perform better than systems that captured roof-runoff. Through these results, the framework was shown to be effective at identifying near optimal trade-offs between competing objectives, thereby enabling informed decisions to be made when planning water systems for greenfield developments.
Energy-Water-Land-Climate Nexus: Modeling Impacts from the Asset to Regional Scale
NASA Astrophysics Data System (ADS)
Tidwell, V. C.; Bennett, K. E.; Middleton, R. S.; Behery, S.; Macknick, J.; Corning-Padilla, A.; Brinkman, G.; Meng, M.
2016-12-01
A critical challenge for the energy-water-land nexus is understanding and modeling the connection between the natural system—including changes in climate, land use/cover, and streamflow—and the engineered system including water for energy, agriculture, and society. Equally important is understanding the linkage across scales; that is, how impacts at the asset level aggregate to influence behavior at the local to regional scale. Toward this need, a case study was conducted featuring multi-sector and multi-scale modeling centered on the San Juan River basin (a watershed that accounts for one-tenth of the Colorado River drainage area). Simulations were driven by statistically downscaled climate data from three global climate models (emission scenario RCP 8.5) and planned growth in regional water demand. The Variable Infiltration Capacity (VIC) hydrologic model was fitted with a custom vegetation mortality sub-model and used to estimate tributary inflows to the San Juan River and estimate reservoir evaporation. San Juan River operations, including releases from Navajo Reservoir, were subsequently modeled using RiverWare to estimate impacts on water deliveries out to the year 2100. Major water demands included two large coal-fired power plants, a local electric utility, river-side irrigation, the Navajo Indian Irrigation Project and instream flows managed for endangered aquatic species. Also tracked were basin exports, including water (downstream flows to the Colorado River and interbasin transfers to the Rio Grande) and interstate electric power transmission. Implications for the larger western electric grid were assessed using PLEXOS, a sub-hourly dispatch, electric production-cost model. Results highlight asset-level interactions at the energy-water-land nexus driven by climate and population dynamics; specifically, growing vulnerabilities to shorted water deliveries. Analyses also explored linkages across geographic scales from the San Juan to the larger Colorado River and Rio Grande basins as well as the western power grid.