Sample records for future large-scale studies

  1. Architectural Optimization of Digital Libraries

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1998-01-01

    This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.

  2. Data Integration: Charting a Path Forward to 2035

    DTIC Science & Technology

    2011-02-14

    New York, NY: Gotham Books, 2004. Seligman , Len. Mitre Corporation, e-mail interview, 6 Dec 2010. Singer, P.W. Wired for War: The Robotics...articles.aspx (accessed 4 Dec 2010). Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie...Virtualization?‖ 1. 41 Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie Mellon Software

  3. The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy

    2016-01-01

    This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…

  4. Recent and future liquid metal experiments on homogeneous dynamo action and magnetic instabilities

    NASA Astrophysics Data System (ADS)

    Stefani, Frank; Gerbeth, Gunter; Giesecke, Andre; Gundrum, Thomas; Kirillov, Oleg; Seilmayer, Martin; Gellert, Marcus; Rüdiger, Günther; Gailitis, Agris

    2011-10-01

    The present status of the Riga dynamo experiment is summarized and the prospects for its future exploitation are evaluated. We further discuss the plans for a large-scale precession driven dynamo experiment to be set-up in the framework of the new installation DRESDYN (DREsden Sodium facility for dynamo and thermohydraulic studies) at Helmholtz-Zentrum Dresden-Rossendorf. We report recent investigations of the magnetorotational instability and the Tayler instability and sketch the plans for another large-scale liquid sodium facility devoted to the combined study of both effects.

  5. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  6. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    NASA Astrophysics Data System (ADS)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").

  7. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    NASA Technical Reports Server (NTRS)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  8. Among-provence variability of gas exchange and growth in response to long-term elevated CO2 exposure

    Treesearch

    James L.J. Houpis; Paul D. Anderson; James C. Pushnik; David J. Anschel

    1999-01-01

    Genetic variability can have profound effects on the interpretation of results from elevated CO2 studies, and future forest management decisions. Information on which varieties are best suited to future atmospheric conditions is needed to develop future forest management practices. A large-scale screening study of the effects of elevated CO

  9. Russian Policy on Methane Emissions in the Oil and Gas Sector: A Case Study in Opportunities and Challenges in Reducing Short-Lived Forcers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Roshchanka, Volha

    2014-08-04

    This paper uses Russian policy in the oil and gas sector as a case study in assessing options and challenges for scaling-up emission reductions. We examine the challenges to achieving large-scale emission reductions, successes that companies have achieved to date, how Russia has sought to influence methane emissions through its environmental fine system, and options for helping companies achieve large-scale emission reductions in the future through simpler and clearer incentives.

  10. Estimating ecosystem service changes as a precursor to modeling

    EPA Science Inventory

    EPA's Future Midwestern Landscapes Study will project changes in ecosystem services (ES) for alternative future policy scenarios in the Midwestern U.S. Doing so for detailed landscapes over large spatial scales will require serial application of economic and ecological models. W...

  11. COLAcode: COmoving Lagrangian Acceleration code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin V.

    2016-02-01

    COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.

  12. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  13. Characterising large-scale structure with the REFLEX II cluster survey

    NASA Astrophysics Data System (ADS)

    Chon, Gayoung

    2016-10-01

    We study the large-scale structure with superclusters from the REFLEX X-ray cluster survey together with cosmological N-body simulations. It is important to construct superclusters with criteria such that they are homogeneous in their properties. We lay out our theoretical concept considering future evolution of superclusters in their definition, and show that the X-ray luminosity and halo mass functions of clusters in superclusters are found to be top-heavy, different from those of clusters in the field. We also show a promising aspect of using superclusters to study the local cluster bias and mass scaling relation with simulations.

  14. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  15. Large-Angular-Scale Clustering as a Clue to the Source of UHECRs

    NASA Astrophysics Data System (ADS)

    Berlind, Andreas A.; Farrar, Glennys R.

    We explore what can be learned about the sources of UHECRs from their large-angular-scale clustering (referred to as their "bias" by the cosmology community). Exploiting the clustering on large scales has the advantage over small-scale correlations of being insensitive to uncertainties in source direction from magnetic smearing or measurement error. In a Cold Dark Matter cosmology, the amplitude of large-scale clustering depends on the mass of the system, with more massive systems such as galaxy clusters clustering more strongly than less massive systems such as ordinary galaxies or AGN. Therefore, studying the large-scale clustering of UHECRs can help determine a mass scale for their sources, given the assumption that their redshift depth is as expected from the GZK cutoff. We investigate the constraining power of a given UHECR sample as a function of its cutoff energy and number of events. We show that current and future samples should be able to distinguish between the cases of their sources being galaxy clusters, ordinary galaxies, or sources that are uncorrelated with the large-scale structure of the universe.

  16. Coal resources, reserves and peak coal production in the United States

    USGS Publications Warehouse

    Milici, Robert C.; Flores, Romeo M.; Stricker, Gary D.

    2013-01-01

    In spite of its large endowment of coal resources, recent studies have indicated that United States coal production is destined to reach a maximum and begin an irreversible decline sometime during the middle of the current century. However, studies and assessments illustrating coal reserve data essential for making accurate forecasts of United States coal production have not been compiled on a national basis. As a result, there is a great deal of uncertainty in the accuracy of the production forecasts. A very large percentage of the coal mined in the United States comes from a few large-scale mines (mega-mines) in the Powder River Basin of Wyoming and Montana. Reported reserves at these mines do not account for future potential reserves or for future development of technology that may make coal classified currently as resources into reserves in the future. In order to maintain United States coal production at or near current levels for an extended period of time, existing mines will eventually have to increase their recoverable reserves and/or new large-scale mines will have to be opened elsewhere. Accordingly, in order to facilitate energy planning for the United States, this paper suggests that probabilistic assessments of the remaining coal reserves in the country would improve long range forecasts of coal production. As it is in United States coal assessment projects currently being conducted, a major priority of probabilistic assessments would be to identify the numbers and sizes of remaining large blocks of coal capable of supporting large-scale mining operations for extended periods of time and to conduct economic evaluations of those resources.

  17. Control-Structure-Interaction (CSI) technologies and trends to future NASA missions

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Control-structure-interaction (CSI) issues which are relevant for future NASA missions are reviewed. This goal was achieved by: (1) reviewing large space structures (LSS) technologies to provide a background and survey of the current state of the art (SOA); (2) analytically studying a focus mission to identify opportunities where CSI technology may be applied to enhance or enable future NASA spacecraft; and (3) expanding a portion of the focus mission, the large antenna, to provide in-depth trade studies, scaling laws, and methodologies which may be applied to other NASA missions. Several sections are presented. Section 1 defines CSI issues and presents an overview of the relevant modeling and control issues for LLS. Section 2 presents the results of the three phases of the CSI study. Section 2.1 gives the results of a CSI study conducted with the Geostationary Platform (Geoplat) as the focus mission. Section 2.2 contains an overview of the CSI control design methodology available in the technical community. Included is a survey of the CSI ground-based experiments which were conducted to verify theoretical performance predictions. Section 2.3 presents and demonstrates a new CSI scaling law methodology for assessing potential CSI with large antenna systems.

  18. Smooth Sailing or Stormy Seas? Atlantic Canadian Physical Educators on the State and Future of Physical Education

    ERIC Educational Resources Information Center

    Robinson, Daniel B.; Randall, Lynn

    2016-01-01

    This article summarizes results from a recently completed study that focused upon the current state and possible future of physical education within Canada's four Atlantic provinces. Data from both large-scale surveys and eight follow-up focus group interviews are shared as they relate to the state and future of physical education, possible…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzi, Silvio; Hereld, Mark; Insley, Joseph

    In this work we perform in-situ visualization of molecular dynamics simulations, which can help scientists to visualize simulation output on-the-fly, without incurring storage overheads. We present a case study to couple LAMMPS, the large-scale molecular dynamics simulation code with vl3, our parallel framework for large-scale visualization and analysis. Our motivation is to identify effective approaches for covisualization and exploration of large-scale atomistic simulations at interactive frame rates.We propose a system of coupled libraries and describe its architecture, with an implementation that runs on GPU-based clusters. We present the results of strong and weak scalability experiments, as well as future researchmore » avenues based on our results.« less

  20. Investigating the Impact of Surface Heterogeneity on the Convective Boundary Layer Over Urban Areas Through Coupled Large-Eddy Simulation and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Dominguez, Anthony; Kleissl, Jan P.; Luvall, Jeffrey C.

    2011-01-01

    Large-eddy Simulation (LES) was used to study convective boundary layer (CBL) flow through suburban regions with both large and small scale heterogeneities in surface temperature. Constant remotely sensed surface temperatures were applied at the surface boundary at resolutions of 10 m, 90 m, 200 m, and 1 km. Increasing the surface resolution from 1 km to 200 m had the most significant impact on the mean and turbulent flow characteristics as the larger scale heterogeneities became resolved. While previous studies concluded that scales of heterogeneity much smaller than the CBL inversion height have little impact on the CBL characteristics, we found that further increasing the surface resolution (resolving smaller scale heterogeneities) results in an increase in mean surface heat flux, thermal blending height, and potential temperature profile. The results of this study will help to better inform sub-grid parameterization for meso-scale meteorological models. The simulation tool developed through this study (combining LES and high resolution remotely sensed surface conditions) is a significant step towards future studies on the micro-scale meteorology in urban areas.

  1. State of the Art Methodology for the Design and Analysis of Future Large Scale Evaluations: A Selective Examination.

    ERIC Educational Resources Information Center

    Burstein, Leigh

    Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…

  2. Geospatial intelligence and visual classification of environmentally observed species in the Future Internet

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.

    2012-04-01

    The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.

  3. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size.

    PubMed

    Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E

    2015-01-01

    One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics.

  4. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size

    PubMed Central

    Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E.

    2015-01-01

    One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics. PMID:26381745

  5. Transcriptome characterization and SSR discovery in large-scale loach Paramisgurnus dabryanus (Cobitidae, Cypriniformes).

    PubMed

    Li, Caijuan; Ling, Qufei; Ge, Chen; Ye, Zhuqing; Han, Xiaofei

    2015-02-25

    The large-scale loach (Paramisgurnus dabryanus, Cypriniformes) is a bottom-dwelling freshwater species of fish found mainly in eastern Asia. The natural germplasm resources of this important aquaculture species has been recently threatened due to overfishing and artificial propagation. The objective of this study is to obtain the first functional genomic resource and candidate molecular markers for future conservation and breeding research. Illumina paired-end sequencing generated over one hundred million reads that resulted in 71,887 assembled transcripts, with an average length of 1465bp. 42,093 (58.56%) protein-coding sequences were predicted; and 43,837 transcripts had significant matches to NCBI nonredundant protein (Nr) database. 29,389 and 14,419 transcripts were assigned into gene ontology (GO) categories and Eukaryotic Orthologous Groups (KOG), respectively. 22,102 (31.14%) transcripts were mapped to 302 KEGG pathways. In addition, 15,106 candidate SSR markers were identified, with 11,037 pairs of PCR primers designed. 400 primers pairs of SSR selected randomly were validated, of which 364 (91%) pairs of primers were able to produce PCR products. Further test with 41 loci and 20 large-scale loach specimens collected from the four largest lakes in China showed that 36 (87.8%) loci were polymorphic. The transcriptomic profile and SSR repertoire obtained in this study will facilitate population genetic studies and selective breeding of large-scale loach in the future. Copyright © 2015. Published by Elsevier B.V.

  6. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    NASA Astrophysics Data System (ADS)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  7. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  8. Scale-Up: Improving Large Enrollment Physics Courses

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallarno, George; Rogers, James H; Maxwell, Don E

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less

  10. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  11. Projected Future Vegetation Changes for the Northwest United States and Southwest Canada at a Fine Spatial Resolution Using a Dynamic Global Vegetation Model.

    PubMed

    Shafer, Sarah L; Bartlein, Patrick J; Gray, Elizabeth M; Pelltier, Richard T

    2015-01-01

    Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0-58.0°N latitude by 136.6-103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070-2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.

  12. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    NASA Astrophysics Data System (ADS)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  13. Omega-3 polyunsaturated fatty acids for cardiovascular diseases: present, past and future.

    PubMed

    Watanabe, Yasuhiro; Tatsuno, Ichiro

    2017-08-01

    Large-scale epidemiological studies on Greenlandic, Canadian and Alaskan Eskimos have examined the health benefits of omega-3 fatty acids consumed as part of the diet, and found statistically significant relative reduction in cardiovascular risk in people consuming omega-3 fatty acids. Areas covered: This article reviews studies on omega-3 fatty acids during the last 50 years, and identifies issues relevant to future studies on cardiovascular (CV) risk. Expert commentary: Although a meta-analysis of large-scale prospective cohort studies and randomized studies reported that fish and fish oil consumption reduced coronary heart disease-related mortality and sudden cardiac death, omega-3 fatty acids have not yet been shown to be effective in secondary prevention trials on patients with multiple cardiovascular disease (CVD) risk factors. The ongoing long-term CV interventional outcome studies investigate high-dose, prescription-strength omega-3 fatty acids. The results are expected to clarify the potential role of omega-3 fatty acids in reducing CV risk. The anti-inflammatory properties of omega-3 fatty acids are also important. Future clinical trials should also focus on the role of these anti-inflammatory mediators in human arteriosclerotic diseases as well as inflammatory diseases.

  14. Large scale in vivo recordings to study neuronal biophysics.

    PubMed

    Giocomo, Lisa M

    2015-06-01

    Over the last several years, technological advances have enabled researchers to more readily observe single-cell membrane biophysics in awake, behaving animals. Studies utilizing these technologies have provided important insights into the mechanisms generating functional neural codes in both sensory and non-sensory cortical circuits. Crucial for a deeper understanding of how membrane biophysics control circuit dynamics however, is a continued effort to move toward large scale studies of membrane biophysics, in terms of the numbers of neurons and ion channels examined. Future work faces a number of theoretical and technical challenges on this front but recent technological developments hold great promise for a larger scale understanding of how membrane biophysics contribute to circuit coding and computation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. A feasibility study of large-scale photobiological hydrogen production utilizing mariculture-raised cyanobacteria.

    PubMed

    Sakurai, Hidehiro; Masukawa, Hajime; Kitashima, Masaharu; Inoue, Kazuhito

    2010-01-01

    In order to decrease CO(2) emissions from the burning of fossil fuels, the development of new renewable energy sources sufficiently large in quantity is essential. To meet this need, we propose large-scale H(2) production on the sea surface utilizing cyanobacteria. Although many of the relevant technologies are in the early stage of development, this chapter briefly examines the feasibility of such H(2) production, in order to illustrate that under certain conditions large-scale photobiological H(2) production can be viable. Assuming that solar energy is converted to H(2) at 1.2% efficiency, the future cost of H(2) can be estimated to be about 11 (pipelines) and 26.4 (compression and marine transportation) cents kWh(-1), respectively.

  16. Parameterizing a Large-scale Water Balance Model in Regions with Sparse Data: The Tigris-Euphrates River Basins as an Example

    NASA Astrophysics Data System (ADS)

    Flint, A. L.; Flint, L. E.

    2010-12-01

    The characterization of hydrologic response to current and future climates is of increasing importance to many countries around the world that rely heavily on changing and uncertain water supplies. Large-scale models that can calculate a spatially distributed water balance and elucidate groundwater recharge and surface water flows for large river basins provide a basis of estimates of changes due to future climate projections. Unfortunately many regions in the world have very sparse data for parameterization or calibration of hydrologic models. For this study, the Tigris and Euphrates River basins were used for the development of a regional water balance model at 180-m spatial scale, using the Basin Characterization Model, to estimate historical changes in groundwater recharge and surface water flows in the countries of Turkey, Syria, Iraq, Iran, and Saudi Arabia. Necessary input parameters include precipitation, air temperature, potential evapotranspiration (PET), soil properties and thickness, and estimates of bulk permeability from geologic units. Data necessary for calibration includes snow cover, reservoir volumes (from satellite data and historic, pre-reservoir elevation data) and streamflow measurements. Global datasets for precipitation, air temperature, and PET were available at very large spatial scales (50 km) through the world scale databases, finer scale WorldClim climate data, and required downscaling to fine scales for model input. Soils data were available through world scale soil maps but required parameterization on the basis of textural data to estimate soil hydrologic properties. Soil depth was interpreted from geomorphologic interpretation and maps of quaternary deposits, and geologic materials were categorized from generalized geologic maps of each country. Estimates of bedrock permeability were made on the basis of literature and data on driller’s logs and adjusted during calibration of the model to streamflow measurements where available. Results of historical water balance calculations throughout the Tigris and Euphrates River basins will be shown along with details of processing input data to provide spatial continuity and downscaling. Basic water availability analysis for recharge and runoff is readily available from a determinisitic solar radiation energy balance model and a global potential evapotranspiration model and global estimates of precipitation and air temperature. Future climate estimates can be readily applied to the same water and energy balance models to evaluate future water availability for countries around the globe.

  17. Monitoring Million Trees LA: Tree performance during the early years and future benefits

    Treesearch

    E. Gregory McPherson

    2014-01-01

    Million Trees LA (MTLA) is one of several large-scale mayoral tree planting initiatives striving to create more livable cities through urban forestry. This study combined field sampling of tree survival and growth with numerical modeling of future benefits to assess performance of MTLA plantings. From 2006 to 2010 MTLA planted a diverse mix of 91,786 trees....

  18. Analysis of the ability of large-scale reanalysis data to define Siberian fire danger in preparation for future fire prediction

    NASA Astrophysics Data System (ADS)

    Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly

    2010-05-01

    Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.

  19. Future changes in large-scale transport and stratosphere-troposphere exchange

    NASA Astrophysics Data System (ADS)

    Abalos, M.; Randel, W. J.; Kinnison, D. E.; Garcia, R. R.

    2017-12-01

    Future changes in large-scale transport are investigated in long-term (1955-2099) simulations of the Community Earth System Model - Whole Atmosphere Community Climate Model (CESM-WACCM) under an RCP6.0 climate change scenario. We examine artificial passive tracers in order to isolate transport changes from future changes in emissions and chemical processes. The model suggests enhanced stratosphere-troposphere exchange in both directions (STE), with decreasing tropospheric and increasing stratospheric tracer concentrations in the troposphere. Changes in the different transport processes are evaluated using the Transformed Eulerian Mean continuity equation, including parameterized convective transport. Dynamical changes associated with the rise of the tropopause height are shown to play a crucial role on future transport trends.

  20. Large-scale velocities and primordial non-Gaussianity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, Fabian

    2010-09-15

    We study the peculiar velocities of density peaks in the presence of primordial non-Gaussianity. Rare, high-density peaks in the initial density field can be identified with tracers such as galaxies and clusters in the evolved matter distribution. The distribution of relative velocities of peaks is derived in the large-scale limit using two different approaches based on a local biasing scheme. Both approaches agree, and show that halos still stream with the dark matter locally as well as statistically, i.e. they do not acquire a velocity bias. Nonetheless, even a moderate degree of (not necessarily local) non-Gaussianity induces a significant skewnessmore » ({approx}0.1-0.2) in the relative velocity distribution, making it a potentially interesting probe of non-Gaussianity on intermediate to large scales. We also study two-point correlations in redshift space. The well-known Kaiser formula is still a good approximation on large scales, if the Gaussian halo bias is replaced with its (scale-dependent) non-Gaussian generalization. However, there are additional terms not encompassed by this simple formula which become relevant on smaller scales (k > or approx. 0.01h/Mpc). Depending on the allowed level of non-Gaussianity, these could be of relevance for future large spectroscopic surveys.« less

  1. A Review of Biological Agent Sampling Methods and ...

    EPA Pesticide Factsheets

    Report This study was conducted to evaluate current sampling and analytical capabilities, from a time and resource perspective, for a large-scale biological contamination incident. The analysis will be useful for strategically directing future research investment.

  2. The relationship of large fire occurrence with drought and fire danger indices in the western USA, 1984-2008: The role of temporal scale

    Treesearch

    Karin L. Riley; John T. Abatzoglou; Isaac C. Grenfell; Anna E. Klene; Faith Ann Heinsch

    2013-01-01

    The relationship between large fire occurrence and drought has important implications for fire prediction under current and future climates. This study’s primary objective was to evaluate correlations between drought and fire-danger- rating indices representing short- and long-term drought, to determine which had the strongest relationships with large fire occurrence...

  3. Defining biotypes for depression and anxiety based on large-scale circuit dysfunction: A theoretical review of the evidence and future directions for clinical translation

    PubMed Central

    Williams, Leanne M

    2016-01-01

    Complex emotional, cognitive and self-reflective functions rely on the activation and connectivity of large-scale neural circuits. These circuits offer a relevant scale of focus for conceptualizing a taxonomy for depression and anxiety based on specific profiles (or biotypes) of neural circuit dysfunction. Here, the theoretical review first outlined the current consensus as to what constitutes the organization of large-scale circuits in the human brain identified using parcellation and meta-analysis. The focus is on neural circuits implicated in resting reflection (“default mode”), detection of “salience”, affective processing (“threat” and “reward”), “attention” and “cognitive control”. Next, the current evidence regarding which type of dysfunctions in these circuits characterize depression and anxiety disorders was reviewed, with an emphasis on published meta-analyses and reviews of circuit dysfunctions that have been identified in at least two well-powered case:control studies. Grounded in the review of these topics, a conceptual framework is proposed for considering neural circuit-defined “biotypes”. In this framework, biotypes are defined by profiles of extent of dysfunction on each large-scale circuit. The clinical implications of a biotype approach for guiding classification and treatment of depression and anxiety is considered. Future research directions will develop the validity and clinical utility of a neural circuit biotype model that spans diagnostic categories and helps to translate neuroscience into clinical practice in the real world. PMID:27653321

  4. Large scale systems : a study of computer organizations for air traffic control applications.

    DOT National Transportation Integrated Search

    1971-06-01

    Based on current sizing estimates and tracking algorithms, some computer organizations applicable to future air traffic control computing systems are described and assessed. Hardware and software problem areas are defined and solutions are outlined.

  5. An outdoor test facility for the large-scale production of microalgae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.A.; Weissman, J.; Goebel, R.

    The goal of the US Department of EnergySolar Energy Research Institute's Aquatic Species Program is to develop the technology base to produce liquid fuels from microalgae. This technology is being initially developed for the desert Southwest. As part of this program an outdoor test facility has been designed and constructed in Roswell, New Mexico. The site has a large existing infrastructure, a suitable climate, and abundant saline groundwater. This facility will be used to evaluate productivity of microalgae strains and conduct large-scale experiments to increase biomass productivity while decreasing production costs. Six 3-m/sup 2/ fiberglass raceways were constructed. Several microalgaemore » strains were screened for growth, one of which had a short-term productivity rate of greater than 50 g dry wt m/sup /minus/2/ d/sup /minus/1/. Two large-scale, 0.1-ha raceways have also been built. These are being used to evaluate the performance trade-offs between low-cost earthen liners and higher cost plastic liners. A series of hydraulic measurements is also being carried out to evaluate future improved pond designs. Future plans include a 0.5-ha pond, which will be built in approximately 2 years to test a scaled-up system. This unique facility will be available to other researchers and industry for studies on microalgae productivity. 6 refs., 9 figs., 1 tab.« less

  6. Locally Downscaled and Spatially Customizable Climate Data for Historical and Future Periods for North America

    PubMed Central

    Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos

    2016-01-01

    Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901–2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011–2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data. PMID:27275583

  7. Locally Downscaled and Spatially Customizable Climate Data for Historical and Future Periods for North America.

    PubMed

    Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos

    2016-01-01

    Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901-2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011-2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data.

  8. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    NASA Astrophysics Data System (ADS)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are being simulated for plausible physical reasons, boosting confidence in future projections of temperature extremes. Conversely, where model skill is identified to be lower, caution should be exercised in interpreting future projections.

  9. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  10. Large-scale assessment of present day and future groundwater recharge and its sensitivity to climate variability in Europe's karst regions

    NASA Astrophysics Data System (ADS)

    Hartmann, A. J.; Gleeson, T. P.; Wagener, T.; Wada, Y.

    2016-12-01

    Karst aquifers in Europe are an important source of fresh water contributing up to half of the total drinking water supply in some countries. Karstic groundwater recharge is one of the most important components of the water balance of karst systems as it feeds the karst aquifers. Presently available large-scale hydrological models do not consider karst heterogeneity adequately. Projections of current and potential future groundwater recharge of Europe's karst aquifers are therefore unclear. In this study we compare simulations of present (1991-2010) and future (2080-2099) recharge using two different models to simulate groundwater recharge processes. One model includes karst processes (subsurface heterogeneity, lateral flow and concentrated recharge), while the other is based on the conceptual understanding of common hydrological systems (homogeneous subsurface, saturation excess overland flow). Both models are driven by the bias-corrected 5 GCMs of the ISI-MIP project (RCP8.5). To further assess sensitivity of groundwater recharge to climate variability, we calculate the elasticity of recharge rates to annual precipitation, temperature and average intensity of rainfall events, which is the median change of recharge that corresponds to the median change of these climate variables within the present and future time period, respectively. Our model comparison shows that karst regions over Europe have enhanced recharge rates with greater inter-annual variability compared to those with more homogenous subsurface properties. Furthermore, the heterogeneous representation shows stronger elasticity concerning climate variability than the homogeneous subsurface representation. This difference tends to increase towards the future. Our results suggest that water management in regions with heterogeneous subsurface can expect a higher water availability than estimated by most of the current large-scale simulations, while measures should be taken to prepare for increasingly variable groundwater recharge rates.

  11. Fire Whirls

    NASA Astrophysics Data System (ADS)

    Tohidi, Ali; Gollner, Michael J.; Xiao, Huahua

    2018-01-01

    Fire whirls present a powerful intensification of combustion, long studied in the fire research community because of the dangers they present during large urban and wildland fires. However, their destructive power has hidden many features of their formation, growth, and propagation. Therefore, most of what is known about fire whirls comes from scale modeling experiments in the laboratory. Both the methods of formation, which are dominated by wind and geometry, and the inner structure of the whirl, including velocity and temperature fields, have been studied at this scale. Quasi-steady fire whirls directly over a fuel source form the bulk of current experimental knowledge, although many other cases exist in nature. The structure of fire whirls has yet to be reliably measured at large scales; however, scaling laws have been relatively successful in modeling the conditions for formation from small to large scales. This review surveys the state of knowledge concerning the fluid dynamics of fire whirls, including the conditions for their formation, their structure, and the mechanisms that control their unique state. We highlight recent discoveries and survey potential avenues for future research, including using the properties of fire whirls for efficient remediation and energy generation.

  12. Potential climatic impacts and reliability of large-scale offshore wind farms

    NASA Astrophysics Data System (ADS)

    Wang, Chien; Prinn, Ronald G.

    2011-04-01

    The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land-based installations. However, the intermittency caused by the significant seasonal wind variations over several major offshore sites is substantial, and demands further options to ensure the reliability of large-scale offshore wind power. The method that we used to simulate the offshore wind turbine effect on the lower atmosphere involved simply increasing the ocean surface drag coefficient. While this method is consistent with several detailed fine-scale simulations of wind turbines, it still needs further study to ensure its validity. New field observations of actual wind turbine arrays are definitely required to provide ultimate validation of the model predictions presented here.

  13. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe.

    PubMed

    Blaas, Harry; Kroeze, Carolien

    2014-10-15

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed. To this end, scenarios for the year 2050 are analysed, assuming that in the 27 countries of the European Union fossil diesel will be replaced by biodiesel from algae. Estimates are made for the required fertiliser inputs to algae parks, and how this may increase concentrations of nitrogen and phosphorus in coastal waters, potentially leading to eutrophication. The Global NEWS (Nutrient Export from WaterSheds) model has been used to estimate the transport of nitrogen and phosphorus to the European coastal waters. The results indicate that the amount of nitrogen and phosphorus in the coastal waters may increase considerably in the future as a result of large-scale production of algae for the production of biodiesel, even in scenarios assuming effective waste water treatment and recycling of waste water in algae production. To ensure sustainable production of biodiesel from micro-algae, it is important to develop cultivation systems with low nutrient losses to the environment. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Channel optimization of high-intensity laser beams in millimeter-scale plasmas.

    PubMed

    Ceurvorst, L; Savin, A; Ratan, N; Kasim, M F; Sadler, J; Norreys, P A; Habara, H; Tanaka, K A; Zhang, S; Wei, M S; Ivancic, S; Froula, D H; Theobald, W

    2018-04-01

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (>10^{18}W/cm^{2}) kilojoule laser pulses through large density scale length (∼390-570 μm) laser-produced plasmas, demonstrating the effects of the pulse's focal location and intensity as well as the plasma's temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities, as expected. However, contrary to previous large-scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer-duration equivalents. This new observation has many implications for future laser-plasma research in the relativistic regime.

  15. Channel optimization of high-intensity laser beams in millimeter-scale plasmas

    NASA Astrophysics Data System (ADS)

    Ceurvorst, L.; Savin, A.; Ratan, N.; Kasim, M. F.; Sadler, J.; Norreys, P. A.; Habara, H.; Tanaka, K. A.; Zhang, S.; Wei, M. S.; Ivancic, S.; Froula, D. H.; Theobald, W.

    2018-04-01

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (>1018W/cm 2 ) kilojoule laser pulses through large density scale length (˜390 -570 μ m ) laser-produced plasmas, demonstrating the effects of the pulse's focal location and intensity as well as the plasma's temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities, as expected. However, contrary to previous large-scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer-duration equivalents. This new observation has many implications for future laser-plasma research in the relativistic regime.

  16. Using large-scale diagnostic quantities to investigate change in East Coast Lows

    NASA Astrophysics Data System (ADS)

    Ji, Fei; Evans, Jason P.; Argueso, Daniel; Fita, Lluis; Di Luca, Alejandro

    2015-11-01

    East Coast Lows (ECLs) are intense low-pressure systems that affect the eastern seaboard of Australia. They have attracted research interest for both their destructive nature and water supplying capability. Estimating the changes in ECLs in the future has a major impact on emergency response as well as water management strategies for the coastal communities on the east coast of Australia. In this study, ECLs were identified using two large-scale diagnostic quantities: isentropic potential vorticity (IPV) and geostrophic vorticity (GV), which were calculated from outputs of historical and future regional climate simulations from the NSW/ACT regional climate modelling (NARCliM) project. The diagnostic results for the historical period were evaluated against a subjective ECL event database. Future simulations using a high emission scenario were examined to estimate changes in frequency, duration, and intensity of ECLs. The use of a relatively high resolution regional climate model makes this the first study to examine future changes in ECLs while resolving the full range of ECL sizes which can be as small as 100-200 km in diameter. The results indicate that it is likely that there will be fewer ECLs, with weaker intensity in the future. There could also be a seasonal shift in ECLs from cool months to warm months. These changes have the potential to significantly impact the water security on the east coast of Australia.

  17. Towards Productive Critique of Large-Scale Comparisons in Education

    ERIC Educational Resources Information Center

    Gorur, Radhika

    2017-01-01

    International large-scale assessments and comparisons (ILSAs) in education have become significant policy phenomena. How a country fares in these assessments has come to signify not only how a nation's education system is performing, but also its future prospects in a global economic "race". These assessments provoke passionate arguments…

  18. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water in large scale basins in the future. The broader benefits of engineering landscapes to hold water for pollution control, sediment loss and drought minimisation will also be shown.

  19. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  20. Projected future vegetation changes for the northwest United States and southwest Canada at a fine spatial resolution using a dynamic global vegetation model.

    USGS Publications Warehouse

    Shafer, Sarah; Bartlein, Patrick J.; Gray, Elizabeth M.; Pelltier, Richard T.

    2015-01-01

    Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0–58.0°N latitude by 136.6–103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070–2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.

  1. Projected Future Vegetation Changes for the Northwest United States and Southwest Canada at a Fine Spatial Resolution Using a Dynamic Global Vegetation Model

    PubMed Central

    Shafer, Sarah L.; Bartlein, Patrick J.; Gray, Elizabeth M.; Pelltier, Richard T.

    2015-01-01

    Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0–58.0°N latitude by 136.6–103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070–2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas. PMID:26488750

  2. Scale dependence of halo bispectrum from non-Gaussian initial conditions in cosmological N-body simulations

    NASA Astrophysics Data System (ADS)

    Nishimichi, Takahiro; Taruya, Atsushi; Koyama, Kazuya; Sabiu, Cristiano

    2010-07-01

    We study the halo bispectrum from non-Gaussian initial conditions. Based on a set of large N-body simulations starting from initial density fields with local type non-Gaussianity, we find that the halo bispectrum exhibits a strong dependence on the shape and scale of Fourier space triangles near squeezed configurations at large scales. The amplitude of the halo bispectrum roughly scales as fNL2. The resultant scaling on the triangular shape is consistent with that predicted by Jeong & Komatsu based on perturbation theory. We systematically investigate this dependence with varying redshifts and halo mass thresholds. It is shown that the fNL dependence of the halo bispectrum is stronger for more massive haloes at higher redshifts. This feature can be a useful discriminator of inflation scenarios in future deep and wide galaxy redshift surveys.

  3. Impact of oceanic-scale interactions on the seasonal modulation of ocean dynamics by the atmosphere.

    PubMed

    Sasaki, Hideharu; Klein, Patrice; Qiu, Bo; Sasai, Yoshikazu

    2014-12-15

    Ocean eddies (with a size of 100-300 km), ubiquitous in satellite observations, are known to represent about 80% of the total ocean kinetic energy. Recent studies have pointed out the unexpected role of smaller oceanic structures (with 1-50 km scales) in generating and sustaining these eddies. The interpretation proposed so far invokes the internal instability resulting from the large-scale interaction between upper and interior oceanic layers. Here we show, using a new high-resolution simulation of the realistic North Pacific Ocean, that ocean eddies are instead sustained by a different process that involves small-scale mixed-layer instabilities set up by large-scale atmospheric forcing in winter. This leads to a seasonal evolution of the eddy kinetic energy in a very large part of this ocean, with an amplitude varying by a factor almost equal to 2. Perspectives in terms of the impacts on climate dynamics and future satellite observational systems are briefly discussed.

  4. Impact of oceanic-scale interactions on the seasonal modulation of ocean dynamics by the atmosphere

    PubMed Central

    Sasaki, Hideharu; Klein, Patrice; Qiu, Bo; Sasai, Yoshikazu

    2014-01-01

    Ocean eddies (with a size of 100–300 km), ubiquitous in satellite observations, are known to represent about 80% of the total ocean kinetic energy. Recent studies have pointed out the unexpected role of smaller oceanic structures (with 1–50 km scales) in generating and sustaining these eddies. The interpretation proposed so far invokes the internal instability resulting from the large-scale interaction between upper and interior oceanic layers. Here we show, using a new high-resolution simulation of the realistic North Pacific Ocean, that ocean eddies are instead sustained by a different process that involves small-scale mixed-layer instabilities set up by large-scale atmospheric forcing in winter. This leads to a seasonal evolution of the eddy kinetic energy in a very large part of this ocean, with an amplitude varying by a factor almost equal to 2. Perspectives in terms of the impacts on climate dynamics and future satellite observational systems are briefly discussed. PMID:25501039

  5. Spatially distributed potential evapotranspiration modeling and climate projections.

    PubMed

    Gharbia, Salem S; Smullen, Trevor; Gill, Laurence; Johnston, Paul; Pilla, Francesco

    2018-08-15

    Evapotranspiration integrates energy and mass transfer between the Earth's surface and atmosphere and is the most active mechanism linking the atmosphere, hydrosphsophere, lithosphere and biosphere. This study focuses on the fine resolution modeling and projection of spatially distributed potential evapotranspiration on the large catchment scale as response to climate change. Six potential evapotranspiration designed algorithms, systematically selected based on a structured criteria and data availability, have been applied and then validated to long-term mean monthly data for the Shannon River catchment with a 50m 2 cell size. The best validated algorithm was therefore applied to evaluate the possible effect of future climate change on potential evapotranspiration rates. Spatially distributed potential evapotranspiration projections have been modeled based on climate change projections from multi-GCM ensembles for three future time intervals (2020, 2050 and 2080) using a range of different Representative Concentration Pathways producing four scenarios for each time interval. Finally, seasonal results have been compared to baseline results to evaluate the impact of climate change on the potential evapotranspiration and therefor on the catchment dynamical water balance. The results present evidence that the modeled climate change scenarios would have a significant impact on the future potential evapotranspiration rates. All the simulated scenarios predicted an increase in potential evapotranspiration for each modeled future time interval, which would significantly affect the dynamical catchment water balance. This study addresses the gap in the literature of using GIS-based algorithms to model fine-scale spatially distributed potential evapotranspiration on the large catchment systems based on climatological observations and simulations in different climatological zones. Providing fine-scale potential evapotranspiration data is very crucial to assess the dynamical catchment water balance to setup management scenarios for the water abstractions. This study illustrates a transferable systematic method to design GIS-based algorithms to simulate spatially distributed potential evapotranspiration on the large catchment systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. ATLAS and LHC computing on CRAY

    NASA Astrophysics Data System (ADS)

    Sciacca, F. G.; Haug, S.; ATLAS Collaboration

    2017-10-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  7. Improving Future Ecosystem Benefits through Earth Observations: the H2020 Project ECOPOTENTIAL

    NASA Astrophysics Data System (ADS)

    Provenzale, Antonello; Beierkuhnlein, Carl; Ziv, Guy

    2016-04-01

    Terrestrial and marine ecosystems provide essential goods and services to human societies. In the last decades, however, anthropogenic pressures caused serious threats to ecosystem integrity, functions and processes, potentially leading to the loss of essential ecosystem services. ECOPOTENTIAL is a large European-funded H2020 project which focuses its activities on a targeted set of internationally recognised protected areas in Europe, European Territories and beyond, blending Earth Observations from remote sensing and field measurements, data analysis and modelling of current and future ecosystem conditions and services. The definition of future scenarios is based on climate and land-use change projections, addressing the issue of uncertainties and uncertainty propagation across the modelling chain. The ECOPOTENTIAL project addresses cross-scale geosphere-biosphere interactions and landscape-ecosystem dynamics at regional to continental scales, using geostatistical methods and the emerging approaches in Macrosystem Ecology and Earth Critical Zone studies, addressing long-term and large-scale environmental and ecological challenges. The project started its activities in 2015, by defining a set of storylines which allow to tackle some of the most crucial issues in the assessment of present conditions and the estimate of the future state of selected ecosystem services. In this contribution, we focus on some of the main storylines of the project and discuss the general approach, focusing on the interplay of data and models and on the estimate of projection uncertainties.

  8. Recent developments in VSD imaging of small neuronal networks

    PubMed Central

    Hill, Evan S.; Bruno, Angela M.

    2014-01-01

    Voltage-sensitive dye (VSD) imaging is a powerful technique that can provide, in single experiments, a large-scale view of network activity unobtainable with traditional sharp electrode recording methods. Here we review recent work using VSDs to study small networks and highlight several results from this approach. Topics covered include circuit mapping, network multifunctionality, the network basis of decision making, and the presence of variably participating neurons in networks. Analytical tools being developed and applied to large-scale VSD imaging data sets are discussed, and the future prospects for this exciting field are considered. PMID:25225295

  9. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    PubMed Central

    Jensen, Tue V.; Pinson, Pierre

    2017-01-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600

  10. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    PubMed

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  11. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    NASA Astrophysics Data System (ADS)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  12. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: Overview and Air-side System Description

    NASA Technical Reports Server (NTRS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; hide

    2016-01-01

    This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  13. Nonreciprocity in the dynamics of coupled oscillators with nonlinearity, asymmetry, and scale hierarchy

    NASA Astrophysics Data System (ADS)

    Moore, Keegan J.; Bunyan, Jonathan; Tawfick, Sameh; Gendelman, Oleg V.; Li, Shuangbao; Leamy, Michael; Vakakis, Alexander F.

    2018-01-01

    In linear time-invariant dynamical and acoustical systems, reciprocity holds by the Onsager-Casimir principle of microscopic reversibility, and this can be broken only by odd external biases, nonlinearities, or time-dependent properties. A concept is proposed in this work for breaking dynamic reciprocity based on irreversible nonlinear energy transfers from large to small scales in a system with nonlinear hierarchical internal structure, asymmetry, and intentional strong stiffness nonlinearity. The resulting nonreciprocal large-to-small scale energy transfers mimic analogous nonlinear energy transfer cascades that occur in nature (e.g., in turbulent flows), and are caused by the strong frequency-energy dependence of the essentially nonlinear small-scale components of the system considered. The theoretical part of this work is mainly based on action-angle transformations, followed by direct numerical simulations of the resulting system of nonlinear coupled oscillators. The experimental part considers a system with two scales—a linear large-scale oscillator coupled to a small scale by a nonlinear spring—and validates the theoretical findings demonstrating nonreciprocal large-to-small scale energy transfer. The proposed study promotes a paradigm for designing nonreciprocal acoustic materials harnessing strong nonlinearity, which in a future application will be implemented in designing lattices incorporating nonlinear hierarchical internal structures, asymmetry, and scale mixing.

  14. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach), and iii) process-level evaluation at climate time-scales. The advantages and disadvantages of each approach will be identified and discussed, and some thoughts about possible future developments will be given.

  15. Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles

    NASA Technical Reports Server (NTRS)

    Gradl, Paul; Brandsmeier, Will

    2016-01-01

    Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.

  16. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    NASA Astrophysics Data System (ADS)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the SDG agenda. Based on this, we argue that the development of policies for regulating externalities of large-scale bioenergy production should rely on broad sustainability assessments to discover potential trade-offs with the SDG agenda before implementation.

  17. Stream Discharge and Evapotranspiration Responses to Climate Change and Their Associated Uncertainties in a Large Semi-Arid Basin

    NASA Astrophysics Data System (ADS)

    Bassam, S.; Ren, J.

    2017-12-01

    Predicting future water availability in watersheds is very important for proper water resources management, especially in semi-arid regions with scarce water resources. Hydrological models have been considered as powerful tools in predicting future hydrological conditions in watershed systems in the past two decades. Streamflow and evapotranspiration are the two important components in watershed water balance estimation as the former is the most commonly-used indicator of the overall water budget estimation, and the latter is the second biggest component of water budget (biggest outflow from the system). One of the main concerns in watershed scale hydrological modeling is the uncertainties associated with model prediction, which could arise from errors in model parameters and input meteorological data, or errors in model representation of the physics of hydrological processes. Understanding and quantifying these uncertainties are vital to water resources managers for proper decision making based on model predictions. In this study, we evaluated the impacts of different climate change scenarios on the future stream discharge and evapotranspiration, and their associated uncertainties, throughout a large semi-arid basin using a stochastically-calibrated, physically-based, semi-distributed hydrological model. The results of this study could provide valuable insights in applying hydrological models in large scale watersheds, understanding the associated sensitivity and uncertainties in model parameters, and estimating the corresponding impacts on interested hydrological process variables under different climate change scenarios.

  18. Modelling high Reynolds number wall–turbulence interactions in laboratory experiments using large-scale free-stream turbulence

    PubMed Central

    Dogan, Eda; Hearst, R. Jason

    2017-01-01

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to ‘simulate’ high Reynolds number wall–turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167584

  19. Modelling high Reynolds number wall-turbulence interactions in laboratory experiments using large-scale free-stream turbulence.

    PubMed

    Dogan, Eda; Hearst, R Jason; Ganapathisubramani, Bharathram

    2017-03-13

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to 'simulate' high Reynolds number wall-turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  20. Projecting future impacts of hurricanes on the carbon balance of eastern U.S. forests

    NASA Astrophysics Data System (ADS)

    Fisk, J. P.; Hurtt, G. C.; Chambers, J. Q.; Zeng, H.; Dolan, K.; Flanagan, S.; Rourke, O.; Negron Juarez, R. I.

    2011-12-01

    In U.S. Atlantic coastal areas, hurricanes are a principal agent of catastrophic wind damage, with dramatic impacts on the structure and functioning of forests. Substantial recent progress has been made to estimate the biomass loss and resulting carbon emissions caused by hurricanes impacting the U.S. Additionally, efforts to evaluate the net effects of hurricanes on the regional carbon balance have demonstrated the importance of viewing large disturbance events in the broader context of recovery from a mosaic of past events. Viewed over sufficiently long time scales and large spatial scales, regrowth from previous storms may largely offset new emissions; however, changes in number, strength or spatial distribution of extreme disturbance events will result in changes to the equilibrium state of the ecosystem and have the potential to result in a lasting carbon source or sink. Many recent studies have linked climate change to changes in the frequency and intensity of hurricanes. In this study, we use a mechanistic ecosystem model, the Ecosystem Demography (ED) model, driven by scenarios of future hurricane activity based on historic activity and future climate projections, to evaluate how changes in hurricane frequency, intensity and spatial distribution could affect regional carbon storage and flux over the coming century. We find a non-linear response where increased storm activity reduces standing biomass stocks reducing the impacts of future events. This effect is highly dependent on the spatial pattern and repeat interval of future hurricane activity. Developing this kind of predictive modeling capability that tracks disturbance events and recovery is key to our understanding and ability to predict the carbon balance of forests.

  1. Evolution of Precipitation Extremes in Three Large Ensembles of Climate Simulations - Impact of Spatial and Temporal Resolutions

    NASA Astrophysics Data System (ADS)

    Martel, J. L.; Brissette, F.; Mailhot, A.; Wood, R. R.; Ludwig, R.; Frigon, A.; Leduc, M.; Turcotte, R.

    2017-12-01

    Recent studies indicate that the frequency and intensity of extreme precipitation will increase in future climate due to global warming. In this study, we compare annual maxima precipitation series from three large ensembles of climate simulations at various spatial and temporal resolutions. The first two are at the global scale: the Canadian Earth System Model (CanESM2) 50-member large ensemble (CanESM2-LE) at a 2.8° resolution and the Community Earth System Model (CESM1) 40-member large ensemble (CESM1-LE) at a 1° resolution. The third ensemble is at the regional scale over both Eastern North America and Europe: the Canadian Regional Climate Model (CRCM5) 50-member large ensemble (CRCM5-LE) at a 0.11° resolution, driven at its boundaries by the CanESM-LE. The CRCM5-LE is a new ensemble issued from the ClimEx project (http://www.climex-project.org), a Québec-Bavaria collaboration. Using these three large ensembles, change in extreme precipitations over the historical (1980-2010) and future (2070-2100) periods are investigated. This results in 1 500 (30 years x 50 members for CanESM2-LE and CRCM5-LE) and 1200 (30 years x 40 members for CESM1-LE) simulated years over both the historical and future periods. Using these large datasets, the empirical daily (and sub-daily for CRCM5-LE) extreme precipitation quantiles for large return periods ranging from 2 to 100 years are computed. Results indicate that daily extreme precipitations generally will increase over most land grid points of both domains according to the three large ensembles. Regarding the CRCM5-LE, the increase in sub-daily extreme precipitations will be even more important than the one observed for daily extreme precipitations. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety.

  2. Sensitivity of tree ring growth to local and large-scale climate variability in a region of Southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Venegas-González, Alejandro; Chagas, Matheus Peres; Anholetto Júnior, Claudio Roberto; Alvares, Clayton Alcarde; Roig, Fidel Alejandro; Tomazello Filho, Mario

    2016-01-01

    We explored the relationship between tree growth in two tropical species and local and large-scale climate variability in Southeastern Brazil. Tree ring width chronologies of Tectona grandis (teak) and Pinus caribaea (Caribbean pine) trees were compared with local (Water Requirement Satisfaction Index—WRSI, Standardized Precipitation Index—SPI, and Palmer Drought Severity Index—PDSI) and large-scale climate indices that analyze the equatorial pacific sea surface temperature (Trans-Niño Index-TNI and Niño-3.4-N3.4) and atmospheric circulation variations in the Southern Hemisphere (Antarctic Oscillation-AAO). Teak trees showed positive correlation with three indices in the current summer and fall. A significant correlation between WRSI index and Caribbean pine was observed in the dry season preceding tree ring formation. The influence of large-scale climate patterns was observed only for TNI and AAO, where there was a radial growth reduction in months preceding the growing season with positive values of the TNI in teak trees and radial growth increase (decrease) during December (March) to February (May) of the previous (current) growing season with positive phase of the AAO in teak (Caribbean pine) trees. The development of a new dendroclimatological study in Southeastern Brazil sheds light to local and large-scale climate influence on tree growth in recent decades, contributing in future climate change studies.

  3. Contribution of the infrasound technology to characterize large scale atmospheric disturbances and impact on infrasound monitoring

    NASA Astrophysics Data System (ADS)

    Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter

    2016-04-01

    The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.

  4. Large underground, liquid based detectors for astro-particle physics in Europe: scientific case and prospects

    NASA Astrophysics Data System (ADS)

    Autiero, D.; Äystö, J.; Badertscher, A.; Bezrukov, L.; Bouchez, J.; Bueno, A.; Busto, J.; Campagne, J.-E.; Cavata, Ch; Chaussard, L.; de Bellefon, A.; Déclais, Y.; Dumarchez, J.; Ebert, J.; Enqvist, T.; Ereditato, A.; von Feilitzsch, F.; Fileviez Perez, P.; Göger-Neff, M.; Gninenko, S.; Gruber, W.; Hagner, C.; Hess, M.; Hochmuth, K. A.; Kisiel, J.; Knecht, L.; Kreslo, I.; Kudryavtsev, V. A.; Kuusiniemi, P.; Lachenmaier, T.; Laffranchi, M.; Lefievre, B.; Lightfoot, P. K.; Lindner, M.; Maalampi, J.; Maltoni, M.; Marchionni, A.; Marrodán Undagoitia, T.; Marteau, J.; Meregaglia, A.; Messina, M.; Mezzetto, M.; Mirizzi, A.; Mosca, L.; Moser, U.; Müller, A.; Natterer, G.; Oberauer, L.; Otiougova, P.; Patzak, T.; Peltoniemi, J.; Potzel, W.; Pistillo, C.; Raffelt, G. G.; Rondio, E.; Roos, M.; Rossi, B.; Rubbia, A.; Savvinov, N.; Schwetz, T.; Sobczyk, J.; Spooner, N. J. C.; Stefan, D.; Tonazzo, A.; Trzaska, W.; Ulbricht, J.; Volpe, C.; Winter, J.; Wurm, M.; Zalewska, A.; Zimmermann, R.

    2007-11-01

    This document reports on a series of experimental and theoretical studies conducted to assess the astro-particle physics potential of three future large scale particle detectors proposed in Europe as next generation underground observatories. The proposed apparatuses employ three different and, to some extent, complementary detection techniques: GLACIER (liquid argon TPC), LENA (liquid scintillator) and MEMPHYS (water Cherenkov), based on the use of large mass of liquids as active detection media. The results of these studies are presented along with a critical discussion of the performance attainable by the three proposed approaches coupled to existing or planned underground laboratories, in relation to open and outstanding physics issues such as the search for matter instability, the detection of astrophysical neutrinos and geo-neutrinos and to the possible use of these detectors in future high intensity neutrino beams.

  5. Channel optimization of high-intensity laser beams in millimeter-scale plasmas

    DOE PAGES

    Ceurvorst, L.; Savin, A.; Ratan, N.; ...

    2018-04-20

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (> 10 18 W/cm 2) kilojoule laser pulses through large density scale length (~ 390-570 μm) laser-produced plasmas, demonstrating the effects of the pulse’s focal location and intensity as well as the plasma’s temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities as expected. However, contrary to previous large scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer duration equivalents. To conclude, this new observation has manymore » implications for future laser-plasma research in the relativistic regime.« less

  6. Channel optimization of high-intensity laser beams in millimeter-scale plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceurvorst, L.; Savin, A.; Ratan, N.

    Channeling experiments were performed at the OMEGA EP facility using relativistic intensity (> 10 18 W/cm 2) kilojoule laser pulses through large density scale length (~ 390-570 μm) laser-produced plasmas, demonstrating the effects of the pulse’s focal location and intensity as well as the plasma’s temperature on the resulting channel formation. The results show deeper channeling when focused into hot plasmas and at lower densities as expected. However, contrary to previous large scale particle-in-cell studies, the results also indicate deeper penetration by short (10 ps), intense pulses compared to their longer duration equivalents. To conclude, this new observation has manymore » implications for future laser-plasma research in the relativistic regime.« less

  7. Computational Models of Consumer Confidence from Large-Scale Online Attention Data: Crowd-Sourcing Econometrics

    PubMed Central

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting. PMID:25826692

  8. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    PubMed

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  9. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  10. Imaging detectors and electronics—a view of the future

    NASA Astrophysics Data System (ADS)

    Spieler, Helmuth

    2004-09-01

    Imaging sensors and readout electronics have made tremendous strides in the past two decades. The application of modern semiconductor fabrication techniques and the introduction of customized monolithic integrated circuits have made large-scale imaging systems routine in high-energy physics. This technology is now finding its way into other areas, such as space missions, synchrotron light sources, and medical imaging. I review current developments and discuss the promise and limits of new technologies. Several detector systems are described as examples of future trends. The discussion emphasizes semiconductor detector systems, but I also include recent developments for large-scale superconducting detector arrays.

  11. Mathematics Teacher Education Quality in TEDS-M: Globalizing the Views of Future Teachers and Teacher Educators

    ERIC Educational Resources Information Center

    Hsieh, Feng-Jui; Law, Chiu-Keung; Shy, Haw-Yaw; Wang, Ting-Ying; Hsieh, Chia-Jui; Tang, Shu-Jyh

    2011-01-01

    The Teacher Education and Development Study in Mathematics, sponsored by the International Association for the Evaluation of Educational Achievement, is the first data-based study about mathematics teacher education with large-scale samples; this article is based on its data but develops a stand-alone conceptual framework to investigate the…

  12. Turning of COGS moves forward findings for hormonally mediated cancers.

    PubMed

    Sakoda, Lori C; Jorgenson, Eric; Witte, John S

    2013-04-01

    The large-scale Collaborative Oncological Gene-environment Study (COGS) presents new findings that further characterize the genetic bases of breast, ovarian and prostate cancers. We summarize and provide insights into this collection of papers from COGS and discuss the implications of the results and future directions for such efforts.

  13. The genetic architecture of coronary artery disease: current knowledge and future opportunities

    USDA-ARS?s Scientific Manuscript database

    Recent Findings Large-scale studies in human populations, coupled with rapid advances in genetic technologies over the last decade, have clearly established the association of common genetic variation with risk of CAD. However, the effect sizes of the susceptibility alleles are for the most part mod...

  14. The Oral Health Burden in the United States: A Summary of Recent Epidemiological Studies.

    ERIC Educational Resources Information Center

    Caplan, Daniel J.; Weintraub, Jane A.

    1993-01-01

    This article reviews recent large-scale epidemiological surveys of oral health in the United States, outlines risk factors for oral disease, and makes recommendations for future surveys. Discussion is limited to dental caries, periodontal diseases, tooth loss, edentulism, oral cancer, and orofacial clefts. (Author/MSE)

  15. Using complexity theory to develop a student-directed interprofessional learning activity for 1220 healthcare students.

    PubMed

    Jorm, Christine; Nisbet, Gillian; Roberts, Chris; Gordon, Christopher; Gentilcore, Stacey; Chen, Timothy F

    2016-08-08

    More and better interprofessional practice is predicated to be necessary to deliver good care to the patients of the future. However, universities struggle to create authentic learning activities that enable students to experience the dynamic interprofessional interactions common in healthcare and that can accommodate large interprofessional student cohorts. We investigated a large-scale mandatory interprofessional learning (IPL) activity for health professional students designed to promote social learning. A mixed methods research approach determined feasibility, acceptability and the extent to which student IPL outcomes were met. We developed an IPL activity founded in complexity theory to prepare students for future practice by engaging them in a self-directed (self-organised) learning activity with a diverse team, whose assessable products would be emergent creations. Complicated but authentic clinical cases (n = 12) were developed to challenge student teams (n = 5 or 6). Assessment consisted of a written management plan (academically marked) and a five-minute video (peer marked) designed to assess creative collaboration as well as provide evidence of integrated collective knowledge; the cohesive patient-centred management plan. All students (including the disciplines of diagnostic radiology, exercise physiology, medicine, nursing, occupational therapy, pharmacy, physiotherapy and speech pathology), completed all tasks successfully. Of the 26 % of students who completed the evaluation survey, 70 % agreed or strongly agreed that the IPL activity was worthwhile, and 87 % agreed or strongly agreed that their case study was relevant. Thematic analysis found overarching themes of engagement and collaboration-in-action suggesting that the IPL activity enabled students to achieve the intended learning objectives. Students recognised the contribution of others and described negotiation, collaboration and creation of new collective knowledge after working together on the complicated patient case studies. The novel video assessment was challenging to many students and contextual issues limited engagement for some disciplines. We demonstrated the feasibility and acceptability of a large scale IPL activity where design of cases, format and assessment tasks was founded in complexity theory. This theoretically based design enabled students to achieve complex IPL outcomes relevant to future practice. Future research could establish the psychometric properties of assessments of student performance in large-scale IPL events.

  16. A modeling approach to assess coastal management effects on benthic habitat quality: A case study on coastal defense and navigability

    NASA Astrophysics Data System (ADS)

    Cozzoli, Francesco; Smolders, Sven; Eelkema, Menno; Ysebaert, Tom; Escaravage, Vincent; Temmerman, Stijn; Meire, Patrick; Herman, Peter M. J.; Bouma, Tjeerd J.

    2017-01-01

    The natural coastal hydrodynamics and morphology worldwide is altered by human interventions such as embankments, shipping and dredging, which may have consequences for ecosystem functionality. To ensure long-term ecological sustainability, requires capability to predict long-term large-scale ecological effects of altered hydromorphology. As empirical data sets at relevant scales are missing, there is need for integrating ecological modeling with physical modeling. This paper presents a case study showing the long-term, large-scale macrozoobenthic community response to two contrasting human alterations of the hydromorphological habitat: deepening of estuarine channels to enhance navigability (Westerschelde) vs. realization of a storm surge barrier to enhance coastal safety (Oosterschelde). A multidisciplinary integration of empirical data and modeling of estuarine morphology, hydrodynamics and benthic ecology was used to reconstruct the hydrological evolution and resulting long-term (50 years) large-scale ecological trends for both estuaries over the last. Our model indicated that hydrodynamic alterations following the deepening of the Westerschelde had negative implications for benthic life, while the realization of the Oosterschelde storm surge barriers had mixed and habitat-dependent responses, that also include unexpected improvement of environmental quality. Our analysis illustrates long-term trends in the natural community caused by opposing management strategies. The divergent human pressures on the Oosterschelde and Westerschelde are examples of what could happen in a near future for many global coastal ecosystems. The comparative analysis of the two basins is a valuable source of information to understand (and communicate) the future ecological consequences of human coastal development.

  17. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    PubMed

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  18. The future of management: The NASA paradigm

    NASA Technical Reports Server (NTRS)

    Harris, Philip R.

    1992-01-01

    Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.

  19. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  20. Development of out-of-hours primary care by general practitioners (GPs) in The Netherlands: from small-call rotations to large-scale GP cooperatives.

    PubMed

    van Uden, Caro J T; Giesen, Paul H J; Metsemakers, Job F M; Grol, Richard P T M

    2006-09-01

    Over the last 10 years, care outside office hours by primary care physicians in The Netherlands has experienced a radical change. While Dutch general practitioners (GPs) formerly performed these services in small-call rotations, care is nowadays delivered by large-scale GP cooperatives. We searched the literature for relevant studies on the effect of the out-of-hours care reorganization in The Netherlands. We identified research that included before- and afterintervention studies, descriptive studies, and surveys. These studies focused on the consequences of reorganizing several aspects of out-of-hours care, such as patient and GP satisfaction, patient characteristics, utilization of care, and costs. Various studies showed that the reorganization has successfully addressed many of the critical issues that Dutch GPs were confronted with delivering these services. GPs' job satisfaction has increased, and patients seem to be satisfied with current out-of-hours care. Several aspects of out-of-hours care are discussed, such as telephone triage, self referrals, and future expectations, which should receive extra attention by researchers and health policy makers in the near future.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fant, Charles; Schlosser, C. Adam; Gao, Xiang

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios—internally consistent across economics, emissions, climate, and population—to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify themore » primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region’s population will live in water-stressed regions in the near future. Lastly, tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers.« less

  2. Identification and Characterization of Genomic Amplifications in Ovarian Serous Carcinoma

    DTIC Science & Technology

    2009-07-01

    oncogenes, Rsf1 and Notch3, which were up-regulated in both genomic DNA and transcript levels in ovarian cancer. In a large- scale FISH analysis, Rsf1...associated with worse disease outcome, suggesting that Rsf1 could be potentially used as a prognostic marker in the future (Appendix #1). For the...over- expressed in a recurrent carcinoma. Although the follow-up study in a larger- scale sample size did not demonstrate clear amplification in NAC1

  3. Projections of Water Stress Based on an Ensemble of Socioeconomic Growth and Climate Change Scenarios: A Case Study in Asia.

    PubMed

    Fant, Charles; Schlosser, C Adam; Gao, Xiang; Strzepek, Kenneth; Reilly, John

    2016-01-01

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios--internally consistent across economics, emissions, climate, and population--to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify the primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region's population will live in water-stressed regions in the near future. Tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers.

  4. Projections of Water Stress Based on an Ensemble of Socioeconomic Growth and Climate Change Scenarios: A Case Study in Asia

    PubMed Central

    Fant, Charles; Schlosser, C. Adam; Gao, Xiang; Strzepek, Kenneth; Reilly, John

    2016-01-01

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios—internally consistent across economics, emissions, climate, and population—to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify the primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region’s population will live in water-stressed regions in the near future. Tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers. PMID:27028871

  5. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    ERIC Educational Resources Information Center

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  6. Reliability of plasma polar metabolite concentrations in a large-scale cohort study using capillary electrophoresis-mass spectrometry.

    PubMed

    Harada, Sei; Hirayama, Akiyoshi; Chan, Queenie; Kurihara, Ayako; Fukai, Kota; Iida, Miho; Kato, Suzuka; Sugiyama, Daisuke; Kuwabara, Kazuyo; Takeuchi, Ayano; Akiyama, Miki; Okamura, Tomonori; Ebbels, Timothy M D; Elliott, Paul; Tomita, Masaru; Sato, Asako; Suzuki, Chizuru; Sugimoto, Masahiro; Soga, Tomoyoshi; Takebayashi, Toru

    2018-01-01

    Cohort studies with metabolomics data are becoming more widespread, however, large-scale studies involving 10,000s of participants are still limited, especially in Asian populations. Therefore, we started the Tsuruoka Metabolomics Cohort Study enrolling 11,002 community-dwelling adults in Japan, and using capillary electrophoresis-mass spectrometry (CE-MS) and liquid chromatography-mass spectrometry. The CE-MS method is highly amenable to absolute quantification of polar metabolites, however, its reliability for large-scale measurement is unclear. The aim of this study is to examine reproducibility and validity of large-scale CE-MS measurements. In addition, the study presents absolute concentrations of polar metabolites in human plasma, which can be used in future as reference ranges in a Japanese population. Metabolomic profiling of 8,413 fasting plasma samples were completed using CE-MS, and 94 polar metabolites were structurally identified and quantified. Quality control (QC) samples were injected every ten samples and assessed throughout the analysis. Inter- and intra-batch coefficients of variation of QC and participant samples, and technical intraclass correlation coefficients were estimated. Passing-Bablok regression of plasma concentrations by CE-MS on serum concentrations by standard clinical chemistry assays was conducted for creatinine and uric acid. In QC samples, coefficient of variation was less than 20% for 64 metabolites, and less than 30% for 80 metabolites out of the 94 metabolites. Inter-batch coefficient of variation was less than 20% for 81 metabolites. Estimated technical intraclass correlation coefficient was above 0.75 for 67 metabolites. The slope of Passing-Bablok regression was estimated as 0.97 (95% confidence interval: 0.95, 0.98) for creatinine and 0.95 (0.92, 0.96) for uric acid. Compared to published data from other large cohort measurement platforms, reproducibility of metabolites common to the platforms was similar to or better than in the other studies. These results show that our CE-MS platform is suitable for conducting large-scale epidemiological studies.

  7. Resources for Functional Genomics Studies in Drosophila melanogaster

    PubMed Central

    Mohr, Stephanie E.; Hu, Yanhui; Kim, Kevin; Housden, Benjamin E.; Perrimon, Norbert

    2014-01-01

    Drosophila melanogaster has become a system of choice for functional genomic studies. Many resources, including online databases and software tools, are now available to support design or identification of relevant fly stocks and reagents or analysis and mining of existing functional genomic, transcriptomic, proteomic, etc. datasets. These include large community collections of fly stocks and plasmid clones, “meta” information sites like FlyBase and FlyMine, and an increasing number of more specialized reagents, databases, and online tools. Here, we introduce key resources useful to plan large-scale functional genomics studies in Drosophila and to analyze, integrate, and mine the results of those studies in ways that facilitate identification of highest-confidence results and generation of new hypotheses. We also discuss ways in which existing resources can be used and might be improved and suggest a few areas of future development that would further support large- and small-scale studies in Drosophila and facilitate use of Drosophila information by the research community more generally. PMID:24653003

  8. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    NASA Technical Reports Server (NTRS)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  9. Appropriate experimental ecosystem warming methods by ecosystem, objective, and practicality

    Treesearch

    E.L. Aronson; S.G. McNulty

    2009-01-01

    The temperature of the Earth is rising, and is highly likely to continue to do so for the foreseeable future. The study of the effects of sustained heating on the ecosystems of the world is necessary so that wemight predict and respond to coming changes on both large and small spatial scales. To this end, ecosystem warming studies have...

  10. The Future of Wind Energy in California: Future Projections in Variable-Resolution CESM

    NASA Astrophysics Data System (ADS)

    Wang, M.; Ullrich, P. A.; Millstein, D.; Collier, C.

    2017-12-01

    This study focuses on the wind energy characterization and future projection at five primary wind turbine sites in California. Historical (1980-2000) and mid-century (2030-2050) simulations were produced using the Variable-Resolution Community Earth System Model (VR-CESM) to analyze the trends and variations in wind energy under climate change. Datasets from Det Norske Veritas Germanischer Llyod (DNV GL), MERRA-2, CFSR, NARR, as well as surface observational data were used for model validation and comparison. Significant seasonal wind speed changes under RCP8.5 were detected from several wind farm sites. Large-scale patterns were then investigated to analyze the synoptic-scale impact on localized wind change. The agglomerative clustering method was applied to analyze and group different wind patterns. The associated meteorological background of each cluster was investigated to analyze the drivers of different wind patterns. This study improves the characterization of uncertainty around the magnitude and variability in space and time of California's wind resources in the near future, and also enhances understanding of the physical mechanisms related to the trends in wind resource variability.

  11. Mechanisation of large-scale agricultural fields in developing countries - a review.

    PubMed

    Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila

    2016-09-01

    Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  12. Outlook and Challenges of Perovskite Solar Cells toward Terawatt-Scale Photovoltaic Module Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Kai; Kim, Donghoe; Whitaker, James B

    Rapid development of perovskite solar cells (PSCs) during the past several years has made this photovoltaic (PV) technology a serious contender for potential large-scale deployment on the terawatt scale in the PV market. To successfully transition PSC technology from the laboratory to industry scale, substantial efforts need to focus on scalable fabrication of high-performance perovskite modules with minimum negative environmental impact. Here, we provide an overview of the current research and our perspective regarding PSC technology toward future large-scale manufacturing and deployment. Several key challenges discussed are (1) a scalable process for large-area perovskite module fabrication; (2) less hazardous chemicalmore » routes for PSC fabrication; and (3) suitable perovskite module designs for different applications.« less

  13. Study and design of cryogenic propellant acquisition systems. Volume 1: Design studies

    NASA Technical Reports Server (NTRS)

    Burge, G. W.; Blackmon, J. B.

    1973-01-01

    An in-depth study and selection of practical propellant surface tension acquisition system designs for two specific future cryogenic space vehicles, an advanced cryogenic space shuttle auxiliary propulsion system and an advanced space propulsion module is reported. A supporting laboratory scale experimental program was also conducted to provide design information critical to concept finalization and selection. Designs using localized pressure isolated surface tension screen devices were selected for each application and preliminary designs were generated. Based on these designs, large scale acquisition prototype hardware was designed and fabricated to be compatible with available NASA-MSFC feed system hardware.

  14. Cosmological measurements with general relativistic galaxy correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raccanelli, Alvise; Montanari, Francesco; Durrer, Ruth

    We investigate the cosmological dependence and the constraining power of large-scale galaxy correlations, including all redshift-distortions, wide-angle, lensing and gravitational potential effects on linear scales. We analyze the cosmological information present in the lensing convergence and in the gravitational potential terms describing the so-called ''relativistic effects'', and we find that, while smaller than the information contained in intrinsic galaxy clustering, it is not negligible. We investigate how neglecting them does bias cosmological measurements performed by future spectroscopic and photometric large-scale surveys such as SKA and Euclid. We perform a Fisher analysis using the CLASS code, modified to include scale-dependent galaxymore » bias and redshift-dependent magnification and evolution bias. Our results show that neglecting relativistic terms, especially lensing convergence, introduces an error in the forecasted precision in measuring cosmological parameters of the order of a few tens of percent, in particular when measuring the matter content of the Universe and primordial non-Gaussianity parameters. The analysis suggests a possible substantial systematic error in cosmological parameter constraints. Therefore, we argue that radial correlations and integrated relativistic terms need to be taken into account when forecasting the constraining power of future large-scale number counts of galaxy surveys.« less

  15. A new resource for developing and strengthening large-scale community health worker programs.

    PubMed

    Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve

    2017-01-12

    Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.

  16. Imprint of thawing scalar fields on the large scale galaxy overdensity

    NASA Astrophysics Data System (ADS)

    Dinda, Bikash R.; Sen, Anjan A.

    2018-04-01

    We investigate the observed galaxy power spectrum for the thawing class of scalar field models taking into account various general relativistic corrections that occur on very large scales. We consider the full general relativistic perturbation equations for the matter as well as the dark energy fluid. We form a single autonomous system of equations containing both the background and the perturbed equations of motion which we subsequently solve for different scalar field potentials. First we study the percentage deviation from the Λ CDM model for different cosmological parameters as well as in the observed galaxy power spectra on different scales in scalar field models for various choices of scalar field potentials. Interestingly the difference in background expansion results from the enhancement of power from Λ CDM on small scales, whereas the inclusion of general relativistic (GR) corrections results in the suppression of power from Λ CDM on large scales. This can be useful to distinguish scalar field models from Λ CDM with future optical/radio surveys. We also compare the observed galaxy power spectra for tracking and thawing types of scalar field using some particular choices for the scalar field potentials. We show that thawing and tracking models can have large differences in observed galaxy power spectra on large scales and for smaller redshifts due to different GR effects. But on smaller scales and for larger redshifts, the difference is small and is mainly due to the difference in background expansion.

  17. The future of primordial features with large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Chen, Xingang; Dvorkin, Cora; Huang, Zhiqi; Namjoo, Mohammad Hossein; Verde, Licia

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  18. The future of primordial features with large-scale structure surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xingang; Namjoo, Mohammad Hossein; Dvorkin, Cora

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopicmore » and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.« less

  19. TEDS-M Encyclopedia: A Guide to Teacher Education Context, Structure, and Quality Assurance in 17 Countries. Findings from the IEA Teacher Education and Development Study in Mathematics (TEDS-M)

    ERIC Educational Resources Information Center

    Schwille, John, Ed.; Ingvarson, Lawrence, Ed.; Holdgreve-Resendez, Richard, Ed.

    2013-01-01

    The IEA Teacher Education and Development Study in Mathematics (TEDS-M) is the first large-scale international study of the preparation of primary and lower-secondary teachers. The study investigated the pedagogical and subject-specific knowledge that future primary and lower secondary school teachers acquire during their mathematics teacher…

  20. Simulating the impact of the large-scale circulation on the 2-m temperature and precipitation climatology

    NASA Astrophysics Data System (ADS)

    Bowden, Jared H.; Nolte, Christopher G.; Otte, Tanya L.

    2013-04-01

    The impact of the simulated large-scale atmospheric circulation on the regional climate is examined using the Weather Research and Forecasting (WRF) model as a regional climate model. The purpose is to understand the potential need for interior grid nudging for dynamical downscaling of global climate model (GCM) output for air quality applications under a changing climate. In this study we downscale the NCEP-Department of Energy Atmospheric Model Intercomparison Project (AMIP-II) Reanalysis using three continuous 20-year WRF simulations: one simulation without interior grid nudging and two using different interior grid nudging methods. The biases in 2-m temperature and precipitation for the simulation without interior grid nudging are unreasonably large with respect to the North American Regional Reanalysis (NARR) over the eastern half of the contiguous United States (CONUS) during the summer when air quality concerns are most relevant. This study examines how these differences arise from errors in predicting the large-scale atmospheric circulation. It is demonstrated that the Bermuda high, which strongly influences the regional climate for much of the eastern half of the CONUS during the summer, is poorly simulated without interior grid nudging. In particular, two summers when the Bermuda high was west (1993) and east (2003) of its climatological position are chosen to illustrate problems in the large-scale atmospheric circulation anomalies. For both summers, WRF without interior grid nudging fails to simulate the placement of the upper-level anticyclonic (1993) and cyclonic (2003) circulation anomalies. The displacement of the large-scale circulation impacts the lower atmosphere moisture transport and precipitable water, affecting the convective environment and precipitation. Using interior grid nudging improves the large-scale circulation aloft and moisture transport/precipitable water anomalies, thereby improving the simulated 2-m temperature and precipitation. The results demonstrate that constraining the RCM to the large-scale features in the driving fields improves the overall accuracy of the simulated regional climate, and suggest that in the absence of such a constraint, the RCM will likely misrepresent important large-scale shifts in the atmospheric circulation under a future climate.

  1. Projections of the Ganges-Brahmaputra precipitation: downscaled from GCM predictors

    USGS Publications Warehouse

    Pervez, Md Shahriar; Henebry, Geoffrey M.

    2014-01-01

    Downscaling Global Climate Model (GCM) projections of future climate is critical for impact studies. Downscaling enables use of GCM experiments for regional scale impact studies by generating regionally specific forecasts connecting global scale predictions and regional scale dynamics. We employed the Statistical Downscaling Model (SDSM) to downscale 21st century precipitation for two data-sparse hydrologically challenging river basins in South Asia—the Ganges and the Brahmaputra. We used CGCM3.1 by Canadian Center for Climate Modeling and Analysis version 3.1 predictors in downscaling the precipitation. Downscaling was performed on the basis of established relationships between historical Global Summary of Day observed precipitation records from 43 stations and National Center for Environmental Prediction re-analysis large scale atmospheric predictors. Although the selection of predictors was challenging during the set-up of SDSM, they were found to be indicative of important physical forcings in the basins. The precipitation of both basins was largely influenced by geopotential height: the Ganges precipitation was modulated by the U component of the wind and specific humidity at 500 and 1000 h Pa pressure levels; whereas, the Brahmaputra precipitation was modulated by the V component of the wind at 850 and 1000 h Pa pressure levels. The evaluation of the SDSM performance indicated that model accuracy for reproducing precipitation at the monthly scale was acceptable, but at the daily scale the model inadequately simulated some daily extreme precipitation events. Therefore, while the downscaled precipitation may not be the suitable input to analyze future extreme flooding or drought events, it could be adequate for analysis of future freshwater availability. Analysis of the CGCM3.1 downscaled precipitation projection with respect to observed precipitation reveals that the precipitation regime in each basin may be significantly impacted by climate change. Precipitation during and after the monsoon is likely to increase in both basins under the A1B and A2 emission scenarios; whereas, the pre-monsoon precipitation is likely to decrease. Peak monsoon precipitation is likely to shift from July to August, and may impact the livelihoods of large rural populations linked to subsistence agriculture in the basins. Uncertainty analysis of the downscaled precipitation indicated that the uncertainty in the downscaled precipitation was less than the uncertainty in the original CGCM3.1 precipitation; hence, the CGCM3.1 downscaled precipitation was a better input for the regional hydrological impact studies. However, downscaled precipitation from multiple GCMs is suggested for comprehensive impact studies.

  2. Using stable isotopes to assess surface water source dynamics and hydrological connectivity in a high-latitude wetland and permafrost influenced landscape

    NASA Astrophysics Data System (ADS)

    Ala-aho, P.; Soulsby, C.; Pokrovsky, O. S.; Kirpotin, S. N.; Karlsson, J.; Serikova, S.; Vorobyev, S. N.; Manasypov, R. M.; Loiko, S.; Tetzlaff, D.

    2018-01-01

    Climate change is expected to alter hydrological and biogeochemical processes in high-latitude inland waters. A critical question for understanding contemporary and future responses to environmental change is how the spatio-temporal dynamics of runoff generation processes will be affected. We sampled stable water isotopes in soils, lakes and rivers on an unprecedented spatio-temporal scale along a 1700 km transect over three years in the Western Siberia Lowlands. Our findings suggest that snowmelt mixes with, and displaces, large volumes of water stored in the organic soils and lakes to generate runoff during the thaw season. Furthermore, we saw a persistent hydrological connection between water bodies and the landscape across permafrost regions. Our findings help to bridge the understanding between small and large scale hydrological studies in high-latitude systems. These isotope data provide a means to conceptualise hydrological connectivity in permafrost and wetland influenced regions, which is needed for an improved understanding of future biogeochemical changes.

  3. Regional variability of the frequency distribution of daily precipitation and the synoptic characteristics of heavy precipitation events in present and future climate simulations

    NASA Astrophysics Data System (ADS)

    DeAngelis, Anthony M.

    Changes in the characteristics of daily precipitation in response to global warming may have serious impacts on human life and property. An analysis of precipitation in climate models is performed to evaluate how well the models simulate the present climate and how precipitation may change in the future. Models participating in phase 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) have substantial biases in their simulation of heavy precipitation intensity over parts of North America during the 20th century. Despite these biases, the large-scale atmospheric circulation accompanying heavy precipitation is either simulated realistically or the strength of the circulation is overestimated. The biases are not related to the large-scale flow in a simple way, pointing toward the importance of other model deficiencies, such as coarse horizontal resolution and convective parameterizations, for the accurate simulation of intense precipitation. Although the models may not sufficiently simulate the intensity of precipitation, their realistic portrayal of the large-scale circulation suggests that projections of future precipitation may be reliable. In the CMIP5 ensemble, the distribution of daily precipitation is projected to undergo substantial changes in response to future atmospheric warming. The regional distribution of these changes was investigated, revealing that dry days and days with heavy-extreme precipitation are projected to increase at the expense of light-moderate precipitation over much of the middle and low latitudes. Such projections have serious implications for future impacts from flood and drought events. In other places, changes in the daily precipitation distribution are characterized by a shift toward either wetter or drier conditions in the future, with heavy-extreme precipitation projected to increase in all but the driest subtropical subsidence regions. Further analysis shows that increases in heavy precipitation in midlatitudes are largely explained by thermodynamics, including increases in atmospheric water vapor. However, in low latitudes and northern high latitudes, changes in vertical velocity accompanying heavy precipitation are also important. The strength of the large-scale atmospheric circulation is projected to change in accordance with vertical velocity in many places, though the circulation patterns, and therefore physical mechanisms that generate heavy precipitation, may remain the same.

  4. Varying the forcing scale in low Prandtl number dynamos

    NASA Astrophysics Data System (ADS)

    Brandenburg, A.; Haugen, N. E. L.; Li, Xiang-Yu; Subramanian, K.

    2018-06-01

    Small-scale dynamos are expected to operate in all astrophysical fluids that are turbulent and electrically conducting, for example the interstellar medium, stellar interiors, and accretion disks, where they may also be affected by or competing with large-scale dynamos. However, the possibility of small-scale dynamos being excited at small and intermediate ratios of viscosity to magnetic diffusivity (the magnetic Prandtl number) has been debated, and the possibility of them depending on the large-scale forcing wavenumber has been raised. Here we show, using four values of the forcing wavenumber, that the small-scale dynamo does not depend on the scale-separation between the size of the simulation domain and the integral scale of the turbulence, i.e., the forcing scale. Moreover, the spectral bottleneck in turbulence, which has been implied as being responsible for raising the excitation conditions of small-scale dynamos, is found to be invariant under changing the forcing wavenumber. However, when forcing at the lowest few wavenumbers, the effective forcing wavenumber that enters in the definition of the magnetic Reynolds number is found to be about twice the minimum wavenumber of the domain. Our work is relevant to future studies of small-scale dynamos, of which several applications are being discussed.

  5. Potential Impacts of Offshore Wind Farms on North Sea Stratification

    PubMed Central

    Carpenter, Jeffrey R.; Merckelbach, Lucas; Callies, Ulrich; Clark, Suzanna; Gaslikova, Lidia; Baschek, Burkard

    2016-01-01

    Advances in offshore wind farm (OWF) technology have recently led to their construction in coastal waters that are deep enough to be seasonally stratified. As tidal currents move past the OWF foundation structures they generate a turbulent wake that will contribute to a mixing of the stratified water column. In this study we show that the mixing generated in this way may have a significant impact on the large-scale stratification of the German Bight region of the North Sea. This region is chosen as the focus of this study since the planning of OWFs is particularly widespread. Using a combination of idealised modelling and in situ measurements, we provide order-of-magnitude estimates of two important time scales that are key to understanding the impacts of OWFs: (i) a mixing time scale, describing how long a complete mixing of the stratification takes, and (ii) an advective time scale, quantifying for how long a water parcel is expected to undergo enhanced wind farm mixing. The results are especially sensitive to both the drag coefficient and type of foundation structure, as well as the evolution of the pycnocline under enhanced mixing conditions—both of which are not well known. With these limitations in mind, the results show that OWFs could impact the large-scale stratification, but only when they occupy extensive shelf regions. They are expected to have very little impact on large-scale stratification at the current capacity in the North Sea, but the impact could be significant in future large-scale development scenarios. PMID:27513754

  6. Potential Impacts of Offshore Wind Farms on North Sea Stratification.

    PubMed

    Carpenter, Jeffrey R; Merckelbach, Lucas; Callies, Ulrich; Clark, Suzanna; Gaslikova, Lidia; Baschek, Burkard

    2016-01-01

    Advances in offshore wind farm (OWF) technology have recently led to their construction in coastal waters that are deep enough to be seasonally stratified. As tidal currents move past the OWF foundation structures they generate a turbulent wake that will contribute to a mixing of the stratified water column. In this study we show that the mixing generated in this way may have a significant impact on the large-scale stratification of the German Bight region of the North Sea. This region is chosen as the focus of this study since the planning of OWFs is particularly widespread. Using a combination of idealised modelling and in situ measurements, we provide order-of-magnitude estimates of two important time scales that are key to understanding the impacts of OWFs: (i) a mixing time scale, describing how long a complete mixing of the stratification takes, and (ii) an advective time scale, quantifying for how long a water parcel is expected to undergo enhanced wind farm mixing. The results are especially sensitive to both the drag coefficient and type of foundation structure, as well as the evolution of the pycnocline under enhanced mixing conditions-both of which are not well known. With these limitations in mind, the results show that OWFs could impact the large-scale stratification, but only when they occupy extensive shelf regions. They are expected to have very little impact on large-scale stratification at the current capacity in the North Sea, but the impact could be significant in future large-scale development scenarios.

  7. From catchment scale hydrologic processes to numerical models and robust predictions of climate change impacts at regional scales

    NASA Astrophysics Data System (ADS)

    Wagener, T.

    2017-12-01

    Current societal problems and questions demand that we increasingly build hydrologic models for regional or even continental scale assessment of global change impacts. Such models offer new opportunities for scientific advancement, for example by enabling comparative hydrology or connectivity studies, and for improved support of water management decision, since we might better understand regional impacts on water resources from large scale phenomena such as droughts. On the other hand, we are faced with epistemic uncertainties when we move up in scale. The term epistemic uncertainty describes those uncertainties that are not well determined by historical observations. This lack of determination can be because the future is not like the past (e.g. due to climate change), because the historical data is unreliable (e.g. because it is imperfectly recorded from proxies or missing), or because it is scarce (either because measurements are not available at the right scale or there is no observation network available at all). In this talk I will explore: (1) how we might build a bridge between what we have learned about catchment scale processes and hydrologic model development and evaluation at larger scales. (2) How we can understand the impact of epistemic uncertainty in large scale hydrologic models. And (3) how we might utilize large scale hydrologic predictions to understand climate change impacts, e.g. on infectious disease risk.

  8. Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories

    NASA Astrophysics Data System (ADS)

    Park, Kiwan; Blackman, Eric G.; Subramanian, Kandaswamy

    2013-05-01

    Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.

  9. Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories.

    PubMed

    Park, Kiwan; Blackman, Eric G; Subramanian, Kandaswamy

    2013-05-01

    Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.

  10. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  11. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nusser, Adi; Branchini, Enzo; Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, becausemore » of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.« less

  12. Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate

    PubMed Central

    Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.

    2015-01-01

    Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906

  13. Quarter Scale RLV Multi-Lobe LH2 Tank Test Program

    NASA Technical Reports Server (NTRS)

    Blum, Celia; Puissegur, Dennis; Tidwell, Zeb; Webber, Carol

    1998-01-01

    Thirty cryogenic pressure cycles have been completed on the Lockheed Martin Michoud Space Systems quarter scale RLV composite multi-lobe liquid hydrogen propellant tank assembly, completing the initial phases of testing and demonstrating technologies key to the success of large scale composite cryogenic tankage for X33, RLV, and other future launch vehicles.

  14. Epilepsy Genetics—Past, Present, and Future

    PubMed Central

    Poduri, Annapurna; Lowenstein, Daniel

    2014-01-01

    Human epilepsy is a common and heterogeneous condition in which genetics play an important etiological role. We begin by reviewing the past history of epilepsy genetics, a field that has traditionally included studies of pedigrees with epilepsy caused by defects in ion channels and neurotransmitters. We highlight important recent discoveries that have expanded the field beyond the realm of channels and neurotransmitters and that have challenged the notion that single genes produce single disorders. Finally, we project toward an exciting future for epilepsy genetics as large-scale collaborative phenotyping studies come face to face with new technologies in genomic medicine. PMID:21277190

  15. [Japanese epidemiologic investigation for non-steroidal anti-inflammatory drugs-induced ulcers].

    PubMed

    Miyake, Kazumasa; Sakamoto, Choitsu

    2011-06-01

    This review summaried epidemiologic investigation for non-steroidal anti-inflammatory drugs (NSAIDs)-induced ulcers to focus on the Japanese evidence. In Japan, national health insurance does not cover procedures that prevent or lower the risk for NSAIDs-induced ulcer. In NSAIDs treatment to patients with risk factors, it is desirable to administer antiulcer agents. However, in Japan, there are no large-scale studies on the efficacy of co-medication such as proton pump inhibitors, prostaglandin analogs (misoprostol) or histamine-H2 receptor antagonists or on the effectiveness of H. pylori eradication or selective COX-2 antagonists. In the future, large-scale clinical studies should be conducted to accumulate high quality evidence including cost-effectiveness and overall safety including cardiovascular events, because Japanese differ from Westerners in several genetical or acquired factors.

  16. Consideration of future safety consequences: a new predictor of employee safety.

    PubMed

    Probst, Tahira M; Graso, Maja; Estrada, Armando X; Greer, Sarah

    2013-06-01

    Compliance with safety behaviors is often associated with longer term benefits, but may require some short-term sacrifices. This study examines the extent to which consideration of future safety consequences (CFSC) predicts employee safety outcomes. Two field studies were conducted to evaluate the reliability and validity of the newly developed Consideration of Future Safety Consequences (CFSC) scale. Surveys containing the CFSC scale and other measures of safety attitudes, behaviors, and outcomes were administered during working hours to a sample of 128 pulp and paper mill employees; after revising the CFSC scale based on these initial results, follow-up survey data were collected in a second sample of 212 copper miners. In Study I, CFSC was predictive of employee safety knowledge and motivation, compliance, safety citizenship behaviors, accident reporting attitudes and behaviors, and workplace injuries - even after accounting for conscientiousness and demographic variables. Moreover, the effects of CFSC on the variables generally appear to be direct, as opposed to mediated by safety knowledge or motivation. These findings were largely replicated in Study II. CFSC appears to be an important personality construct that may predict those individuals who are more likely to comply with safety rules and have more positive safety outcomes. Future research should examine the longitudinal stability of CFSC to determine the extent to which this construct is a stable trait, rather than a safety attitude amenable to change over time or following an intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. An Assessment of Potential Mining Impacts on Salmon ...

    EPA Pesticide Factsheets

    The Bristol Bay watershed in southwestern Alaska supports the largest sockeye salmon fishery in the world, is home to 25 federally recognized tribal governments, and contains large mineral resources. The potential for large-scale mining activities in the watershed has raised concerns about the impact of mining on the sustainability of Bristol Bay’s world-class commercial, recreational and subsistence fisheries and the future of Alaska Native tribes in the watershed who have maintained a salmon-based culture and subsistence-based way of life for at least 4,000 years. The purpose of this assessment is to provide a characterization of the biological and mineral resources of the Bristol Bay watershed, increase understanding of the potential impacts of large-scale mining on the region’s fish resources, and inform future government decisions related to protecting and maintaining the chemical, physical, and biological integrity of the watershed. It will also serve as a technical resource for the public, tribes, and governments who must consider how best to address the challenges of mining and ecological protection in the Bristol Bay watershed. The purpose of this assessment is to understand how future large-scale mining may affect water quality and the Bristol Bay salmon fisheries, which includes the largest wild sockeye salmon fishery in the world. Bristol Bay, Alaska, is home to a salmon fishery that is of significant economic and subsistence value to the peopl

  19. An Alternative to the Search for Single Polymorphisms: Toward Molecular Personality Scales for the Five-Factor Model

    PubMed Central

    McCrae, Robert R.; Scally, Matthew; Terracciano, Antonio; Abecasis, Gonçalo R.; Costa, Paul T.

    2011-01-01

    There is growing evidence that personality traits are affected by many genes, all of which have very small effects. As an alternative to the largely-unsuccessful search for individual polymorphisms associated with personality traits, we identified large sets of potentially related single nucleotide polymorphisms (SNPs) and summed them to form molecular personality scales (MPSs) with from 4 to 2,497 SNPs. Scales were derived from two-thirds of a large (N = 3,972) sample of individuals from Sardinia who completed the Revised NEO Personality Inventory and were assessed in a genome-wide association scan. When MPSs were correlated with the phenotype in the remaining third of the sample, very small but significant associations were found for four of the five personality factors when the longest scales were examined. These data suggest that MPSs for Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness (but not Extraversion) contain genetic information that can be refined in future studies, and the procedures described here should be applicable to other quantitative traits. PMID:21114353

  20. Keeping on Track: Performance Profiles of Low Performers in Academic Educational Tracks

    ERIC Educational Resources Information Center

    Reed, Helen C.; van Wesel, Floryt; Ouwehand, Carolijn; Jolles, Jelle

    2015-01-01

    In countries with high differentiation between academic and vocational education, an individual's future prospects are strongly determined by the educational track to which he or she is assigned. This large-scale, cross-sectional study focuses on low-performing students in academic tracks who face being moved to a vocational track. If more is…

  1. The Educational Predicament Confronting Taiwan's Gifted Programs: An Evaluation of Current Practices and Future Challenges

    ERIC Educational Resources Information Center

    Kao, Chen-yao

    2012-01-01

    This study examines the current problems affecting Taiwan's gifted education through a large-scale gifted program evaluation. Fifty-one gifted classes at 15 elementary schools and 62 gifted classes at 18 junior high schools were evaluated. The primary activities included in this biennial evaluation were document review, observation of…

  2. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  3. Experience with specifications applicable to certification. [of photovoltaic modules for large-scale application

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1982-01-01

    The Jet Propulsion Laboratory has developed a number of photovoltaic test and measurement specifications to guide the development of modules toward the requirements of future large-scale applications. Experience with these specifications and the extensive module measurement and testing that has accompanied their use is examined. Conclusions are drawn relative to three aspects of product certification: performance measurement, endurance testing and safety evaluation.

  4. Tritium

    DTIC Science & Technology

    2011-11-01

    fusion energy -production processes of the particular type of reactor using a lithium (Li) blanket or related alloys such as the Pb-17Li eutectic. As such, tritium breeding is intimately connected with energy production, thermal management, radioactivity management, materials properties, and mechanical structures of any plausible future large-scale fusion power reactor. JASON is asked to examine the current state of scientific knowledge and engineering practice on the physical and chemical bases for large-scale tritium

  5. Projections of water stress based on an ensemble of socioeconomic growth and climate change scenarios: A case study in Asia

    DOE PAGES

    Fant, Charles; Schlosser, C. Adam; Gao, Xiang; ...

    2016-03-30

    The sustainability of future water resources is of paramount importance and is affected by many factors, including population, wealth and climate. Inherent in current methods to estimate these factors in the future is the uncertainty of their prediction. In this study, we integrate a large ensemble of scenarios—internally consistent across economics, emissions, climate, and population—to develop a risk portfolio of water stress over a large portion of Asia that includes China, India, and Mainland Southeast Asia in a future with unconstrained emissions. We isolate the effects of socioeconomic growth from the effects of climate change in order to identify themore » primary drivers of stress on water resources. We find that water needs related to socioeconomic changes, which are currently small, are likely to increase considerably in the future, often overshadowing the effect of climate change on levels of water stress. As a result, there is a high risk of severe water stress in densely populated watersheds by 2050, compared to recent history. There is strong evidence to suggest that, in the absence of autonomous adaptation or societal response, a much larger portion of the region’s population will live in water-stressed regions in the near future. Lastly, tools and studies such as these can effectively investigate large-scale system sensitivities and can be useful in engaging and informing decision makers.« less

  6. Webinar July 28: H2@Scale - A Potential Opportunity | News | NREL

    Science.gov Websites

    role of hydrogen at the grid scale and the efforts of a large, national lab team assembled to evaluate the potential of hydrogen to play a critical role in our energy future. Presenters will share facts

  7. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  8. Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  9. Fabrication of the HIAD Large-Scale Demonstration Assembly

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  10. Characterization of Sound Radiation by Unresolved Scales of Motion in Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Rubinstein, Robert; Zhou, Ye

    1999-01-01

    Evaluation of the sound sources in a high Reynolds number turbulent flow requires time-accurate resolution of an extremely large number of scales of motion. Direct numerical simulations will therefore remain infeasible for the forseeable future: although current large eddy simulation methods can resolve the largest scales of motion accurately the, they must leave some scales of motion unresolved. A priori studies show that acoustic power can be underestimated significantly if the contribution of these unresolved scales is simply neglected. In this paper, the problem of evaluating the sound radiation properties of the unresolved, subgrid-scale motions is approached in the spirit of the simplest subgrid stress models: the unresolved velocity field is treated as isotropic turbulence with statistical descriptors, evaluated from the resolved field. The theory of isotropic turbulence is applied to derive formulas for the total power and the power spectral density of the sound radiated by a filtered velocity field. These quantities are compared with the corresponding quantities for the unfiltered field for a range of filter widths and Reynolds numbers.

  11. Analyzing the cosmic variance limit of remote dipole measurements of the cosmic microwave background using the large-scale kinetic Sunyaev Zel'dovich effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terrana, Alexandra; Johnson, Matthew C.; Harris, Mary-Jean, E-mail: aterrana@perimeterinstitute.ca, E-mail: mharris8@perimeterinstitute.ca, E-mail: mjohnson@perimeterinstitute.ca

    Due to cosmic variance we cannot learn any more about large-scale inhomogeneities from the primary cosmic microwave background (CMB) alone. More information on large scales is essential for resolving large angular scale anomalies in the CMB. Here we consider cross correlating the large-scale kinetic Sunyaev Zel'dovich (kSZ) effect and probes of large-scale structure, a technique known as kSZ tomography. The statistically anisotropic component of the cross correlation encodes the CMB dipole as seen by free electrons throughout the observable Universe, providing information about long wavelength inhomogeneities. We compute the large angular scale power asymmetry, constructing the appropriate transfer functions, andmore » estimate the cosmic variance limited signal to noise for a variety of redshift bin configurations. The signal to noise is significant over a large range of power multipoles and numbers of bins. We present a simple mode counting argument indicating that kSZ tomography can be used to estimate more modes than the primary CMB on comparable scales. A basic forecast indicates that a first detection could be made with next-generation CMB experiments and galaxy surveys. This paper motivates a more systematic investigation of how close to the cosmic variance limit it will be possible to get with future observations.« less

  12. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    NASA Astrophysics Data System (ADS)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.

  13. Spatial synchrony of local populations has increased in association with the recent Northern Hemisphere climate trend.

    PubMed

    Post, Eric; Forchhammer, Mads C

    2004-06-22

    According to ecological theory, populations whose dynamics are entrained by environmental correlation face increased extinction risk as environmental conditions become more synchronized spatially. This prediction is highly relevant to the study of ecological consequences of climate change. Recent empirical studies have indicated, for example, that large-scale climate synchronizes trophic interactions and population dynamics over broad spatial scales in freshwater and terrestrial systems. Here, we present an analysis of century-scale, spatially replicated data on local weather and the population dynamics of caribou in Greenland. Our results indicate that spatial autocorrelation in local weather has increased with large-scale climatic warming. This increase in spatial synchrony of environmental conditions has been matched, in turn, by an increase in the spatial synchrony of local caribou populations toward the end of the 20th century. Our results indicate that spatial synchrony in environmental conditions and the populations influenced by them are highly variable through time and can increase with climatic warming. We suggest that if future warming can increase population synchrony, it may also increase extinction risk.

  14. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    PubMed

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).

  15. Hope for the Forests? Habitat Resiliency Illustrated in the Face of Climate Change Using Fine-Scale Modeling

    NASA Astrophysics Data System (ADS)

    Flint, L. E.; Flint, A. L.; Weiss, S. B.; Micheli, E. R.

    2010-12-01

    In the face of rapid climate change, fine-scale predictions of landscape change are of extreme interest to land managers that endeavor to develop long term adaptive strategies for maintaining biodiversity and ecosystem services. Global climate model (GCM) outputs, which generally focus on estimated increases in air temperature, are increasingly applied to species habitat distribution models. For sensitive species subject to climate change, habitat models predict significant migration (either northward or towards higher elevations), or complete extinction. Current studies typically rely on large spatial scale GCM projections (> 10 km) of changes in precipitation and air temperature: at this scale, these models necessarily neglect subtleties of topographic shading, geomorphic expression of the landscape, and fine-scale differences in soil properties - data that is readily available at meaningful local scales. Recent advances in modeling take advantage of available soils, geology, and topographic data to construct watershed-scale scenarios using GCM inputs and result in improved correlations of vegetation distribution with temperature. For this study, future climate projections were downscaled to 270-m and applied to a physically-based hydrologic model to calculate future changes in recharge, runoff, and climatic water deficit (CWD) for basins draining into the northern San Francisco Bay. CWD was analyzed for mapped vegetation types to evaluate the range of CWD for historic time periods in comparison to future time periods. For several forest communities (including blue oak woodlands, montane hardwoods, douglas-fir, and coast redwood) existing landscape area exhibiting suitable CWD diminishes by up 80 percent in the next century, with a trend towards increased CWD throughout the region. However, no forest community loses all suitable habitat, with islands of potential habitat primarily remaining on north facing slopes and deeper soils. Creation of new suitable habitat is also predicted throughout the region. Results have direct application to management issues of habitat connectivity, forest land protection and acquisition, and active management solutions such as transplanting or assisted migration. Although this analysis considers only one driver of forest habitat distribution, consideration of hydrologic derivatives at a fine scale explains current forest community distributions and provides a far more informed perspective on potential future forest distributions. Results demonstrate the utility of fine-scale modeling and provide landscape managers and conservation agencies valuable management tools in fine-scale future forest scenarios and a framework for evaluating forest resiliency in a changing climate.

  16. From J. J. Thomson to FAIR, what do we learn from Large-Scale Mass and Half-Life Measurements of Bare and Few-Electron Ions?

    NASA Astrophysics Data System (ADS)

    Münzenberg, Gottfried; Geissel, Hans; Litvinov, Yuri A.

    2010-04-01

    This contribution is based on the combination of the talks: "What can we learn from large-scale mass measurements," "Present and future experiments with stored exotic nuclei at relativistic energies," and "Beta decay of highly-charged ions." Studying the nuclear mass surface gives information on the evolution of nuclear structure such as nuclear shells, the onset of deformation and the drip-lines. Previously, most of the masses far-off stability has been obtained from decay data. Modern methods allow direct mass measurements. They are much more sensitive, down to single atoms, access short-lived species and have high accuracy. Large-scale explorations of the nuclear mass surface are ideally performed with the combination of the in-flight FRagment Separator FRS and the Experimental Storage Ring ESR. After a brief historic introduction selected examples such as the evolution of shell closures far-off stability and the proton-neutron interaction will be discussed in the framework of our data. Recently, the experiments have been extended and led to the discovery of new heavy neutron-rich isotopes along with their mass and lifetime measurements. Storage rings applied at relativistic energies are a unique tool to study the radioactive decay of bare or few-electron atomic nuclei. New features observed with the analysis of stored circulating mother and daughter ions including oscillations in the decay curves of hydrogen-like nuclei will be addressed. Future experiments with NUSTAR at FAIR will further extend our knowledge to the borderlines of nuclear existence.

  17. Applications for Micrographics in Large Scale Information Systems of the Future. Volume I: Part I. Summary. Part II. Five-Year Plan for DDC Micrographic Development Actions.

    ERIC Educational Resources Information Center

    Information Dynamics Corp., Reading, MA.

    A study intended to provide the Defense Documentation Center (DDC) with a five-year plan for the development of improved and new microfiche products, services, and production capabilities is summarized in this report. In addition, the major findings, conclusions, and recommendations developed during the study are noted. The results of the research…

  18. A study of the efficiency of hydrogen liquefaction. [jet aircraft applications

    NASA Technical Reports Server (NTRS)

    Baker, C. R.; Shaner, R. L.

    1976-01-01

    The search for an environmentally acceptable fuel to eventually replace petroleum-based fuels for long-range jet aircraft has singled out liquid hydrogen as an outstanding candidate. Hydrogen liquefaction is discussed, along with the effect of several operating parameters on process efficiency. A feasible large-scale commercial hydrogen liquefaction facility based on the results of the efficiency study is described. Potential future improvements in hydrogen liquefaction are noted.

  19. Towards a global water scarcity risk assessment framework: using scenarios and risk distributions

    NASA Astrophysics Data System (ADS)

    Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Philip

    2016-04-01

    Over the past decades, changing hydro-climatic and socioeconomic conditions have led to increased water scarcity problems. A large number of studies have shown that these water scarcity conditions will worsen in the near future. Despite numerous calls for risk-based assessments of water scarcity, a framework that includes UNISDR's definition of risk does not yet exist at the global scale. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change projections and socioeconomic scenarios. Our study highlights that water scarcity risk increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity in terms of Expected Annual Exposed Population, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels. Covering hazard, exposure, and vulnerability, risk-based methods are well-suited to assess water scarcity adaptation. Completing the presented risk framework therefore offers water managers a promising perspective to increase water security in a well-informed and adaptive manner.

  20. Production of black holes and their angular momentum distribution in models with split fermions

    NASA Astrophysics Data System (ADS)

    Dai, De-Chang; Starkman, Glenn D.; Stojkovic, Dejan

    2006-05-01

    In models with TeV-scale gravity it is expected that mini black holes will be produced in near-future accelerators. On the other hand, TeV-scale gravity is plagued with many problems like fast proton decay, unacceptably large n-n¯ oscillations, flavor changing neutral currents, large mixing between leptons, etc. Most of these problems can be solved if different fermions are localized at different points in the extra dimensions. We study the cross section for the production of black holes and their angular momentum distribution in these models with “split” fermions. We find that, for a fixed value of the fundamental mass scale, the total production cross section is reduced compared with models where all the fermions are localized at the same point in the extra dimensions. Fermion splitting also implies that the bulk component of the black hole angular momentum must be taken into account in studies of the black hole decay via Hawking radiation.

  1. Thermodynamic and dynamic contributions to future changes in summer precipitation over Northeast Asia and Korea: a multi-RCM study

    NASA Astrophysics Data System (ADS)

    Lee, Donghyun; Min, Seung-Ki; Jin, Jonghun; Lee, Ji-Woo; Cha, Dong-Hyun; Suh, Myoung-Seok; Ahn, Joong-Bae; Hong, Song-You; Kang, Hyun-Suk; Joh, Minsu

    2017-12-01

    This study examines future changes in precipitation over Northeast Asia and Korea using five regional climate model (RCM) simulations driven by single global climate model (GCM) under two representative concentration pathway (RCP) emission scenarios. Focusing on summer season (June-July-August) when heavy rains dominate in this region, future changes in precipitation and associated variables including temperature, moisture, and winds are analyzed by comparing future conditions (2071-2100) with a present climate (1981-2005). Physical mechanisms are examined by analyzing moisture flux convergence at 850 hPa level, which is found to have a close relationship to precipitation and by assessing contribution of thermodynamic effect (TH, moisture increase due to warming) and dynamic effect (DY, atmospheric circulation change) to changes in the moisture flux convergence. Overall background warming and moistening are projected over the Northeast Asia with a good inter-RCM agreement, indicating dominant influence of the driving GCM. Also, RCMs consistently project increases in the frequency of heavy rains and the intensification of extreme precipitation over South Korea. Analysis of moisture flux convergence reveals competing impacts between TH and DY. The TH effect contributes to the overall increases in mean precipitation over Northeast Asia and in extreme precipitation over South Korea, irrespective of models and scenarios. However, DY effect is found to induce local-scale precipitation decreases over the central part of the Korean Peninsula with large inter-RCM and inter-scenario differences. Composite analysis of daily anomaly synoptic patterns indicates that extreme precipitation events are mainly associated with the southwest to northeast evolution of large-scale low-pressure system in both present and future climates.

  2. Advancing flood risk analysis by integrating adaptive behaviour in large-scale flood risk assessments

    NASA Astrophysics Data System (ADS)

    Haer, T.; Botzen, W.; Aerts, J.

    2016-12-01

    In the last four decades the global population living in the 1/100 year-flood zone has doubled from approximately 500 million to a little less than 1 billion people. Urbanization in low lying -flood prone- cities further increases the exposed assets, such as buildings and infrastructure. Moreover, climate change will further exacerbate flood risk in the future. Accurate flood risk assessments are important to inform policy-makers and society on current- and future flood risk levels. However, these assessment suffer from a major flaw in the way they estimate flood vulnerability and adaptive behaviour of individuals and governments. Current flood risk projections commonly assume that either vulnerability remains constant, or try to mimic vulnerability through incorporating an external scenario. Such a static approach leads to a misrepresentation of future flood risk, as humans respond adaptively to flood events, flood risk communication, and incentives to reduce risk. In our study, we integrate adaptive behaviour in a large-scale European flood risk framework through an agent-based modelling approach. This allows for the inclusion of heterogeneous agents, which dynamically respond to each other and a changing environment. We integrate state-of-the-art flood risk maps based on climate scenarios (RCP's), and socio-economic scenarios (SSP's), with government and household agents, which behave autonomously based on (micro-)economic behaviour rules. We show for the first time that excluding adaptive behaviour leads to a major misrepresentation of future flood risk. The methodology is applied to flood risk, but has similar implications for other research in the field of natural hazards. While more research is needed, this multi-disciplinary study advances our understanding of how future flood risk will develop.

  3. Modelling the large-scale redshift-space 3-point correlation function of galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.

    2017-08-01

    We present a configuration-space model of the large-scale galaxy 3-point correlation function (3PCF) based on leading-order perturbation theory and including redshift-space distortions (RSD). This model should be useful in extracting distance-scale information from the 3PCF via the baryon acoustic oscillation method. We include the first redshift-space treatment of biasing by the baryon-dark matter relative velocity. Overall, on large scales the effect of RSD is primarily a renormalization of the 3PCF that is roughly independent of both physical scale and triangle opening angle; for our adopted Ωm and bias values, the rescaling is a factor of ˜1.8. We also present an efficient scheme for computing 3PCF predictions from our model, important for allowing fast exploration of the space of cosmological parameters in future analyses.

  4. Technical instrumentation R&D for ILD SiW ECAL large scale device

    NASA Astrophysics Data System (ADS)

    Balagura, V.

    2018-03-01

    Calorimeters with silicon detectors have many unique features and are proposed for several world-leading experiments. We describe the R&D program of the large scale detector element with up to 12 000 readout channels for the International Large Detector (ILD) at the future e+e‑ ILC collider. The program is focused on the readout front-end electronics embedded inside the calorimeter. The first part with 2 000 channels and two small silicon sensors has already been constructed, the full prototype is planned for the beginning of 2018.

  5. Models projecting the fate of fish populations under climate change need to be based on valid physiological mechanisms.

    PubMed

    Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E

    2017-09-01

    Some recent modelling papers projecting smaller fish sizes and catches in a warmer future are based on erroneous assumptions regarding (i) the scaling of gills with body mass and (ii) the energetic cost of 'maintenance'. Assumption (i) posits that insurmountable geometric constraints prevent respiratory surface areas from growing as fast as body volume. It is argued that these constraints explain allometric scaling of energy metabolism, whereby larger fishes have relatively lower mass-specific metabolic rates. Assumption (ii) concludes that when fishes reach a certain size, basal oxygen demands will not be met, because of assumption (i). We here demonstrate unequivocally, by applying accepted physiological principles with reference to the existing literature, that these assumptions are not valid. Gills are folded surfaces, where the scaling of surface area to volume is not constrained by spherical geometry. The gill surface area can, in fact, increase linearly in proportion to gill volume and body mass. We cite the large body of evidence demonstrating that respiratory surface areas in fishes reflect metabolic needs, not vice versa, which explains the large interspecific variation in scaling of gill surface areas. Finally, we point out that future studies basing their predictions on models should incorporate factors for scaling of metabolic rate and for temperature effects on metabolism, which agree with measured values, and should account for interspecific variation in scaling and temperature effects. It is possible that some fishes will become smaller in the future, but to make reliable predictions the underlying mechanisms need to be identified and sought elsewhere than in geometric constraints on gill surface area. Furthermore, to ensure that useful information is conveyed to the public and policymakers about the possible effects of climate change, it is necessary to improve communication and congruity between fish physiologists and fisheries scientists. © 2017 John Wiley & Sons Ltd.

  6. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  7. Study of Travelling Interplanetary Phenomena Report

    NASA Astrophysics Data System (ADS)

    Dryer, Murray

    1987-09-01

    Scientific progress on the topic of energy, mass, and momentum transport from the Sun into the heliosphere is contingent upon interdisciplinary and international cooperative efforts on the part of many workers. Summarized here is a report of some highlights of research carried out during the SMY/SMA by the STIP (Study of Travelling Interplanetary Phenomena) Project that included solar and interplanetary scientists around the world. These highlights are concerned with coronal mass ejections from solar flares or erupting prominences (sometimes together); their large-scale consequences in interplanetary space (such as shocks and magnetic 'bubbles'); and energetic particles and their relationship to these large-scale structures. It is concluded that future progress is contingent upon similar international programs assisted by real-time (or near-real-time) warnings of solar activity by cooperating agencies along the lines experienced during the SMY/SMA.

  8. THE IMPACT OF WINTER NH3 EMISSION REDUCTIONS ON INORGANIC PARTICULATE MATTER UNDER PRESENT AND FUTURE REGULATED CONDITIONS

    EPA Science Inventory

    Recent regulation by the US Environmental Protection Agency requires large-scale emission reductions of NOx and SO2. This study estimates the impact of these changes on the sensitivity of PM2.5 to NH3 emission reductions and the reduce...

  9. Comparing Future Teachers' Beliefs across Countries: Approximate Measurement Invariance with Bayesian Elastic Constraints for Local Item Dependence and Differential Item Functioning

    ERIC Educational Resources Information Center

    Braeken, Johan; Blömeke, Sigrid

    2016-01-01

    Using data from the international Teacher Education and Development Study: Learning to Teach Mathematics (TEDS-M), the measurement equivalence of teachers' beliefs across countries is investigated for the case of "mathematics-as-a fixed-ability". Measurement equivalence is a crucial topic in all international large-scale assessments and…

  10. The Future of Stellar Populations Studies in the Milky Way and the Local Group

    NASA Astrophysics Data System (ADS)

    Majewski, Steven R.

    2010-04-01

    The last decade has seen enormous progress in understanding the structure of the Milky Way and neighboring galaxies via the production of large-scale digital surveys of the sky like 2MASS and SDSS, as well as specialized, counterpart imaging surveys of other Local Group systems. Apart from providing snaphots of galaxy structure, these “cartographic” surveys lend insights into the formation and evolution of galaxies when supplemented with additional data (e.g., spectroscopy, astrometry) and when referenced to theoretical models and simulations of galaxy evolution. These increasingly sophisticated simulations are making ever more specific predictions about the detailed chemistry and dynamics of stellar populations in galaxies. To fully exploit, test and constrain these theoretical ventures demands similar commitments of observational effort as has been plied into the previous imaging surveys to fill out other dimensions of parameter space with statistically significant intensity. Fortunately the future of large-scale stellar population studies is bright with a number of grand projects on the horizon that collectively will contribute a breathtaking volume of information on individual stars in Local Group galaxies. These projects include: (1) additional imaging surveys, such as Pan-STARRS, SkyMapper and LSST, which, apart from providing deep, multicolor imaging, yield time series data useful for revealing variable stars (including critical standard candles, like RR Lyrae variables) and creating large-scale, deep proper motion catalogs; (2) higher accuracy, space-based astrometric missions, such as Gaia and SIM-Lite, which stand to provide critical, high precision dynamical data on stars in the Milky Way and its satellites; and (3) large-scale spectroscopic surveys provided by RAVE, APOGEE, HERMES, LAMOST, and the Gaia spectrometer, which will yield not only enormous numbers of stellar radial velocities, but extremely comprehensive views of the chemistry of stellar populations. Meanwhile, previously dust-obscured regions of the Milky Way will continue to be systematically exposed via large infrared surveys underway or on the way, such as the various GLIMPSE surveys from Spitzer's IRAC instrument, UKIDSS, APOGEE, JASMINE and WISE.

  11. : “Developing Regional Modeling Techniques Applicable for Simulating Future Climate Conditions in the Carolinas”

    EPA Science Inventory

    Global climate models (GCMs) are currently used to obtain information about future changes in the large-scale climate. However, such simulations are typically done at coarse spatial resolutions, with model grid boxes on the order of 100 km on a horizontal side. Therefore, techniq...

  12. Importance of Anthropogenic Aerosols for Climate Prediction: a Study on East Asian Sulfate Aerosols

    NASA Astrophysics Data System (ADS)

    Bartlett, R. E.; Bollasina, M. A.

    2017-12-01

    Climate prediction is vital to ensure that we are able to adapt to our changing climate. Understandably, the main focus for such prediction is greenhouse gas forcing, as this will be the main anthropogenic driver of long-term global climate change; however, other forcings could still be important. Atmospheric aerosols represent one such forcing, especially in regions with high present-day aerosol loading such as Asia; yet, uncertainty in their future emissions are under-sampled by commonly used climate forcing projections, such as the Representative Concentration Pathways (RCPs). Globally, anthropogenic aerosols exert a net cooling, but their effects show large variation at regional scales. Studies have shown that aerosols impact locally upon temperature, precipitation and hydroclimate, and also upon larger scale atmospheric circulation (for example, the Asian monsoon) with implications for climate remote from aerosol sources. We investigate how future climate could evolve differently given the same greenhouse gas forcing pathway but differing aerosol emissions. Specifically, we use climate modelling experiments (using HadGEM2-ES) of two scenarios based upon RCP2.6 greenhouse gas forcing but with large differences in sulfur dioxide emissions over East Asia. Results show that increased sulfate aerosols (associated with increased sulfur dioxide) lead to large regional cooling through aerosol-radiation and aerosol-cloud interactions. Focussing on dynamical mechanisms, we explore the consequences of this cooling for the Asian summer and winter monsoons. In addition to local temperature and precipitation changes, we find significant changes to large scale atmospheric circulation. Wave-like responses to upper-level atmospheric changes propagate across the northern hemisphere with far-reaching effects on surface climate, for example, cooling over Europe. Within the tropics, we find alterations to zonal circulation (notably, shifts in the Pacific Walker cell) and monsoon systems outside of Asia. These results indicate that anthropogenic aerosols have significant climate impacts against a background of greenhouse gas-induced climate change, and thus represent a key source of uncertainty in near-term climate projection that should be seriously considered in future climate assessments.

  13. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends

    DOE PAGES

    Grotjahn, Richard; Black, Robert; Leung, Ruby; ...

    2015-05-22

    This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less

  14. Performance of the first Japanese large-scale facility for radon inhalation experiments with small animals.

    PubMed

    Ishimori, Yuu; Mitsunobu, Fumihiro; Yamaoka, Kiyonori; Tanaka, Hiroshi; Kataoka, Takahiro; Sakoda, Akihiro

    2011-07-01

    A radon test facility for small animals was developed in order to increase the statistical validity of differences of the biological response in various radon environments. This paper illustrates the performances of that facility, the first large-scale facility of its kind in Japan. The facility has a capability to conduct approximately 150 mouse-scale tests at the same time. The apparatus for exposing small animals to radon has six animal chamber groups with five independent cages each. Different radon concentrations in each animal chamber group are available. Because the first target of this study is to examine the in vivo behaviour of radon and its effects, the major functions to control radon and to eliminate thoron were examined experimentally. Additionally, radon progeny concentrations and their particle size distributions in the cages were also examined experimentally to be considered in future projects.

  15. Computational aerodynamics development and outlook /Dryden Lecture in Research for 1979/

    NASA Technical Reports Server (NTRS)

    Chapman, D. R.

    1979-01-01

    Some past developments and current examples of computational aerodynamics are briefly reviewed. An assessment is made of the requirements on future computer memory and speed imposed by advanced numerical simulations, giving emphasis to the Reynolds averaged Navier-Stokes equations and to turbulent eddy simulations. Experimental scales of turbulence structure are used to determine the mesh spacings required to adequately resolve turbulent energy and shear. Assessment also is made of the changing market environment for developing future large computers, and of the projections of micro-electronics memory and logic technology that affect future computer capability. From the two assessments, estimates are formed of the future time scale in which various advanced types of aerodynamic flow simulations could become feasible. Areas of research judged especially relevant to future developments are noted.

  16. Impacts of elevated temperature on ant species, communities and ecological roles at two temperate forests in Eastern North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Robert

    2014-04-01

    Over the course of five years we have established a long-term array of warming chambers at Duke and Harvard Forest that simulate future conditions with regard to temperature. In these chambers, we have studied, ants, other animal taxa, fungi, bacteria and plants and their responses to the treatments. We have coupled these studies with lab experiments, large-scale observations, and models to contextualize our results. Finally, we have developed integrative models of the future distribution of species and their consequences as a result of warming in eastern North America and more generally.

  17. Extreme weather: Subtropical floods and tropical cyclones

    NASA Astrophysics Data System (ADS)

    Shaevitz, Daniel A.

    Extreme weather events have a large effect on society. As such, it is important to understand these events and to project how they may change in a future, warmer climate. The aim of this thesis is to develop a deeper understanding of two types of extreme weather events: subtropical floods and tropical cyclones (TCs). In the subtropics, the latitude is high enough that quasi-geostrophic dynamics are at least qualitatively relevant, while low enough that moisture may be abundant and convection strong. Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. In the first part of this thesis, I examine the possible triggering of convection by the large-scale dynamics and investigate the coupling between the two. Specifically two examples of extreme precipitation events in the subtropics are analyzed, the 2010 and 2014 floods of India and Pakistan and the 2015 flood of Texas and Oklahoma. I invert the quasi-geostrophic omega equation to decompose the large-scale vertical motion profile to components due to synoptic forcing and diabatic heating. Additionally, I present model results from within the Column Quasi-Geostrophic framework. A single column model and cloud-revolving model are forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. It is found that convection was triggered primarily by mechanically forced orographic ascent over the Himalayas during the India/Pakistan flood and by upper-level Potential Vorticity disturbances during the Texas/Oklahoma flood. Furthermore, a climate attribution analysis was conducted for the Texas/Oklahoma flood and it is found that anthropogenic climate change was responsible for a small amount of rainfall during the event but the intensity of this event may be greatly increased if it occurs in a future climate. In the second part of this thesis, I examine the ability of high-resolution global atmospheric models to simulate TCs. Specifically, I present an intercomparison of several models' ability to simulate the global characteristics of TCs in the current climate. This is a necessary first step before using these models to project future changes in TCs. Overall, the models were able to reproduce the geographic distribution of TCs reasonably well, with some of the models performing remarkably well. The intensity of TCs varied widely between the models, with some of this difference being due to model resolution.

  18. Climate change impacts on risks of groundwater pollution by herbicides: a regional scale assessment

    NASA Astrophysics Data System (ADS)

    Steffens, Karin; Moeys, Julien; Lindström, Bodil; Kreuger, Jenny; Lewan, Elisabet; Jarvis, Nick

    2014-05-01

    Groundwater contributes nearly half of the Swedish drinking water supply, which therefore needs to be protected both under present and future climate conditions. Pesticides are sometimes found in Swedish groundwater in concentrations exceeding the EU-drinking water limit and thus constitute a threat. The aim of this study was to assess the present and future risks of groundwater pollution at the regional scale by currently approved herbicides. We identified representative combinations of major crop types and their specific herbicide usage (product, dose and application timing) based on long-term monitoring data from two agricultural catchments in the South-West of Sweden. All these combinations were simulated with the regional version of the pesticide fate model MACRO (called MACRO-SE) for the periods 1970-1999 and 2070-2099 for a major crop production region in South West Sweden. To represent the uncertainty in future climate data, we applied a five-member ensemble based on different climate model projections downscaled with the RCA3-model (Swedish Meteorological and Hydrological Institute). In addition to the direct impacts of changes in the climate, the risks of herbicide leaching in the future will also be affected by likely changes in weed pressure and land use and management practices (e.g. changes in crop rotations and application timings). To assess the relative importance of such factors we performed a preliminary sensitivity analysis which provided us with a hierarchical structure for constructing future herbicide use scenarios for the regional scale model runs. The regional scale analysis gave average concentrations of herbicides leaching to groundwater for a large number of combinations of soils, crops and compounds. The results showed that future scenarios for herbicide use (more autumn-sown crops, more frequent multiple applications on one crop, and a shift from grassland to arable crops such as maize) imply significantly greater risks of herbicide leaching to groundwater in a changing climate, and that these indirect effects outweigh the direct effects of changes in climate driving variables. Due to the large uncertainties in climate change impact assessments, drawing firm conclusions is not possible, but this type of analysis provides indications of likely future concerns and can be used as an early-warning tool to inform the general public, responsible public authorities and decision makers.

  19. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    PubMed

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  20. Transmission Infrastructure | Energy Analysis | NREL

    Science.gov Websites

    aggregating geothermal with other complementary generating technologies, in renewable energy zones infrastructure planning and expansion to enable large-scale deployment of renewable energy in the future. Large Energy, FERC, NERC, and the regional entities, transmission providers, generating companies, utilities

  1. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    NASA Astrophysics Data System (ADS)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  2. Air-quality in the mid-21st century for the city of Paris under two climate scenarios; from regional to local scale

    NASA Astrophysics Data System (ADS)

    Markakis, K.; Valari, M.; Colette, A.; Sanchez, O.; Perrussel, O.; Honore, C.; Vautard, R.; Klimont, Z.; Rao, S.

    2014-01-01

    Ozone and PM2.5 concentrations over the city of Paris are modeled with the CHIMERE air-quality model at 4 km × 4 km horizontal resolution for two future emission scenarios. High-resolution (1 km × 1 km) emission projection until 2020 for the greater Paris region is developed by local experts (AIRPARIF) and is further extended to year 2050 based on regional scale emission projections developed by the Global Energy Assessment. Model evaluation is performed based on a 10 yr control simulation. Ozone is in very good agreement with measurements while PM2.5 is underestimated by 20% over the urban area mainly due to a large wet bias in wintertime precipitation. A significant increase of maximum ozone relative to present time levels over Paris is modeled under the "business as usual" scenario (+7 ppb) while a more optimistic mitigation scenario leads to moderate ozone decrease (-3.5 ppb) in year 2050. These results are substantially different to previous regional scale projections where 2050 ozone is found to decrease under both future scenarios. A sensitivity analysis showed that this difference is due to the fact that ozone formation over Paris at the current, urban scale study, is driven by VOC-limited chemistry, whereas at the regional scale ozone formation occurs under NOx-sensitive conditions. This explains why the sharp NOx reductions implemented in the future scenarios have a different effect on ozone projections at different scales. In rural areas projections at both scales yield similar results showing that the longer time-scale processes of emission transport and ozone formation are less sensitive to model resolution. PM2.5 concentrations decrease by 78% and 89% under "business as usual" and "mitigation" scenarios respectively compared to present time period. The reduction is much more prominent over the urban part of the domain due to the effective reductions of road transport and residential emissions resulting in the smoothing of the large urban increment modelled in the control simulation.

  3. Climate change impact assessment on food security in Indonesia

    NASA Astrophysics Data System (ADS)

    Ettema, Janneke; Aldrian, Edvin; de Bie, Kees; Jetten, Victor; Mannaerts, Chris

    2013-04-01

    As Indonesia is the world's fourth most populous country, food security is a persistent challenge. The potential impact of future climate change on the agricultural sector needs to be addressed in order to allow early implementation of mitigation strategies. The complex island topography and local sea-land-air interactions cannot adequately be represented in large scale General Climate Models (GCMs) nor visualized by TRMM. Downscaling is needed. Using meteorological observations and a simple statistical downscaling tool, local future projections are derived from state-of-the-art, large-scale GCM scenarios, provided by the CMIP5 project. To support the agriculture sector, providing information on especially rainfall and temperature variability is essential. Agricultural production forecast is influenced by several rain and temperature factors, such as rainy and dry season onset, offset and length, but also by daily and monthly minimum and maximum temperatures and its rainfall amount. A simple and advanced crop model will be used to address the sensitivity of different crops to temperature and rainfall variability, present-day and future. As case study area, Java Island is chosen as it is fourth largest island in Indonesia but contains more than half of the nation's population and dominates it politically and economically. The objective is to identify regions at agricultural risk due to changing patterns in precipitation and temperature.

  4. Ocean Research Enabled by Underwater Gliders.

    PubMed

    Rudnick, Daniel L

    2016-01-01

    Underwater gliders are autonomous underwater vehicles that profile vertically by changing their buoyancy and use wings to move horizontally. Gliders are useful for sustained observation at relatively fine horizontal scales, especially to connect the coastal and open ocean. In this review, research topics are grouped by time and length scales. Large-scale topics addressed include the eastern and western boundary currents and the regional effects of climate variability. The accessibility of horizontal length scales of order 1 km allows investigation of mesoscale and submesoscale features such as fronts and eddies. Because the submesoscales dominate vertical fluxes in the ocean, gliders have found application in studies of biogeochemical processes. At the finest scales, gliders have been used to measure internal waves and turbulent dissipation. The review summarizes gliders' achievements to date and assesses their future in ocean observation.

  5. Understanding adherence to treatment and physical activity in children with hemophilia: The role of psychosocial factors.

    PubMed

    Bérubé, Sarah; Cloutier-Bergeron, Audrey; Amesse, Claudine; Sultan, Serge

    2017-02-01

    The objective of this study was to identify psychosocial factors to explain intentions of children and adolescents with hemophilia to adhere to recommendations for self-care. Twenty-four patients with hemophilia A and B, aged 6-18 years old, and their parents, completed a survey. Measures assessed factors from the theory of planned behavior, physical activity, and medical treatment adherence. The results indicate that past behaviors, attitudes, and subjective norms explained a large proportion of the intention to engage in future nonrecommended physical activity. This study supports the need to investigate motivational factors underlying behaviors in larger scale studies and identifies targets for future interventions.

  6. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  7. Future of applied watershed science at regional scales

    Treesearch

    Lee Benda; Daniel Miller; Steve Lanigan; Gordon Reeves

    2009-01-01

    Resource managers must deal increasingly with land use and conservation plans applied at large spatial scales (watersheds, landscapes, states, regions) involving multiple interacting federal agencies and stakeholders. Access to a geographically focused and application-oriented database would allow users in different locations and with different concerns to quickly...

  8. A mesostructured Y zeolite as a superior FCC catalyst--lab to refinery.

    PubMed

    García-Martínez, Javier; Li, Kunhao; Krishnaiah, Gautham

    2012-12-18

    A mesostructured Y zeolite was prepared by a surfactant-templated process at the commercial scale and tested in a refinery, showing superior hydrothermal stability and catalytic cracking selectivity, which demonstrates, for the first time, the promising future of mesoporous zeolites in large scale industrial applications.

  9. Global-scale hydrological response to future glacier mass loss

    NASA Astrophysics Data System (ADS)

    Huss, Matthias; Hock, Regine

    2018-01-01

    Worldwide glacier retreat and associated future runoff changes raise major concerns over the sustainability of global water resources1-4, but global-scale assessments of glacier decline and the resulting hydrological consequences are scarce5,6. Here we compute global glacier runoff changes for 56 large-scale glacierized drainage basins to 2100 and analyse the glacial impact on streamflow. In roughly half of the investigated basins, the modelled annual glacier runoff continues to rise until a maximum (`peak water') is reached, beyond which runoff steadily declines. In the remaining basins, this tipping point has already been passed. Peak water occurs later in basins with larger glaciers and higher ice-cover fractions. Typically, future glacier runoff increases in early summer but decreases in late summer. Although most of the 56 basins have less than 2% ice coverage, by 2100 one-third of them might experience runoff decreases greater than 10% due to glacier mass loss in at least one month of the melt season, with the largest reductions in central Asia and the Andes. We conclude that, even in large-scale basins with minimal ice-cover fraction, the downstream hydrological effects of continued glacier wastage can be substantial, but the magnitudes vary greatly among basins and throughout the melt season.

  10. Relative contributions of set-asides and tree retention to the long-term availability of key forest biodiversity structures at the landscape scale.

    PubMed

    Roberge, Jean-Michel; Lämås, Tomas; Lundmark, Tomas; Ranius, Thomas; Felton, Adam; Nordin, Annika

    2015-05-01

    Over previous decades new environmental measures have been implemented in forestry. In Fennoscandia, forest management practices were modified to set aside conservation areas and to retain trees at final felling. In this study we simulated the long-term effects of set-aside establishment and tree retention practices on the future availability of large trees and dead wood, two forest structures of documented importance to biodiversity conservation. Using a forest decision support system (Heureka), we projected the amounts of these structures over 200 years in two managed north Swedish landscapes, under management scenarios with and without set-asides and tree retention. In line with common best practice, we simulated set-asides covering 5% of the productive area with priority to older stands, as well as ∼5% green-tree retention (solitary trees and forest patches) including high-stump creation at final felling. We found that only tree retention contributed to substantial increases in the future density of large (DBH ≥35 cm) deciduous trees, while both measures made significant contributions to the availability of large conifers. It took more than half a century to observe stronger increases in the densities of large deciduous trees as an effect of tree retention. The mean landscape-scale volumes of hard dead wood fluctuated widely, but the conservation measures yielded values which were, on average over the entire simulation period, about 2.5 times as high as for scenarios without these measures. While the density of large conifers increased with time in the landscape initially dominated by younger forest, best practice conservation measures did not avert a long-term decrease in large conifer density in the landscape initially comprised of more old forest. Our results highlight the needs to adopt a long temporal perspective and to consider initial landscape conditions when evaluating the large-scale effects of conservation measures on forest biodiversity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Seemingly unrelated intervention time series models for effectiveness evaluation of large scale environmental remediation.

    PubMed

    Ip, Ryan H L; Li, W K; Leung, Kenneth M Y

    2013-09-15

    Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Visualization, documentation, analysis, and communication of large scale gene regulatory networks

    PubMed Central

    Longabaugh, William J.R.; Davidson, Eric H.; Bolouri, Hamid

    2009-01-01

    Summary Genetic regulatory networks (GRNs) are complex, large-scale, and spatially and temporally distributed. These characteristics impose challenging demands on computational GRN modeling tools, and there is a need for custom modeling tools. In this paper, we report on our ongoing development of BioTapestry, an open source, freely available computational tool designed specifically for GRN modeling. We also outline our future development plans, and give some examples of current applications of BioTapestry. PMID:18757046

  13. The numerics of hydrostatic structured-grid coastal ocean models: State of the art and future perspectives

    NASA Astrophysics Data System (ADS)

    Klingbeil, Knut; Lemarié, Florian; Debreu, Laurent; Burchard, Hans

    2018-05-01

    The state of the art of the numerics of hydrostatic structured-grid coastal ocean models is reviewed here. First, some fundamental differences in the hydrodynamics of the coastal ocean, such as the large surface elevation variation compared to the mean water depth, are contrasted against large scale ocean dynamics. Then the hydrodynamic equations as they are used in coastal ocean models as well as in large scale ocean models are presented, including parameterisations for turbulent transports. As steps towards discretisation, coordinate transformations and spatial discretisations based on a finite-volume approach are discussed with focus on the specific requirements for coastal ocean models. As in large scale ocean models, splitting of internal and external modes is essential also for coastal ocean models, but specific care is needed when drying & flooding of intertidal flats is included. As one obvious characteristic of coastal ocean models, open boundaries occur and need to be treated in a way that correct model forcing from outside is transmitted to the model domain without reflecting waves from the inside. Here, also new developments in two-way nesting are presented. Single processes such as internal inertia-gravity waves, advection and turbulence closure models are discussed with focus on the coastal scales. Some overview on existing hydrostatic structured-grid coastal ocean models is given, including their extensions towards non-hydrostatic models. Finally, an outlook on future perspectives is made.

  14. Investigating large-scale secondary circulations within impact crater topographies in a refractive index-matched facility

    NASA Astrophysics Data System (ADS)

    Blois, Gianluca; Kim, Taehoon; Bristow, Nathan; Day, Mackenzie; Kocurek, Gary; Anderson, William; Christensen, Kenneth

    2017-11-01

    Impact craters, common large-scale topographic features on the surface of Mars, are circular depressions delimited by a sharp ridge. A variety of crater fill morphologies exist, suggesting that complex intracrater circulations affect their evolution. Some large craters (diameter >10 km), particularly at mid latitudes on Mars, exhibit a central mound surrounded by circular moat. Foremost among these examples is Gale crater, landing site of NASA's Curiosity rover, since large-scale climatic processes early in in the history of Mars are preserved in the stratigraphic record of the inner mound. Investigating the intracrater flow produced by large scale winds aloft Mars craters is key to a number of important scientific issues including ongoing research on Mars paleo-environmental reconstruction and the planning of future missions (these results must be viewed in conjunction with the affects of radial katabatibc flows, the importance of which is already established in preceding studies). In this work we consider a number of crater shapes inspired by Gale morphology, including idealized craters. Access to the flow field within such geometrically complex topography is achieved herein using a refractive index matched approach. Instantaneous velocity maps, using both planar and volumetric PIV techniques, are presented to elucidate complex three-dimensional flow within the crater. In addition, first- and second-order statistics will be discussed in the context of wind-driven (aeolian) excavation of crater fill.

  15. An effective online data monitoring and saving strategy for large-scale climate simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin

    Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less

  16. An effective online data monitoring and saving strategy for large-scale climate simulations

    DOE PAGES

    Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...

    2018-01-22

    Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less

  17. Polymer Dynamics from Synthetic to Biological Macromolecules

    NASA Astrophysics Data System (ADS)

    Richter, D.; Niedzwiedz, K.; Monkenbusch, M.; Wischnewski, A.; Biehl, R.; Hoffmann, B.; Merkel, R.

    2008-02-01

    High resolution neutron scattering together with a meticulous choice of the contrast conditions allows to access the large scale dynamics of soft materials including biological molecules in space and time. In this contribution we present two examples. One from the world of synthetic polymers, the other from biomolecules. First, we will address the peculiar dynamics of miscible polymer blends with very different component glass transition temperatures. Polymethylmetacrylate (PMMA), polyethyleneoxide (PEO) are perfectly miscible but exhibit a difference in the glass transition temperature by 200 K. We present quasielastic neutron scattering investigations on the dynamics of the fast component in the range from angströms to nanometers over a time frame of five orders of magnitude. All data may be consistently described in terms of a Rouse model with random friction, reflecting the random environment imposed by the nearly frozen PMMA matrix on the fast mobile PEO. In the second part we touch on some new developments relating to large scale internal dynamics of proteins by neutron spin echo. We will report results of some pioneering studies which show the feasibility of such experiments on large scale protein motion which will most likely initiate further studies in the future.

  18. Current challenges in quantifying preferential flow through the vadose zone

    NASA Astrophysics Data System (ADS)

    Koestel, John; Larsbo, Mats; Jarvis, Nick

    2017-04-01

    In this presentation, we give an overview of current challenges in quantifying preferential flow through the vadose zone. A review of the literature suggests that current generation models do not fully reflect the present state of process understanding and empirical knowledge of preferential flow. We believe that the development of improved models will be stimulated by the increasingly widespread application of novel imaging technologies as well as future advances in computational power and numerical techniques. One of the main challenges in this respect is to bridge the large gap between the scales at which preferential flow occurs (pore to Darcy scales) and the scale of interest for management (fields, catchments, regions). Studies at the pore scale are being supported by the development of 3-D non-invasive imaging and numerical simulation techniques. These studies are leading to a better understanding of how macropore network topology and initial/boundary conditions control key state variables like matric potential and thus the strength of preferential flow. Extrapolation of this knowledge to larger scales would require support from theoretical frameworks such as key concepts from percolation and network theory, since we lack measurement technologies to quantify macropore networks at these large scales. Linked hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data enable investigation of the larger-scale heterogeneities that can generate preferential flow patterns at pedon, hillslope and field scales. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help in parameterizing models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).

  19. Urban Growth Scenarios of a Future MEGA City: Case Study Ahmedabad

    NASA Astrophysics Data System (ADS)

    Lehner, A.; Kraus, V.; Steinnocher, K.

    2016-06-01

    The study of urban areas and their development focuses on cities, their physical and demographic expansion and the tensions and impacts that go along with urban growth. Especially in developing countries and emerging national economies like India, consistent and up to date information or other planning relevant data all too often is not available. With its Smart Cities Mission, the Indian government places great importance on the future developments of Indian urban areas and pays tribute to the large-scale rural to urban migration. The potentials of urban remote sensing and its contribution to urban planning are discussed and related to the Indian Smart Cities Mission. A case study is presented showing urban remote sensing based information products for the city of Ahmedabad. Resulting urban growth scenarios are presented, hotspots identified and future action alternatives proposed.

  20. The Mekong's future flows under multiple driving factors: How future climate change, hydropower developments and irrigation expansion drive hydrological changes?

    NASA Astrophysics Data System (ADS)

    Hoang, L. P.; van Vliet, M. T. H.; Lauri, H.; Kummu, M.; Koponen, J.; Supit, I.; Leemans, R.; Kabat, P.; Ludwig, F.

    2016-12-01

    The Mekong River's flows and water resources are in many ways essential for sustaining economic growths, flood security of about 70 million people and biodiversity in one of the world's most ecologically productive wetland systems. The river's hydrological cycle, however, are increasingly perturbed by climate change, large-scale hydropower developments and rapid irrigated land expansions. This study presents an integrated impact assessment to characterize and quantify future hydrological changes induced by these driving factors, both separately and combined. We have integrated a crop simulation module and a hydropower dam module into a distributed hydrological model (VMod) and simulated the Mekong's hydrology under multiple climate change and development scenarios. Our results show that the Mekong's hydrological regime will experience substantial changes caused by the considered factors. Magnitude-wise, hydropower dam developments exhibit the largest impacts on river flows, with projected higher flows (up to +35%) during the dry season and lower flows (up to -44%) during the wet season. Annual flow changes caused by the dams, however, are relatively marginal. In contrast to this, climate change is projected to increase the Mekong's annual flows (up to +16%) while irrigated land expansions result in annual flow reductions (-1% to -3%). Combining the impacts of these three drivers, we found that river flow changes, especially those at the monthly scale, largely differ from changes under the individual driving factors. This is explained by large differences in impacts' magnitudes and contrasting impacts' directions for the individual drivers. We argue that the Mekong's future flows are likely driven by multiple factors and thus advocate for integrated assessment approaches and tools that support proper considerations of these factors and their interplays.

  1. Precision medicine in the age of big data: The present and future role of large-scale unbiased sequencing in drug discovery and development.

    PubMed

    Vicini, P; Fields, O; Lai, E; Litwack, E D; Martin, A-M; Morgan, T M; Pacanowski, M A; Papaluca, M; Perez, O D; Ringel, M S; Robson, M; Sakul, H; Vockley, J; Zaks, T; Dolsten, M; Søgaard, M

    2016-02-01

    High throughput molecular and functional profiling of patients is a key driver of precision medicine. DNA and RNA characterization has been enabled at unprecedented cost and scale through rapid, disruptive progress in sequencing technology, but challenges persist in data management and interpretation. We analyze the state-of-the-art of large-scale unbiased sequencing in drug discovery and development, including technology, application, ethical, regulatory, policy and commercial considerations, and discuss issues of LUS implementation in clinical and regulatory practice. © 2015 American Society for Clinical Pharmacology and Therapeutics.

  2. An alternative to the search for single polymorphisms: toward molecular personality scales for the five-factor model.

    PubMed

    McCrae, Robert R; Scally, Matthew; Terracciano, Antonio; Abecasis, Gonçalo R; Costa, Paul T

    2010-12-01

    There is growing evidence that personality traits are affected by many genes, all of which have very small effects. As an alternative to the largely unsuccessful search for individual polymorphisms associated with personality traits, the authors identified large sets of potentially related single nucleotide polymorphisms (SNPs) and summed them to form molecular personality scales (MPSs) with from 4 to 2,497 SNPs. Scales were derived from two thirds of a large (N = 3,972) sample of individuals from Sardinia who completed the Revised NEO Personality Inventory (P. T. Costa, Jr., & R. R. McCrae, 1992) and were assessed in a genomewide association scan. When MPSs were correlated with the phenotype in the remaining one third of the sample, very small but significant associations were found for 4 of the 5e personality factors when the longest scales were examined. These data suggest that MPSs for Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness (but not Extraversion) contain genetic information that can be refined in future studies, and the procedures described here should be applicable to other quantitative traits. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  3. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less

  4. Probing the largest cosmological scales with the correlation between the cosmic microwave background and peculiar velocities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fosalba, Pablo; Dore, Olivier

    2007-11-15

    Cross correlation between the cosmic microwave background (CMB) and large-scale structure is a powerful probe of dark energy and gravity on the largest physical scales. We introduce a novel estimator, the CMB-velocity correlation, that has most of its power on large scales and that, at low redshift, delivers up to a factor of 2 higher signal-to-noise ratio than the recently detected CMB-dark matter density correlation expected from the integrated Sachs-Wolfe effect. We propose to use a combination of peculiar velocities measured from supernovae type Ia and kinetic Sunyaev-Zeldovich cluster surveys to reveal this signal and forecast dark energy constraints thatmore » can be achieved with future surveys. We stress that low redshift peculiar velocity measurements should be exploited with complementary deeper large-scale structure surveys for precision cosmology.« less

  5. Nicholas Metropolis Award for Outstanding Doctoral Thesis Work in Computational Physics Talk: Understanding Nano-scale Electronic Systems via Large-scale Computation

    NASA Astrophysics Data System (ADS)

    Cao, Chao

    2009-03-01

    Nano-scale physical phenomena and processes, especially those in electronics, have drawn great attention in the past decade. Experiments have shown that electronic and transport properties of functionalized carbon nanotubes are sensitive to adsorption of gas molecules such as H2, NO2, and NH3. Similar measurements have also been performed to study adsorption of proteins on other semiconductor nano-wires. These experiments suggest that nano-scale systems can be useful for making future chemical and biological sensors. Aiming to understand the physical mechanisms underlying and governing property changes at nano-scale, we start off by investigating, via first-principles method, the electronic structure of Pd-CNT before and after hydrogen adsorption, and continue with coherent electronic transport using non-equilibrium Green’s function techniques combined with density functional theory. Once our results are fully analyzed they can be used to interpret and understand experimental data, with a few difficult issues to be addressed. Finally, we discuss a newly developed multi-scale computing architecture, OPAL, that coordinates simultaneous execution of multiple codes. Inspired by the capabilities of this computing framework, we present a scenario of future modeling and simulation of multi-scale, multi-physical processes.

  6. Impacts of Changing Climatic Drivers and Land use features on Future Stormwater Runoff in the Northwest Florida Basin: A Large-Scale Hydrologic Modeling Assessment

    NASA Astrophysics Data System (ADS)

    Khan, M.; Abdul-Aziz, O. I.

    2017-12-01

    Potential changes in climatic drivers and land cover features can significantly influence the stormwater budget in the Northwest Florida Basin. We investigated the hydro-climatic and land use sensitivities of stormwater runoff by developing a large-scale process-based rainfall-runoff model for the large basin by using the EPA Storm Water Management Model (SWMM 5.1). Climatic and hydrologic variables, as well as land use/cover features were incorporated into the model to account for the key processes of coastal hydrology and its dynamic interactions with groundwater and sea levels. We calibrated and validated the model by historical daily streamflow observations during 2009-2012 at four major rivers in the basin. Downscaled climatic drivers (precipitation, temperature, solar radiation) projected by twenty GCMs-RCMs under CMIP5, along with the projected future land use/cover features were also incorporated into the model. The basin storm runoff was then simulated for the historical (2000s = 1976-2005) and two future periods (2050s = 2030-2059, and 2080s = 2070-2099). Comparative evaluation of the historical and future scenarios leads to important guidelines for stormwater management in Northwest Florida and similar regions under a changing climate and environment.

  7. Wetlands as large-scale nature-based solutions: status and future challenges for research and management

    NASA Astrophysics Data System (ADS)

    Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia

    2017-04-01

    Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.

  8. Mapping the universe in three dimensions

    PubMed Central

    Haynes, Martha P.

    1996-01-01

    The determination of the three-dimensional layout of galaxies is critical to our understanding of the evolution of galaxies and the structures in which they lie, to our determination of the fundamental parameters of cosmology, and to our understanding of both the past and future histories of the universe at large. The mapping of the large scale structure in the universe via the determination of galaxy red shifts (Doppler shifts) is a rapidly growing industry thanks to technological developments in detectors and spectrometers at radio and optical wavelengths. First-order application of the red shift-distance relation (Hubble’s law) allows the analysis of the large-scale distribution of galaxies on scales of hundreds of megaparsecs. Locally, the large-scale structure is very complex but the overall topology is not yet clear. Comparison of the observed red shifts with ones expected on the basis of other distance estimates allows mapping of the gravitational field and the underlying total density distribution. The next decade holds great promise for our understanding of the character of large-scale structure and its origin. PMID:11607714

  9. Mapping the universe in three dimensions.

    PubMed

    Haynes, M P

    1996-12-10

    The determination of the three-dimensional layout of galaxies is critical to our understanding of the evolution of galaxies and the structures in which they lie, to our determination of the fundamental parameters of cosmology, and to our understanding of both the past and future histories of the universe at large. The mapping of the large scale structure in the universe via the determination of galaxy red shifts (Doppler shifts) is a rapidly growing industry thanks to technological developments in detectors and spectrometers at radio and optical wavelengths. First-order application of the red shift-distance relation (Hubble's law) allows the analysis of the large-scale distribution of galaxies on scales of hundreds of megaparsecs. Locally, the large-scale structure is very complex but the overall topology is not yet clear. Comparison of the observed red shifts with ones expected on the basis of other distance estimates allows mapping of the gravitational field and the underlying total density distribution. The next decade holds great promise for our understanding of the character of large-scale structure and its origin.

  10. Genetic Diversity and Ecological Niche Modelling of Wild Barley: Refugia, Large-Scale Post-LGM Range Expansion and Limited Mid-Future Climate Threats?

    PubMed Central

    Russell, Joanne; van Zonneveld, Maarten; Dawson, Ian K.; Booth, Allan; Waugh, Robbie; Steffenson, Brian

    2014-01-01

    Describing genetic diversity in wild barley (Hordeum vulgare ssp. spontaneum) in geographic and environmental space in the context of current, past and potential future climates is important for conservation and for breeding the domesticated crop (Hordeum vulgare ssp. vulgare). Spatial genetic diversity in wild barley was revealed by both nuclear- (2,505 SNP, 24 nSSR) and chloroplast-derived (5 cpSSR) markers in 256 widely-sampled geo-referenced accessions. Results were compared with MaxEnt-modelled geographic distributions under current, past (Last Glacial Maximum, LGM) and mid-term future (anthropogenic scenario A2, the 2080s) climates. Comparisons suggest large-scale post-LGM range expansion in Central Asia and relatively small, but statistically significant, reductions in range-wide genetic diversity under future climate. Our analyses support the utility of ecological niche modelling for locating genetic diversity hotspots and determine priority geographic areas for wild barley conservation under anthropogenic climate change. Similar research on other cereal crop progenitors could play an important role in tailoring conservation and crop improvement strategies to support future human food security. PMID:24505252

  11. Past and future changes in streamflow in the U.S. Midwest: Bridging across time scales

    NASA Astrophysics Data System (ADS)

    Villarini, G.; Slater, L. J.; Salvi, K. A.

    2017-12-01

    Streamflows have increased notably across the U.S. Midwest over the past century, principally due to changes in precipitation and land use / land cover. Improving our understanding of the physical drivers that are responsible for the observed changes in discharge may enhance our capability of predicting and projecting these changes, and may have large implications for water resources management over this area. This study will highlight our efforts towards the statistical attribution of changes in discharge across the U.S. Midwest, with analyses performed at the seasonal scale from low to high flows. The main drivers of changing streamflows that we focus on are: urbanization, agricultural land cover, basin-averaged temperature, basin-averaged precipitation, and antecedent soil moisture. Building on the insights from this attribution, we will examine the potential predictability of streamflow across different time scales, with lead times ranging from seasonal to decadal, and discuss a potential path forward for engineering design for future conditions.

  12. Astrophysical Applications of Gravitational Lensing

    NASA Astrophysics Data System (ADS)

    Mediavilla, Evencio; Muñoz, Jose A.; Garzón, Francisco; Mahoney, Terence J.

    2016-10-01

    Contributors; Participants; Preface; Acknowledgements; 1. Lensing basics Sherry H. Suyu; 2. Exoplanet microlensing Andrew Gould; 3. Case studies of microlensing Veronica Motta and Emilio Falco; 4. Microlensing of quasars and AGN Joachim Wambsganss; 5. DM in clusters and large-scale structure Peter Schneider; 6. The future of strong lensing Chris Fassnacht; 7. Methods for strong lens modelling Charles Keeton; 8. Tutorial on inverse ray shooting Jorge Jimenez-Vicente.

  13. Impact of seasonal synoptic weather types on local PM10 concentrations in Bavaria/Germany: recent conditions and future projections

    NASA Astrophysics Data System (ADS)

    Weitnauer, Claudia; Beck, Christoph; Jacobeit, Jucundus

    2015-04-01

    It is a matter of common knowledge that local concentrations of PM10 (fine particles in the air with a medium diameter less than 10 μm) vary with the seasons in Europe. These concentrations are influenced on the one hand by the amount of natural and anthropogenic emissions and on the other hand by large-scale and local meteorological conditions. In Bavaria (part of southern Germany) as the target region of the present study, the PM10 concentrations are particularly high in winter time. One reason for this are increased particle emissions due to domestic heating and traffic load in December, January and February. As several studies in other European regions indicated, a distinct effect of the large-scale synoptic weather situation in winter on local PM10 concentrations should be considered as another reason. The main task of this study is to use seasonal synoptic weather types, which are optimized with respect to daily mean PM10 data at 16 Bavarian cities, and therefore are classified by using daily gridded NCEP/NCAR reanalysis data (2.5° x 2.5° horizontal resolution) for the recent period 1980 - 2011 over a Central European spatial domain, to describe the impact of the large-scale meteorological conditions on the local particle concentrations. The weather types are related to monthly PM10 indices by using different transfer techniques like direct synoptic downscaling, multiple regression and generalized linear models as well as random forests. The PM10 indices are determined by averaging daily to monthly data (PMmean) or by counting the daily exceedances of a particular threshold (> 50 μg/m3, PMe50). The generated transfer models are evaluated in calibration and validation periods using several forecast skills, for example the mean squared skill score (MSSS) or the Heidke Skill Score (HSS). The sufficiently performing models are then applied to weather types derived from future climate change scenarios of the global climate model ECHAM 6 for the IPCC scenarios RCP 4.5 and 8.5 in order to estimate future climate-change induced modifications of local PM10 concentrations in Bavaria.

  14. Feasibility analysis of large length-scale thermocapillary flow experiment for the International Space Station

    NASA Astrophysics Data System (ADS)

    Alberts, Samantha J.

    The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.

  15. The potential of text mining in data integration and network biology for plant research: a case study on Arabidopsis.

    PubMed

    Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J; Inzé, Dirk; Van de Peer, Yves

    2013-03-01

    Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein-protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies.

  16. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    PubMed

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  17. A multi-scale comparison of trait linkages to environmental and spatial variables in fish communities across a large freshwater lake.

    PubMed

    Strecker, Angela L; Casselman, John M; Fortin, Marie-Josée; Jackson, Donald A; Ridgway, Mark S; Abrams, Peter A; Shuter, Brian J

    2011-07-01

    Species present in communities are affected by the prevailing environmental conditions, and the traits that these species display may be sensitive indicators of community responses to environmental change. However, interpretation of community responses may be confounded by environmental variation at different spatial scales. Using a hierarchical approach, we assessed the spatial and temporal variation of traits in coastal fish communities in Lake Huron over a 5-year time period (2001-2005) in response to biotic and abiotic environmental factors. The association of environmental and spatial variables with trophic, life-history, and thermal traits at two spatial scales (regional basin-scale, local site-scale) was quantified using multivariate statistics and variation partitioning. We defined these two scales (regional, local) on which to measure variation and then applied this measurement framework identically in all 5 study years. With this framework, we found that there was no change in the spatial scales of fish community traits over the course of the study, although there were small inter-annual shifts in the importance of regional basin- and local site-scale variables in determining community trait composition (e.g., life-history, trophic, and thermal). The overriding effects of regional-scale variables may be related to inter-annual variation in average summer temperature. Additionally, drivers of fish community traits were highly variable among study years, with some years dominated by environmental variation and others dominated by spatially structured variation. The influence of spatial factors on trait composition was dynamic, which suggests that spatial patterns in fish communities over large landscapes are transient. Air temperature and vegetation were significant variables in most years, underscoring the importance of future climate change and shoreline development as drivers of fish community structure. Overall, a trait-based hierarchical framework may be a useful conservation tool, as it highlights the multi-scaled interactive effect of variables over a large landscape.

  18. Composites for Exploration Upper Stage

    NASA Technical Reports Server (NTRS)

    Fikes, J. C.; Jackson, J. R.; Richardson, S. W.; Thomas, A. D.; Mann, T. O.; Miller, S. G.

    2016-01-01

    The Composites for Exploration Upper Stage (CEUS) was a 3-year, level III project within the Technology Demonstration Missions program of the NASA Space Technology Mission Directorate. Studies have shown that composites provide important programmatic enhancements, including reduced weight to increase capability and accelerated expansion of exploration and science mission objectives. The CEUS project was focused on technologies that best advanced innovation, infusion, and broad applications for the inclusion of composites on future large human-rated launch vehicles and spacecraft. The benefits included near- and far-term opportunities for infusion (NASA, industry/commercial, Department of Defense), demonstrated critical technologies and technically implementable evolvable innovations, and sustained Agency experience. The initial scope of the project was to advance technologies for large composite structures applicable to the Space Launch System (SLS) Exploration Upper Stage (EUS) by focusing on the affordability and technical performance of the EUS forward and aft skirts. The project was tasked to develop and demonstrate critical composite technologies with a focus on full-scale materials, design, manufacturing, and test using NASA in-house capabilities. This would have demonstrated a major advancement in confidence and matured the large-scale composite technology to a Technology Readiness Level 6. This project would, therefore, have bridged the gap for providing composite application to SLS upgrades, enabling future exploration missions.

  19. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    NASA Astrophysics Data System (ADS)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of climate change on crop productivity in a watershed. The first was carried out by the large-scale crop model alone. The second was carried out by the integrated model of the large-scale crop model and the H08 model. The former projected that changes in temperature and precipitation due to future climate change would give rise to increasing the water stress in crops. Nevertheless, the latter projected that the increasing amount of agricultural water resources in the watershed would supply sufficient amount of water for irrigation, consequently reduce the water stress. The integrated model demonstrated the importance of taking into account the water circulation in watershed when predicting the regional crop production.

  20. The suite of small-angle neutron scattering instruments at Oak Ridge National Laboratory

    DOE PAGES

    Heller, William T.; Cuneo, Matthew J.; Debeer-Schmitt, Lisa M.; ...

    2018-02-21

    Oak Ridge National Laboratory is home to the High Flux Isotope Reactor (HFIR), a high-flux research reactor, and the Spallation Neutron Source (SNS), the world's most intense source of pulsed neutron beams. The unique co-localization of these two sources provided an opportunity to develop a suite of complementary small-angle neutron scattering instruments for studies of large-scale structures: the GP-SANS and Bio-SANS instruments at the HFIR and the EQ-SANS and TOF-USANS instruments at the SNS. This article provides an overview of the capabilities of the suite of instruments, with specific emphasis on how they complement each other. As a result, amore » description of the plans for future developments including greater integration of the suite into a single point of entry for neutron scattering studies of large-scale structures is also provided.« less

  1. Molecular clouds and the large-scale structure of the galaxy

    NASA Technical Reports Server (NTRS)

    Thaddeus, Patrick; Stacy, J. Gregory

    1990-01-01

    The application of molecular radio astronomy to the study of the large-scale structure of the Galaxy is reviewed and the distribution and characteristic properties of the Galactic population of Giant Molecular Clouds (GMCs), derived primarily from analysis of the Columbia CO survey, and their relation to tracers of Population 1 and major spiral features are described. The properties of the local molecular interstellar gas are summarized. The CO observing programs currently underway with the Center for Astrophysics 1.2 m radio telescope are described, with an emphasis on projects relevant to future comparison with high-energy gamma-ray observations. Several areas are discussed in which high-energy gamma-ray observations by the EGRET (Energetic Gamma-Ray Experiment Telescope) experiment aboard the Gamma Ray Observatory will directly complement radio studies of the Milky Way, with the prospect of significant progress on fundamental issues related to the structure and content of the Galaxy.

  2. The suite of small-angle neutron scattering instruments at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heller, William T.; Cuneo, Matthew J.; Debeer-Schmitt, Lisa M.

    Oak Ridge National Laboratory is home to the High Flux Isotope Reactor (HFIR), a high-flux research reactor, and the Spallation Neutron Source (SNS), the world's most intense source of pulsed neutron beams. The unique co-localization of these two sources provided an opportunity to develop a suite of complementary small-angle neutron scattering instruments for studies of large-scale structures: the GP-SANS and Bio-SANS instruments at the HFIR and the EQ-SANS and TOF-USANS instruments at the SNS. This article provides an overview of the capabilities of the suite of instruments, with specific emphasis on how they complement each other. As a result, amore » description of the plans for future developments including greater integration of the suite into a single point of entry for neutron scattering studies of large-scale structures is also provided.« less

  3. A novel bonding method for large scale poly(methyl methacrylate) micro- and nanofluidic chip fabrication

    NASA Astrophysics Data System (ADS)

    Qu, Xingtian; Li, Jinlai; Yin, Zhifu

    2018-04-01

    Micro- and nanofluidic chips are becoming increasing significance for biological and medical applications. Future advances in micro- and nanofluidics and its utilization in commercial applications depend on the development and fabrication of low cost and high fidelity large scale plastic micro- and nanofluidic chips. However, the majority of the present fabrication methods suffer from a low bonding rate of the chip during thermal bonding process due to air trapping between the substrate and the cover plate. In the present work, a novel bonding technique based on Ar plasma and water treatment was proposed to fully bond the large scale micro- and nanofluidic chips. The influence of Ar plasma parameters on the water contact angle and the effect of bonding conditions on the bonding rate and the bonding strength of the chip were studied. The fluorescence tests demonstrate that the 5 × 5 cm2 poly(methyl methacrylate) chip with 180 nm wide and 180 nm deep nanochannels can be fabricated without any block and leakage by our newly developed method.

  4. A Large Scale Wind Tunnel for the Study of High Reynolds Number Turbulent Boundary Layer Physics

    NASA Astrophysics Data System (ADS)

    Priyadarshana, Paththage; Klewicki, Joseph; Wosnik, Martin; White, Chris

    2008-11-01

    Progress and the basic features of the University of New Hampshire's very large multi-disciplinary wind tunnel are reported. The refinement of the overall design has been greatly aided through consultations with an external advisory group. The facility test section is 73 m long, 6 m wide, and 2.5 m nominally high, and the maximum free stream velocity is 30 m/s. A very large tunnel with relatively low velocities makes the small scale turbulent motions resolvable by existing measurement systems. The maximum Reynolds number is estimated at &+circ;= δuτ/ν˜50000, where δ is the boundary layer thickness and uτ is the friction velocity. The effects of scale separation on the generation of the Reynolds stress gradient appearing in the mean momentum equation are briefly discussed to justify the need to attain &+circ; in excess of about 40000. Lastly, plans for future utilization of the facility as a community-wide resource are outlined. This project is supported through the NSF-EPSCoR RII Program, grant number EPS0701730.

  5. Large-scale production of functional human lysozyme from marker-free transgenic cloned cows.

    PubMed

    Lu, Dan; Liu, Shen; Ding, Fangrong; Wang, Haiping; Li, Jing; Li, Ling; Dai, Yunping; Li, Ning

    2016-03-10

    Human lysozyme is an important natural non-specific immune protein that is highly expressed in breast milk and participates in the immune response of infants against bacterial and viral infections. Considering the medicinal value and market demand for human lysozyme, an animal model for large-scale production of recombinant human lysozyme (rhLZ) is needed. In this study, we generated transgenic cloned cows with the marker-free vector pBAC-hLF-hLZ, which was shown to efficiently express rhLZ in cow milk. Seven transgenic cloned cows, identified by polymerase chain reaction, Southern blot, and western blot analyses, produced rhLZ in milk at concentrations of up to 3149.19 ± 24.80 mg/L. The purified rhLZ had a similar molecular weight and enzymatic activity as wild-type human lysozyme possessed the same C-terminal and N-terminal amino acid sequences. The preliminary results from the milk yield and milk compositions from a naturally lactating transgenic cloned cow 0906 were also tested. These results provide a solid foundation for the large-scale production of rhLZ in the future.

  6. Modeling CMB lensing cross correlations with CLEFT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modi, Chirag; White, Martin; Vlah, Zvonimir, E-mail: modichirag@berkeley.edu, E-mail: mwhite@berkeley.edu, E-mail: zvlah@stanford.edu

    2017-08-01

    A new generation of surveys will soon map large fractions of sky to ever greater depths and their science goals can be enhanced by exploiting cross correlations between them. In this paper we study cross correlations between the lensing of the CMB and biased tracers of large-scale structure at high z . We motivate the need for more sophisticated bias models for modeling increasingly biased tracers at these redshifts and propose the use of perturbation theories, specifically Convolution Lagrangian Effective Field Theory (CLEFT). Since such signals reside at large scales and redshifts, they can be well described by perturbative approaches.more » We compare our model with the current approach of using scale independent bias coupled with fitting functions for non-linear matter power spectra, showing that the latter will not be sufficient for upcoming surveys. We illustrate our ideas by estimating σ{sub 8} from the auto- and cross-spectra of mock surveys, finding that CLEFT returns accurate and unbiased results at high z . We discuss uncertainties due to the redshift distribution of the tracers, and several avenues for future development.« less

  7. Private sector, for-profit health providers in low and middle income countries: can they reach the poor at scale?

    PubMed

    Tung, Elizabeth; Bennett, Sara

    2014-06-24

    The bottom of the pyramid concept suggests that profit can be made in providing goods and services to poor people, when high volume is combined with low margins. To-date there has been very limited empirical evidence from the health sector concerning the scope and potential for such bottom of the pyramid models. This paper analyzes private for-profit (PFP) providers currently offering services to the poor on a large scale, and assesses the future prospects of bottom of the pyramid models in health. We searched published and grey literature and databases to identify PFP companies that provided more than 40,000 outpatient visits per year, or who covered 15% or more of a particular type of service in their country. For each included provider, we searched for additional information on location, target market, business model and performance, including quality of care. Only 10 large scale PFP providers were identified. The majority of these were in South Asia and most provided specialized services such as eye care. The characteristics of the business models of these firms were found to be similar to non-profit providers studied by other analysts (such as Bhattacharya 2010). They pursued social rather than traditional marketing, partnerships with government, low cost/high volume services and cross-subsidization between different market segments. There was a lack of reliable data concerning these providers. There is very limited evidence to support the notion that large scale bottom of the pyramid models in health offer good prospects for extending services to the poor in the future. In order to be successful PFP providers often require partnerships with government or support from social health insurance schemes. Nonetheless, more reliable and independent data on such schemes is needed.

  8. Private sector, for-profit health providers in low and middle income countries: can they reach the poor at scale?

    PubMed Central

    2014-01-01

    Background The bottom of the pyramid concept suggests that profit can be made in providing goods and services to poor people, when high volume is combined with low margins. To-date there has been very limited empirical evidence from the health sector concerning the scope and potential for such bottom of the pyramid models. This paper analyzes private for-profit (PFP) providers currently offering services to the poor on a large scale, and assesses the future prospects of bottom of the pyramid models in health. Methods We searched published and grey literature and databases to identify PFP companies that provided more than 40,000 outpatient visits per year, or who covered 15% or more of a particular type of service in their country. For each included provider, we searched for additional information on location, target market, business model and performance, including quality of care. Results Only 10 large scale PFP providers were identified. The majority of these were in South Asia and most provided specialized services such as eye care. The characteristics of the business models of these firms were found to be similar to non-profit providers studied by other analysts (such as Bhattacharya 2010). They pursued social rather than traditional marketing, partnerships with government, low cost/high volume services and cross-subsidization between different market segments. There was a lack of reliable data concerning these providers. Conclusions There is very limited evidence to support the notion that large scale bottom of the pyramid models in health offer good prospects for extending services to the poor in the future. In order to be successful PFP providers often require partnerships with government or support from social health insurance schemes. Nonetheless, more reliable and independent data on such schemes is needed. PMID:24961496

  9. Air quality in the mid-21st century for the city of Paris under two climate scenarios; from the regional to local scale

    NASA Astrophysics Data System (ADS)

    Markakis, K.; Valari, M.; Colette, A.; Sanchez, O.; Perrussel, O.; Honore, C.; Vautard, R.; Klimont, Z.; Rao, S.

    2014-07-01

    Ozone and PM2.5 concentrations over the city of Paris are modeled with the CHIMERE air-quality model at 4 km × 4 km horizontal resolution for two future emission scenarios. A high-resolution (1 km × 1 km) emission projection until 2020 for the greater Paris region is developed by local experts (AIRPARIF) and is further extended to year 2050 based on regional-scale emission projections developed by the Global Energy Assessment. Model evaluation is performed based on a 10-year control simulation. Ozone is in very good agreement with measurements while PM2.5 is underestimated by 20% over the urban area mainly due to a large wet bias in wintertime precipitation. A significant increase of maximum ozone relative to present-day levels over Paris is modeled under the "business-as-usual" scenario (+7 ppb) while a more optimistic "mitigation" scenario leads to a moderate ozone decrease (-3.5 ppb) in year 2050. These results are substantially different to previous regional-scale projections where 2050 ozone is found to decrease under both future scenarios. A sensitivity analysis showed that this difference is due to the fact that ozone formation over Paris at the current urban-scale study is driven by volatile organic compound (VOC)-limited chemistry, whereas at the regional-scale ozone formation occurs under NOx-sensitive conditions. This explains why the sharp NOx reductions implemented in the future scenarios have a different effect on ozone projections at different scales. In rural areas, projections at both scales yield similar results showing that the longer timescale processes of emission transport and ozone formation are less sensitive to model resolution. PM2.5 concentrations decrease by 78% and 89% under business-as-usual and mitigation scenarios, respectively, compared to the present-day period. The reduction is much more prominent over the urban part of the domain due to the effective reductions of road transport and residential emissions resulting in the smoothing of the large urban increment modeled in the control simulation.

  10. Mapping the Heavens: Probing Cosmology with Large Surveys

    ScienceCinema

    Frieman, Joshua [Fermilab

    2017-12-09

    This talk will provide an overview of recent and on-going sky surveys, focusing on their implications for cosmology. I will place particular emphasis on the Sloan Digital Sky Survey, the most ambitious mapping of the Universe yet undertaken, showing a virtual fly-through of the survey that reveals the large-scale structure of the galaxy distribution. Recent measurements of this large-scale structure, in combination with observations of the cosmic microwave background, have provided independent evidence for a Universe dominated by dark matter and dark energy as well as insights into how galaxies and larger-scale structures formed. Future planned surveys will build on these foundations to probe the history of the cosmic expansion--and thereby the dark energy--with greater precision.

  11. A worldwide analysis of the impact of forest cover change on annual runoff across multiple spatial scales

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Liu, S.

    2017-12-01

    Despite extensive studies on hydrological responses to forest cover change in small watersheds, the hydrological responses to forest change and associated mechanisms across multiple spatial scales have not been fully understood. This review thus examined about 312 watersheds worldwide to provide a generalized framework to evaluate hydrological responses to forest cover change and to identify the contribution of spatial scale, climate, forest type and hydrological regime in determining the intensity of forest change related hydrological responses in small (<1000 km2) and large watersheds (≥1000 km2). Key findings include: 1) the increase in annual runoff associated with forest cover loss is statistically significant at multiple spatial scales whereas the effect of forest cover gain is statistically inconsistent; 2) the sensitivity of annual runoff to forest cover change tends to attenuate as watershed size increases only in large watersheds; 3) annual runoff is more sensitive to forest cover change in water-limited watersheds than in energy-limited watersheds across all spatial scales; and 4) small mixed forest-dominated watersheds or large snow-dominated watersheds are more hydrologically resilient to forest cover change. These findings improve the understanding of hydrological response to forest cover change at different spatial scales and provide a scientific underpinning to future watershed management in the context of climate change and increasing anthropogenic disturbances.

  12. Methane hydrates and the future of natural gas

    USGS Publications Warehouse

    Ruppel, Carolyn

    2011-01-01

    For decades, gas hydrates have been discussed as a potential resource, particularly for countries with limited access to conventional hydrocarbons or a strategic interest in establishing alternative, unconventional gas reserves. Methane has never been produced from gas hydrates at a commercial scale and, barring major changes in the economics of natural gas supply and demand, commercial production at a large scale is considered unlikely to commence within the next 15 years. Given the overall uncertainty still associated with gas hydrates as a potential resource, they have not been included in the EPPA model in MITEI’s Future of Natural Gas report. Still, gas hydrates remain a potentially large methane resource and must necessarily be included in any consideration of the natural gas supply beyond two decades from now.

  13. A large-scale simulation of climate change effects on flood regime - A case study for the Alabama-Coosa-Tallapoosa River Basin

    NASA Astrophysics Data System (ADS)

    Dullo, T. T.; Gangrade, S.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.; Kao, S. C.; Kalyanapu, A. J.

    2017-12-01

    The damage and cost of flooding are continuously increasing due to climate change and variability, which compels the development and advance of global flood hazard models. However, due to computational expensiveness, evaluation of large-scale and high-resolution flood regime remains a challenge. The objective of this research is to use a coupled modeling framework that consists of a dynamically downscaled suite of eleven Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models, a distributed hydrologic model called DHSVM, and a computational-efficient 2-dimensional hydraulic model called Flood2D-GPU to study the impacts of climate change on flood regime in the Alabama-Coosa-Tallapoosa (ACT) River Basin. Downscaled meteorologic forcings for 40 years in the historical period (1966-2005) and 40 years in the future period (2011-2050) were used as inputs to drive the calibrated DHSVM to generate annual maximum flood hydrographs. These flood hydrographs along with 30-m resolution digital elevation and estimated surface roughness were then used by Flood2D-GPU to estimate high-resolution flood depth, velocities, duration, and regime. Preliminary results for the Conasauga river basin (an upper subbasin within ACT) indicate that seven of the eleven climate projections show an average increase of 25 km2 in flooded area (between historic and future projections). Future work will focus on illustrating the effects of climate change on flood duration and area for the entire ACT basin.

  14. Microfilament-Eruption Mechanism for Solar Spicules

    NASA Technical Reports Server (NTRS)

    Sterling, Alphonse C.; Moore, Ronald L.

    2017-01-01

    Recent studies indicate that solar coronal jets result from eruption of small-scale filaments, or "minifilaments" (Sterling et al. 2015, Nature, 523, 437; Panesar et al. ApJL, 832L, 7). In many aspects, these coronal jets appear to be small-scale versions of long-recognized large-scale solar eruptions that are often accompanied by eruption of a large-scale filament and that produce solar flares and coronal mass ejections (CMEs). In coronal jets, a jet-base bright point (JBP) that is often observed to accompany the jet and that sits on the magnetic neutral line from which the minifilament erupts, corresponds to the solar flare of larger-scale eruptions that occurs at the neutral line from which the large-scale filament erupts. Large-scale eruptions are relatively uncommon (approximately 1 per day) and occur with relatively large-scale erupting filaments (approximately 10 (sup 5) kilometers long). Coronal jets are more common (approximately 100s per day), but occur from erupting minifilaments of smaller size (approximately 10 (sup 4) kilometers long). It is known that solar spicules are much more frequent (many millions per day) than coronal jets. Just as coronal jets are small-scale versions of large-scale eruptions, here we suggest that solar spicules might in turn be small-scale versions of coronal jets; we postulate that the spicules are produced by eruptions of "microfilaments" of length comparable to the width of observed spicules (approximately 300 kilometers). A plot of the estimated number of the three respective phenomena (flares/CMEs, coronal jets, and spicules) occurring on the Sun at a given time, against the average sizes of erupting filaments, minifilaments, and the putative microfilaments, results in a size distribution that can be fitted with a power-law within the estimated uncertainties. The counterparts of the flares of large-scale eruptions and the JBPs of jets might be weak, pervasive, transient brightenings observed in Hinode/CaII images, and the production of spicules by microfilament eruptions might explain why spicules spin, as do coronal jets. The expected small-scale neutral lines from which the microfilaments would be expected to erupt would be difficult to detect reliably with current instrumentation, but might be apparent with instrumentation of the near future. A full report on this work appears in Sterling and Moore 2016, ApJL, 829, L9.

  15. Some ecological guidelines for large-scale biomass plantations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, W.; Cook, J.H.; Beyea, J.

    1993-12-31

    The National Audubon Society sees biomass as an appropriate and necessary source of energy to help replace fossil fuels in the near future, but is concerned that large-scale biomass plantations could displace significant natural vegetation and wildlife habitat, and reduce national and global biodiversity. We support the development of an industry large enough to provide significant portions of our energy budget, but we see a critical need to ensure that plantations are designed and sited in ways that minimize ecological disruption, or even provide environmental benefits. We have been studying the habitat value of intensively managed short-rotation tree plantations. Ourmore » results show that these plantations support large populations of some birds, but not all of the species using the surrounding landscape, and indicate that their value as habitat can be increased greatly by including small areas of mature trees within them. We believe short-rotation plantations can benefit regional biodiversity if they can be deployed as buffers for natural forests, or as corridors connecting forest tracts. To realize these benefits, and to avoid habitat degradation, regional biomass plantation complexes (e.g., the plantations supplying all the fuel for a powerplant) need to be planned, sited, and developed as large-scale units in the context of the regional landscape mosaic.« less

  16. Extending the Shared Socioeconomic Pathways for sub-national impacts, adaptation, and vulnerability studies

    DOE PAGES

    Absar, Syeda Mariya; Preston, Benjamin L.

    2015-05-25

    The exploration of alternative socioeconomic futures is an important aspect of understanding the potential consequences of climate change. While socioeconomic scenarios are common and, at times essential, tools for the impact, adaptation and vulnerability and integrated assessment modeling research communities, their approaches to scenario development have historically been quite distinct. However, increasing convergence of impact, adaptation and vulnerability and integrated assessment modeling research in terms of scales of analysis suggests there may be value in the development of a common framework for socioeconomic scenarios. The Shared Socioeconomic Pathways represents an opportunity for the development of such a common framework. However,more » the scales at which these global storylines have been developed are largely incommensurate with the sub-national scales at which impact, adaptation and vulnerability, and increasingly integrated assessment modeling, studies are conducted. Our objective for this study was to develop sub-national and sectoral extensions of the global SSP storylines in order to identify future socioeconomic challenges for adaptation for the U.S. Southeast. A set of nested qualitative socioeconomic storyline elements, integrated storylines, and accompanying quantitative indicators were developed through an application of the Factor-Actor-Sector framework. Finally, in addition to revealing challenges and opportunities associated with the use of the SSPs as a basis for more refined scenario development, this study generated sub-national storyline elements and storylines that can subsequently be used to explore the implications of alternative subnational socioeconomic futures for the assessment of climate change impacts and adaptation.« less

  17. Future projection of Indian summer monsoon variability under climate change scenario: An assessment from CMIP5 climate models

    NASA Astrophysics Data System (ADS)

    Sharmila, S.; Joseph, S.; Sahai, A. K.; Abhilash, S.; Chattopadhyay, R.

    2015-01-01

    In this study, the impact of enhanced anthropogenic greenhouse gas emissions on the possible future changes in different aspects of daily-to-interannual variability of Indian summer monsoon (ISM) is systematically assessed using 20 coupled models participated in the Coupled Model Inter-comparison Project Phase 5. The historical (1951-1999) and future (2051-2099) simulations under the strongest Representative Concentration Pathway have been analyzed for this purpose. A few reliable models are selected based on their competence in simulating the basic features of present-climate ISM variability. The robust and consistent projections across the selected models suggest substantial changes in the ISM variability by the end of 21st century indicating strong sensitivity of ISM to global warming. On the seasonal scale, the all-India summer monsoon mean rainfall is likely to increase moderately in future, primarily governed by enhanced thermodynamic conditions due to atmospheric warming, but slightly offset by weakened large scale monsoon circulation. It is projected that the rainfall magnitude will increase over core monsoon zone in future climate, along with lengthening of the season due to late withdrawal. On interannual timescales, it is speculated that severity and frequency of both strong monsoon (SM) and weak monsoon (WM) might increase noticeably in future climate. Substantial changes in the daily variability of ISM are also projected, which are largely associated with the increase in heavy rainfall events and decrease in both low rain-rate and number of wet days during future monsoon. On the subseasonal scale, the model projections depict considerable amplification of higher frequency (below 30 day mode) components; although the dominant northward propagating 30-70 day mode of monsoon intraseasonal oscillations may not change appreciably in a warmer climate. It is speculated that the enhanced high frequency mode of monsoon ISOs due to increased GHG induced warming may notably modulate the ISM rainfall in future climate. Both extreme wet and dry episodes are likely to intensify and regionally extend in future climate with enhanced propensity of short active and long break spells. The SM (WM) could also be more wet (dry) in future due to the increment in longer active (break) spells. However, future changes in the spatial pattern during active/break phase of SM and WM are geographically inconsistent among the models. The results point out the growing climate-related vulnerability over Indian subcontinent, and further suggest the requisite of profound adaptation measures and better policy making in future.

  18. Impacts of Near-Term Climate Change on Irrigation Demands and Crop Yields in the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Rajagopalan, K.; Chinnayakanahalli, K. J.; Stockle, C. O.; Nelson, R. L.; Kruger, C. E.; Brady, M. P.; Malek, K.; Dinesh, S. T.; Barber, M. E.; Hamlet, A. F.; Yorgey, G. G.; Adam, J. C.

    2018-03-01

    Adaptation to a changing climate is critical to address future global food and water security challenges. While these challenges are global, successful adaptation strategies are often generated at regional scales; therefore, regional-scale studies are critical to inform adaptation decision making. While climate change affects both water supply and demand, water demand is relatively understudied, especially at regional scales. The goal of this work is to address this gap, and characterize the direct impacts of near-term (for the 2030s) climate change and elevated CO2 levels on regional-scale crop yields and irrigation demands for the Columbia River basin (CRB). This question is addressed through a coupled crop-hydrology model that accounts for site-specific and crop-specific characteristics that control regional-scale response to climate change. The overall near-term outlook for agricultural production in the CRB is largely positive, with yield increases for most crops and small overall increases in irrigation demand. However, there are crop-specific and location-specific negative impacts as well, and the aggregate regional response of irrigation demands to climate change is highly sensitive to the spatial crop mix. Low-value pasture/hay varieties of crops—typically not considered in climate change assessments—play a significant role in determining the regional response of irrigation demands to climate change, and thus cannot be overlooked. While, the overall near-term outlook for agriculture in the region is largely positive, there may be potential for a negative outlook further into the future, and it is important to consider this in long-term planning.

  19. Multilevel landscape utilization of the Siberian flying squirrel: Scale effects on species habitat use.

    PubMed

    Remm, Jaanus; Hanski, Ilpo K; Tuominen, Sakari; Selonen, Vesa

    2017-10-01

    Animals use and select habitat at multiple hierarchical levels and at different spatial scales within each level. Still, there is little knowledge on the scale effects at different spatial levels of species occupancy patterns. The objective of this study was to examine nonlinear effects and optimal-scale landscape characteristics that affect occupancy of the Siberian flying squirrel, Pteromys volans , in South- and Mid-Finland. We used presence-absence data ( n  = 10,032 plots of 9 ha) and novel approach to separate the effects on site-, landscape-, and regional-level occupancy patterns. Our main results were: landscape variables predicted the placement of population patches at least twice as well as they predicted the occupancy of particular sites; the clear optimal value of preferred habitat cover for species landscape-level abundance is a surprisingly low value (10% within a 4 km buffer); landscape metrics exert different effects on species occupancy and abundance in high versus low population density regions of our study area. We conclude that knowledge of regional variation in landscape utilization will be essential for successful conservation of the species. The results also support the view that large-scale landscape variables have high predictive power in explaining species abundance. Our study demonstrates the complex response of species occurrence at different levels of population configuration on landscape structure. The study also highlights the need for data in large spatial scale to increase the precision of biodiversity mapping and prediction of future trends.

  20. Regional modeling of large wildfires under current and potential future climates in Colorado and Wyoming, USA

    USGS Publications Warehouse

    West, Amanda; Kumar, Sunil; Jarnevich, Catherine S.

    2016-01-01

    Regional analysis of large wildfire potential given climate change scenarios is crucial to understanding areas most at risk in the future, yet wildfire models are not often developed and tested at this spatial scale. We fit three historical climate suitability models for large wildfires (i.e. ≥ 400 ha) in Colorado andWyoming using topography and decadal climate averages corresponding to wildfire occurrence at the same temporal scale. The historical models classified points of known large wildfire occurrence with high accuracies. Using a novel approach in wildfire modeling, we applied the historical models to independent climate and wildfire datasets, and the resulting sensitivities were 0.75, 0.81, and 0.83 for Maxent, Generalized Linear, and Multivariate Adaptive Regression Splines, respectively. We projected the historic models into future climate space using data from 15 global circulation models and two representative concentration pathway scenarios. Maps from these geospatial analyses can be used to evaluate the changing spatial distribution of climate suitability of large wildfires in these states. April relative humidity was the most important covariate in all models, providing insight to the climate space of large wildfires in this region. These methods incorporate monthly and seasonal climate averages at a spatial resolution relevant to land management (i.e. 1 km2) and provide a tool that can be modified for other regions of North America, or adapted for other parts of the world.

  1. Psychometric Properties of the Perceived Wellness Culture and Environment Support Scale.

    PubMed

    Melnyk, Bernadette Mazurek; Szalacha, Laura A; Amaya, Megan

    2018-05-01

    This study reports on the psychometric properties of the 11-item Perceived Wellness Culture and Environment Support Scale (PWCESS) and its relationship with employee healthy lifestyle beliefs and behaviors. Faculty and staff (N = 3959) at a large public university in the United States mid-west completed the PWCESS along with healthy lifestyle beliefs and behaviors scales. Data were randomly split into 2 halves to explore the PWCESS' validity and reliability and the second half to confirm findings. Principal components analysis indicated a unidimensional construct. The PWCESS was positively related to healthy lifestyle beliefs and behaviors supporting the scale's validity. Confirmatory factor analysis supported the unidimensional construct (Cronbach's α = .92). Strong evidence supports the validity and reliability of the PWCESS. Future use of this scale could guide workplace intervention strategies to improve organizational wellness culture and employee health outcomes.

  2. Quantifying Livestock Heat Stress Impacts in the Sahel

    NASA Astrophysics Data System (ADS)

    Broman, D.; Rajagopalan, B.; Hopson, T. M.

    2014-12-01

    Livestock heat stress, especially in regions of the developing world with limited adaptive capacity, has a largely unquantified impact on food supply. Though dominated by ambient air temperature, relative humidity, wind speed, and solar radiation all affect heat stress, which can decrease livestock growth, milk production, reproduction rates, and mortality. Indices like the thermal-humidity index (THI) are used to quantify the heat stress experienced from climate variables. Livestock experience differing impacts at different index critical thresholds that are empirically determined and specific to species and breed. This lack of understanding has been highlighted in several studies with a limited knowledge of the critical thresholds of heat stress in native livestock breeds, as well as the current and future impact of heat stress,. As adaptation and mitigation strategies to climate change depend on a solid quantitative foundation, this knowledge gap has limited such efforts. To address the lack of study, we have investigated heat stress impacts in the pastoral system of Sub-Saharan West Africa. We used a stochastic weather generator to quantify both the historic and future variability of heat stress. This approach models temperature, relative humidity, and precipitation, the climate variables controlling heat stress. Incorporating large-scale climate as covariates into this framework provides a better historical fit and allows us to include future CMIP5 GCM projections to examine the climate change impacts on heat stress. Health and production data allow us to examine the influence of this variability on livestock directly, and are considered in conjunction with the confounding impacts of fodder and water access. This understanding provides useful information to decision makers looking to mitigate the impacts of climate change and can provide useful seasonal forecasts of heat stress risk. A comparison of the current and future heat stress conditions based on climate variables for West Africa will be presented, An assessment of current and future risk was obtained by linking climatic heat stress to cattle health and production. Seasonal forecasts of heat stress are also provided by modeling the heat stress climate variables using persistent large-scale climate features.

  3. Characteristic mega-basin water storage behavior using GRACE.

    PubMed

    Reager, J T; Famiglietti, James S

    2013-06-01

    [1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA's Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km 2 ), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world's largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤  E f  ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation.

  4. Characteristic mega-basin water storage behavior using GRACE

    PubMed Central

    Reager, J T; Famiglietti, James S

    2013-01-01

    [1] A long-standing challenge for hydrologists has been a lack of observational data on global-scale basin hydrological behavior. With observations from NASA’s Gravity Recovery and Climate Experiment (GRACE) mission, hydrologists are now able to study terrestrial water storage for large river basins (>200,000 km2), with monthly time resolution. Here we provide results of a time series model of basin-averaged GRACE terrestrial water storage anomaly and Global Precipitation Climatology Project precipitation for the world’s largest basins. We address the short (10 year) length of the GRACE record by adopting a parametric spectral method to calculate frequency-domain transfer functions of storage response to precipitation forcing and then generalize these transfer functions based on large-scale basin characteristics, such as percent forest cover and basin temperature. Among the parameters tested, results show that temperature, soil water-holding capacity, and percent forest cover are important controls on relative storage variability, while basin area and mean terrain slope are less important. The derived empirical relationships were accurate (0.54 ≤ Ef ≤ 0.84) in modeling global-scale water storage anomaly time series for the study basins using only precipitation, average basin temperature, and two land-surface variables, offering the potential for synthesis of basin storage time series beyond the GRACE observational period. Such an approach could be applied toward gap filling between current and future GRACE missions and for predicting basin storage given predictions of future precipitation. PMID:24563556

  5. Superior Red Blood Cell Generation from Human Pluripotent Stem Cells Through a Novel Microcarrier-Based Embryoid Body Platform.

    PubMed

    Sivalingam, Jaichandran; Lam, Alan Tin-Lun; Chen, Hong Yu; Yang, Bin Xia; Chen, Allen Kuan-Liang; Reuveny, Shaul; Loh, Yuin-Han; Oh, Steve Kah-Weng

    2016-08-01

    In vitro generation of red blood cells (RBCs) from human embryonic stem cells and human induced pluripotent stem cells appears to be a promising alternate approach to circumvent shortages in donor-derived blood supplies for clinical applications. Conventional methods for hematopoietic differentiation of human pluripotent stem cells (hPSC) rely on embryoid body (EB) formation and/or coculture with xenogeneic cell lines. However, most current methods for hPSC expansion and EB formation are not amenable for scale-up to levels required for large-scale RBC generation. Moreover, differentiation methods that rely on xenogenic cell lines would face obstacles for future clinical translation. In this study, we report the development of a serum-free and chemically defined microcarrier-based suspension culture platform for scalable hPSC expansion and EB formation. Improved survival and better quality EBs generated with the microcarrier-based method resulted in significantly improved mesoderm induction and, when combined with hematopoietic differentiation, resulted in at least a 6-fold improvement in hematopoietic precursor expansion, potentially culminating in a 80-fold improvement in the yield of RBC generation compared to a conventional EB-based differentiation method. In addition, we report efficient terminal maturation and generation of mature enucleated RBCs using a coculture system that comprised primary human mesenchymal stromal cells. The microcarrier-based platform could prove to be an appealing strategy for future scale-up of hPSC culture, EB generation, and large-scale generation of RBCs under defined and xeno-free conditions.

  6. Forecasting landscape effects of Mississippi River diversions on elevation and accretion in Louisiana deltaic wetlands under future environmental uncertainty scenarios

    USGS Publications Warehouse

    Wang, Hongqing; Steyer, Gregory D.; Couvillion, Brady R.; John M. Rybczyk,; Beck, Holly J.; William J. Sleavin,; Ehab A. Meselhe,; Mead A. Allison,; Ronald G. Boustany,; Craig J. Fischenich,; Victor H. Rivera-Monroy,

    2014-01-01

    Large sediment diversions are proposed and expected to build new wetlands to alleviate the extensive wetland loss (5,000 km2) affecting coastal Louisiana during the last 78 years. Current assessment and prediction of the impacts of sediment diversions have focused on the capture and dispersal of both water and sediment on the adjacent river side and the immediate outfall marsh area. However, little is known about the effects of sediment diversions on existing wetland surface elevation and vertical accretion dynamics in the receiving basin at the landscape scale. In this study, we used a spatial wetland surface elevation model developed in support of Louisiana's 2012 Coastal Master Plan to examine such landscape-scale effects of sediment diversions. Multiple sediment diversion projects were incorporated in the model to simulate surface elevation and vertical accretion for the next 50 years (2010-2060) under two environmental (moderate and less optimistic) scenarios. Specifically, we examined landscape-scale surface elevation and vertical accretion trends under diversions with different geographical locations, diverted discharge rates, and geomorphic characteristics of the receiving basin. Model results indicate that small diversions (< 283 m3 s-1) tend to have limited effects of reducing landscape-scale elevation loss (< 3%) compared to a future without action (FWOA) condition. Large sediment diversions (> 1,500 m3 s-1) are required to achieve landscape-level benefits to promote surface elevation via vertical accretion to keep pace with rising sea level.

  7. Unravelling connections between river flow and large-scale climate: experiences from Europe

    NASA Astrophysics Data System (ADS)

    Hannah, D. M.; Kingston, D. G.; Lavers, D.; Stagge, J. H.; Tallaksen, L. M.

    2016-12-01

    The United Nations has identified better knowledge of large-scale water cycle processes as essential for socio-economic development and global water-food-energy security. In this context, and given the ever-growing concerns about climate change/ variability and human impacts on hydrology, there is an urgent research need: (a) to quantify space-time variability in regional river flow, and (b) to improve hydroclimatological understanding of climate-flow connections as a basis for identifying current and future water-related issues. In this paper, we draw together studies undertaken at the pan-European scale: (1) to evaluate current methods for assessing space-time dynamics for different streamflow metrics (annual regimes, low flows and high flows) and for linking flow variability to atmospheric drivers (circulation indices, air-masses, gridded climate fields and vapour flux); and (2) to propose a plan for future research connecting streamflow and the atmospheric conditions in Europe and elsewhere. We believe this research makes a useful, unique contribution to the literature through a systematic inter-comparison of different streamflow metrics and atmospheric descriptors. In our findings, we highlight the need to consider appropriate atmospheric descriptors (dependent on the target flow metric and region of interest) and to develop analytical techniques that best characterise connections in the ocean-atmosphere-land surface process chain. We call for the need to consider not only atmospheric interactions, but also the role of the river basin-scale terrestrial hydrological processes in modifying the climate signal response of river flows.

  8. Effects of climate variability on global scale flood risk

    NASA Astrophysics Data System (ADS)

    Ward, P.; Dettinger, M. D.; Kummu, M.; Jongman, B.; Sperna Weiland, F.; Winsemius, H.

    2013-12-01

    In this contribution we demonstrate the influence of climate variability on flood risk. Globally, flooding is one of the worst natural hazards in terms of economic damages; Munich Re estimates global losses in the last decade to be in excess of $240 billion. As a result, scientifically sound estimates of flood risk at the largest scales are increasingly needed by industry (including multinational companies and the insurance industry) and policy communities. Several assessments of global scale flood risk under current and conditions have recently become available, and this year has seen the first studies assessing how flood risk may change in the future due to global change. However, the influence of climate variability on flood risk has as yet hardly been studied, despite the fact that: (a) in other fields (drought, hurricane damage, food production) this variability is as important for policy and practice as long term change; and (b) climate variability has a strong influence in peak riverflows around the world. To address this issue, this contribution illustrates the influence of ENSO-driven climate variability on flood risk, at both the globally aggregated scale and the scale of countries and large river basins. Although it exerts significant and widespread influences on flood peak discharges in many parts of the world, we show that ENSO does not have a statistically significant influence on flood risk once aggregated to global totals. At the scale of individual countries, though, strong relationships exist over large parts of the Earth's surface. For example, we find particularly strong anomalies of flood risk in El Niño or La Niña years (compared to all years) in southern Africa, parts of western Africa, Australia, parts of Central Eurasia (especially for El Niño), the western USA (especially for La Niña), and parts of South America. These findings have large implications for both decadal climate-risk projections and long-term future climate change research. We carried out the research by simulating daily river discharge using a global hydrological model (PCR-GLOBWB), forced with gridded climate reanalysis time-series. From this, we derived peak annual flood volumes for large-scale river basins globally. These were used to force a global inundation model (dynRout) to map inundation extent and depth for return periods between 2 and 1000 years, under El Niño conditions, neutral conditions, and La Niña conditions. Theses flood hazard maps were combined with global datasets on socioeconomic variables such as population and income to represent the socioeconomic exposure to flooding, and depth-damage curves to represent vulnerability.

  9. Role of absorbing aerosols on hot extremes in India in a GCM

    NASA Astrophysics Data System (ADS)

    Mondal, A.; Sah, N.; Venkataraman, C.; Patil, N.

    2017-12-01

    Temperature extremes and heat waves in North-Central India during the summer months of March through June are known for causing significant impact in terms of human health, productivity and mortality. While greenhouse gas-induced global warming is generally believed to intensify the magnitude and frequency of such extremes, aerosols are usually associated with an overall cooling, by virtue of their dominant radiation scattering nature, in most world regions. Recently, large-scale atmospheric conditions leading to heat wave and extreme temperature conditions have been analysed for the North-Central Indian region. However, the role of absorbing aerosols, including black carbon and dust, is still not well understood, in mediating hot extremes in the region. In this study, we use 30-year simulations from a chemistry-coupled atmosphere-only General Circulation Model (GCM), ECHAM6-HAM2, forced with evolving aerosol emissions in an interactive aerosol module, along with observed sea surface temperatures, to examine large-scale and mesoscale conditions during hot extremes in India. The model is first validated with observed gridded temperature and reanalysis data, and is found to represent observed variations in temperature in the North-Central region and concurrent large-scale atmospheric conditions during high temperature extremes realistically. During these extreme events, changes in near surface properties include a reduction in single scattering albedo and enhancement in short-wave solar heating rate, compared to climatological conditions. This is accompanied by positive anomalies of black carbon and dust aerosol optical depths. We conclude that the large-scale atmospheric conditions such as the presence of anticyclones and clear skies, conducive to heat waves and high temperature extremes, are exacerbated by absorbing aerosols in North-Central India. Future air quality regulations are expected to reduce sulfate particles and their masking of GHG warming. It is concurrently important to mitigate emissions of warming black carbon particles, to manage future climate change-induced hot extremes.

  10. Utilizing Multi-Ensemble of Downscaled CMIP5 GCMs to Investigate Trends and Spatial and Temporal Extent of Drought in Willamette Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Beal, B.; Moradkhani, H.

    2015-12-01

    Changing climate and potential future increases in global temperature are likely to have impacts on drought characteristics and hydrologic cylce. In this study, we analyze changes in temporal and spatial extent of meteorological and hydrological droughts in future, and their trends. Three statistically downscaled datasets from NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP), Multivariate Adaptive Constructed Analogs (MACA), and Bias Correction and Spatial Disagregation (BCSD-PSU) each consisting of 10 CMIP5 Global Climate Models (GCM) are utilized for RCP4.5 and RCP8.5 scenarios. Further, Precipitation Runoff Modeling System (PRMS) hydrologic model is used to simulate streamflow from GCM inputs and assess the hydrological drought characteristics. Standard Precipitation Index (SPI) and Streamflow Drought Index (SDI) are the two indexes used to investigate meteorological and hydrological drought, respectively. Study is done for Willamette Basin with a drainage area of 29,700 km2 accommodating more than 3 million inhabitants and 25 dams. We analyze our study for annual time scale as well as three future periods of near future (2010-2039), intermediate future (2040-2069), and far future (2070-2099). Large uncertainty is found from GCM predictions. Results reveal that meteorological drought events are expected to increase in near future. Severe to extreme drought with large areal coverage and several years of occurance is predicted around year 2030 with the likelihood of exceptional drought for both drought types. SPI is usually showing positive trends, while SDI indicates negative trends in most cases.

  11. Solving large scale structure in ten easy steps with COLA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J., E-mail: stassev@cfa.harvard.edu, E-mail: matiasz@ias.edu, E-mail: deisenstein@cfa.harvard.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As anmore » illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.« less

  12. Wafer scale fabrication of carbon nanotube thin film transistors with high yield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Boyuan; Liang, Xuelei, E-mail: liangxl@pku.edu.cn, E-mail: ssxie@iphy.ac.cn; Yan, Qiuping

    Carbon nanotube thin film transistors (CNT-TFTs) are promising candidates for future high performance and low cost macro-electronics. However, most of the reported CNT-TFTs are fabricated in small quantities on a relatively small size substrate. The yield of large scale fabrication and the performance uniformity of devices on large size substrates should be improved before the CNT-TFTs reach real products. In this paper, 25 200 devices, with various geometries (channel width and channel length), were fabricated on 4-in. size ridged and flexible substrates. Almost 100% device yield were obtained on a rigid substrate with high out-put current (>8 μA/μm), high on/off current ratiomore » (>10{sup 5}), and high mobility (>30 cm{sup 2}/V·s). More importantly, uniform performance in 4-in. area was achieved, and the fabrication process can be scaled up. The results give us more confidence for the real application of the CNT-TFT technology in the near future.« less

  13. Holography--An Update.

    ERIC Educational Resources Information Center

    Jacobs, D. J.

    1988-01-01

    This article describes the basic physics of several types of holograms and discusses different recording materials in use. Current and possible future applications of holograms are described as well as their large-scale production. (Author)

  14. Renewable Electricity Futures. Operational Analysis of the Western Interconnection at Very High Renewable Penetrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brinkman, Gregory

    2015-09-01

    The Renewable Electricity Futures Study (RE Futures)--an analysis of the costs and grid impacts of integrating large amounts of renewable electricity generation into the U.S. power system--examined renewable energy resources, technical issues regarding the integration of these resources into the grid, and the costs associated with high renewable penetration scenarios. These scenarios included up to 90% of annual generation from renewable sources, although most of the analysis was focused on 80% penetration scenarios. Hourly production cost modeling was performed to understand the operational impacts of high penetrations. One of the conclusions of RE Futures was that further work was necessarymore » to understand whether the operation of the system was possible at sub-hourly time scales and during transient events. This study aimed to address part of this by modeling the operation of the power system at sub-hourly time scales using newer methodologies and updated data sets for transmission and generation infrastructure. The goal of this work was to perform a detailed, sub-hourly analysis of very high penetration scenarios for a single interconnection (the Western Interconnection). It focused on operational impacts, and it helps verify that the operational results from the capacity expansion models are useful. The primary conclusion of this study is that sub-hourly operation of the grid is possible with renewable generation levels between 80% and 90%.« less

  15. Data-based discharge extrapolation: estimating annual discharge for a partially gauged large river basin from its small sub-basins

    NASA Astrophysics Data System (ADS)

    Gong, L.

    2013-12-01

    Large-scale hydrological models and land surface models are by far the only tools for accessing future water resources in climate change impact studies. Those models estimate discharge with large uncertainties, due to the complex interaction between climate and hydrology, the limited quality and availability of data, as well as model uncertainties. A new purely data-based scale-extrapolation method is proposed, to estimate water resources for a large basin solely from selected small sub-basins, which are typically two-orders-of-magnitude smaller than the large basin. Those small sub-basins contain sufficient information, not only on climate and land surface, but also on hydrological characteristics for the large basin In the Baltic Sea drainage basin, best discharge estimation for the gauged area was achieved with sub-basins that cover 2-4% of the gauged area. There exist multiple sets of sub-basins that resemble the climate and hydrology of the basin equally well. Those multiple sets estimate annual discharge for gauged area consistently well with 5% average error. The scale-extrapolation method is completely data-based; therefore it does not force any modelling error into the prediction. The multiple predictions are expected to bracket the inherent variations and uncertainties of the climate and hydrology of the basin. The method can be applied in both un-gauged basins and un-gauged periods with uncertainty estimation.

  16. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    PubMed

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment.

  17. Fuel mapping for the future

    Treesearch

    C.W. Woodall; G.R. Holden; J.S. Vissage

    2004-01-01

    The large wildland fires that raged during the 2000 and 2002 fire seasons highlighted the need for a nationwide strategic assessment of forest fuels. The lack of a nationally consistent and comprehensive inventory of forest fuels has hindered large-scale assessments- essential for effective fuel hazard management and monitoring reduction treatments. Data from the USDA...

  18. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  19. MRS3D: 3D Spherical Wavelet Transform on the Sphere

    NASA Astrophysics Data System (ADS)

    Lanusse, F.; Rassat, A.; Starck, J.-L.

    2011-12-01

    Future cosmological surveys will provide 3D large scale structure maps with large sky coverage, for which a 3D Spherical Fourier-Bessel (SFB) analysis is natural. Wavelets are particularly well-suited to the analysis and denoising of cosmological data, but a spherical 3D isotropic wavelet transform does not currently exist to analyse spherical 3D data. We present a new fast Discrete Spherical Fourier-Bessel Transform (DSFBT) based on both a discrete Bessel Transform and the HEALPIX angular pixelisation scheme. We tested the 3D wavelet transform and as a toy-application, applied a denoising algorithm in wavelet space to the Virgo large box cosmological simulations and found we can successfully remove noise without much loss to the large scale structure. The new spherical 3D isotropic wavelet transform, called MRS3D, is ideally suited to analysing and denoising future 3D spherical cosmological surveys; it uses a novel discrete spherical Fourier-Bessel Transform. MRS3D is based on two packages, IDL and Healpix and can be used only if these two packages have been installed.

  20. Numerical simulations of the convective flame in white dwarfs

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1993-01-01

    A first step toward better understanding of the mechanism driving convective flames in exploding white dwarfs is presented. The propagation of the convective flame is examined using a two-dimensional implicit hydrodynamical code. The large scales of the instability are captured by the grid while the scales that are smaller than the grid resolution are approximated by a mixing-length approximation. It is found that largescale perturbations (of order of the pressure scale height) do grow significantly during the expansion, leading to a very nonspherical burning front. The combustion rate is strongly enhanced (compared to the unperturbed case) during the first second, but later the expansion of the star suppresses the flame speed, leading to only partial incineration of the nuclear fuel. Our results imply that large-scale perturbations by themselves are not enough to explain the mechanism by which convective flames are driven, and a study of the whole spectrum of relevant perturbations is needed. The implications of these preliminary results on future simulations, in the context of current models for Type Ia supernovae, are discussed.

  1. Spatial Temporal Mathematics at Scale: An Innovative and Fully Developed Paradigm to Boost Math Achievement among All Learners

    ERIC Educational Resources Information Center

    Rutherford, Teomara; Kibrick, Melissa; Burchinal, Margaret; Richland, Lindsey; Conley, AnneMarie; Osborne, Keara; Schneider, Stephanie; Duran, Lauren; Coulson, Andrew; Antenore, Fran; Daniels, Abby; Martinez, Michael E.

    2010-01-01

    This paper describes the background, methodology, preliminary findings, and anticipated future directions of a large-scale multi-year randomized field experiment addressing the efficacy of ST Math [Spatial-Temporal Math], a fully-developed math curriculum that uses interactive animated software. ST Math's unique approach minimizes the use of…

  2. Measuring suicidality using the personality assessment inventory: a convergent validity study with federal inmates.

    PubMed

    Patry, Marc W; Magaletta, Philip R

    2015-02-01

    Although numerous studies have examined the psychometric properties and clinical utility of the Personality Assessment Inventory in correctional contexts, only two studies to date have specifically focused on suicide ideation. This article examines the convergent validity of the Suicide Ideation Scale and the Suicide Potential Index on the Personality Assessment Inventory in a large, nontreatment sample of male and female federal inmates (N = 1,120). The data indicated robust validity support for both the Suicide Ideation Scale and Suicide Potential Index, which were each correlated with a broad group of validity indices representing multiple assessment modalities. Recommendations for future research to build upon these findings through replication and extension are made. © The Author(s) 2014.

  3. Differences in flood hazard projections in Europe – their causes and consequences for decision making

    USGS Publications Warehouse

    Kundzewicz, Z. W.; Krysanova, V.; Dankers, R.; Hirabayashi, Y.; Kanae, S.; Hattermann, F. F.; Huang, S.; Milly, Paul C.D.; Stoffel, M.; Driessen, P.P.J.; Matczak, P.; Quevauviller, P.; Schellnhuber, H.-J.

    2017-01-01

    This paper interprets differences in flood hazard projections over Europe and identifies likely sources of discrepancy. Further, it discusses potential implications of these differences for flood risk reduction and adaptation to climate change. The discrepancy in flood hazard projections raises caution, especially among decision makers in charge of water resources management, flood risk reduction, and climate change adaptation at regional to local scales. Because it is naïve to expect availability of trustworthy quantitative projections of future flood hazard, in order to reduce flood risk one should focus attention on mapping of current and future risks and vulnerability hotspots and improve the situation there. Although an intercomparison of flood hazard projections is done in this paper and differences are identified and interpreted, it does not seems possible to recommend which large-scale studies may be considered most credible in particular areas of Europe.

  4. Future potential distribution of the emerging amphibian chytrid fungus under anthropogenic climate change.

    PubMed

    Rödder, Dennis; Kielgast, Jos; Lötters, Stefan

    2010-11-01

    Anthropogenic climate change poses a major threat to global biodiversity with a potential to alter biological interactions at all spatial scales. Amphibians are the most threatened vertebrates and have been subject to increasing conservation attention over the past decade. A particular concern is the pandemic emergence of the parasitic chytrid fungus Batrachochytrium dendrobatidis, which has been identified as the cause of extremely rapid large-scale declines and species extinctions. Experimental and observational studies have demonstrated that the host-pathogen system is strongly influenced by climatic parameters and thereby potentially affected by climate change. Herein we project a species distribution model of the pathogen onto future climatic scenarios generated by the IPCC to examine their potential implications on the pandemic. Results suggest that predicted anthropogenic climate change may reduce the geographic range of B. dendrobatidis and its potential influence on amphibian biodiversity.

  5. Future HEP Accelerators: The US Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, Pushpalatha; Shiltsev, Vladimir

    2015-11-02

    Accelerator technology has advanced tremendously since the introduction of accelerators in the 1930s, and particle accelerators have become indispensable instruments in high energy physics (HEP) research to probe Nature at smaller and smaller distances. At present, accelerator facilities can be classified into Energy Frontier colliders that enable direct discoveries and studies of high mass scale particles and Intensity Frontier accelerators for exploration of extremely rare processes, usually at relatively low energies. The near term strategies of the global energy frontier particle physics community are centered on fully exploiting the physics potential of the Large Hadron Collider (LHC) at CERN throughmore » its high-luminosity upgrade (HL-LHC), while the intensity frontier HEP research is focused on studies of neutrinos at the MW-scale beam power accelerator facilities, such as Fermilab Main Injector with the planned PIP-II SRF linac project. A number of next generation accelerator facilities have been proposed and are currently under consideration for the medium- and long-term future programs of accelerator-based HEP research. In this paper, we briefly review the post-LHC energy frontier options, both for lepton and hadron colliders in various regions of the world, as well as possible future intensity frontier accelerator facilities.« less

  6. Drive to miniaturization: integrated optical networks on mobile platforms

    NASA Astrophysics Data System (ADS)

    Salour, Michael M.; Batayneh, Marwan; Figueroa, Luis

    2011-11-01

    With rapid growth of the Internet, bandwidth demand for data traffic is continuing to explode. In addition, emerging and future applications are becoming more and more network centric. With the proliferation of data communication platforms and data-intensive applications (e.g. cloud computing), high-bandwidth materials such as video clips dominating the Internet, and social networking tools, a networking technology is very desirable which can scale the Internet's capability (particularly its bandwidth) by two to three orders of magnitude. As the limits of Moore's law are approached, optical mesh networks based on wavelength-division multiplexing (WDM) have the ability to satisfy the large- and scalable-bandwidth requirements of our future backbone telecommunication networks. In addition, this trend is also affecting other special-purpose systems in applications such as mobile platforms, automobiles, aircraft, ships, tanks, and micro unmanned air vehicles (UAVs) which are becoming independent systems roaming the sky while sensing data, processing, making decisions, and even communicating and networking with other heterogeneous systems. Recently, WDM optical technologies have seen advances in its transmission speeds, switching technologies, routing protocols, and control systems. Such advances have made WDM optical technology an appealing choice for the design of future Internet architectures. Along these lines, scientists across the entire spectrum of the network architectures from physical layer to applications have been working on developing devices and communication protocols which can take full advantage of the rapid advances in WDM technology. Nevertheless, the focus has always been on large-scale telecommunication networks that span hundreds and even thousands of miles. Given these advances, we investigate the vision and applicability of integrating the traditionally large-scale WDM optical networks into miniaturized mobile platforms such as UAVs. We explain the benefits of WDM optical technology for these applications. We also describe some of the limitations of WDM optical networks as the size of a vehicle gets smaller, such as in micro-UAVs, and study the miniaturization and communication system limitations in such environments.

  7. Characteristics of Tornado-Like Vortices Simulated in a Large-Scale Ward-Type Simulator

    NASA Astrophysics Data System (ADS)

    Tang, Zhuo; Feng, Changda; Wu, Liang; Zuo, Delong; James, Darryl L.

    2018-02-01

    Tornado-like vortices are simulated in a large-scale Ward-type simulator to further advance the understanding of such flows, and to facilitate future studies of tornado wind loading on structures. Measurements of the velocity fields near the simulator floor and the resulting floor surface pressures are interpreted to reveal the mean and fluctuating characteristics of the flow as well as the characteristics of the static-pressure deficit. We focus on the manner in which the swirl ratio and the radial Reynolds number affect these characteristics. The transition of the tornado-like flow from a single-celled vortex to a dual-celled vortex with increasing swirl ratio and the impact of this transition on the flow field and the surface-pressure deficit are closely examined. The mean characteristics of the surface-pressure deficit caused by tornado-like vortices simulated at a number of swirl ratios compare well with the corresponding characteristics recorded during full-scale tornadoes.

  8. A Dynamic Evaluation Of A Model And An Estimate Of The Air Quality And Regional Climate Impacts Of Enhanced Solar Power Generation

    NASA Astrophysics Data System (ADS)

    Millstein, D.; Brown, N. J.; Zhai, P.; Menon, S.

    2012-12-01

    We use the WRF/Chem model (Weather Research and Forecasting model with chemistry) and pollutant emissions based on the EPA National Emission Inventories from 2005 and 2008 to model regional climate and air quality over the continental United States. Additionally, 2030 emission scenarios are developed to investigate the effects of future enhancements to solar power generation. Modeling covered 6 summer and 6 winter weeks each year. We model feedback between aerosols and meteorology and thus capture direct and indirect aerosol effects. The grid resolution is 25 km and includes no nesting. Between 2005 and 2008 significant emission reductions were reported in the National Emission Inventory. The 2008 weekday emissions over the continental U.S. of SO2 and NO were reduced from 2005 values by 28% and 16%, respectively. Emission reductions of this magnitude are similar in scale to the potential emission reductions from various energy policy initiatives. By evaluating modeled and observed air quality changes from 2005 to 2008, we analyze how well the model represents the effects of historical emission changes. We also gain insight into how well the model might predict the effects of future emission changes. In addition to direct comparisons of model outputs to ground and satellite observations, we compare observed differences between 2005 and 2008 to corresponding modeled differences. Modeling was extended to future scenarios (2030) to simulate air quality and regional climate effects of large-scale adoption of solar power. The 2030-year was selected to allow time for development of solar generation infrastructure. The 2030 emission scenario was scaled, with separate factors for different economic sectors, from the 2008 National Emissions Inventory. The changes to emissions caused by the introduction of large-scale solar power (here assumed to be 10% of total energy generation) are based on results from a parallel project that used an electricity grid model applied over multiple regions across the country. The regional climate and air quality effects of future large-scale solar power adoption are analyzed in the context of uncertainty quantified by the dynamic evaluation of the historical (2005 and 2008) WRF/Chem simulations.

  9. Enhancing Performance of Large-Area Organic Solar Cells with Thick Film via Ternary Strategy.

    PubMed

    Zhang, Jianqi; Zhao, Yifan; Fang, Jin; Yuan, Liu; Xia, Benzheng; Wang, Guodong; Wang, Zaiyu; Zhang, Yajie; Ma, Wei; Yan, Wei; Su, Wenming; Wei, Zhixiang

    2017-06-01

    Large-scale fabrication of organic solar cells requires an active layer with high thickness tolerability and the use of environment-friendly solvents. Thick films with high-performance can be achieved via a ternary strategy studied herein. The ternary system consists of one polymer donor, one small molecule donor, and one fullerene acceptor. The small molecule enhances the crystallinity and face-on orientation of the active layer, leading to improved thickness tolerability compared with that of a polymer-fullerene binary system. An active layer with 270 nm thickness exhibits an average power conversion efficiency (PCE) of 10.78%, while the PCE is less than 8% with such thick film for binary system. Furthermore, large-area devices are successfully fabricated using polyethylene terephthalate (PET)/Silver gride or indium tin oxide (ITO)-based transparent flexible substrates. The product shows a high PCE of 8.28% with an area of 1.25 cm 2 for a single cell and 5.18% for a 20 cm 2 module. This study demonstrates that ternary organic solar cells exhibit great potential for large-scale fabrication and future applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  11. A laser-sheet flow visualization technique for the large wind tunnels of the National Full-Scale Aerodynamics Complex

    NASA Technical Reports Server (NTRS)

    Reinath, M. S.; Ross, J. C.

    1990-01-01

    A flow visualization technique for the large wind tunnels of the National Full Scale Aerodynamics Complex (NFAC) is described. The technique uses a laser sheet generated by the NFAC Long Range Laser Velocimeter (LRLV) to illuminate a smoke-like tracer in the flow. The LRLV optical system is modified slightly, and a scanned mirror is added to generate the sheet. These modifications are described, in addition to the results of an initial performance test conducted in the 80- by 120-Foot Wind Tunnel. During this test, flow visualization was performed in the wake region behind a truck as part of a vehicle drag reduction study. The problems encountered during the test are discussed, in addition to the recommended improvements needed to enhance the performance of the technique for future applications.

  12. Experience in managing a large-scale rescreening of Papanicolaou smears and the pros and cons of measuring proficiency with visual and written examinations.

    PubMed

    Rube, I F

    1989-01-01

    Experiences in a large-scale interlaboratory rescreening of Papanicolaou smears are detailed, and the pros and cons of measuring proficiency in cytology are discussed. Despite the additional work of the rescreening project and some psychological and technical problems, it proved to be a useful measure of the laboratory's performance as a whole. One problem to be avoided in future similar studies is the creation of too many diagnostic categories. Individual testing and certification have been shown to be accurate predictors of proficiency. For cytology, such tests require a strong visual component to test interpretation and judgment skills, such as by the use of glass slides or photomicrographs. The potential of interactive videodisc technology for facilitating cytopathologic teaching and assessment is discussed.

  13. Land grabbing: a preliminary quantification of economic impacts on rural livelihoods.

    PubMed

    Davis, Kyle F; D'Odorico, Paolo; Rulli, Maria Cristina

    2014-01-01

    Global demands on agricultural land are increasing due to population growth, dietary changes and the use of biofuels. Their effect on food security is to reduce humans' ability to cope with the uncertainties of global climate change. In light of the 2008 food crisis, to secure reliable future access to sufficient agricultural land, many nations and corporations have begun purchasing large tracts of land in the global South, a phenomenon deemed "land grabbing" by popular media. Because land investors frequently export crops without providing adequate employment, this represents an effective income loss for local communities. We study 28 countries targeted by large-scale land acquisitions [comprising 87 % of reported cases and 27 million hectares (ha)] and estimate the effects of such investments on local communities' incomes. We find that this phenomenon can potentially affect the incomes of ~12 million people globally with implications for food security, poverty levels and urbanization. While it is important to note that our study incorporates a number of assumptions and limitations, it provides a much needed initial quantification of the economic impacts of large-scale land acquisitions on rural livelihoods.

  14. Calculation of the coherent synchrotron radiation impedance from a wiggler

    NASA Astrophysics Data System (ADS)

    Wu, Juhao; Raubenheimer, Tor O.; Stupakov, Gennady V.

    2003-04-01

    Most studies of coherent synchrotron radiation (CSR) have considered only the radiation from independent dipole magnets. However, in the damping rings of future linear colliders, a large fraction of the radiation power will be emitted in damping wigglers. In this paper, the longitudinal wakefield and impedance due to CSR in a wiggler are derived in the limit of a large wiggler parameter K. After an appropriate scaling, the results can be expressed in terms of universal functions, which are independent of K. Analytical asymptotic results are obtained for the wakefield in the limit of large and small distances, and for the impedance in the limit of small and high frequencies.

  15. A High Resolution View of Galactic Centers: Arp 220 and M31

    NASA Astrophysics Data System (ADS)

    Lockhart, Kelly E.

    The centers of galaxy are small in size and yet incredibly complex. They play host to supermassive black holes and nuclear star clusters (NSCs) and are subject to large gas inows, nuclear starbursts, and active galactic nuclear (AGN) activity. They can also be the launching site for large-scale galactic outows. However, though these systems are quite important to galactic evolution, observations are quite difficult due to their small size. Using high spatial resolution narrowband imaging with HST/WFC3 of Arp 220, a latestage galaxy merger, I discover an ionized gas bubble feature ( r = 600 pc) just off the nucleus. The bubble is aligned with both the western nucleus and with the large-scale galactic outflow. Using energetics arguments, I link the bubble with a young, obscured AGN or with an intense nuclear starburst. Given its alignment along the large-scale outflow axis, I argue that the bubble presents evidence for a link between the galactic center and the large-scale outflow. I also present new observations of the NSC in M31, the closest large spiral galaxy to our own. Using the OSIRIS near-infrared integral field spectrograph (IFS) on Keck, I map the kinematics of the old stellar population in the eccentric disk of the NSC. I compare the observations to models to derive a precession speed of the disk of 0+/-5 km s-1 pc-1 , and hence confirm that winds from the old stellar population may be the source of gas needed to form the young stellar population in the NSC. Studies of galactic centers are dependent on high spatial resolution observations. In particular, IFSs are ideal instruments for these studies as they provide two-dimensional spectroscopy of the field of view, enabling 2D kinematic studies. I report on work to characterize and improve the data reduction pipeline of the OSIRIS IFS, and discuss implications for future generations of IFS instrumentation.

  16. Genome resequencing in Populus: Revealing large-scale genome variation and implications on specialized-trait genomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muchero, Wellington; Labbe, Jessy L; Priya, Ranjan

    2014-01-01

    To date, Populus ranks among a few plant species with a complete genome sequence and other highly developed genomic resources. With the first genome sequence among all tree species, Populus has been adopted as a suitable model organism for genomic studies in trees. However, far from being just a model species, Populus is a key renewable economic resource that plays a significant role in providing raw materials for the biofuel and pulp and paper industries. Therefore, aside from leading frontiers of basic tree molecular biology and ecological research, Populus leads frontiers in addressing global economic challenges related to fuel andmore » fiber production. The latter fact suggests that research aimed at improving quality and quantity of Populus as a raw material will likely drive the pursuit of more targeted and deeper research in order to unlock the economic potential tied in molecular biology processes that drive this tree species. Advances in genome sequence-driven technologies, such as resequencing individual genotypes, which in turn facilitates large scale SNP discovery and identification of large scale polymorphisms are key determinants of future success in these initiatives. In this treatise we discuss implications of genome sequence-enable technologies on Populus genomic and genetic studies of complex and specialized-traits.« less

  17. The latest developments and outlook for hydrogen liquefaction technology

    NASA Astrophysics Data System (ADS)

    Ohlig, K.; Decker, L.

    2014-01-01

    Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence higher operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.

  18. Sustainability of groundwater supplies in the Northern Atlantic Coastal Plain aquifer system

    USGS Publications Warehouse

    Masterson, John P.; Pope, Jason P.

    2016-08-31

    The U.S. Geological Survey (USGS) is conducting large-scale multidisciplinary regional studies of groundwater availability as part of its ongoing assessments of the principal aquifers of the Nation. These regional studies are intended to provide citizens, communities, and natural resource managers with knowledge of the status of the Nation’s groundwater resources and how changes in land use, water use, and climate have affected and are likely to affect those resources now and in the future.

  19. Large scale structures in the kinetic gravity braiding model that can be unbraided

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, Rampei; Yamamoto, Kazuhiro, E-mail: rampei@theo.phys.sci.hiroshima-u.ac.jp, E-mail: kazuhiro@hiroshima-u.ac.jp

    2011-04-01

    We study cosmological consequences of a kinetic gravity braiding model, which is proposed as an alternative to the dark energy model. The kinetic braiding model we study is characterized by a parameter n, which corresponds to the original galileon cosmological model for n = 1. We find that the background expansion of the universe of the kinetic braiding model is the same as the Dvali-Turner's model, which reduces to that of the standard cold dark matter model with a cosmological constant (ΛCDM model) for n equal to infinity. We also find that the evolution of the linear cosmological perturbation inmore » the kinetic braiding model reduces to that of the ΛCDM model for n = ∞. Then, we focus our study on the growth history of the linear density perturbation as well as the spherical collapse in the nonlinear regime of the density perturbations, which might be important in order to distinguish between the kinetic braiding model and the ΛCDM model when n is finite. The theoretical prediction for the large scale structure is confronted with the multipole power spectrum of the luminous red galaxy sample of the Sloan Digital Sky survey. We also discuss future prospects of constraining the kinetic braiding model using a future redshift survey like the WFMOS/SuMIRe PFS survey as well as the cluster redshift distribution in the South Pole Telescope survey.« less

  20. Cosmology with cosmic shear observations: a review.

    PubMed

    Kilbinger, Martin

    2015-07-01

    Cosmic shear is the distortion of images of distant galaxies due to weak gravitational lensing by the large-scale structure in the Universe. Such images are coherently deformed by the tidal field of matter inhomogeneities along the line of sight. By measuring galaxy shape correlations, we can study the properties and evolution of structure on large scales as well as the geometry of the Universe. Thus, cosmic shear has become a powerful probe into the nature of dark matter and the origin of the current accelerated expansion of the Universe. Over the last years, cosmic shear has evolved into a reliable and robust cosmological probe, providing measurements of the expansion history of the Universe and the growth of its structure. We review here the principles of weak gravitational lensing and show how cosmic shear is interpreted in a cosmological context. Then we give an overview of weak-lensing measurements, and present the main observational cosmic-shear results since it was discovered 15 years ago, as well as the implications for cosmology. We then conclude with an outlook on the various future surveys and missions, for which cosmic shear is one of the main science drivers, and discuss promising new weak cosmological lensing techniques for future observations.

  1. Defense Acquisitions: Future Aerostat and Airship Investment Decisions Drive Oversight and Coordination Needs

    DTIC Science & Technology

    2012-10-01

    earlier, LEMV experienced schedule delays of at least 10 months, largely rooted in technical, design, and engineering problems in scaling up the airship ...had informal coordination with the Blue Devil Block 2 effort in the past. For example, originally both airships had several diesel engine ...DEFENSE ACQUISITIONS Future Aerostat and Airship Investment Decisions Drive Oversight and Coordination Needs

  2. Natural reproduction in certain cutover pine-fir stands in California

    Treesearch

    H.A. Fowells; G.H. Schubert

    1951-01-01

    Natural reproduction must provide future crops of timber on most of the forest land being placed under management in California. Relatively few acres will be planted or seeded in the near future because planting costs are high, facilities for undertaking large-scale planting are inadequate, and direct seeding has not yet proved satisfactory. In the pine region it is...

  3. Assessing Lebanon's wildfire potential in association with current and future climatic conditions

    Treesearch

    George H. Mitri; Mireille G. Jazi; David McWethy

    2015-01-01

    The increasing occurrence and extent of large-scale wildfires in the Mediterranean have been linked to extended periods of warm and dry weather. We set out to assess Lebanon's wildfire potential in association with current and future climatic conditions. The Keetch-Byram Drought Index (KBDI) was the primary climate variable used in our evaluation of climate/fire...

  4. The scientific targets of the SCOPE mission

    NASA Astrophysics Data System (ADS)

    Fujimoto, M.; Saito, Y.; Tsuda, Y.; Shinohara, I.; Kojima, H.

    Future Japanese magnetospheric mission "SCOPE" is now under study (planned to be launched in 2012). The main purpose of this mission is to investigate the dynamic behaviors of plasmas in the Earth's magnetosphere from the view-point of cross-scale coupling. Dynamical collisionless space plasma phenomena, be they large scale as a whole, are chracterized by coupling over various time and spatial scales. The best example would be the magnetic reconnection process, which is a large scale energy conversion process but has a small key region at the heart of its engine. Inside the key region, electron scale dynamics plays the key role in liberating the frozen-in constraint, by which reconnection is allowed to proceed. The SCOPE mission is composed of one large mother satellite and four small daughter satellites. The mother spacecraft will be equiped with the electron detector that has 10 msec time resolution so that scales down to the electron's will be resolved. Three of the four daughter satellites surround the mother satellite 3-dimensionally with the mutual distances between several km and several thousand km, which are varied during the mission. Plasma measurements on these spacecrafts will have 1 sec resolution and will provide information on meso-scale plasma structure. The fourth daughter satellite stays near the mother satellite with the distance less than 100km. By correlation between the two plasma wave instruments on the daughter and the mother spacecrafts, propagation of the waves and the information on the electron scale dynamics will be obtained. By this strategy, both meso- and micro-scale information on dynamics are obtained, that will enable us to investigate the physics of the space plasma from the cross-scale coupling point of view.

  5. Future Research in Health Information Technology: A Review.

    PubMed

    Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammad Reza; Saghafi, Fatemeh

    2017-01-01

    Currently, information technology is considered an important tool to improve healthcare services. To adopt the right technologies, policy makers should have adequate information about present and future advances. This study aimed to review and compare studies with a focus on the future of health information technology. This review study was completed in 2015. The databases used were Scopus, Web of Science, ProQuest, Ovid Medline, and PubMed. Keyword searches were used to identify papers and materials published between 2000 and 2015. Initially, 407 papers were obtained, and they were reduced to 11 papers at the final stage. The selected papers were described and compared in terms of the country of origin, objective, methodology, and time horizon. The papers were divided into two groups: those forecasting the future of health information technology (seven papers) and those providing health information technology foresight (four papers). The results showed that papers related to forecasting the future of health information technology were mostly a literature review, and the time horizon was up to 10 years in most of these studies. In the health information technology foresight group, most of the studies used a combination of techniques, such as scenario building and Delphi methods, and had long-term objectives. To make the most of an investment and to improve planning and successful implementation of health information technology, a strategic plan for the future needs to be set. To achieve this aim, methods such as forecasting the future of health information technology and offering health information technology foresight can be applied. The forecasting method is used when the objectives are not very large, and the foresight approach is recommended when large-scale objectives are set to be achieved. In the field of health information technology, the results of foresight studies can help to establish realistic long-term expectations of the future of health information technology.

  6. Future Research in Health Information Technology: A Review

    PubMed Central

    Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammad Reza; Saghafi, Fatemeh

    2017-01-01

    Introduction Currently, information technology is considered an important tool to improve healthcare services. To adopt the right technologies, policy makers should have adequate information about present and future advances. This study aimed to review and compare studies with a focus on the future of health information technology. Method This review study was completed in 2015. The databases used were Scopus, Web of Science, ProQuest, Ovid Medline, and PubMed. Keyword searches were used to identify papers and materials published between 2000 and 2015. Initially, 407 papers were obtained, and they were reduced to 11 papers at the final stage. The selected papers were described and compared in terms of the country of origin, objective, methodology, and time horizon. Results The papers were divided into two groups: those forecasting the future of health information technology (seven papers) and those providing health information technology foresight (four papers). The results showed that papers related to forecasting the future of health information technology were mostly a literature review, and the time horizon was up to 10 years in most of these studies. In the health information technology foresight group, most of the studies used a combination of techniques, such as scenario building and Delphi methods, and had long-term objectives. Conclusion To make the most of an investment and to improve planning and successful implementation of health information technology, a strategic plan for the future needs to be set. To achieve this aim, methods such as forecasting the future of health information technology and offering health information technology foresight can be applied. The forecasting method is used when the objectives are not very large, and the foresight approach is recommended when large-scale objectives are set to be achieved. In the field of health information technology, the results of foresight studies can help to establish realistic long-term expectations of the future of health information technology. PMID:28566991

  7. Ongoing Use of Data and Specimens from NCI Sponsored Cancer Prevention Clinical Trials in the Community Clinical Oncology Program

    PubMed Central

    Minasian, Lori; Tangen, Catherine M.; Wickerham, D. Lawrence

    2015-01-01

    Large cancer prevention trials provide opportunities to collect a wide array of data and biospecimens at study entry and longitudinally, for a healthy, aging population without cancer. This provides an opportunity to use pre-diagnostic data and specimens to evaluate hypotheses about the initial development of cancer. This paper reports on strides made by, and future possibilities for, the use of accessible biorepositories developed from precisely annotated samples obtained through large-scale National Cancer Institute (NCI)-sponsored cancer prevention clinical trials conducted by the NCI Cooperative Groups. These large cancer prevention studies, which have enrolled over 80,000 volunteers, continue to contribute to our understanding of cancer development more than 10 years after they were closed. PMID:26433556

  8. Correlates of the MMPI-2-RF in a college setting.

    PubMed

    Forbey, Johnathan D; Lee, Tayla T C; Handel, Richard W

    2010-12-01

    The current study examined empirical correlates of scores on Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF; A. Tellegen & Y. S. Ben-Porath, 2008; Y. S. Ben-Porath & A. Tellegen, 2008) scales in a college setting. The MMPI-2-RF and six criterion measures (assessing anger, assertiveness, sex roles, cognitive failures, social avoidance, and social fear) were administered to 846 college students (nmen = 264, nwomen = 582) to examine the convergent and discriminant validity of scores on the MMPI-2-RF Specific Problems and Interest scales. Results demonstrated evidence of generally good convergent score validity for the selected MMPI-2-RF scales, reflected in large effect size correlations with criterion measure scores. Further, MMPI-2-RF scale scores demonstrated adequate discriminant validity, reflected in relatively low comparative median correlations between scores on MMPI-2-RF substantive scale sets and criterion measures. Limitations and future directions are discussed.

  9. Validation of the Short Form of the Academic Procrastination Scale.

    PubMed

    Yockey, Ronald D

    2016-02-01

    The factor structure, internal consistency reliability, and convergent validity of the five-item Academic Procrastination Scale-Short Form was investigated on an ethnically diverse sample of college students. The results provided support for the Academic Procrastination Scale-Short Form as a unidimensional measure of academic procrastination, which possessed good internal consistency reliability in this sample of 282 students. The scale also demonstrated good convergent validity, with moderate to large correlations with both the Procrastination Assessment Scale-Students and the Tuckman Procrastination Scale. Implications of the results are discussed and recommendations for future work provided.

  10. Projection Effects of Large-scale Structures on Weak-lensing Peak Abundances

    NASA Astrophysics Data System (ADS)

    Yuan, Shuo; Liu, Xiangkun; Pan, Chuzhong; Wang, Qiao; Fan, Zuhui

    2018-04-01

    High peaks in weak lensing (WL) maps originate dominantly from the lensing effects of single massive halos. Their abundance is therefore closely related to the halo mass function and thus a powerful cosmological probe. However, besides individual massive halos, large-scale structures (LSS) along lines of sight also contribute to the peak signals. In this paper, with ray-tracing simulations, we investigate the LSS projection effects. We show that for current surveys with a large shape noise, the stochastic LSS effects are subdominant. For future WL surveys with source galaxies having a median redshift z med ∼ 1 or higher, however, they are significant. For the cosmological constraints derived from observed WL high-peak counts, severe biases can occur if the LSS effects are not taken into account properly. We extend the model of Fan et al. by incorporating the LSS projection effects into the theoretical considerations. By comparing with simulation results, we demonstrate the good performance of the improved model and its applicability in cosmological studies.

  11. Soil carbon management in large-scale Earth system modelling: implications for crop yields and nitrogen leaching

    NASA Astrophysics Data System (ADS)

    Olin, S.; Lindeskog, M.; Pugh, T. A. M.; Schurgers, G.; Wårlind, D.; Mishurov, M.; Zaehle, S.; Stocker, B. D.; Smith, B.; Arneth, A.

    2015-11-01

    Croplands are vital ecosystems for human well-being and provide important ecosystem services such as crop yields, retention of nitrogen and carbon storage. On large (regional to global)-scale levels, assessment of how these different services will vary in space and time, especially in response to cropland management, are scarce. We explore cropland management alternatives and the effect these can have on future C and N pools and fluxes using the land-use-enabled dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator). Simulated crop production, cropland carbon storage, carbon sequestration and nitrogen leaching from croplands are evaluated and discussed. Compared to the version of LPJ-GUESS that does not include land-use dynamics, estimates of soil carbon stocks and nitrogen leaching from terrestrial to aquatic ecosystems were improved. Our model experiments allow us to investigate trade-offs between these ecosystem services that can be provided from agricultural fields. These trade-offs are evaluated for current land use and climate and further explored for future conditions within the two future climate change scenarios, RCP (Representative Concentration Pathway) 2.6 and 8.5. Our results show that the potential for carbon sequestration due to typical cropland management practices such as no-till management and cover crops proposed in previous studies is not realised, globally or over larger climatic regions. Our results highlight important considerations to be made when modelling C-N interactions in agricultural ecosystems under future environmental change and the effects these have on terrestrial biogeochemical cycles.

  12. The Potential of Text Mining in Data Integration and Network Biology for Plant Research: A Case Study on Arabidopsis[C][W

    PubMed Central

    Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J.; Inzé, Dirk; Van de Peer, Yves

    2013-01-01

    Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein–protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies. PMID:23532071

  13. A study of residence time distribution using radiotracer technique in the large scale plant facility

    NASA Astrophysics Data System (ADS)

    Wetchagarun, S.; Tippayakul, C.; Petchrak, A.; Sukrod, K.; Khoonkamjorn, P.

    2017-06-01

    As the demand for troubleshooting of large industrial plants increases, radiotracer techniques, which have capability to provide fast, online and effective detections to plant problems, have been continually developed. One of the good potential applications of the radiotracer for troubleshooting in a process plant is the analysis of Residence Time Distribution (RTD). In this paper, the study of RTD in a large scale plant facility using radiotracer technique was presented. The objective of this work is to gain experience on the RTD analysis using radiotracer technique in a “larger than laboratory” scale plant setup which can be comparable to the real industrial application. The experiment was carried out at the sedimentation tank in the water treatment facility of Thailand Institute of Nuclear Technology (Public Organization). Br-82 was selected to use in this work due to its chemical property, its suitable half-life and its on-site availability. NH4Br in the form of aqueous solution was injected into the system as the radiotracer. Six NaI detectors were placed along the pipelines and at the tank in order to determine the RTD of the system. The RTD and the Mean Residence Time (MRT) of the tank was analysed and calculated from the measured data. The experience and knowledge attained from this study is important for extending this technique to be applied to industrial facilities in the future.

  14. Numerical Modeling of Propellant Boil-Off in a Cryogenic Storage Tank

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Steadman, T. E.; Maroney, J. L.; Sass, J. P.; Fesmire, J. E.

    2007-01-01

    A numerical model to predict boil-off of stored propellant in large spherical cryogenic tanks has been developed. Accurate prediction of tank boil-off rates for different thermal insulation systems was the goal of this collaboration effort. The Generalized Fluid System Simulation Program, integrating flow analysis and conjugate heat transfer for solving complex fluid system problems, was used to create the model. Calculation of tank boil-off rate requires simultaneous simulation of heat transfer processes among liquid propellant, vapor ullage space, and tank structure. The reference tank for the boil-off model was the 850,000 gallon liquid hydrogen tank at Launch Complex 39B (LC- 39B) at Kennedy Space Center, which is under study for future infrastructure improvements to support the Constellation program. The methodology employed in the numerical model was validated using a sub-scale model and tank. Experimental test data from a 1/15th scale version of the LC-39B tank using both liquid hydrogen and liquid nitrogen were used to anchor the analytical predictions of the sub-scale model. Favorable correlations between sub-scale model and experimental test data have provided confidence in full-scale tank boil-off predictions. These methods are now being used in the preliminary design for other cases including future launch vehicles

  15. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  16. Synoptic circulation and temperature pattern during severe wildland fires

    Treesearch

    Warren E. Heilman

    1996-01-01

    Large-scale changes in the atmosphere associated with a globally changed climate and changes in climatic variability may have important regional impacts on the frequency and severity of wildland fires in the future.

  17. Men and Arms in the Middle East: The Human Factor in Military Modernization

    DTIC Science & Technology

    1979-06-01

    countries under study supports their abilities to wield military power effectively, their large- scale reliance on importation of military technologies...statistics, and on quality from area experts. In many cases , we were unable to arrive at numerical estimates of the sources of supply. . Likely future...government agencies); on -the-job training (as in the case of counterpart pro- grams); and thle direct importation of both military and civilian labor

  18. What Are Student Inservice Teachers Talking about in Their Online Communities of Practice? Investigating Student Inservice Teachers' Experiences in a Double-Layered CoP

    ERIC Educational Resources Information Center

    Lee, Kyungmee; Brett, Clare

    2013-01-01

    This qualitative case study is the first phase of a large-scale design-based research project to implement a theoretically derived double-layered CoP model within real-world teacher development practices. The main goal of this first iteration is to evaluate the courses and test and refine the CoP model for future implementations. This paper…

  19. Multi-scale correlations in different futures markets

    NASA Astrophysics Data System (ADS)

    Bartolozzi, M.; Mellen, C.; di Matteo, T.; Aste, T.

    2007-07-01

    In the present work we investigate the multiscale nature of the correlations for high frequency data (1 min) in different futures markets over a period of two years, starting on the 1st of January 2003 and ending on the 31st of December 2004. In particular, by using the concept of local Hurst exponent, we point out how the behaviour of this parameter, usually considered as a benchmark for persistency/antipersistency recognition in time series, is largely time-scale dependent in the market context. These findings are a direct consequence of the intrinsic complexity of a system where trading strategies are scale-adaptive. Moreover, our analysis points out different regimes in the dynamical behaviour of the market indices under consideration.

  20. Electricity's Future: The Shift to Efficiency and Small-Scale Power. Worldwatch Paper 61.

    ERIC Educational Resources Information Center

    Flavin, Christopher

    Electricity, which has largely supplanted oil as the most controversial energy issue of the 1980s, is at the center of some of the world's bitterest economic and environmental controversies. Soaring costs, high interest rates, and environmental damage caused by large power plants have wreaked havoc on the once booming electricity industry.…

  1. Constraints and consequences of reducing small scale structure via large dark matter-neutrino interactions

    DOE PAGES

    Bertoni, Bridget; Ipek, Seyda; McKeen, David; ...

    2015-04-30

    Here, cold dark matter explains a wide range of data on cosmological scales. However, there has been a steady accumulation of evidence for discrepancies between simulations and observations at scales smaller than galaxy clusters. One promising way to affect structure formation on small scales is a relatively strong coupling of dark matter to neutrinos. We construct an experimentally viable, simple, renormalizable model with new interactions between neutrinos and dark matter and provide the first discussion of how these new dark matter-neutrino interactions affect neutrino phenomenology. We show that addressing the small scale structure problems requires asymmetric dark matter with amore » mass that is tens of MeV. Generating a sufficiently large dark matter-neutrino coupling requires a new heavy neutrino with a mass around 100 MeV. The heavy neutrino is mostly sterile but has a substantial τ neutrino component, while the three nearly massless neutrinos are partly sterile. This model can be tested by future astrophysical, particle physics, and neutrino oscillation data. Promising signatures of this model include alterations to the neutrino energy spectrum and flavor content observed from a future nearby supernova, anomalous matter effects in neutrino oscillations, and a component of the τ neutrino with mass around 100 MeV.« less

  2. Mapping the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Manzotti, A.; Dodelson, S.

    2014-12-01

    On large scales, the anisotropies in the cosmic microwave background (CMB) reflect not only the primordial density field but also the energy gain when photons traverse decaying gravitational potentials of large scale structure, what is called the integrated Sachs-Wolfe (ISW) effect. Decomposing the anisotropy signal into a primordial piece and an ISW component, the main secondary effect on large scales, is more urgent than ever as cosmologists strive to understand the Universe on those scales. We present a likelihood technique for extracting the ISW signal combining measurements of the CMB, the distribution of galaxies, and maps of gravitational lensing. We test this technique with simulated data showing that we can successfully reconstruct the ISW map using all the data sets together. Then we present the ISW map obtained from a combination of real data: the NRAO VLA sky survey (NVSS) galaxy survey, temperature anisotropies, and lensing maps made by the Planck satellite. This map shows that, with the data sets used and assuming linear physics, there is no evidence, from the reconstructed ISW signal in the Cold Spot region, for an entirely ISW origin of this large scale anomaly in the CMB. However a large scale structure origin from low redshift voids outside the NVSS redshift range is still possible. Finally we show that future surveys, thanks to a better large scale lensing reconstruction will be able to improve the reconstruction signal to noise which is now mainly coming from galaxy surveys.

  3. Anticipatory Traumatic Reaction: Outcomes Arising From Secondary Exposure to Disasters and Large-Scale Threats.

    PubMed

    Hopwood, Tanya L; Schutte, Nicola S; Loi, Natasha M

    2017-09-01

    Two studies, with a total of 707 participants, developed and examined the reliability and validity of a measure for anticipatory traumatic reaction (ATR), a novel construct describing a form of distress that may occur in response to threat-related media reports and discussions. Exploratory and confirmatory factor analysis resulted in a scale comprising three subscales: feelings related to future threat; preparatory thoughts and actions; and disruption to daily activities. Internal consistency was .93 for the overall ATR scale. The ATR scale demonstrated convergent validity through associations with negative affect, depression, anxiety, stress, neuroticism, and repetitive negative thinking. The scale showed discriminant validity in relationships to Big Five characteristics. The ATR scale had some overlap with a measure of posttraumatic stress disorder, but also showed substantial separate variance. This research provides preliminary evidence for the novel construct of ATR as well as a measure of the construct. The ATR scale will allow researchers to further investigate anticipatory traumatic reaction in the fields of trauma, clinical practice, and social psychology.

  4. An imperative need for global change research in tropical forests.

    PubMed

    Zhou, Xuhui; Fu, Yuling; Zhou, Lingyan; Li, Bo; Luo, Yiqi

    2013-09-01

    Tropical forests play a crucial role in regulating regional and global climate dynamics, and model projections suggest that rapid climate change may result in forest dieback or savannization. However, these predictions are largely based on results from leaf-level studies. How tropical forests respond and feedback to climate change is largely unknown at the ecosystem level. Several complementary approaches have been used to evaluate the effects of climate change on tropical forests, but the results are conflicting, largely due to confounding effects of multiple factors. Although altered precipitation and nitrogen deposition experiments have been conducted in tropical forests, large-scale warming and elevated carbon dioxide (CO2) manipulations are completely lacking, leaving many hypotheses and model predictions untested. Ecosystem-scale experiments to manipulate temperature and CO2 concentration individually or in combination are thus urgently needed to examine their main and interactive effects on tropical forests. Such experiments will provide indispensable data and help gain essential knowledge on biogeochemical, hydrological and biophysical responses and feedbacks of tropical forests to climate change. These datasets can also inform regional and global models for predicting future states of tropical forests and climate systems. The success of such large-scale experiments in natural tropical forests will require an international framework to coordinate collaboration so as to meet the challenges in cost, technological infrastructure and scientific endeavor.

  5. Probing features in the primordial perturbation spectrum with large-scale structure data

    NASA Astrophysics Data System (ADS)

    L'Huillier, Benjamin; Shafieloo, Arman; Hazra, Dhiraj Kumar; Smoot, George F.; Starobinsky, Alexei A.

    2018-06-01

    The form of the primordial power spectrum (PPS) of cosmological scalar (matter density) perturbations is not yet constrained satisfactorily in spite of the tremendous amount of information from the Cosmic Microwave Background (CMB) data. While a smooth power-law-like form of the PPS is consistent with the CMB data, some PPSs with small non-smooth features at large scales can also fit the CMB temperature and polarization data with similar statistical evidence. Future CMB surveys cannot help distinguish all such models due to the cosmic variance at large angular scales. In this paper, we study how well we can differentiate between such featured forms of the PPS not otherwise distinguishable using CMB data. We ran 15 N-body DESI-like simulations of these models to explore this approach. Showing that statistics such as the halo mass function and the two-point correlation function are not able to distinguish these models in a DESI-like survey, we advocate to avoid reducing the dimensionality of the problem by demonstrating that the use of a simple three-dimensional count-in-cell density field can be much more effective for the purpose of model distinction.

  6. Spatial Fluctuations in the Diffuse Cosmic X-Ray Background. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Shafer, R. A.

    1983-01-01

    The bright, essentially isotropic, X-ray sky flux above 2 keV yields information on the universe at large distances. However, a definitive understanding of the origin of the flux is lacking. Some fraction of the total flux is contributed by active galactic nuclei and clusters of galaxies, but less than one percent of the total is contributed by the or approximately 3 keV band resolved sources, which is the band where the sky flux is directly observed. Parametric models of AGN (quasar) luminosity function evolution are examined. Most constraints are by the total sky flux. The acceptability of particular models hinges on assumptions currently not directly testable. The comparison with the Einstein Observatory 1 to keV low flux source counts is hampered by spectral uncertainties. A tentative measurement of a large scale dipole anisotropy is consistent with the velocity and direction derived from the dipole in the microwave background. The impact of the X-ray anisotropy limits for other scales on studies of large-scale structure in the universe is sketched. Models of the origins of the X-ray sky flux are reviewed, and future observational programs outlined.

  7. Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests

    PubMed Central

    Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís

    2014-01-01

    Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853

  8. Using unplanned fires to help suppressing future large fires in Mediterranean forests.

    PubMed

    Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís

    2014-01-01

    Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.

  9. Transitioning a home telehealth project into a sustainable, large-scale service: a qualitative study.

    PubMed

    Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin

    2016-05-16

    This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.

  10. Optimizing Implementation of Obesity Prevention Programs: A Qualitative Investigation Within a Large-Scale Randomized Controlled Trial.

    PubMed

    Kozica, Samantha L; Teede, Helena J; Harrison, Cheryce L; Klein, Ruth; Lombard, Catherine B

    2016-01-01

    The prevalence of obesity in rural and remote areas is elevated in comparison to urban populations, highlighting the need for interventions targeting obesity prevention in these settings. Implementing evidence-based obesity prevention programs is challenging. This study aimed to investigate factors influencing the implementation of obesity prevention programs, including adoption, program delivery, community uptake, and continuation, specifically within rural settings. Nested within a large-scale randomized controlled trial, a qualitative exploratory approach was adopted, with purposive sampling techniques utilized, to recruit stakeholders from 41 small rural towns in Australia. In-depth semistructured interviews were conducted with clinical health professionals, health service managers, and local government employees. Open coding was completed independently by 2 investigators and thematic analysis undertaken. In-depth interviews revealed that obesity prevention programs were valued by the rural workforce. Program implementation is influenced by interrelated factors across: (1) contextual factors and (2) organizational capacity. Key recommendations to manage the challenges of implementing evidence-based programs focused on reducing program delivery costs, aided by the provision of a suite of implementation and evaluation resources. Informing the scale-up of future prevention programs, stakeholders highlighted the need to build local rural capacity through developing supportive university partnerships, generating local program ownership and promoting active feedback to all program partners. We demonstrate that the rural workforce places a high value on obesity prevention programs. Our results inform the future scale-up of obesity prevention programs, providing an improved understanding of strategies to optimize implementation of evidence-based prevention programs. © 2015 National Rural Health Association.

  11. Identifying and Mitigating Potential Nutrient and Sediment Hot Spots under a Future Scenario in the Missouri River Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, May; Zhang, Zhonglong

    Using the Soil and Water Assessment Tool (SWAT) for large-scale watershed modeling could be useful for evaluating the quality of the water in regions that are dominated by nonpoint sources in order to identify potential “hot spots” for which mitigating strategies could be further developed. An analysis of water quality under future scenarios in which changes in land use would be made to accommodate increased biofuel production was developed for the Missouri River Basin (MoRB) based on a SWAT model application. The analysis covered major agricultural crops and biofuel feedstock in the MoRB, including pasture land, hay, corn, soybeans, wheat,more » and switchgrass. The analysis examined, at multiple temporal and spatial scales, how nitrate, organic nitrogen, and total nitrogen; phosphorus, organic phosphorus, inorganic phosphorus, and total phosphorus; suspended sediments; and water flow (water yield) would respond to the shifts in land use that would occur under proposed future scenarios. The analysis was conducted at three geospatial scales: (1) large tributary basin scale (two: Upper MoRB and Lower MoRB); (2) regional watershed scale (seven: Upper Missouri River, Middle Missouri River, Middle Lower Missouri River, Lower Missouri River, Yellowstone River, Platte River, and Kansas River); and (3) eight-digit hydrologic unit (HUC-8) subbasin scale (307 subbasins). Results showed that subbasin-level variations were substantial. Nitrogen loadings decreased across the entire Upper MoRB, and they increased in several subbasins in the Lower MoRB. Most nitrate reductions occurred in lateral flow. Also at the subbasin level, phosphorus in organic, sediment, and soluble forms was reduced by 35%, 45%, and 65%, respectively. Suspended sediments increased in 68% of the subbasins. The water yield decreased in 62% of the subbasins. In the Kansas River watershed, the water quality improved significantly with regard to every nitrogen and phosphorus compound. The improvement was clearly attributable to the conversion of a large amount of land to switchgrass. The Middle Lower Missouri River and Lower Missouri River were identified as hot regions. Further analysis identified four subbasins (10240002, 10230007, 10290402, and 10300200) as being the most vulnerable in terms of sediment, nitrogen, and phosphorus loadings. Overall, results suggest that increasing the amount of switchgrass acreage in the hot spots should be considered to mitigate the nutrient loads. The study provides an analytical method to support stakeholders in making informed decisions that balance biofuel production and water sustainability.« less

  12. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less

  14. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers

    PubMed Central

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon—Salmo salar, number of >1 year old Atlantic salmon, number of brown trout—Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies. PMID:27191717

  15. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers.

    PubMed

    Bilotta, Gary S; Burnside, Niall G; Gray, Jeremy C; Orr, Harriet G

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon-Salmo salar, number of >1 year old Atlantic salmon, number of brown trout-Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies.

  16. Power from Ocean Waves.

    ERIC Educational Resources Information Center

    Newman, J. N.

    1979-01-01

    Discussed is the utilization of surface ocean waves as a potential source of power. Simple and large-scale wave power devices and conversion systems are described. Alternative utilizations, environmental impacts, and future prospects of this alternative energy source are detailed. (BT)

  17. Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.

    PubMed

    Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian

    2014-07-01

    We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.

  18. Large Field of View PIV Measurements of Air Entrainment by SLS SMAT Water Sound Suppression System

    NASA Astrophysics Data System (ADS)

    Stegmeir, Matthew; Pothos, Stamatios; Bissell, Dan

    2015-11-01

    Water-based sound suppressions systems have been used to reduce the acoustic impact of space vehicle launches. Water flows at a high rate during launch in order to suppress Engine Generated Acoustics and other potentially damaging sources of noise. For the Space Shuttle, peak flow rates exceeded 900,000 gallons per minute. Such large water flow rates have the potential to induce substantial entrainment of the surrounding air, affecting the launch conditions and generating airflow around the launch vehicle. Validation testing is necessary to quantify this impact for future space launch systems. In this study, PIV measurements were performed to map the flow field above the SMAT sub-scale launch vehicle scaled launch stand. Air entrainment effects generated by a water-based sound suppression system were studied. Mean and fluctuating fluid velocities were mapped up to 1m above the test stand deck and compared to simulation results. Measurements performed with NASA MSFC.

  19. Chameleon dark energy models with characteristic signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gannouji, Radouane; Department of Physics, Faculty of Science, Tokyo University of Science, 1-3, Kagurazaka, Shinjuku-ku, Tokyo 162-8601; Moraes, Bruno

    2010-12-15

    In chameleon dark energy models, local gravity constraints tend to rule out parameters in which observable cosmological signatures can be found. We study viable chameleon potentials consistent with a number of recent observational and experimental bounds. A novel chameleon field potential, motivated by f(R) gravity, is constructed where observable cosmological signatures are present both at the background evolution and in the growth rate of the perturbations. We study the evolution of matter density perturbations on low redshifts for this potential and show that the growth index today {gamma}{sub 0} can have significant dispersion on scales relevant for large scale structures.more » The values of {gamma}{sub 0} can be even smaller than 0.2 with large variations of {gamma} on very low redshifts for the model parameters constrained by local gravity tests. This gives a possibility to clearly distinguish these chameleon models from the {Lambda}-cold-dark-matter ({Lambda}CDM) model in future high-precision observations.« less

  20. Data integration in the era of omics: current and future challenges

    PubMed Central

    2014-01-01

    To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990

  1. Application of Large-Scale Database-Based Online Modeling to Plant State Long-Term Estimation

    NASA Astrophysics Data System (ADS)

    Ogawa, Masatoshi; Ogai, Harutoshi

    Recently, attention has been drawn to the local modeling techniques of a new idea called “Just-In-Time (JIT) modeling”. To apply “JIT modeling” to a large amount of database online, “Large-scale database-based Online Modeling (LOM)” has been proposed. LOM is a technique that makes the retrieval of neighboring data more efficient by using both “stepwise selection” and quantization. In order to predict the long-term state of the plant without using future data of manipulated variables, an Extended Sequential Prediction method of LOM (ESP-LOM) has been proposed. In this paper, the LOM and the ESP-LOM are introduced.

  2. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenison, LaVesta; Flanigan, Thomas; Hagerty, Gregg

    The primary objectives of the FutureGen 2.0 CO 2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO 2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO 2 capture in steady-state operations. The project was to be fully integratedmore » in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO 2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO 2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO 2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will be helpful to plotting the course of, and successfully executing future large demonstration projects. This Final Scientific and Technical Report describes the technology and engineering basis of the project, inclusive of process systems, performance, effluents and emissions, and controls. Further, the project cost estimate, schedule, and permitting requirements are presented, along with a project risk and opportunity assessment. Lessons-learned related to these elements are summarized in this report. Companion reports Oxy-combustion further document the accomplishments and learnings of the project, including: A.01 Project Management Report which describes what was done to coordinate the various participants, and to track their performance with regard to schedule and budget B.02 Lessons Learned - Technology Integration, Value Improvements, and Program Management, which describes the innovations and conclusions that we arrived upon during the development of the project, and makes recommendations for improvement of future projects of a similar nature . B.03 Project Economics, which details the capital and operation costs and their basis, and also illustrates the cost of power produced by the plant with certain sensitivities. B.04 Power Plant, Pipeline, and Injection Site Interfaces, which details the interfaces between the two FutureGen projects B.05 Contractual Mechanisms for Design, Construction, and Operation, which describes the major EPC, and Operations Contracts required to execute the project.« less

  3. Searching for modified growth patterns with tomographic surveys

    NASA Astrophysics Data System (ADS)

    Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel

    2009-04-01

    In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.

  4. General relativistic description of the observed galaxy power spectrum: Do we understand what we measure?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jaiyul

    2010-10-15

    We extend the general relativistic description of galaxy clustering developed in Yoo, Fitzpatrick, and Zaldarriaga (2009). For the first time we provide a fully general relativistic description of the observed matter power spectrum and the observed galaxy power spectrum with the linear bias ansatz. It is significantly different from the standard Newtonian description on large scales and especially its measurements on large scales can be misinterpreted as the detection of the primordial non-Gaussianity even in the absence thereof. The key difference in the observed galaxy power spectrum arises from the real-space matter fluctuation defined as the matter fluctuation at themore » hypersurface of the observed redshift. As opposed to the standard description, the shape of the observed galaxy power spectrum evolves in redshift, providing additional cosmological information. While the systematic errors in the standard Newtonian description are negligible in the current galaxy surveys at low redshift, correct general relativistic description is essential for understanding the galaxy power spectrum measurements on large scales in future surveys with redshift depth z{>=}3. We discuss ways to improve the detection significance in the current galaxy surveys and comment on applications of our general relativistic formalism in future surveys.« less

  5. Large-scale Cortical Network Properties Predict Future Sound-to-Word Learning Success

    PubMed Central

    Sheppard, John Patrick; Wang, Ji-Ping; Wong, Patrick C. M.

    2013-01-01

    The human brain possesses a remarkable capacity to interpret and recall novel sounds as spoken language. These linguistic abilities arise from complex processing spanning a widely distributed cortical network and are characterized by marked individual variation. Recently, graph theoretical analysis has facilitated the exploration of how such aspects of large-scale brain functional organization may underlie cognitive performance. Brain functional networks are known to possess small-world topologies characterized by efficient global and local information transfer, but whether these properties relate to language learning abilities remains unknown. Here we applied graph theory to construct large-scale cortical functional networks from cerebral hemodynamic (fMRI) responses acquired during an auditory pitch discrimination task and found that such network properties were associated with participants’ future success in learning words of an artificial spoken language. Successful learners possessed networks with reduced local efficiency but increased global efficiency relative to less successful learners and had a more cost-efficient network organization. Regionally, successful and less successful learners exhibited differences in these network properties spanning bilateral prefrontal, parietal, and right temporal cortex, overlapping a core network of auditory language areas. These results suggest that efficient cortical network organization is associated with sound-to-word learning abilities among healthy, younger adults. PMID:22360625

  6. Validation of the RAGE Hydrocode for Impacts into Volatile-Rich Targets

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Asphaug, E.; Coker, R. F.; Wohletz, K. H.; Korycansky, D. G.; Gisler, G. R.

    2007-12-01

    In preparation for a detailed study of large-scale impacts into the Martian surface, we have validated the RAGE hydrocode (Gittings et al., in press, CSD) against a suite of experiments and statistical models. We present comparisons of hydrocode models to centimeter-scale gas gun impacts (Nakazawa et al. 2002), an underground nuclear test (Perret, 1971), and crater scaling laws (Holsapple 1993, O'Keefe and Ahrens 1993). We have also conducted model convergence and uncertainty analyses which will be presented. Results to date are encouraging for our current model goals, and indicate areas where the hydrocode may be extended in the future. This validation work is focused on questions related to the specific problem of large impacts into volatile-rich targets. The overall goal of this effort is to be able to realistically model large-scale Noachian, and possibly post- Noachian, impacts on Mars not so much to model the crater morphology as to understand the evolution of target volatiles in the post-impact regime, to explore how large craters might set the stage for post-impact hydro- geologic evolution both locally (in the crater subsurface) and globally, due to the redistribution of volatiles from the surface and subsurface into the atmosphere. This work is performed under the auspices of IGPP and the DOE at LANL under contracts W-7405-ENG-36 and DE-AC52-06NA25396. Effort by DK and EA is sponsored by NASA's Mars Fundamental Research Program.

  7. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    NASA Astrophysics Data System (ADS)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  8. Properties and spatial distribution of galaxy superclusters

    NASA Astrophysics Data System (ADS)

    Liivamägi, Lauri Juhan

    2017-01-01

    Astronomy is a science that can offer plenty of unforgettable imagery, and the large-scale distribution of galaxies is no exception. Among the first features the viewer's eye is likely to be drawn to, are large concentrations of galaxies - galaxy superclusters, contrasting to the seemingly empty regions beside them. Superclusters can extend from tens to over hundred megaparsecs, they contain from hundreds to thousands of galaxies, and many galaxy groups and clusters. Unlike galaxy clusters, superclusters are clearly unrelaxed systems, not gravitationally bound as crossing times exceed the age of the universe, and show little to no radial symmetry. Superclusters, as part of the large-scale structure, are sensitive to the initial power spectrum and the following evolution. They are massive enough to leave an imprint on the cosmic microwave background radiation. Superclusters can also provide an unique environment for their constituent galaxies and galaxy clusters. In this study we used two different observational and one simulated galaxy samples to create several catalogues of structures that, we think, correspond to what are generally considered galaxy superclusters. Superclusters were delineated as continuous over-dense regions in galaxy luminosity density fields. When calculating density fields several corrections were applied to remove small-scale redshift distortions and distance-dependent selection effects. Resulting catalogues of objects display robust statistical properties, showing that flux-limited galaxy samples can be used to create nearly volume-limited catalogues of superstructures. Generally, large superclusters can be regarded as massive, often branching filamentary structures, that are mainly characterised by their length. Smaller superclusters, on the other hand, can display a variety of shapes. Spatial distribution of superclusters shows large-scale variations, with high-density concentrations often found in semi-regularly spaced groups. Future studies are needed to quantify the relations between superclusters and finer details of the galaxy distribution. Supercluster catalogues from this thesis have already been used in numerous other studies.

  9. Drought forecasting in Luanhe River basin involving climatic indices

    NASA Astrophysics Data System (ADS)

    Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.

    2017-11-01

    Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.

  10. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    PubMed

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  11. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  12. Review of Aerosol–Cloud Interactions: Mechanisms, Significance, and Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Jiwen; Wang, Yuan; Rosenfeld, Daniel

    2016-11-01

    Over the past decade, the number of studies that investigate aerosol-cloud interactions has increased considerably. Although tremendous progress has been made to improve our understanding of basic physical mechanisms of aerosol-cloud interactions and reduce their uncertainties in climate forcing, we are still in poor understanding of (1) some of the mechanisms that interact with each other over multiple spatial and temporal scales, (2) the feedback between microphysical and dynamical processes and between local-scale processes and large-scale circulations, and (3) the significance of cloud-aerosol interactions on weather systems as well as regional and global climate. This review focuses on recent theoreticalmore » studies and important mechanisms on aerosol-cloud interactions, and discusses the significances of aerosol impacts on raditative forcing and precipitation extremes associated with different cloud systems. Despite significant understanding has been gained about aerosol impacts on the main cloud types, there are still many unknowns especially associated with various deep convective systems. Therefore, large efforts are needed to escalate our understanding. Future directions should focus on obtaining concurrent measurements of aerosol properties, cloud microphysical and dynamic properties over a range of temporal and spatial scales collected over typical climate regimes and closure studies, as well as improving understanding and parameterizations of cloud microphysics such as ice nucleation, mixed-phase properties, and hydrometeor size and fall speed« less

  13. A follow-up study of hygiene in catering premises at large-scale events in the United Kingdom.

    PubMed

    Willis, C; Elviss, N; McLauchlin, J

    2015-01-01

    To investigate food hygiene practices at large events by assessing the microbiological quality of ready-to-eat food, drinking water, food preparation surfaces, cleaning cloths and wristbands worn by food handlers for event security purposes. Over a 7-month period, 1662 samples were collected at 153 events and examined for microbiological contamination. Eight per cent of food samples were of an unsatisfactory quality. A further one per cent contained potentially hazardous levels of human pathogenic bacteria. 27% of water samples, 32% of swabs and 56% of cloths were also unsatisfactory. These results represented an improvement in hygiene compared to a previous study carried out 12 months previously. A fifth of food handler wristbands were contaminated with Enterobacteriaceae, Escherichia coli and/or coagulase-positive staphylococci, with those bands made from fabric being more frequently contaminated than those made from plastic or other materials. This study provides evidence that the food hygiene at large-scale events may have improved. However, there is still a need for continued efforts to maintain an ongoing improvement in cleaning regimes and food hygiene management. This study was part of an ongoing focus on large events in the lead-up to the London 2012 Olympics. Lessons learnt here will be important in the planning of future large events. © 2014 Crown copyright. © 2014 Society for Applied Microbiology This article is Published with the permission of the Controller of HMSO and Queen's Printer for Scotland.

  14. Temporal and Spatial Variation in Peatland Carbon Cycling and Implications for Interpreting Responses of an Ecosystem-Scale Warming Experiment

    Treesearch

    Natalie A. Griffiths; Paul J. Hanson; Daniel M. Ricciuto; Colleen M. Iversen; Anna M. Jensen; Avni Malhotra; Karis J. McFarlane; Richard J. Norby; Khachik Sargsyan; Stephen D. Sebestyen; Xiaoying Shi; Anthony P. Walker; Eric J. Ward; Jeffrey M. Warren; David J. Weston

    2017-01-01

    We are conducting a large-scale, long-term climate change response experiment in an ombrotrophic peat bog in Minnesota to evaluate the effects of warming and elevated CO2 on ecosystem processes using empirical and modeling approaches. To better frame future assessments of peatland responses to climate change, we characterized and compared spatial...

  15. Adolescent health and adult labor market outcomes.

    PubMed

    Lundborg, Petter; Nilsson, Anton; Rooth, Dan-Olof

    2014-09-01

    Whereas a large literature has shown the importance of early life health for adult socioeconomic outcomes, there is little evidence on the importance of adolescent health. We contribute to the literature by studying the impact of adolescent health status on adult labor market outcomes using a unique and large-scale dataset covering almost the entire population of Swedish males. We show that most types of major conditions have long-run effects on future outcomes, and that the strongest effects result from mental conditions. Including sibling fixed effects or twin pair fixed effects reduces the magnitudes of the estimates, but they remain substantial. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Large-scale quarantine following biological terrorism in the United States: scientific examination, logistic and legal limits, and possible consequences.

    PubMed

    Barbera, J; Macintyre, A; Gostin, L; Inglesby, T; O'Toole, T; DeAtley, C; Tonat, K; Layton, M

    2001-12-05

    Concern for potential bioterrorist attacks causing mass casualties has increased recently. Particular attention has been paid to scenarios in which a biological agent capable of person-to-person transmission, such as smallpox, is intentionally released among civilians. Multiple public health interventions are possible to effect disease containment in this context. One disease control measure that has been regularly proposed in various settings is the imposition of large-scale or geographic quarantine on the potentially exposed population. Although large-scale quarantine has not been implemented in recent US history, it has been used on a small scale in biological hoaxes, and it has been invoked in federally sponsored bioterrorism exercises. This article reviews the scientific principles that are relevant to the likely effectiveness of quarantine, the logistic barriers to its implementation, legal issues that a large-scale quarantine raises, and possible adverse consequences that might result from quarantine action. Imposition of large-scale quarantine-compulsory sequestration of groups of possibly exposed persons or human confinement within certain geographic areas to prevent spread of contagious disease-should not be considered a primary public health strategy in most imaginable circumstances. In the majority of contexts, other less extreme public health actions are likely to be more effective and create fewer unintended adverse consequences than quarantine. Actions and areas for future research, policy development, and response planning efforts are provided.

  17. Natural and drought scenarios in an east central Amazon forest: Fidelity of the Community Land Model 3.5 with three biogeochemical models

    NASA Astrophysics Data System (ADS)

    Sakaguchi, Koichi; Zeng, Xubin; Christoffersen, Bradley J.; Restrepo-Coupe, Natalia; Saleska, Scott R.; Brando, Paulo M.

    2011-03-01

    Recent development of general circulation models involves biogeochemical cycles: flows of carbon and other chemical species that circulate through the Earth system. Such models are valuable tools for future projections of climate, but still bear large uncertainties in the model simulations. One of the regions with especially high uncertainty is the Amazon forest where large-scale dieback associated with the changing climate is predicted by several models. In order to better understand the capability and weakness of global-scale land-biogeochemical models in simulating a tropical ecosystem under the present day as well as significantly drier climates, we analyzed the off-line simulations for an east central Amazon forest by the Community Land Model version 3.5 of the National Center for Atmospheric Research and its three independent biogeochemical submodels (CASA', CN, and DGVM). Intense field measurements carried out under Large Scale Biosphere-Atmosphere Experiment in Amazonia, including forest response to drought from a throughfall exclusion experiment, are utilized to evaluate the whole spectrum of biogeophysical and biogeochemical aspects of the models. Our analysis shows reasonable correspondence in momentum and energy turbulent fluxes, but it highlights three processes that are not in agreement with observations: (1) inconsistent seasonality in carbon fluxes, (2) biased biomass size and allocation, and (3) overestimation of vegetation stress to short-term drought but underestimation of biomass loss from long-term drought. Without resolving these issues the modeled feedbacks from the biosphere in future climate projections would be questionable. We suggest possible directions for model improvements and also emphasize the necessity of more studies using a variety of in situ data for both driving and evaluating land-biogeochemical models.

  18. Regional assessment of the hydropower potential of rivers in West Africa

    NASA Astrophysics Data System (ADS)

    Kling, Harald; Stanzel, Philipp; Fuchs, Martin

    2016-04-01

    The 15 countries of the Economic Community of West African States (ECOWAS) face a constant shortage of energy supply, which limits sustained economic growth. Currently there are about 50 operational hydropower plants and about 40 more are under construction or refurbishment. The potential for future hydropower development - especially for small-scale plants in rural areas - is assumed to be large, but exact data are missing. This study supports the energy initiatives of the "ECOWAS Centre for Renewable Energy and Energy Efficiency" (ECREEE) by assessing the hydropower potential of all rivers in West Africa. For more than 500,000 river reaches the hydropower potential was computed from channel slope and mean annual discharge. In large areas there is a lack of discharge observations. Therefore, an annual water balance model was used to simulate discharge. The model domain covers 5 Mio km², including e.g. the Niger, Volta, and Senegal River basins. The model was calibrated with observed data of 410 gauges, using precipitation and potential evapotranspiration data as inputs. Historic variations of observed annual discharge between 1950 and 2010 are simulated well by the model. As hydropower plants are investments with a lifetime of several decades we also assessed possible changes in future discharge due to climate change. To this end the water balance model was driven with bias-corrected climate projections of 15 Regional Climate Models for two emission scenarios of the CORDEX-Africa ensemble. The simulation results for the river network were up-scaled to sub-areas and national summaries. This information gives a regional quantification of the hydropower potential, expected climate change impacts, as well as a regional classification for general suitability (or non-suitability) of hydropower plant size - from small-scale to large projects.

  19. Limnological surveys of the Great Lakes--early and recent

    USGS Publications Warehouse

    Smith, Stanford H.

    1957-01-01

    Early explorations on the Great Lakes were concerned largely with things easily collected or observed—common organisms, water levels, surface temperatures … Even when more scientific studies were undertaken, they were at first scattered and small-scale. Effective surveys became possible only through inter-agency cooperation which permits a pooling of facilities, staff, and equipment. Expansion of limnological research on the Great Lakes has been rapid in later years and the outlook for the future is good.

  20. What Works? Common Practices in High Functioning Afterschool Programs across the Nation in Math, Reading, Science, Arts, Technology, and Homework--A Study by the National Partnership. The Afterschool Program Assessment Guide. CRESST Report 768

    ERIC Educational Resources Information Center

    Huang, Denise; Cho, Jamie; Mostafavi, Sima; Nam, Hannah H.; Oh, Christine; Harven, Aletha; Leon, Seth

    2010-01-01

    In an effort to identify and incorporate exemplary practices into existing and future afterschool programs, the U.S. Department of Education commissioned a large-scale evaluation of the 21st Century Community Learning Center (CCLC) program. The purpose of this evaluation project was to develop resources and professional development that addresses…

  1. Transient and diffusion analysis of HgCdTe

    NASA Technical Reports Server (NTRS)

    Clayton, J. C.

    1982-01-01

    Solute redistribution during directional solidification of HgCdTe is addressed. Both one-dimensional and two-dimensional models for solute redistribution are treated and model results compared to experiment. The central problem studied is the cause of radial inhomogeneities found in directionally solidified HgCdTe. A large scale gravity-driven interface instability, termed shape instability, is postulated to be the cause of radial inhomogeneities. Recommendations for future work, along with appropriate computer programs, are included.

  2. Scaling and criticality in a stochastic multi-agent model of a financial market

    NASA Astrophysics Data System (ADS)

    Lux, Thomas; Marchesi, Michele

    1999-02-01

    Financial prices have been found to exhibit some universal characteristics that resemble the scaling laws characterizing physical systems in which large numbers of units interact. This raises the question of whether scaling in finance emerges in a similar way - from the interactions of a large ensemble of market participants. However, such an explanation is in contradiction to the prevalent `efficient market hypothesis' in economics, which assumes that the movements of financial prices are an immediate and unbiased reflection of incoming news about future earning prospects. Within this hypothesis, scaling in price changes would simply reflect similar scaling in the `input' signals that influence them. Here we describe a multi-agent model of financial markets which supports the idea that scaling arises from mutual interactions of participants. Although the `news arrival process' in our model lacks both power-law scaling and any temporal dependence in volatility, we find that it generates such behaviour as a result of interactions between agents.

  3. Quartified leptonic color, bound states, and future electron–positron collider

    DOE PAGES

    Kownacki, Corey; Ma, Ernest; Pollard, Nicholas; ...

    2017-04-04

    The [SU(3)] 4 quartification model of Babu, Ma, and Willenbrock (BMW), proposed in 2003, predicts a confining leptonic color SU(2)gauge symmetry, which becomes strong at the keV scale. Also, it predicts the existence of three families of half-charged leptons (hemions) below the TeV scale. These hemions are confined to form bound states which are not so easy to discover at the Large Hadron Collider (LHC). But, just as J/ψand Υ appeared as sharp resonances in e -e +colliders of the 20th century, the corresponding ‘hemionium’ states are expected at a future e -e +collider of the 21st century.

  4. FROM FINANCE TO COSMOLOGY: THE COPULA OF LARGE-SCALE STRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scherrer, Robert J.; Berlind, Andreas A.; Mao, Qingqing

    2010-01-01

    Any multivariate distribution can be uniquely decomposed into marginal (one-point) distributions, and a function called the copula, which contains all of the information on correlations between the distributions. The copula provides an important new methodology for analyzing the density field in large-scale structure. We derive the empirical two-point copula for the evolved dark matter density field. We find that this empirical copula is well approximated by a Gaussian copula. We consider the possibility that the full n-point copula is also Gaussian and describe some of the consequences of this hypothesis. Future directions for investigation are discussed.

  5. Assessment of climate change impacts on runoff in China using climate elasticity and multiple CMIP5 GCMs

    NASA Astrophysics Data System (ADS)

    Wu, C.; Hu, B. X.; Wang, P.; Xu, K.

    2017-12-01

    The occurrence of climate warming is unequivocal and is expected to alter the temporal-spatial patterns of regional water resources. Based on the long-term (1960-2012) water budget data and climate projections from 28 Global Climate Models (GCMs) of the Coupled Model Intercomparison Project Phase 5 (CMIP5), this study investigated the responses of runoff (R) to future climate variability in China at both grid and catchment scales using the Budyko-based elasticity method. Results indicate a large spatial variation in precipitation (P) elasticity (from 1.2 to 3.3) and potential evaporation (PET) elasticity (from -2.3 to -0.2) across China. The P elasticity is larger in northeast and western China than in southern China, while the opposite occurs for PET elasticity. Climate projections suggest that there is large uncertainty involved among the GCM simulations, but most project a consistent change in P (or PET) over China at the mean annual scale. During the future period of 2071-2100, the mean annual P will likely increase in most parts of China particularly the western regions, while the mean annual PET will likely increase in the whole China especially the southern regions due to future increases in temperature. Moreover, larger increases are projected for higher emission scenarios. Compared with the baseline 1971-2000, the arid regions and humid regions of China will likely become wetter and drier in the period 2071-2100, respectively.

  6. Fire modeling in the Brazilian arc of deforestation through nested coupling of atmosphere, dynamic vegetation, LUCC and fire spread models

    NASA Astrophysics Data System (ADS)

    Tourigny, E.; Nobre, C.; Cardoso, M. F.

    2012-12-01

    Deforestation of tropical forests for logging and agriculture, associated to slash-and-burn practices, is a major source of CO2 emissions, both immediate due to biomass burning and future due to the elimination of a potential CO2 sink. Feedbacks between climate change and LUCC (Land-Use and Land-Cover Change) can potentially increase the loss of tropical forests and increase the rate of CO2 emissions, through mechanisms such as land and soil degradation and the increase in wildfire occurrence and severity. However, current understanding of the processes of fires (including ignition, spread and consequences) in tropical forests and climatic feedbacks are poorly understood and need further research. As the processes of LUCC and associated fires occur at local scales, linking them to large-scale atmospheric processes requires a means of up-scaling higher resolutions processes to lower resolutions. Our approach is to couple models which operate at various spatial and temporal scales: a Global Climate Model (GCM), Dynamic Global Vegetation Model (DGVM) and local-scale LUCC and fire spread model. The climate model resolves large scale atmospheric processes and forcings, which are imposed on the surface DGVM and fed-back to climate. Higher-resolution processes such as deforestation, land use management and associated (as well as natural) fires are resolved at the local level. A dynamic tiling scheme allows to represent local-scale heterogeneity while maintaining computational efficiency of the land surface model, compared to traditional landscape models. Fire behavior is modeled at the regional scale (~500m) to represent the detailed landscape using a semi-empirical fire spread model. The relatively coarse scale (as compared to other fire spread models) is necessary due to the paucity of detailed land-cover information and fire history (particularly in the tropics and developing countries). This work presents initial results of a spatially-explicit fire spread model coupled to the IBIS DGVM model. Our area of study comprises selected regions in and near the Brazilian "arc of deforestation". For model training and evaluation, several areas have been mapped using high-resolution imagery from the Landsat TM/ETM+ sensors (Figure 1). This high resolution reference data is used for local-scale simulations and also to evaluate the accuracy of the global MCD45 burned area product, which will be used in future studies covering the entire "arc of deforestation".; Area of study along the arc of deforestation and cerrado: landsat scenes used and burned area (2010) from MCD45 product.

  7. Areas of Unsolved Problems in Caribbean Active Tectonics

    NASA Astrophysics Data System (ADS)

    Mann, P.

    2015-12-01

    I review some unsolved problems in Caribbean active tectonics. At the regional and plate scale: 1) confirm the existence of intraplate deformation zones of the central Caribbean plate that are within the margin of error of ongoing GPS measurements; 2) carry out field studies to evaluate block models versus models for distributed fault shear on the densely populated islands of Jamaica, Hispaniola, Puerto Rico, and the Virgin Islands; 3) carry out paleoseismological research of key plate boundary faults that may have accumulated large strains but have not been previously studied in detail; 4) determine the age of onset and far-field effects of the Cocos ridge and the Central America forearc sliver; 4) investigate the origin and earthquake-potential of obliquely-sheared rift basins along the northern coast of Venezuela; 5) determine the age of onset and regional active, tectonic effects of the Panama-South America collision including the continued activation of the Maracaibo block; and 6) validate longterm rates on active subduction zones with improving, tomographic maps of subducted slabs. At the individual fault scale: 1) determine the mode of termination of large and active strike -slip faults and application of the STEP model (Septentrional, Polochic, El Pilar, Bocono, Santa Marta-Bucaramanaga); 2) improve the understanding of the earthquake potential on the Enriquillo-Plantain Garden fault zone given "off-fault" events such as the 2010 Haiti earthquake; how widespread is this behavior?; and 3) estimate size of future tsunamis from studies of historic or prehistoric slump scars and mass transport deposits; what potential runups can be predicted from this information?; and 4) devise ways to keep rapidly growing, circum-Caribbean urban populations better informed and safer in the face of inevitable and future, large earthquakes.

  8. Higher-level simulations of turbulent flows

    NASA Technical Reports Server (NTRS)

    Ferziger, J. H.

    1981-01-01

    The fundamentals of large eddy simulation are considered and the approaches to it are compared. Subgrid scale models and the development of models for the Reynolds-averaged equations are discussed as well as the use of full simulation in testing these models. Numerical methods used in simulating large eddies, the simulation of homogeneous flows, and results from full and large scale eddy simulations of such flows are examined. Free shear flows are considered with emphasis on the mixing layer and wake simulation. Wall-bounded flow (channel flow) and recent work on the boundary layer are also discussed. Applications of large eddy simulation and full simulation in meteorological and environmental contexts are included along with a look at the direction in which work is proceeding and what can be expected from higher-level simulation in the future.

  9. Basin-Scale Hydrologic Impacts of CO2 Storage: Regulatory and Capacity Implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, J.T.; Zhou, Q.

    Industrial-scale injection of CO{sub 2} into saline sedimentary basins will cause large-scale fluid pressurization and migration of native brines, which may affect valuable groundwater resources overlying the deep sequestration reservoirs. In this paper, we discuss how such basin-scale hydrologic impacts can (1) affect regulation of CO{sub 2} storage projects and (2) may reduce current storage capacity estimates. Our assessment arises from a hypothetical future carbon sequestration scenario in the Illinois Basin, which involves twenty individual CO{sub 2} storage projects in a core injection area suitable for long-term storage. Each project is assumed to inject five million tonnes of CO{sub 2}more » per year for 50 years. A regional-scale three-dimensional simulation model was developed for the Illinois Basin that captures both the local-scale CO{sub 2}-brine flow processes and the large-scale groundwater flow patterns in response to CO{sub 2} storage. The far-field pressure buildup predicted for this selected sequestration scenario suggests that (1) the area that needs to be characterized in a permitting process may comprise a very large region within the basin if reservoir pressurization is considered, and (2) permits cannot be granted on a single-site basis alone because the near- and far-field hydrologic response may be affected by interference between individual sites. Our results also support recent studies in that environmental concerns related to near-field and far-field pressure buildup may be a limiting factor on CO{sub 2} storage capacity. In other words, estimates of storage capacity, if solely based on the effective pore volume available for safe trapping of CO{sub 2}, may have to be revised based on assessments of pressure perturbations and their potential impact on caprock integrity and groundwater resources, respectively. We finally discuss some of the challenges in making reliable predictions of large-scale hydrologic impacts related to CO{sub 2} sequestration projects.« less

  10. Recent progress and future direction of cancer epidemiological research in Japan.

    PubMed

    Sobue, Tomotaka

    2015-06-01

    In 2006, the Cancer Control Act was approved and a Basic Plan, to Promote the Cancer Control Program at the national level, was developed in 2007. Cancer research is recognized as a fundamental component to provide evidence in cancer control program. Cancer epidemiology plays central role in connecting research and policy, since it directly deals with data from humans. Research for cancer epidemiology in Japan made substantial progress, in the field of descriptive studies, cohort studies, intervention studies and activities for summarizing evidences. In future, promoting high-quality large-scale intervention studies, individual-level linkage studies, simulation models and studies for elderly population will be of great importance, but at the same time research should be promoted in well-balanced fashion not placing too much emphasis on one particular research field. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Characterizing Temperature Variability and Associated Large Scale Meteorological Patterns Across South America

    NASA Astrophysics Data System (ADS)

    Detzer, J.; Loikith, P. C.; Mechoso, C. R.; Barkhordarian, A.; Lee, H.

    2017-12-01

    South America's climate varies considerably owing to its large geographic range and diverse topographical features. Spanning the tropics to the mid-latitudes and from high peaks to tropical rainforest, the continent experiences an array of climate and weather patterns. Due to this considerable spatial extent, assessing temperature variability at the continent scale is particularly challenging. It is well documented in the literature that temperatures have been increasing across portions of South America in recent decades, and while there have been many studies that have focused on precipitation variability and change, temperature has received less scientific attention. Therefore, a more thorough understanding of the drivers of temperature variability is critical for interpreting future change. First, k-means cluster analysis is used to identify four primary modes of temperature variability across the continent, stratified by season. Next, composites of large scale meteorological patterns (LSMPs) are calculated for months assigned to each cluster. Initial results suggest that LSMPs, defined using meteorological variables such as sea level pressure (SLP), geopotential height, and wind, are able to identify synoptic scale mechanisms important for driving temperature variability at the monthly scale. Some LSMPs indicate a relationship with known recurrent modes of climate variability. For example, composites of geopotential height suggest that the Southern Annular Mode is an important, but not necessarily dominant, component of temperature variability over southern South America. This work will be extended to assess the drivers of temperature extremes across South America.

  12. Statistical downscaling of sub-daily (6-hour) temperature in Romania, by means of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Birsan, Marius-Victor; Dumitrescu, Alexandru; Cǎrbunaru, Felicia

    2016-04-01

    The role of statistical downscaling is to model the relationship between large-scale atmospheric circulation and climatic variables on a regional and sub-regional scale, making use of the predictions of future circulation generated by General Circulation Models (GCMs) in order to capture the effects of climate change on smaller areas. The study presents a statistical downscaling model based on a neural network-based approach, by means of multi-layer perceptron networks. Sub-daily temperature data series from 81 meteorological stations over Romania, with full data records are used as predictands. As large-scale predictor, the NCEP/NCAD air temperature data at 850 hPa over the domain 20-30E / 40-50N was used, at a spatial resolution of 2.5×2.5 degrees. The period 1961-1990 was used for calibration, while the validation was realized over the 1991-2010 interval. Further, in order to estimate future changes in air temperature for 2021-2050 and 2071-2100, air temperature data at 850 hPa corresponding to the IPCC A1B scenario was extracted from the CNCM33 model (Meteo-France) and used as predictor. This work has been realized within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian Executive Agency for Higher Education Research, Development and Innovation Funding (UEFISCDI).

  13. The latest developments and outlook for hydrogen liquefaction technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohlig, K.; Decker, L.

    2014-01-29

    Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence highermore » operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.« less

  14. The Joint Statistics of California Temperature and Precipitation as a Function of the Large-scale State of the Climate

    NASA Astrophysics Data System (ADS)

    OBrien, J. P.; O'Brien, T. A.

    2015-12-01

    Single climatic extremes have a strong and disproportionate effect on society and the natural environment. However, the joint occurrence of two or more concurrent extremes has the potential to negatively impact these areas of life in ways far greater than any single event could. California, USA, home to nearly 40 million people and the largest agricultural producer in the United States, is currently experiencing an extreme drought, which has persisted for several years. While drought is commonly thought of in terms of only precipitation deficits, above average temperatures co-occurring with precipitation deficits greatly exacerbate drought conditions. The 2014 calendar year in California was characterized both by extremely low precipitation and extremely high temperatures, which has significantly deepened the already extreme drought conditions leading to severe water shortages and wildfires. While many studies have shown the statistics of 2014 temperature and precipitation anomalies as outliers, none have demonstrated a connection with large-scale, long-term climate trends, which would provide useful relationships for predicting the future trajectory of California climate and water resources. We focus on understanding non-stationarity in the joint distribution of California temperature and precipitation anomalies in terms of large-scale, low-frequency trends in climate such as global mean temperature rise and oscillatory indices such as ENSO and the Pacific Decadal Oscillation among others. We consider temperature and precipitation data from the seven distinct climate divisions in California and employ a novel, high-fidelity kernel density estimation method to directly infer the multivariate distribution of temperature and precipitation anomalies conditioned on the large-scale state of the climate. We show that the joint distributions and associated statistics of temperature and precipitation are non-stationary and vary regionally in California. Further, we show that recurrence intervals of extreme concurrent events vary as a function of time and of teleconnections. This research has implications for predicting and forecasting future temperature and precipitation anomalies, which is critically important for city, water, and agricultural planning in California.

  15. Large prospective birth cohort studies on environmental contaminants and child health - goals, challenges, limitations and needs.

    PubMed

    Luo, Zhong-Cheng; Liu, Jian-Meng; Fraser, William D

    2010-02-01

    The adverse health effects of environmental contaminants (ECs) are a rising public health concern, and a major threat to sustainable socioeconomic development. The developing fetuses and growing children are particularly vulnerable to the adverse effects of ECs. However, assessing the health impact of ECs presents a major challenge, given that multiple outcomes may arise from one exposure, multiple exposures may result in one outcome, and the complex interactions between ECs, and between ECs, nutrients and genetic factors, and the dynamic temporal changes in EC exposures during the life course. Large-scale prospective birth cohort studies collecting extensive data and specimen starting from the prenatal or pre-conception period, although costly, hold promise as a means to more clearly quantify the health effects of ECs, and to unravel the complex interactions between ECs, nutrients and genotypes. A number of such large-scale studies have been launched in some developed counties. We present an overview of "why", "what" and "how" behind these efforts with an objective to uncover major unidentified limitations and needs. Three major limitations were identified: (1) limited data and bio-specimens regarding early life EC exposure assessments in some birth cohort studies; (2) heavy participant burdens in some birth cohort studies may bias participant recruitment, and risk substantial loss to follow-up, protocol deviations limiting the quality of data and specimens collection, with an overall potential bias towards the null effect; (3) lack of concerted efforts in building comparable birth cohorts across countries to take advantage of natural "experiments" (large EC exposure level differences between countries) for more in-depth assessments of dose-response relationships, threshold exposure levels, and positive and negative effect modifiers. Addressing these concerns in current or future large-scale birth cohort studies may help to produce better evidence on the health effects of ECs.

  16. The effects of clinical aromatherapy for anxiety and depression in the high risk postpartum woman - a pilot study.

    PubMed

    Conrad, Pam; Adams, Cindy

    2012-08-01

    The aim of this study was to determine if aromatherapy improves anxiety and/or depression in the high risk postpartum woman and to provide a complementary therapy tool for healthcare practitioners. The pilot study was observational with repeated measures. Private consultation room in a Women's center of a large Indianapolis hospital. 28 women, 0-18 months postpartum. The treatment groups were randomized to either the inhalation group or the aromatherapy hand m'technique. Treatment consisted of 15 min sessions, twice a week for four consecutive weeks. An essential oil blend of rose otto and lavandula angustifolia @ 2% dilution was used in all treatments. The non-randomized control group, comprised of volunteers, was instructed to avoid aromatherapy use during the 4 week study period. Allopathic medical treatment continued for all participants. All subjects completed the Edinburgh Postnatal Depression Scale (EPDS) and Generalized Anxiety Disorder Scale (GAD-7) at the beginning of the study. The scales were then repeated at the midway point (two weeks), and at the end of all treatments (four weeks). Analysis of Variance (ANOVA) was utilized to determine differences in EPDS and/or GAD-7 scores between the aromatherapy and control groups at baseline, midpoint and end of study. No significant differences were found between aromatherapy and control groups at baseline. The midpoint and final scores indicated that aromatherapy had significant improvements greater than the control group on both EPDS and GAD-7 scores. There were no adverse effects reported. The pilot study indicates positive findings with minimal risk for the use of aromatherapy as a complementary therapy in both anxiety and depression scales with the postpartum woman. Future large scale research in aromatherapy with this population is recommended. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Artificial intelligence in diagnosis of obstructive lung disease: current status and future potential.

    PubMed

    Das, Nilakash; Topalovic, Marko; Janssens, Wim

    2018-03-01

    The application of artificial intelligence in the diagnosis of obstructive lung diseases is an exciting phenomenon. Artificial intelligence algorithms work by finding patterns in data obtained from diagnostic tests, which can be used to predict clinical outcomes or to detect obstructive phenotypes. The purpose of this review is to describe the latest trends and to discuss the future potential of artificial intelligence in the diagnosis of obstructive lung diseases. Machine learning has been successfully used in automated interpretation of pulmonary function tests for differential diagnosis of obstructive lung diseases. Deep learning models such as convolutional neural network are state-of-the art for obstructive pattern recognition in computed tomography. Machine learning has also been applied in other diagnostic approaches such as forced oscillation test, breath analysis, lung sound analysis and telemedicine with promising results in small-scale studies. Overall, the application of artificial intelligence has produced encouraging results in the diagnosis of obstructive lung diseases. However, large-scale studies are still required to validate current findings and to boost its adoption by the medical community.

  18. bigSCale: an analytical framework for big-scale single-cell data.

    PubMed

    Iacono, Giovanni; Mereu, Elisabetta; Guillaumet-Adkins, Amy; Corominas, Roser; Cuscó, Ivon; Rodríguez-Esteban, Gustavo; Gut, Marta; Pérez-Jurado, Luis Alberto; Gut, Ivo; Heyn, Holger

    2018-06-01

    Single-cell RNA sequencing (scRNA-seq) has significantly deepened our insights into complex tissues, with the latest techniques capable of processing tens of thousands of cells simultaneously. Analyzing increasing numbers of cells, however, generates extremely large data sets, extending processing time and challenging computing resources. Current scRNA-seq analysis tools are not designed to interrogate large data sets and often lack sensitivity to identify marker genes. With bigSCale, we provide a scalable analytical framework to analyze millions of cells, which addresses the challenges associated with large data sets. To handle the noise and sparsity of scRNA-seq data, bigSCale uses large sample sizes to estimate an accurate numerical model of noise. The framework further includes modules for differential expression analysis, cell clustering, and marker identification. A directed convolution strategy allows processing of extremely large data sets, while preserving transcript information from individual cells. We evaluated the performance of bigSCale using both a biological model of aberrant gene expression in patient-derived neuronal progenitor cells and simulated data sets, which underlines the speed and accuracy in differential expression analysis. To test its applicability for large data sets, we applied bigSCale to assess 1.3 million cells from the mouse developing forebrain. Its directed down-sampling strategy accumulates information from single cells into index cell transcriptomes, thereby defining cellular clusters with improved resolution. Accordingly, index cell clusters identified rare populations, such as reelin ( Reln )-positive Cajal-Retzius neurons, for which we report previously unrecognized heterogeneity associated with distinct differentiation stages, spatial organization, and cellular function. Together, bigSCale presents a solution to address future challenges of large single-cell data sets. © 2018 Iacono et al.; Published by Cold Spring Harbor Laboratory Press.

  19. Enhanced recharge rates and altered recharge sensitivity to climate variability through subsurface heterogeneity

    NASA Astrophysics Data System (ADS)

    Hartmann, Andreas; Gleeson, Tom; Wada, Yoshihide; Wagener, Thorsten

    2017-04-01

    Karst aquifers in Europe are an important source of fresh water contributing up to half of the total drinking water supply in some countries. Karstic groundwater recharge is one of the most important components of the water balance of karst systems as it feeds the karst aquifers. Presently available large-scale hydrological models do not consider karst heterogeneity adequately. Projections of current and potential future groundwater recharge of Europe's karst aquifers are therefore unclear. In this study we compare simulations of present (1991-2010) and future (2080-2099) recharge using two different models to simulate groundwater recharge processes. One model includes karst processes (subsurface heterogeneity, lateral flow and concentrated recharge), while the other is based on the conceptual understanding of common hydrological systems (homogeneous subsurface, saturation excess overland flow). Both models are driven by the bias-corrected 5 GCMs of the ISI-MIP project (RCP8.5). To further assess sensitivity of groundwater recharge to climate variability, we calculate the elasticity of recharge rates to annual precipitation, temperature and average intensity of rainfall events, which is the median change of recharge that corresponds to the median change of these climate variables within the present and future time period, respectively. Our model comparison shows that karst regions over Europe have enhanced recharge rates with greater inter-annual variability compared to those with more homogenous subsurface properties. Furthermore, the heterogeneous representation shows stronger elasticity concerning climate variability than the homogeneous subsurface representation. This difference tends to increase towards the future. Our results suggest that water management in regions with heterogeneous subsurface can expect a higher water availability than estimated by most of the current large-scale simulations, while measures should be taken to prepare for increasingly variable groundwater recharge rates.

  20. Predicting future spatial distribution of SOC across entire France

    NASA Astrophysics Data System (ADS)

    Meersmans, Jeroen; Van Rompaey, Anton; Quine, Tim; Martin, Manuel; Pagé, Christian; Arrouays, Dominique

    2013-04-01

    Soil organic carbon (SOC) is widely recognized as a key factor controlling soil quality and as a crucial and active component of the global C-cycle. Hence, there exists a growing interest in monitoring and modeling the spatial and temporal behavior of this pool. So far, a large attempt has been made to map SOC at national scales for current and/or past situations. Despite some coarse predictions, detailed spatial SOC predictions for the future are still lacking. In this study we aim to predict future spatial evolution of SOC driven by climate and land use change for France up to the year 2100. Therefore, we combined 1) an existing model, predicting SOC as a function of soil type, climate, land use and management (Meersmans et al 2012), with 2) eight different IPCC spatial explicit climate change predictions (conducted by CERFACS) and 3) Land use change scenario predictions. We created business-as-usual land use change scenarios by extrapolating observed trends and calibrating logistic regression models, incorporating a large set of physical and socio-economic factors, at the regional level in combination with a multi-objective land allocation (MOLA) procedure. The resultant detailed projections of future SOC evolution across all regions of France, allow us to identify regions that are most likely to be characterized by a significant gain or loss of SOC and the degree to which land use decisions/outcomes control the scale of loss and gain. Therefore, this methodology and resulting maps can be considered as powerful tools to aid decision making concerning appropriate soil management, in order to enlarge SOC storage possibilities and reduce soil related CO2 fluxes.

  1. Near-Earth Object Interception Using Nuclear Thermal Rock Propulsion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    X-L. Zhang; E. Ball; L. Kochmanski

    Planetary defense has drawn wide study: despite the low probability of a large-scale impact, its consequences would be disastrous. The study presented here evaluates available protection strategies to identify bottlenecks limiting the scale of near-Earth object that could be deflected, using cutting-edge and near-future technologies. It discusses the use of a nuclear thermal rocket (NTR) as a propulsion device for delivery of thermonuclear payloads to deflect or destroy a long-period comet on a collision course with Earth. A ‘worst plausible scenario’ for the available warning time (10 months) and comet approach trajectory are determined, and empirical data are used tomore » make an estimate of the payload necessary to deflect such a comet. Optimizing the tradeoff between early interception and large deflection payload establishes the ideal trajectory for an interception mission to follow. The study also examines the potential for multiple rocket launch dates. Comparison of propulsion technologies for this mission shows that NTR outperforms other options substantially. The discussion concludes with an estimate of the comet size (5 km) that could be deflected usingNTRpropulsion, given current launch capabilities.« less

  2. Characterizing the Response of Composite Panels to a Pyroshock Induced Environment Using Design of Experiments Methodology

    NASA Technical Reports Server (NTRS)

    Parsons, David S.; Ordway, David; Johnson, Kenneth

    2013-01-01

    This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.

  3. Characterizing the Response of Composite Panels to a Pyroshock Induced Environment using Design of Experiments Methodology

    NASA Technical Reports Server (NTRS)

    Parsons, David S.; Ordway, David O.; Johnson, Kenneth L.

    2013-01-01

    This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.

  4. 78 FR 4865 - Endangered and Threatened Wildlife and Plants; Recovery Plan for the Columbia Basin Distinct...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-23

    ... activities and future updates to the recovery plan. Large-scale loss and fragmentation of native shrub steppe... threats to the population and potentially suitable shrub-steppe habitats in the Columbia Basin; (2...

  5. Mars' Magnetic Atmosphere: Ionospheric Currents, Lightning (or Not), E and M Subsurface Sounding, and Future Missions

    NASA Technical Reports Server (NTRS)

    Espley, J. R.; Connerney, J. E. P.

    2014-01-01

    Mars' ionosphere has no obvious magnetic signs of large-scale, dustproduced lightning. However, there are numerous interesting ionospheric currents (some associated with crustal magnetic fields) which would allow for E&M subsurface sounding.

  6. The International Symposium on Applied Military Psychology (20th) Held on 25-29 June 1984 in Brussels, Belgium.

    DTIC Science & Technology

    1984-12-07

    and organization of psychological services, adjustment to military life and stress, organizational diagnosis and intervention, evaluation of new programs, and new emphases in large-scale research programs for the future.

  7. Agricultural Geophysics: Past, present, and future

    USDA-ARS?s Scientific Manuscript database

    Geophysical methods are becoming an increasingly valuable tool for agricultural applications. Agricultural geophysics investigations are commonly (although certainly not always) focused on delineating small- and/or large-scale objects/features within the soil profile (~ 0 to 2 m depth) over very lar...

  8. Probing Higgs self-coupling of a classically scale invariant model in e+e- → Zhh: Evaluation at physical point

    NASA Astrophysics Data System (ADS)

    Fujitani, Y.; Sumino, Y.

    2018-04-01

    A classically scale invariant extension of the standard model predicts large anomalous Higgs self-interactions. We compute missing contributions in previous studies for probing the Higgs triple coupling of a minimal model using the process e+e- → Zhh. Employing a proper order counting, we compute the total and differential cross sections at the leading order, which incorporate the one-loop corrections between zero external momenta and their physical values. Discovery/exclusion potential of a future e+e- collider for this model is estimated. We also find a unique feature in the momentum dependence of the Higgs triple vertex for this class of models.

  9. Experimental Evaluation of Suitability of Selected Multi-Criteria Decision-Making Methods for Large-Scale Agent-Based Simulations.

    PubMed

    Tučník, Petr; Bureš, Vladimír

    2016-01-01

    Multi-criteria decision-making (MCDM) can be formally implemented by various methods. This study compares suitability of four selected MCDM methods, namely WPM, TOPSIS, VIKOR, and PROMETHEE, for future applications in agent-based computational economic (ACE) models of larger scale (i.e., over 10 000 agents in one geographical region). These four MCDM methods were selected according to their appropriateness for computational processing in ACE applications. Tests of the selected methods were conducted on four hardware configurations. For each method, 100 tests were performed, which represented one testing iteration. With four testing iterations conducted on each hardware setting and separated testing of all configurations with the-server parameter de/activated, altogether, 12800 data points were collected and consequently analyzed. An illustrational decision-making scenario was used which allows the mutual comparison of all of the selected decision making methods. Our test results suggest that although all methods are convenient and can be used in practice, the VIKOR method accomplished the tests with the best results and thus can be recommended as the most suitable for simulations of large-scale agent-based models.

  10. Future aircraft networks and schedules

    NASA Astrophysics Data System (ADS)

    Shu, Yan

    2011-07-01

    Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents computational results of these large-scale instances. To validate the models and solution algorithms developed, this thesis also compares the daily flight schedules that it designs with the schedules of the existing airlines. Furthermore, it creates instances that represent different economic and fuel-prices conditions and derives schedules under these different conditions. In addition, it discusses the implication of using new aircraft in the future flight schedules. Finally, future research in three areas---model, computational method, and simulation for validation---is proposed.

  11. Developing Present-day Proxy Cases Based on NARVAL Data for Investigating Low Level Cloud Responses to Future Climate Change.

    NASA Astrophysics Data System (ADS)

    Reilly, Stephanie

    2017-04-01

    The energy budget of the entire global climate is significantly influenced by the presence of boundary layer clouds. The main aim of the High Definition Clouds and Precipitation for Advancing Climate Prediction (HD(CP)2) project is to improve climate model predictions by means of process studies of clouds and precipitation. This study makes use of observed elevated moisture layers as a proxy of future changes in tropospheric humidity. The associated impact on radiative transfer triggers fast responses in boundary layer clouds, providing a framework for investigating this phenomenon. The investigation will be carried out using data gathered during the Next-generation Aircraft Remote-sensing for VALidation (NARVAL) South campaigns. Observational data will be combined with ECMWF reanalysis data to derive the large scale forcings for the Large Eddy Simulations (LES). Simulations will be generated for a range of elevated moisture layers, spanning a multi-dimensional phase space in depth, amplitude, elevation, and cloudiness. The NARVAL locations will function as anchor-points. The results of the large eddy simulations and the observations will be studied and compared in an attempt to determine how simulated boundary layer clouds react to changes in radiative transfer from the free troposphere. Preliminary LES results will be presented and discussed.

  12. Developing closed life support systems for large space habitats

    NASA Technical Reports Server (NTRS)

    Phillips, J. M.; Harlan, A. D.; Krumhar, K. C.

    1978-01-01

    In anticipation of possible large-scale, long-duration space missions which may be conducted in the future, NASA has begun to investigate the research and technology development requirements to create life support systems for large space habitats. An analysis suggests the feasibility of a regeneration of food in missions which exceed four years duration. Regeneration of food in space may be justified for missions of shorter duration when large crews must be supported at remote sites such as lunar bases and space manufacturing facilities. It is thought that biological components consisting principally of traditional crop and livestock species will prove to be the most acceptable means of closing the food cycle. A description is presented of the preliminary results of a study of potential biological components for large space habitats. Attention is given to controlled ecosystems, Russian life support system research, controlled-environment agriculture, and the social aspects of the life-support system.

  13. The magnetic shear-current effect: Generation of large-scale magnetic fields by the small-scale dynamo

    DOE PAGES

    Squire, J.; Bhattacharjee, A.

    2016-03-14

    A novel large-scale dynamo mechanism, the magnetic shear-current effect, is discussed and explored. Here, the effect relies on the interaction of magnetic fluctuations with a mean shear flow, meaning the saturated state of the small-scale dynamo can drive a large-scale dynamo – in some sense the inverse of dynamo quenching. The dynamo is non-helical, with the mean fieldmore » $${\\it\\alpha}$$coefficient zero, and is caused by the interaction between an off-diagonal component of the turbulent resistivity and the stretching of the large-scale field by shear flow. Following up on previous numerical and analytic work, this paper presents further details of the numerical evidence for the effect, as well as an heuristic description of how magnetic fluctuations can interact with shear flow to produce the required electromotive force. The pressure response of the fluid is fundamental to this mechanism, which helps explain why the magnetic effect is stronger than its kinematic cousin, and the basic idea is related to the well-known lack of turbulent resistivity quenching by magnetic fluctuations. As well as being interesting for its applications to general high Reynolds number astrophysical turbulence, where strong small-scale magnetic fluctuations are expected to be prevalent, the magnetic shear-current effect is a likely candidate for large-scale dynamo in the unstratified regions of ionized accretion disks. Evidence for this is discussed, as well as future research directions and the challenges involved with understanding details of the effect in astrophysically relevant regimes.« less

  14. Climate change adaptation and Integrated Water Resource Management in the water sector

    NASA Astrophysics Data System (ADS)

    Ludwig, Fulco; van Slobbe, Erik; Cofino, Wim

    2014-10-01

    Integrated Water Resources Management (IWRM) was introduced in 1980s to better optimise water uses between different water demanding sectors. However, since it was introduced water systems have become more complicated due to changes in the global water cycle as a result of climate change. The realization that climate change will have a significant impact on water availability and flood risks has driven research and policy making on adaptation. This paper discusses the main similarities and differences between climate change adaptation and IWRM. The main difference between the two is the focus on current and historic issues of IWRM compared to the (long-term) future focus of adaptation. One of the main problems of implementing climate change adaptation is the large uncertainties in future projections. Two completely different approaches to adaptation have been developed in response to these large uncertainties. A top-down approach based on large scale biophysical impacts analyses focussing on quantifying and minimizing uncertainty by using a large range of scenarios and different climate and impact models. The main problem with this approach is the propagation of uncertainties within the modelling chain. The opposite is the bottom up approach which basically ignores uncertainty. It focusses on reducing vulnerabilities, often at local scale, by developing resilient water systems. Both these approaches however are unsuitable for integrating into water management. The bottom up approach focuses too much on socio-economic vulnerability and too little on developing (technical) solutions. The top-down approach often results in an “explosion” of uncertainty and therefore complicates decision making. A more promising direction of adaptation would be a risk based approach. Future research should further develop and test an approach which starts with developing adaptation strategies based on current and future risks. These strategies should then be evaluated using a range of future scenarios in order to develop robust adaptation measures and strategies.

  15. Ecosystem water imbalances created during ecological restoration by afforestation in China, and lessons for other developing countries.

    PubMed

    Cao, Shixiong; Zhang, Junze; Chen, Li; Zhao, Tingyang

    2016-12-01

    Land degradation is a global environmental problem that jeopardizes human safety and socioeconomic development. To alleviate severe soil erosion and desertification due to deforestation and overgrazing, China has implemented historically unprecedented large-scale afforestation. However, few studies have accounted for the resulting imbalance between water supply (primarily precipitation) and water consumption (evapotranspiration), which will affect ecosystem health and socioeconomic development. We compared the water balance results between restoration by means of afforestation and restoration using the potential natural vegetation to guide future ecological restoration planning and environmental policy development. Based on estimates of water consumption from seven evapotranspiration models, we discuss the consequences for water security using data obtained since 1952 under China's large-scale afforestation program. The models estimated that afforestation will increase water consumption by 559-2354 m 3 /ha annually compared with natural vegetation. Although afforestation is a potentially important approach for environmental restoration, China's current policy has not been tailored to local precipitation conditions, and will have therefore exacerbated water shortages and decrease the ability to achieve environmental policy goals. Our analysis shows how, both in China and around the world, future ecological restoration planning must account for the water balance to ensure effective and sustainable environmental restoration policy. Copyright © 2016. Published by Elsevier Ltd.

  16. Adapting wheat to uncertain future

    NASA Astrophysics Data System (ADS)

    Semenov, Mikhail; Stratonovitch, Pierre

    2015-04-01

    This study describes integration of climate change projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model ensemble with the LARS-WG weather generator, which delivers an attractive option for downscaling of large-scale climate projections from global climate models (GCMs) to local-scale climate scenarios for impact assessments. A subset of 18 GCMs from the CMIP5 ensemble and 2 RCPs, RCP4.5 and RCP8.5, were integrated with LARS-WG. Climate sensitivity indexes for temperature and precipitation were computed for all GCMs and for 21 regions in the world. For computationally demanding impact assessments, where it is not practical to explore all possible combinations of GCM × RCP, climate sensitivity indexes could be used to select a subset of GCMs from CMIP5 with contrasting climate sensitivity. This would allow to quantify uncertainty in impacts resulting from the CMIP5 ensemble by conducting fewer simulation experiments. As an example, an in silico design of wheat ideotype optimised for future climate scenarios in Europe was described. Two contrasting GCMs were selected for the analysis, "hot" HadGEM2-ES and "cool" GISS-E2-R-CC, along with 2 RCPs. Despite large uncertainty in climate projections, several wheat traits were identified as beneficial for the high-yielding wheat ideotypes that could be used as targets for wheat improvement by breeders.

  17. Is biosorption suitable for decontamination of metal-bearing wastewaters? A critical review on the state-of-the-art of biosorption processes and future directions.

    PubMed

    Vijayaraghavan, K; Balasubramanian, R

    2015-09-01

    For the past few decades, biosorption has been widely investigated for the removal of different contaminants in aqueous media. A number of biomasses of different genre have been identified to possess good biosorption capacity. Insights into biosorption mechanisms have been provided by various researchers in order to develop a fundamental scientific understanding of the biosorption process. However, biosorption has not been employed widely for its large-scale commercial applications. The key factors that affect the growth and evolution of biosorption as a practical technology for decontamination of wastewaters include, (1) lack of investigations on multi-component solutions and wastewaters with complex matrix effects, (2) incomplete understanding of physico-chemical characteristics of biomasses of different types, (3) lack of studies to improve the performance of biosorbents through surface functionalization, and (4) non-integration of biosorption in wastewater/water treatment plants. This critical review aims to identify and discuss the practical limitations of biosorption and provide future research directions to make biosorption a technologically viable process with emphasis on selection and modification of biomasses to suit desired treatment applications, identify appropriate operation modes for large-scale applications of biosorption, and perform techno-economic evaluation of overall biosorption processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Gut Microbiota Dynamics during Dietary Shift in Eastern African Cichlid Fishes

    PubMed Central

    Baldo, Laura; Riera, Joan Lluís; Tooming-Klunderud, Ave; Albà, M. Mar; Salzburger, Walter

    2015-01-01

    The gut microbiota structure reflects both a host phylogenetic history and a signature of adaptation to the host ecological, mainly trophic niches. African cichlid fishes, with their array of closely related species that underwent a rapid dietary niche radiation, offer a particularly interesting system to explore the relative contribution of these two factors in nature. Here we surveyed the host intra- and interspecific natural variation of the gut microbiota of five cichlid species from the monophyletic tribe Perissodini of lake Tanganyika, whose members transitioned from being zooplanktivorous to feeding primarily on fish scales. The outgroup riverine species Astatotilapia burtoni, largely omnivorous, was also included in the study. Fusobacteria, Firmicutes and Proteobacteria represented the dominant components in the gut microbiota of all 30 specimens analysed according to two distinct 16S rRNA markers. All members of the Perissodini tribe showed a homogenous pattern of microbial alpha and beta diversities, with no significant qualitative differences, despite changes in diet. The recent diet shift between zooplantkon- and scale-eaters simply reflects on a significant enrichment of Clostridium taxa in scale-eaters where they might be involved in the scale metabolism. Comparison with the omnivorous species A. burtoni suggests that, with increased host phylogenetic distance and/or increasing herbivory, the gut microbiota begins differentiating also at qualitative level. The cichlids show presence of a large conserved core of taxa and a small set of core OTUs (average 13–15%), remarkably stable also in captivity, and putatively favoured by both restricted microbial transmission among related hosts (putatively enhanced by mouthbrooding behavior) and common host constraints. This study sets the basis for a future large-scale investigation of the gut microbiota of cichlids and its adaptation in the process of the host adaptive radiation. PMID:25978452

  19. Physical Modeling of Flow Over Gale Crater, Mars: Laboratory Measurements of Basin Secondary Circulations

    NASA Astrophysics Data System (ADS)

    Bristow, N.; Blois, G.; Kim, T.; Anderson, W.; Day, M. D.; Kocurek, G.; Christensen, K. T.

    2017-12-01

    Impact craters, common large-scale topographic features on the surface of Mars, are circular depressions delimited by a sharp ridge. A variety of crater fill morphologies exist, suggesting that complex intracrater circulations affect their evolution. Some large craters (diameter > 10 km), particularly at mid latitudes on Mars, exhibit a central mound surrounded by circular moat. Foremost among these examples is Gale crater, landing site of NASA's Curiosity rover, since large-scale climatic processes early in in the history of Mars are preserved in the stratigraphic record of the inner mound. Investigating the intracrater flow produced by large scale winds aloft Mars craters is key to a number of important scientific issues including ongoing research on Mars paleo-environmental reconstruction and the planning of future missions (these results must be viewed in conjunction with the affects of radial katabatibc flows, the importance of which is already established in preceding studies). In this work we consider a number of crater shapes inspired by Gale morphology, including idealized craters. Access to the flow field within such geometrically complex topography is achieved herein using a refractive index matched approach. Instantaneous velocity maps, using both planar and volumetric PIV techniques, are presented to elucidate complex three-dimensional flow within the crater. In addition, first- and second-order statistics will be discussed in the context of wind-driven (aeolian) excavation of crater fill.

  20. [Privacy and public benefit in using large scale health databases].

    PubMed

    Yamamoto, Ryuichi

    2014-01-01

    In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.

  1. Large-Ensemble modeling of past and future variations of the Antarctic Ice Sheet with a coupled ice-Earth-sea level model

    NASA Astrophysics Data System (ADS)

    Pollard, David; DeConto, Robert; Gomez, Natalya

    2016-04-01

    To date, most modeling of the Antarctic Ice Sheet's response to future warming has been calibrated using recent and modern observations. As an alternate approach, we apply a hybrid 3-D ice sheet-shelf model to the last deglacial retreat of Antarctica, making use of geologic data of the last ~20,000 years to test the model against the large-scale variations during this period. The ice model is coupled to a global Earth-sea level model to improve modeling of the bedrock response and to capture ocean-ice gravitational interactions. Following several recent ice-sheet studies, we use Large Ensemble (LE) statistical methods, performing sets of 625 runs from 30,000 years to present with systematically varying model parameters. Objective scores for each run are calculated using modern data and past reconstructed grounding lines, relative sea level records, cosmogenic elevation-age data and uplift rates. The LE results are analyzed to calibrate 4 particularly uncertain model parameters that concern marginal ice processes and interaction with the ocean. LE's are extended into the future with climates following RCP scenarios. An additional scoring criterion tests the model's ability to reproduce estimated sea-level high stands in the warm mid-Pliocene, for which drastic retreat mechanisms of hydrofracturing and ice-cliff failure are needed in the model. The LE analysis provides future sea-level-rise envelopes with well-defined parametric uncertainty bounds. Sensitivities of future LE results to Pliocene sea-level estimates, coupling to the Earth-sea level model, and vertical profiles of Earth properties, will be presented.

  2. Radiation-Hardened Wafer Scale Integration

    DTIC Science & Technology

    1989-10-25

    unlimited. LEXINGTON MASSACHUSETTS EXECUTIVE SUMMARY A focal plane processor (FPP) for a large array of LWIR photodetectors on a space platform must...It seems certain that large. scanning LWIR arrays will once again be of interest in the future, though their specifications will differ from those... nonuniformity and defects in the ZMR material, but films of good quality produced by this technique are now available commercially from Kopin Corporation. Such

  3. Aftermath of the MOOC Wars: Can Commercial Vendors Support Creative Higher Education?

    ERIC Educational Resources Information Center

    Newfield, Christopher

    2016-01-01

    The large-scale massive open online course (xMOOC) rose to prominence in 2012-13 on the promise that its outcomes would be better and cheaper than those of face-to-face university instruction. By late 2013, xMOOC educational claims had been largely discredited, though policy interest in ed-tech carried on. What can we learn about the future of…

  4. A global fit of the MSSM with GAMBIT

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balázs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Serra, Nicola; Weniger, Christoph; White, Martin

    2017-12-01

    We study the seven-dimensional Minimal Supersymmetric Standard Model (MSSM7) with the new GAMBIT software framework, with all parameters defined at the weak scale. Our analysis significantly extends previous weak-scale, phenomenological MSSM fits, by adding more and newer experimental analyses, improving the accuracy and detail of theoretical predictions, including dominant uncertainties from the Standard Model, the Galactic dark matter halo and the quark content of the nucleon, and employing novel and highly-efficient statistical sampling methods to scan the parameter space. We find regions of the MSSM7 that exhibit co-annihilation of neutralinos with charginos, stops and sbottoms, as well as models that undergo resonant annihilation via both light and heavy Higgs funnels. We find high-likelihood models with light charginos, stops and sbottoms that have the potential to be within the future reach of the LHC. Large parts of our preferred parameter regions will also be accessible to the next generation of direct and indirect dark matter searches, making prospects for discovery in the near future rather good.

  5. Future sensitivity to new physics in Bd, Bs, and K mixings

    NASA Astrophysics Data System (ADS)

    Charles, Jérôme; Descotes-Genon, Sébastien; Ligeti, Zoltan; Monteil, Stéphane; Papucci, Michele; Trabelsi, Karim

    2014-02-01

    We estimate, in a large class of scenarios, the sensitivity to new physics in Bd and Bs mixings achievable with 50 ab-1 of Belle II and 50 fb-1 of LHCb data. We find that current limits on new physics contributions in both Bd ,s systems can be improved by a factor of ˜5 for all values of the CP-violating phases, corresponding to over a factor of 2 increase in the scale of new physics probed. Assuming the same suppressions by Cabbibo-Kobayashi-Maskawa matrix elements as those of the standard model box diagrams, the scale probed will be about 20 TeV for tree-level new physics contributions, and about 2 TeV for new physics arising at one loop. We also explore the future sensitivity to new physics in K mixing. Implications for generic new physics and for various specific scenarios, such as minimal flavor violation, light third-generation dominated flavor violation, or U(2) flavor models are studied.

  6. Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.

    PubMed

    Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco

    2018-06-07

    Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Social Gaming and Learning Applications: A Driving Force for the Future of Virtual and Augmented Reality?

    NASA Astrophysics Data System (ADS)

    Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang

    Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.

  8. LAGUNA DESIGN STUDY, Underground infrastructures and engineering

    NASA Astrophysics Data System (ADS)

    Nuijten, Guido Alexander

    2011-07-01

    The European Commission has awarded the LAGUNA project a grant of 1.7 million euro for a Design Study from the seventh framework program of research and technology development (FP7-INFRASTRUCTURES - 2007-1) in 2008. The purpose of this two year work is to study the feasibility of the considered experiments and prepare a conceptual design of the required underground infrastructure. It is due to deliver a report that allows the funding agencies to decide on the realization of the experiment and to select the site and the technology. The result of this work is the first step towards fulfilling the goals of LAGUNA. The work will continue with EU funding to study the possibilities more thoroughly. The LAGUNA project is included in the future plans prepared by European funding organizations. (Astroparticle physics in Europe). It is recommended that a new large European infrastructure is put forward, as a future international multi-purpose facility for improved studies on proton decay and low-energy neutrinos from astrophysical origin. The three detection techniques being studied for such large detectors in Europe, Water-Cherenkov (like MEMPHYS), liquid scintillator (like LENA) and liquid argon (like GLACIER), are evaluated in the context of a common design study which should also address the underground infrastructure and the possibility of an eventual detection of future accelerator neutrino beams. The design study is also to take into account worldwide efforts and converge, on a time scale of 2010, to a common proposal.

  9. Global geomorphology: Report of Working Group Number 1

    NASA Technical Reports Server (NTRS)

    Douglas, I.

    1985-01-01

    Remote sensing was considered invaluable for seeing landforms in their regional context and in relationship to each other. Sequential images, such as those available from LANDSAT orbits provide a means of detecting landform change and the operation of large scale processes, such as major floods in semiarid regions. The use of remote sensing falls into two broad stages: (1) the characterization or accurate description of the features of the Earth's surface; and (2) the study of landform evolution. Recommendations for future research are made.

  10. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    NASA Astrophysics Data System (ADS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  11. Regional hydro-climatic impacts of contemporary Amazonian deforestation

    NASA Astrophysics Data System (ADS)

    Khanna, Jaya

    More than 17% of the Amazon rainforest has been cleared in the past three decades triggering important climatological and societal impacts. This thesis is devoted to identifying and explaining the regional hydroclimatic impacts of this change employing multidecadal satellite observations and numerical simulations providing an integrated perspective on this topic. The climatological nature of this study motivated the implementation and application of a cloud detection technique to a new geostationary satellite dataset. The resulting sub daily, high spatial resolution, multidecadal time series facilitated the detection of trends and variability in deforestation triggered cloud cover changes. The analysis was complemented by satellite precipitation, reanalysis and ground based datasets and attribution with the variable resolution Ocean-Land-Atmosphere-Model. Contemporary Amazonian deforestation affects spatial scales of hundreds of kilometers. But, unlike the well-studied impacts of a few kilometers scale deforestation, the climatic response to contemporary, large scale deforestation is neither well observed nor well understood. Employing satellite datasets, this thesis shows a transition in the regional hydroclimate accompanying increasing scales of deforestation, with downwind deforested regions receiving 25% more and upwind deforested regions receiving 25% less precipitation from the deforested area mean. Simulations robustly reproduce these shifts when forced with increasing deforestation alone, suggesting a negligible role of large-scale decadal climate variability in causing the shifts. Furthermore, deforestation-induced surface roughness variations are found necessary to reproduce the observed spatial patterns in recent times illustrating the strong scale-sensitivity of the climatic response to Amazonian deforestation. This phenomenon, inconsequential during the wet season, is found to substantially affect the regional hydroclimate in the local dry and parts of transition seasons, hence occurring in atmospheric conditions otherwise less conducive to thermal convection. Evidence of this phenomenon is found at two large scale deforested areas considered in this thesis. Hence, the 'dynamical' mechanism, which affects the seasons most important for regional ecology, emerges as an impactful convective triggering mechanism. The phenomenon studied in this thesis provides context for thinking about the climate of a future, more patchily forested Amazonia, by articulating relationships between climate and spatial scales of deforestation.

  12. A Systematic Review of Biomarkers and Risk of Incident Type 2 Diabetes: An Overview of Epidemiological, Prediction and Aetiological Research Literature

    PubMed Central

    Sahlqvist, Anna-Stina; Lotta, Luca; Brosnan, Julia M.; Vollenweider, Peter; Giabbanelli, Philippe; Nunez, Derek J.; Waterworth, Dawn; Scott, Robert A.; Langenberg, Claudia; Wareham, Nicholas J.

    2016-01-01

    Background Blood-based or urinary biomarkers may play a role in quantifying the future risk of type 2 diabetes (T2D) and in understanding possible aetiological pathways to disease. However, no systematic review has been conducted that has identified and provided an overview of available biomarkers for incident T2D. We aimed to systematically review the associations of biomarkers with risk of developing T2D and to highlight evidence gaps in the existing literature regarding the predictive and aetiological value of these biomarkers and to direct future research in this field. Methods and Findings We systematically searched PubMed MEDLINE (January 2000 until March 2015) and Embase (until January 2016) databases for observational studies of biomarkers and incident T2D according to the 2009 PRISMA guidelines. We also searched availability of meta-analyses, Mendelian randomisation and prediction research for the identified biomarkers. We reviewed 3910 titles (705 abstracts) and 164 full papers and included 139 papers from 69 cohort studies that described the prospective relationships between 167 blood-based or urinary biomarkers and incident T2D. Only 35 biomarkers were reported in large scale studies with more than 1000 T2D cases, and thus the evidence for association was inconclusive for the majority of biomarkers. Fourteen biomarkers have been investigated using Mendelian randomisation approaches. Only for one biomarker was there strong observational evidence of association and evidence from genetic association studies that was compatible with an underlying causal association. In additional search for T2D prediction, we found only half of biomarkers were examined with formal evidence of predictive value for a minority of these biomarkers. Most biomarkers did not enhance the strength of prediction, but the strongest evidence for prediction was for biomarkers that quantify measures of glycaemia. Conclusions This study presents an extensive review of the current state of the literature to inform the strategy for future interrogation of existing and newly described biomarkers for T2D. Many biomarkers have been reported to be associated with the risk of developing T2D. The evidence of their value in adding to understanding of causal pathways to disease is very limited so far. The utility of most biomarkers remains largely unknown in clinical prediction. Future research should focus on providing good genetic instruments across consortia for possible biomarkers in Mendelian randomisation, prioritising biomarkers for measurement in large-scale cohort studies and examining predictive utility of biomarkers for a given context. PMID:27788146

  13. Watershed Dynamics, with focus on connectivity index and management of water related impacts on road infrastructure

    NASA Astrophysics Data System (ADS)

    Kalantari, Z.

    2015-12-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. This study was built on a conceptual framework for looking at SedInConnect model, topography, land use, soil data and other PCDs and climate change in an integrated way to pave the way for more integrated policy making. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. This framework can provide a region with an effective tool to inform a broad range of watershed planning activities within a region. Regional planners, decision-makers, etc. can utilize this tool to identify the most vulnerable points in a watershed and along roads to plan for interventions and actions to alter impacts of high flows and other extreme weather events on roads construction. The application of the model over a large scale can give a realistic spatial characterization of sediment connectivity for the optimal management of debris flow to road structures. The ability of the model to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  14. Genetic influences on schizophrenia and subcortical brain volumes: large-scale proof-of-concept and roadmap for future studies

    PubMed Central

    Anttila, Verneri; Hibar, Derrek P; van Hulzen, Kimm J E; Arias-Vasquez, Alejandro; Smoller, Jordan W; Nichols, Thomas E; Neale, Michael C; McIntosh, Andrew M; Lee, Phil; McMahon, Francis J; Meyer-Lindenberg, Andreas; Mattheisen, Manuel; Andreassen, Ole A; Gruber, Oliver; Sachdev, Perminder S; Roiz-Santiañez, Roberto; Saykin, Andrew J; Ehrlich, Stefan; Mather, Karen A; Turner, Jessica A; Schwarz, Emanuel; Thalamuthu, Anbupalam; Shugart, Yin Yao; Ho, Yvonne YW; Martin, Nicholas G; Wright, Margaret J

    2016-01-01

    Schizophrenia is a devastating psychiatric illness with high heritability. Brain structure and function differ, on average, between schizophrenia cases and healthy individuals. As common genetic associations are emerging for both schizophrenia and brain imaging phenotypes, we can now use genome-wide data to investigate genetic overlap. Here we integrated results from common variant studies of schizophrenia (33,636 cases, 43,008 controls) and volumes of several (mainly subcortical) brain structures (11,840 subjects). We did not find evidence of genetic overlap between schizophrenia risk and subcortical volume measures either at the level of common variant genetic architecture or for single genetic markers. The current study provides proof-of-concept (albeit based on a limited set of structural brain measures), and defines a roadmap for future studies investigating the genetic covariance between structural/functional brain phenotypes and risk for psychiatric disorders. PMID:26854805

  15. Statistical techniques for detecting the intergalactic magnetic field from large samples of extragalactic Faraday rotation data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akahori, Takuya; Gaensler, B. M.; Ryu, Dongsu, E-mail: akahori@physics.usyd.edu.au, E-mail: bryan.gaensler@sydney.edu.au, E-mail: ryu@sirius.unist.ac.kr

    2014-08-01

    Rotation measure (RM) grids of extragalactic radio sources have been widely used for studying cosmic magnetism. However, their potential for exploring the intergalactic magnetic field (IGMF) in filaments of galaxies is unclear, since other Faraday-rotation media such as the radio source itself, intervening galaxies, and the interstellar medium of our Galaxy are all significant contributors. We study statistical techniques for discriminating the Faraday rotation of filaments from other sources of Faraday rotation in future large-scale surveys of radio polarization. We consider a 30° × 30° field of view toward the south Galactic pole, while varying the number of sources detectedmore » in both present and future observations. We select sources located at high redshifts and toward which depolarization and optical absorption systems are not observed so as to reduce the RM contributions from the sources and intervening galaxies. It is found that a high-pass filter can satisfactorily reduce the RM contribution from the Galaxy since the angular scale of this component toward high Galactic latitudes would be much larger than that expected for the IGMF. Present observations do not yet provide a sufficient source density to be able to estimate the RM of filaments. However, from the proposed approach with forthcoming surveys, we predict significant residuals of RM that should be ascribable to filaments. The predicted structure of the IGMF down to scales of 0.°1 should be observable with data from the Square Kilometre Array, if we achieve selections of sources toward which sightlines do not contain intervening galaxies and RM errors are less than a few rad m{sup –2}.« less

  16. Meta-analysis on Macropore Flow Velocity in Soils

    NASA Astrophysics Data System (ADS)

    Liu, D.; Gao, M.; Li, H. Y.; Chen, X.; Leung, L. R.

    2017-12-01

    Macropore flow is ubiquitous in the soils and an important hydrologic process that is not well explained using traditional hydrologic theories. Macropore Flow Velocity (MFV) is an important parameter used to describe macropore flow and quantify its effects on runoff generation and solute transport. However, the dominant factors controlling MFV are still poorly understood and the typical ranges of MFV measured at the field are not defined clearly. To address these issues, we conducted a meta-analysis based on a database created from 246 experiments on MFV collected from 76 journal articles. For a fair comparison, a conceptually unified definition of MFV is introduced to convert the MFV measured with different approaches and at various scales including soil core, field, trench or hillslope scales. The potential controlling factors of MFV considered include scale, travel distance, hydrologic conditions, site factors, macropore morphologies, soil texture, and land use. The results show that MFV is about 2 3 orders of magnitude larger than the corresponding values of saturated hydraulic conductivity. MFV is much larger at the trench and hillslope scale than at the field profile and soil core scales and shows a significant positive correlation with the travel distance. Generally, higher irrigation intensity tends to trigger faster MFV, especially at field profile scale, where MFV and irrigation intensity have significant positive correlation. At the trench and hillslope scale, the presence of large macropores (diameter>10 mm) is a key factor determining MFV. The geometric mean of MFV for sites with large macropores was found to be about 8 times larger than those without large macropores. For sites with large macropores, MFV increases with the macropore diameter. However, no noticeable difference in MFV has been observed among different soil texture and land use. Comparing the existing equations to describe MFV, the Poiseuille equation significantly overestimated the observed values, while the Manning-type equations generate reasonable values. The insights from this study will shed light on future field campaigns and modeling of macropore flow.

  17. North American Extreme Temperature Events and Related Large Scale Meteorological Patterns: A Review of Statistical Methods, Dynamics, Modeling, and Trends

    NASA Technical Reports Server (NTRS)

    Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.; hide

    2015-01-01

    The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.

  18. About the bears and the bees: Adaptive responses to asymmetric warfare

    NASA Astrophysics Data System (ADS)

    Ryan, Alex

    Conventional military forces are organised to generate large scale effects against similarly structured adversaries. Asymmetric warfare is a 'game' between a conventional military force and a weaker adversary that is unable to match the scale of effects of the conventional force. In asymmetric warfare, an insurgents' strategy can be understood using a multi-scale perspective: by generating and exploiting fine scale complexity, insurgents prevent the conventional force from acting at the scale they are designed for. This paper presents a complex systems approach to the problem of asymmetric warfare, which shows how future force structures can be designed to adapt to environmental complexity at multiple scales and achieve full spectrum dominance.

  19. About the bears and the bees: Adaptive responses to asymmetric warfare

    NASA Astrophysics Data System (ADS)

    Ryan, Alex

    Conventional military forces are organised to generate large scale effects against similarly structured adversaries. Asymmetric warfare is a `game' between a conventional military force and a weaker adversary that is unable to match the scale of effects of the conventional force. In asymmetric warfare, an insurgents' strategy can be understood using a multi-scale perspective: by generating and exploiting fine scale complexity, insurgents prevent the conventional force from acting at the scale they are designed for. This paper presents a complex systems approach to the problem of asymmetric warfare, which shows how future force structures can be designed to adapt to environmental complexity at multiple scales and achieve full spectrum dominance.

  20. Back to the Consideration of Future Consequences Scale: time to reconsider?

    PubMed

    Rappange, David R; Brouwer, Werner B F; van Exel, N Job A

    2009-10-01

    The Consideration of Future Consequences (CFC) Scale is a measure of the extent to which individuals consider and are influenced by the distant outcomes of current behavior. In this study, the authors conducted factor analysis to investigate the factor structure of the 12-item CFC Scale. The authors found evidence for a multiple factor solution including one completely present-oriented factor consisting of all 7 present-oriented items, and one or two future-oriented factors consisting of the remaining future-oriented items. Further evidence indicated that the present-oriented factor and the 12-item CFC Scale perform similarly in terms of internal consistency and convergent validity. The structure and content of the future-oriented factor(s) is unclear. From the findings, the authors raise questions regarding the construct validity of the CFC Scale, the interpretation of its results, and the usefulness of the CFC scale in its current form in applied research.

  1. Flood events across the North Atlantic region - past development and future perspectives

    NASA Astrophysics Data System (ADS)

    Matti, Bettina; Dieppois, Bastien; Lawler, Damian; Dahlke, Helen E.; Lyon, Steve W.

    2016-04-01

    Flood events have a large impact on humans, both socially and economically. An increase in winter and spring flooding across much of northern Europe in recent years opened up the question of changing underlying hydro-climatic drivers of flood events. Predicting the manifestation of such changes is difficult due to the natural variability and fluctuations in northern hydrological systems caused by large-scale atmospheric circulations, especially under altered climate conditions. Improving knowledge on the complexity of these hydrological systems and their interactions with climate is essential to be able to determine drivers of flood events and to predict changes in these drivers under altered climate conditions. This is particularly true for the North Atlantic region where both physical catchment properties and large-scale atmospheric circulations have a profound influence on floods. This study explores changes in streamflow across North Atlantic region catchments. An emphasis is placed on high-flow events, namely the timing and magnitude of past flood events, and selected flood percentiles were tested for stationarity by applying a flood frequency analysis. The issue of non-stationarity of flood return periods is important when linking streamflow to large-scale atmospheric circulations. Natural fluctuations in these circulations are found to have a strong influence on the outcome causing natural variability in streamflow records. Long time series and a multi-temporal approach allows for determining drivers of floods and linking streamflow to large-scale atmospheric circulations. Exploring changes in selected hydrological signatures consistency was found across much of the North Atlantic region suggesting a shift in flow regime. The lack of an overall regional pattern suggests that how catchments respond to changes in climatic drivers is strongly influenced by their physical characteristics. A better understanding of hydrological response to climate drivers is essential for example for forecasting purposes.

  2. Assessing Impacts of Global Warming on Tropical Cyclone Tracks

    NASA Technical Reports Server (NTRS)

    Wu, Li-Guang; Wang, Bin

    2003-01-01

    A new approach is proposed to assess the possible impacts of the global climate change on tropical cyclone (TC) tracks in the western North Pacific (WNP) basin. The idea is based on the premise that the future change of TC track characteristics is primarily determined by changes in large-scale environmental steering flows. It is demonstrated that the main characteristics of the current climatology of TC tracks can be derived from the climatological mean velocity field of TC motion by using a trajectory model. The climatological mean velocity of TC motion, which is composed of the large-scale steering and beta drift, is determined on each grid of the basin. The mean beta drift is estimated from the best track data, and the mean large-scale steering flow is computed from the NCEP/NCAR reanalysis for the current climate state. The derived mean beta drift agrees well with the results of previous observational and numerical studies in terms of its direction and magnitude. The outputs of experiments A2 and B2 of the Geophysical Fluid Dynamics Laboratory (GFDL) R30 climate model suggest that the subtropical high will be persistently weak over the western part of the WNP or shift eastward during July-September in response to the future climate change. By assuming that the mean beta drift in the future climate state is unchanged, the change in the general circulation by 2059 will decrease the TC activities in the WNP, but favor a northward shift of typical TC tracks. As a result, the storm activities in the South China Sea will decrease by about 12%, while the Japan region will experience an increase of TCs by 12-15%. During the period of 2000-2029, the tropical storms that affect the China region will increase by 5-6%, but return to the current level during 2030-2059. It is also suggested that, during the period of 2030-2059 tropical storms will more frequently affect Japan and the middle latitude region of China given that the formation locations remain the same as in the current climate state.

  3. Potential and prospective implementation of carbon nanotubes on next generation aircraft and space vehicles: A review of current and expected applications in aerospace sciences

    NASA Astrophysics Data System (ADS)

    Gohardani, Omid; Elola, Maialen Chapartegui; Elizetxea, Cristina

    2014-10-01

    Carbon nanotubes have instigated the interest of many different scientific fields since their authenticated introduction, more than two decades ago. Particularly in aerospace applications, the potential implementations of these advanced materials have been predicted to have a large impact on future aircraft and space vehicles, mainly due to their distinct features, which include superior mechanical, thermal and electrical properties. This article provides the very first consolidated review of the imminent prospects of utilizing carbon nanotubes and nanoparticles in aerospace sciences, based on their recent implementations and predicted future applications. Explicitly, expected carbon nanotube employment in aeronautics and astronautics are identified for commercial aircraft, military aircraft, rotorcraft, unmanned aerial vehicles, satellites, and space launch vehicles. Attention is devoted to future utilization of carbon nanotubes, which may comprise hydrogen storage encapsulation, composite material implementation, lightning protection for aircraft, aircraft icing mitigation, reduced weight of airframes/satellites, and alleviation of challenges related to future space launch. This study further sheds light onto recent actualized implementations of carbon nanotubes in aerospace applications, as well as current and prospective challenges related to their usage in aerospace sciences, encompassing health and safety hazards, large scale manufacturing, achievement of optimum properties, recycling, and environmental impacts.

  4. Nation-wide assessment of climate change impacts on crops in the Philippines and Peru as part of multi-disciplinary modelling framework

    NASA Astrophysics Data System (ADS)

    Fujisawa, Mariko; Kanamaru, Hideki

    2016-04-01

    Agriculture is vulnerable to environmental changes, and climate change has been recognized as one of the most devastating factors. In many developing countries, however, few studies have focused on nation-wide assessment of crop yield and crop suitability in the future, and hence there is a large pressure on science to provide policy makers with solid predictions for major crops in the countries in support of climate risk management policies and programmes. FAO has developed the tool MOSAICC (Modelling System for Agricultural Impacts of Climate Change) where statistical climate downscaling is combined with crop yield projections under climate change scenarios. Three steps are required to get the results: 1. The historical meteorological data such as temperature and precipitation for about 30 years were collected, and future climates were statistically downscaled to the local scale, 2. The historical crop yield data were collected and regression functions were made to estimate the yield by using observed climatic data and water balance during the growing period for each crop, and 3. The yield changes in the future were estimated by using the future climate data, produced by the first step, as an input to the yield regression functions. The yield was first simulated at sub-national scale and aggregated to national scale, which is intended to provide national policies with adaptation options. The methodology considers future changes in characteristics of extreme weather events as the climate projections are on daily scale while crop simulations are on 10-daily scale. Yields were simulated with two greenhouse gas concentration pathways (RCPs) for three GCMs per crop to account for uncertainties in projections. The crop assessment constitutes a larger multi-disciplinary assessment of climate change impacts on agriculture and vulnerability of livelihoods in terms of food security (e.g. water resources, agriculture market, household-level food security from socio-economic perspective). In our presentation we will show the cases of Peru and the Philippines, and discuss the implications for agriculture policies and risk management.

  5. On the Uses of Full-Scale Schlieren Flow Visualization

    NASA Astrophysics Data System (ADS)

    Settles, G. S.; Miller, J. D.; Dodson-Dreibelbis, L. J.

    2000-11-01

    A lens-and-grid-type schlieren system using a very large grid as a light source was described at earlier APS/DFD meetings. With a field-of-view of 2.3x2.9 m (7.5x9.5 feet), it is the largest indoor schlieren system in the world. Still and video examples of several full-scale airflows and heat-transfer problems visualized thus far will be shown. These include: heating and ventilation airflows, flows due to appliances and equipment, the thermal plumes of people, the aerodynamics of an explosive trace detection portal, gas leak detection, shock wave motion associated with aviation security problems, and heat transfer from live crops. Planned future projects include visualizing fume-hood and grocery display freezer airflows and studying the dispersion of insect repellent plumes at full scale.

  6. Using the Personality Assessment Inventory Antisocial and Borderline Features Scales to Predict Behavior Change.

    PubMed

    Penson, Brittany N; Ruchensky, Jared R; Morey, Leslie C; Edens, John F

    2016-11-01

    A substantial amount of research has examined the developmental trajectory of antisocial behavior and, in particular, the relationship between antisocial behavior and maladaptive personality traits. However, research typically has not controlled for previous behavior (e.g., past violence) when examining the utility of personality measures, such as self-report scales of antisocial and borderline traits, in predicting future behavior (e.g., subsequent violence). Examination of the potential interactive effects of measures of both antisocial and borderline traits also is relatively rare in longitudinal research predicting adverse outcomes. The current study utilizes a large sample of youthful offenders ( N = 1,354) from the Pathways to Desistance project to examine the separate effects of the Personality Assessment Inventory Antisocial Features (ANT) and Borderline Features (BOR) scales in predicting future offending behavior as well as trends in other negative outcomes (e.g., substance abuse, violence, employment difficulties) over a 1-year follow-up period. In addition, an ANT × BOR interaction term was created to explore the predictive effects of secondary psychopathy. ANT and BOR both explained unique variance in the prediction of various negative outcomes even after controlling for past indicators of those same behaviors during the preceding year.

  7. The asymmetric impact of global warming on US drought types and distributions in a large ensemble of 97 hydro-climatic simulations.

    PubMed

    Huang, Shengzhi; Leng, Guoyong; Huang, Qiang; Xie, Yangyang; Liu, Saiyan; Meng, Erhao; Li, Pei

    2017-07-19

    Projection of future drought is often involved large uncertainties from climate models, emission scenarios as well as drought definitions. In this study, we investigate changes in future droughts in the conterminous United States based on 97 1/8 degree hydro-climate model projections. Instead of focusing on a specific drought type, we investigate changes in meteorological, agricultural, and hydrological drought as well as the concurrences. Agricultural and hydrological droughts are projected to become more frequent with increase in global mean temperature, while less meteorological drought is expected. Changes in drought intensity scale linearly with global temperature rises under RCP8.5 scenario, indicating the potential feasibility to derive future drought severity given certain global warming amount under this scenario. Changing pattern of concurrent droughts generally follows that of agricultural and hydrological droughts. Under the 1.5 °C warming target as advocated in recent Paris agreement, several hot spot regions experiencing highest droughts are identified. Extreme droughts show similar patterns but with much larger magnitude than the climatology. This study highlights the distinct response of droughts of various types to global warming and the asymmetric impact of global warming on drought distribution resulting in a much stronger influence on extreme drought than on mean drought.

  8. Lessons Learned From Large-Scale Evapotranspiration and Root Zone Soil Moisture Mapping Using Ground Measurements (meteorological, LAS, EC) and Remote Sensing (METRIC)

    NASA Astrophysics Data System (ADS)

    Hendrickx, J. M. H.; Allen, R. G.; Myint, S. W.; Ogden, F. L.

    2015-12-01

    Large scale mapping of evapotranspiration and root zone soil moisture is only possible when satellite images are used. The spatial resolution of this imagery typically depends on its temporal resolution or the satellite overpass time. For example, the Landsat satellite acquires images at 30 m resolution every 16 days while the MODIS satellite acquires images at 250 m resolution every day. In this study we deal with optical/thermal imagery that is impacted by cloudiness contrary to radar imagery that penetrates through clouds. Due to cloudiness, the temporal resolution of Landsat drops from 16 days to about one clear sky Landsat image per month in the southwestern USA and about one every ten years in the humid tropics of Panama. Only by launching additional satellites can the temporal resolution be improved. Since this is too costly, an alternative is found by using ground measurements with high temporal resolution (from minutes to days) but poor spatial resolution. The challenge for large-scale evapotranspiration and root zone soil moisture mapping is to construct a layer stack consisting of N time layers covering the period of interest each containing M pixels covering the region of interest. We will present examples of the Phoenix Active Management Area in AZ (14,600 km2), Green River Basin in WY (44,000 km2), the Kishwaukee Watershed in IL (3,150 km2), the area covered by Landsat Path 28/Row 35 in OK (30,000 km2) and the Agua Salud Watershed in Panama (200 km2). In these regions we used Landsat or MODIS imagery for mapping evapotranspiration and root zone soil moisture by the algorithm Mapping EvapoTranspiration at high Resolution with Internalized Calibration (METRIC) together with meteorological measurements and sometimes either Large Aperture Scintillometers (LAS) or Eddy Covariance (EC). We conclude with lessons learned for future large-scale hydrological studies.

  9. BECCS capability of dedicated bioenergy crops under a future land-use scenario targeting net negative carbon emissions

    NASA Astrophysics Data System (ADS)

    Kato, E.; Yamagata, Y.

    2014-12-01

    Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socio-economic scenarios that aim to keep mean global temperature rise below 2°C above pre-industrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large-scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full post-process combustion CO2 capture is deployed with a high fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required, however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise a conflict of land-use with food production is inevitable.

  10. BECCS capability of dedicated bioenergy crops under a future land-use scenario targeting net negative carbon emissions

    NASA Astrophysics Data System (ADS)

    Kato, Etsushi; Yamagata, Yoshiki

    2014-09-01

    Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socioeconomic scenarios that aim to keep mean global temperature rise below 2°C above preindustrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high-fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full postprocess combustion CO2 capture is deployed with a high-fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required; however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise, a conflict of land use with food production is inevitable.

  11. Wavelength-tunable filter utilizing non-cyclic arrayed waveguide grating to create colorless, directionless, contentionless ROADMs

    NASA Astrophysics Data System (ADS)

    Niwa, Masaki; Takashina, Shoichi; Mori, Yojiro; Hasegawa, Hiroshi; Sato, Ken-ichi; Watanabe, Toshio

    2015-01-01

    With the continuous increase in Internet traffic, reconfigurable optical add-drop multiplexers (ROADMs) have been widely adopted in the core and metro core networks. Current ROADMs, however, allow only static operation. To realize future dynamic optical-network services, and to minimize any human intervention in network operation, the optical signal add/drop part should have colorless/directionless/contentionless (C/D/C) capabilities. This is possible with matrix switches or a combination of splitter-switches and optical tunable filters. The scale of the matrix switch increases with the square of the number of supported channels, and hence, the matrix-switch-based architecture is not suitable for creating future large-scale ROADMs. In contrast, the numbers of splitter ports, switches, and tunable filters increase linearly with the number of supported channels, and hence the tunable-filter-based architecture will support all future traffic. So far, we have succeeded in fabricating a compact tunable filter that consists of multi-stage cyclic arrayed-waveguide gratings (AWGs) and switches by using planar-lightwave-circuit (PLC) technologies. However, this multistage configuration suffers from large insertion loss and filter narrowing. Moreover, power-consuming temperature control is necessary since it is difficult to make cyclic AWGs athermal. We propose here novel tunable-filter architecture that sandwiches a single-stage non-cyclic athermal AWG having flatter-topped passbands between small-scale switches. With this configuration, the optical tunable filter attains low insertion loss, large passband bandwidths, low power consumption, compactness, and high cost-effectiveness. A prototype is monolithically fabricated with PLC technologies and its excellent performance is experimentally confirmed utilizing 80-channel 30-GBaud dual-polarization quadrature phase-shift-keying (QPSK) signals.

  12. Spatially Resolved Spectroscopy of Narrow-line Seyfert 1 Host Galaxies

    NASA Astrophysics Data System (ADS)

    Scharwächter, J.; Husemann, B.; Busch, G.; Komossa, S.; Dopita, M. A.

    2017-10-01

    We present optical integral field spectroscopy for five z< 0.062 narrow-line Seyfert 1 (NLS1) galaxies, probing their host galaxies at ≳ 2{--}3 {kpc} scales. Emission lines from the active galactic nucleus (AGN) and the large-scale host galaxy are analyzed separately, based on an AGN-host decomposition technique. The host galaxy gas kinematics indicates large-scale gas rotation in all five sources. At the probed scales of ≳ 2{--}3 {kpc}, the host galaxy gas is found to be predominantly ionized by star formation without any evidence of a strong AGN contribution. None of the five objects shows specific star formation rates (SFRs) exceeding the main sequence of low-redshift star-forming galaxies. The specific SFRs for MCG-05-01-013 and WPVS 007 are roughly consistent with the main sequence, while ESO 399-IG20, MS 22549-3712, and TON S180 show lower specific SFRs, intermediate to the main sequence and the red quiescent galaxies. The host galaxy metallicities, derived for the two sources with sufficient data quality (ESO 399-IG20 and MCG-05-01-013), indicate central oxygen abundances just below the low-redshift mass-metallicity relation. Based on this initial case study, we outline a comparison of AGN and host galaxy parameters as a starting point for future extended NLS1 studies with similar methods.

  13. A techno-economic & environmental analysis of a novel technology utilizing an internal combustion engine as a compact, inexpensive micro-reformer for a distributed gas-to-liquids system

    NASA Astrophysics Data System (ADS)

    Browne, Joshua B.

    Anthropogenic greenhouse gas emissions (GHG) contribute to global warming, and must be mitigated. With GHG mitigation as an overarching goal, this research aims to study the potential for newfound and abundant sources of natural gas to play a role as part of a GHG mitigation strategy. However, recent work suggests that methane leakage in the current natural gas system may inhibit end-use natural gas as a robust mitigation strategy, but that natural gas as a feedstock for other forms of energy, such as electricity generation or liquid fuels, may support natural-gas based mitigation efforts. Flaring of uneconomic natural gas, or outright loss of natural gas to the atmosphere results in greenhouse gas emissions that could be avoided and which today are very large in aggregate. A central part of this study is to look at a new technology for converting natural gas into methanol at a unit scale that is matched to the size of individual natural gas wells. The goal is to convert stranded or otherwise flared natural gas into a commercially valuable product and thereby avoid any unnecessary emission to the atmosphere. A major part of this study is to contribute to the development of a novel approach for converting natural gas into methanol and to assess the environmental impact (for better or for worse) of this new technology. This Ph. D. research contributes to the development of such a system and provides a comprehensive techno-economic and environmental assessment of this technology. Recognizing the distributed nature of methane leakage associated with the natural gas system, this work is also intended to advance previous research at the Lenfest Center for Sustainable Energy that aims to show that small, modular energy systems can be made economic. This thesis contributes to and analyzes the development of a small-scale gas-to-liquids (GTL) system aimed at addressing flared natural gas from gas and oil wells. This thesis includes system engineering around a design that converts natural gas to synthesis gas (syngas) in a reciprocating internal combustion engine and then converts the syngas into methanol in a small-scale reactor. With methanol as the product, this research aims to show that such a system can not only address current and future natural gas flaring regulation, but eventually can compete economically with historically large-scale, centralized methanol production infrastructure. If successful, such systems could contribute to a shift away from large, multi-billion dollar capital cost chemical plants towards smaller systems with shorter lifetimes that may decrease the time to transition to more sustainable forms of energy and chemical conversion technologies. This research also quantifies the potential for such a system to contribute to mitigating GHG emissions, not only by addressing flared gas in the near-term, but also supporting future natural gas infrastructure ideas that may help to redefine the way the current natural gas pipeline system is used. The introduction of new, small-scale, distributed energy and chemical conversion systems located closer to the point of extraction may contribute to reducing methane leakage throughout the natural gas distribution system by reducing the reliance and risks associated with the aging natural gas pipeline infrastructure. The outcome of this thesis will result in several areas for future work. From an economic perspective, factors that contribute to overall system cost, such as operation and maintenance (O&M) and capital cost multiplier (referred to as the Lang Factor for large-scale petro-chemical plants), are not yet known for novel systems such as the technology presented here. From a technical perspective, commercialization of small-scale, distributed chemical conversion systems may create a demand for economical compression and air-separation technologies at this scale that do not currently exist. Further, new business cases may arise aimed at utilizing small, remote sources of methane, such as biogas from agricultural and municipal waste. Finally, while methanol was selected as the end-product for this thesis, future applications of this technology may consider methane conversion to hydrogen, ammonia, and ethylene for example, challenging the orthodoxy in the chemical industry that "bigger is better."

  14. Beyond the plane-parallel and Newtonian approach: wide-angle redshift distortions and convergence in general relativity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertacca, Daniele; Maartens, Roy; Raccanelli, Alvise

    We extend previous analyses of wide-angle correlations in the galaxy power spectrum in redshift space to include all general relativistic effects. These general relativistic corrections to the standard approach become important on large scales and at high redshifts, and they lead to new terms in the wide-angle correlations. We show that in principle the new terms can produce corrections of nearly 10% on Gpc scales over the usual Newtonian approximation. General relativistic corrections will be important for future large-volume surveys such as SKA and Euclid, although the problem of cosmic variance will present a challenge in observing this.

  15. Culture and cognition in health systems change.

    PubMed

    Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan

    2015-01-01

    Large-scale change involves modifying not only the structures and functions of multiple organizations, but also the mindsets and behaviours of diverse stakeholders. This paper focuses on the latter: the informal, less visible, and often neglected psychological and social factors implicated in change efforts. The purpose of this paper is to differentiate between the concepts of organizational culture and mental models, to argue for the value of applying a shared mental models (SMM) framework to large-scale change, and to suggest directions for future research. The authors provide an overview of SMM theory and use it to explore the dynamic relationship between culture and cognition. The contributions and limitations of the theory to change efforts are also discussed. Culture and cognition are complementary perspectives, providing insight into two different levels of the change process. SMM theory draws attention to important questions that add value to existing perspectives on large-scale change. The authors outline these questions for future research and argue that research and practice in this domain may be best served by focusing less on the potentially narrow goal of "achieving consensus" and more on identifying, understanding, and managing cognitive convergences and divergences as part of broader research and change management programmes. Drawing from both cultural and cognitive paradigms can provide researchers with a more complete picture of the processes by which coordinated action are achieved in complex change initiatives in the healthcare domain.

  16. Integrated water and renewable energy management: the Acheloos-Peneios region case study

    NASA Astrophysics Data System (ADS)

    Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.

  17. A pilot study to understand feasibility and acceptability of stool and cord blood sample collection for a large-scale longitudinal birth cohort.

    PubMed

    Bailey, S R; Townsend, C L; Dent, H; Mallet, C; Tsaliki, E; Riley, E M; Noursadeghi, M; Lawley, T D; Rodger, A J; Brocklehurst, P; Field, N

    2017-12-28

    Few data are available to guide biological sample collection around the time of birth for large-scale birth cohorts. We are designing a large UK birth cohort to investigate the role of infection and the developing immune system in determining future health and disease. We undertook a pilot to develop methodology for the main study, gain practical experience of collecting samples, and understand the acceptability of sample collection to women in late pregnancy. Between February-July 2014, we piloted the feasibility and acceptability of collecting maternal stool, baby stool and cord blood samples from participants recruited at prolonged pregnancy and planned pre-labour caesarean section clinics at University College London Hospital. Participating women were asked to complete acceptability questionnaires. Overall, 265 women were approached and 171 (65%) participated, with ≥1 sample collected from 113 women or their baby (66%). Women had a mean age of 34 years, were primarily of white ethnicity (130/166, 78%), and half were nulliparous (86/169, 51%). Women undergoing planned pre-labour caesarean section were more likely than those who delivered vaginally to provide ≥1 sample (98% vs 54%), but less likely to provide maternal stool (10% vs 43%). Pre-sample questionnaires were completed by 110/171 women (64%). Most women reported feeling comfortable with samples being collected from their baby (<10% uncomfortable), but were less comfortable about their own stool (19% uncomfortable) or a vaginal swab (24% uncomfortable). It is possible to collect a range of biological samples from women around the time of delivery, and this was acceptable for most women. These data inform study design and protocol development for large-scale birth cohorts.

  18. Detonation failure characterization of non-ideal explosives

    NASA Astrophysics Data System (ADS)

    Janesheski, Robert S.; Groven, Lori J.; Son, Steven

    2012-03-01

    Non-ideal explosives are currently poorly characterized, hence limiting the modeling of them. Current characterization requires large-scale testing to obtain steady detonation wave characterization for analysis due to the relatively thick reaction zones. Use of a microwave interferometer applied to small-scale confined transient experiments is being implemented to allow for time resolved characterization of a failing detonation. The microwave interferometer measures the position of a failing detonation wave in a tube that is initiated with a booster charge. Experiments have been performed with ammonium nitrate and various fuel compositions (diesel fuel and mineral oil). It was observed that the failure dynamics are influenced by factors such as chemical composition and confiner thickness. Future work is planned to calibrate models to these small-scale experiments and eventually validate the models with available large scale experiments. This experiment is shown to be repeatable, shows dependence on reactive properties, and can be performed with little required material.

  19. To the horizon and beyond: Weak lensing of the CMB and binary inspirals into horizonless objects

    NASA Astrophysics Data System (ADS)

    Kesden, Michael

    This thesis examines two predictions of general relativity: weak lensing and gravitational waves. The cosmic microwave background (CMB) is gravitationally lensed by the large-scale structure between the observer and the last- scattering surface. This weak lensing induces non-Gaussian correlations that can be used to construct estimators for the deflection field. The error and bias of these estimators are derived and used to analyze the viability of lensing reconstruction for future CMB experiments. Weak lensing also affects the one-point probability distribution function of the CMB. The skewness and kurtosis induced by lensing and the Sunayev- Zel'dovich (SZ) effect are calculated as functions of the angular smoothing scale of the map. While these functions offer the advantage of easy computability, only the skewness from lensing-SZ correlations can potentially be detected, even in the limit of the largest amplitude fluctuations allowed by observation. Lensing estimators are also essential to constrain inflation, the favored explanation for large-scale isotropy and the origin of primordial perturbations. B-mode polarization is considered to be a "smoking-gun" signature of inflation, and lensing estimators can be used to recover primordial B-modes from lensing-induced contamination. The ability of future CMB experiments to constrain inflation is assessed as functions of survey size and instrumental sensitivity. A final application of lensing estimators is to constrain a possible cutoff in primordial density perturbations on near-horizon scales. The paucity of independent modes on such scales limits the statistical certainty of such a constraint. Measurements of the deflection field can be used to constrain at the 3s level the existence of a cutoff large enough to account for current CMB observations. A final chapter of this thesis considers an independent topic: the gravitational-wave (GW) signature of a binary inspiral into a horizonless object. If the supermassive objects at galactic centers lack the horizons of traditional black holes, inspiraling objects could emit GWs after passing within their surfaces. The GWs produced by such an inspiral are calculated, revealing distinctive features potentially observable by future GW observatories.

  20. Synoptic-scale circulation patterns during summer derived from tree rings in mid-latitude Asia

    NASA Astrophysics Data System (ADS)

    Seim, Andrea; Schultz, Johannes A.; Leland, Caroline; Davi, Nicole; Byambasuren, Oyunsanaa; Liang, Eryuan; Wang, Xiaochun; Beck, Christoph; Linderholm, Hans W.; Pederson, Neil

    2017-09-01

    Understanding past and recent climate and atmospheric circulation variability is vital for regions that are affected by climate extremes. In mid-latitude Asia, however, the synoptic climatology is complex and not yet fully understood. The aim of this study was to investigate dominant synoptic-scale circulation patterns during the summer season using a multi-species tree-ring width (TRW) network comprising 78 sites from mid-latitude Asia. For each TRW chronology, we calculated an atmospheric circulation tree-ring index (ACTI), based on 1000 hPa geopotential height data, to directly link tree growth to 13 summertime weather types and their associated local climate conditions for the period 1871-1993. Using the ACTI, three groups of similarly responding tree-ring sites can be associated with distinct large-scale atmospheric circulation patterns: 1. growth of drought sensitive trees is positively affected by a cyclone over northern Russia; 2. temperature sensitive trees show positive associations to a cyclone over northwestern Russia and an anticyclone over Mongolia; 3. trees at two high elevation sites show positive relations to a zonal cyclone extending from mid-latitude Eurasia to the West Pacific. The identified synoptic-scale circulation patterns showed spatiotemporal variability in their intensity and position, causing temporally varying climate conditions in mid-latitude Asia. Our results highlight that for regions with less pronounced atmospheric action centers during summer such as the occurrence of large-scale cyclones and anticyclones, synoptic-scale circulation patterns can be extracted and linked to the Northern Hemisphere circulation system. Thus, we provide a new and solid envelope for climate studies covering the past to the future.

  1. New Markets for Solar Photovoltaic Power Systems

    NASA Astrophysics Data System (ADS)

    Thomas, Chacko; Jennings, Philip; Singh, Dilawar

    2007-10-01

    Over the past five years solar photovoltaic (PV) power supply systems have matured and are now being deployed on a much larger scale. The traditional small-scale remote area power supply systems are still important and village electrification is also a large and growing market but large scale, grid-connected systems and building integrated systems are now being deployed in many countries. This growth has been aided by imaginative government policies in several countries and the overall result is a growth rate of over 40% per annum in the sales of PV systems. Optimistic forecasts are being made about the future of PV power as a major source of sustainable energy. Plans are now being formulated by the IEA for very large-scale PV installations of more than 100 MW peak output. The Australian Government has announced a subsidy for a large solar photovoltaic power station of 154 MW in Victoria, based on the concentrator technology developed in Australia. In Western Australia a proposal has been submitted to the State Government for a 2 MW photovoltaic power system to provide fringe of grid support at Perenjori. This paper outlines the technologies, designs, management and policies that underpin these exciting developments in solar PV power.

  2. Radiography with cosmic-ray and compact accelerator muons; Exploring inner-structure of large-scale objects and landforms

    PubMed Central

    NAGAMINE, Kanetada

    2016-01-01

    Cosmic-ray muons (CRM) arriving from the sky on the surface of the earth are now known to be used as radiography purposes to explore the inner-structure of large-scale objects and landforms, ranging in thickness from meter to kilometers scale, such as volcanic mountains, blast furnaces, nuclear reactors etc. At the same time, by using muons produced by compact accelerators (CAM), advanced radiography can be realized for objects with a thickness in the sub-millimeter to meter range, with additional exploration capability such as element identification and bio-chemical analysis. In the present report, principles, methods and specific research examples of CRM transmission radiography are summarized after which, principles, methods and perspective views of the future CAM radiography are described. PMID:27725469

  3. Radiography with cosmic-ray and compact accelerator muons; Exploring inner-structure of large-scale objects and landforms.

    PubMed

    Nagamine, Kanetada

    2016-01-01

    Cosmic-ray muons (CRM) arriving from the sky on the surface of the earth are now known to be used as radiography purposes to explore the inner-structure of large-scale objects and landforms, ranging in thickness from meter to kilometers scale, such as volcanic mountains, blast furnaces, nuclear reactors etc. At the same time, by using muons produced by compact accelerators (CAM), advanced radiography can be realized for objects with a thickness in the sub-millimeter to meter range, with additional exploration capability such as element identification and bio-chemical analysis. In the present report, principles, methods and specific research examples of CRM transmission radiography are summarized after which, principles, methods and perspective views of the future CAM radiography are described.

  4. Cost-effectiveness comparison of response strategies to a large-scale anthrax attack on the chicago metropolitan area: impact of timing and surge capacity.

    PubMed

    Kyriacou, Demetrios N; Dobrez, Debra; Parada, Jorge P; Steinberg, Justin M; Kahn, Adam; Bennett, Charles L; Schmitt, Brian P

    2012-09-01

    Rapid public health response to a large-scale anthrax attack would reduce overall morbidity and mortality. However, there is uncertainty about the optimal cost-effective response strategy based on timing of intervention, public health resources, and critical care facilities. We conducted a decision analytic study to compare response strategies to a theoretical large-scale anthrax attack on the Chicago metropolitan area beginning either Day 2 or Day 5 after the attack. These strategies correspond to the policy options set forth by the Anthrax Modeling Working Group for population-wide responses to a large-scale anthrax attack: (1) postattack antibiotic prophylaxis, (2) postattack antibiotic prophylaxis and vaccination, (3) preattack vaccination with postattack antibiotic prophylaxis, and (4) preattack vaccination with postattack antibiotic prophylaxis and vaccination. Outcomes were measured in costs, lives saved, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). We estimated that postattack antibiotic prophylaxis of all 1,390,000 anthrax-exposed people beginning on Day 2 after attack would result in 205,835 infected victims, 35,049 fulminant victims, and 28,612 deaths. Only 6,437 (18.5%) of the fulminant victims could be saved with the existing critical care facilities in the Chicago metropolitan area. Mortality would increase to 69,136 if the response strategy began on Day 5. Including postattack vaccination with antibiotic prophylaxis of all exposed people reduces mortality and is cost-effective for both Day 2 (ICER=$182/QALY) and Day 5 (ICER=$1,088/QALY) response strategies. Increasing ICU bed availability significantly reduces mortality for all response strategies. We conclude that postattack antibiotic prophylaxis and vaccination of all exposed people is the optimal cost-effective response strategy for a large-scale anthrax attack. Our findings support the US government's plan to provide antibiotic prophylaxis and vaccination for all exposed people within 48 hours of the recognition of a large-scale anthrax attack. Future policies should consider expanding critical care capacity to allow for the rescue of more victims.

  5. Cost-Effectiveness Comparison of Response Strategies to a Large-Scale Anthrax Attack on the Chicago Metropolitan Area: Impact of Timing and Surge Capacity

    PubMed Central

    Dobrez, Debra; Parada, Jorge P.; Steinberg, Justin M.; Kahn, Adam; Bennett, Charles L.; Schmitt, Brian P.

    2012-01-01

    Rapid public health response to a large-scale anthrax attack would reduce overall morbidity and mortality. However, there is uncertainty about the optimal cost-effective response strategy based on timing of intervention, public health resources, and critical care facilities. We conducted a decision analytic study to compare response strategies to a theoretical large-scale anthrax attack on the Chicago metropolitan area beginning either Day 2 or Day 5 after the attack. These strategies correspond to the policy options set forth by the Anthrax Modeling Working Group for population-wide responses to a large-scale anthrax attack: (1) postattack antibiotic prophylaxis, (2) postattack antibiotic prophylaxis and vaccination, (3) preattack vaccination with postattack antibiotic prophylaxis, and (4) preattack vaccination with postattack antibiotic prophylaxis and vaccination. Outcomes were measured in costs, lives saved, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). We estimated that postattack antibiotic prophylaxis of all 1,390,000 anthrax-exposed people beginning on Day 2 after attack would result in 205,835 infected victims, 35,049 fulminant victims, and 28,612 deaths. Only 6,437 (18.5%) of the fulminant victims could be saved with the existing critical care facilities in the Chicago metropolitan area. Mortality would increase to 69,136 if the response strategy began on Day 5. Including postattack vaccination with antibiotic prophylaxis of all exposed people reduces mortality and is cost-effective for both Day 2 (ICER=$182/QALY) and Day 5 (ICER=$1,088/QALY) response strategies. Increasing ICU bed availability significantly reduces mortality for all response strategies. We conclude that postattack antibiotic prophylaxis and vaccination of all exposed people is the optimal cost-effective response strategy for a large-scale anthrax attack. Our findings support the US government's plan to provide antibiotic prophylaxis and vaccination for all exposed people within 48 hours of the recognition of a large-scale anthrax attack. Future policies should consider expanding critical care capacity to allow for the rescue of more victims. PMID:22845046

  6. [Research progress on hydrological scaling].

    PubMed

    Liu, Jianmei; Pei, Tiefan

    2003-12-01

    With the development of hydrology and the extending effect of mankind on environment, scale issue has become a great challenge to many hydrologists due to the stochasticism and complexity of hydrological phenomena and natural catchments. More and more concern has been given to the scaling issues to gain a large-scale (or small-scale) hydrological characteristic from a certain known catchments, but hasn't been solved successfully. The first part of this paper introduced some concepts about hydrological scale, scale issue and scaling. The key problem is the spatial heterogeneity of catchments and the temporal and spatial variability of hydrological fluxes. Three approaches to scale were put forward in the third part, which were distributed modeling, fractal theory and statistical self similarity analyses. Existing problems and future research directions were proposed in the last part.

  7. FY10 Report on Multi-scale Simulation of Solvent Extraction Processes: Molecular-scale and Continuum-scale Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, Kent E.; Frey, Kurt; Pereira, Candido

    2014-02-02

    This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less

  8. Who Really Wants an Ambitious Large-Scale Restoration of the Seine Estuary? A Strategic Analysis of a Science-Policy Interface Locked in a Stalemate.

    PubMed

    Coreau, Audrey; Narcy, Jean-Baptiste; Lumbroso, Sarah

    2018-05-01

    The development of ecosystem knowledge is an essential condition for effective environmental management but using available knowledge to solve environmental controversies is still difficult in "real" situations. This paper explores the conditions under which ecological knowledge could contribute to the environmental strategies and actions of stakeholders at science-policy interface. Ecological restoration of the Seine estuary is an example of an environmental issue whose overall management has run into difficulties despite the production of a large amount of knowledge by a dedicated organization, GIP Seine Aval. Thanks to an action-research project, based on a futures study, we analyze the reasons of these difficulties and help the GIP Seine Aval adopt a robust strategy to overcome them. According to our results, most local stakeholders involved in the large-scale restoration project emphasize the need for a clear divide between knowledge production and environmental action. This kind of divide may be strategic in a context where the robustness of environmental decisions is strongly depending on the mobilization of "neutral" scientific knowledge. But in our case study, this rather blocks action because some powerful stakeholders continuously ask for more knowledge before taking action. The construction and analysis of possible future scenarios has led to three alternative strategies being identified to counter this stalemate situation: (1) to circumvent difficulties by creating indirect links between knowledge and actions; (2) to use knowledge to sustain advocacy for the interests of each and every stakeholder; (3) to involve citizens in decisions about knowledge production and use, so that environmental issues weight more on the local political agenda.

  9. Production of black holes in TeV-scale gravity

    NASA Astrophysics Data System (ADS)

    Ringwald, A.

    2003-07-01

    Copious production of microscopic black holes is one of the least model-dependent predictions of TeV-scale gravity scenarios. We review the arguments behind this assertion and discuss opportunities to track the striking associated signatures in the near future. These include searches at neutrino telescopes, such as AMANDA and RICE, at cosmic ray air shower facilities, such as the Pierre Auger Observatory, and at colliders, such as the Large Hadron Collider.

  10. Japan’s Nuclear Future: Policy Debate, Prospects, and U.S. Interests

    DTIC Science & Technology

    2008-05-09

    raised in particular over the construction of an industrial- scale reprocessing facility in Japan,. Additionally, fast breeder reactors also produce more...Nuclear Fuel Cycle Engineering Laboratories. 10 A fast breeder reactor is a fast neutron reactor that produces more plutonium than it consumes, which can...Japan Nuclear Fuel Limited (JNFL) has built and is currently running active testing on a large - scale commercial reprocessing plant at Rokkasho-mura

  11. Overview of Accelerator Applications in Energy

    NASA Astrophysics Data System (ADS)

    Garnett, Robert W.; Sheffield, Richard L.

    An overview of the application of accelerators and accelerator technology in energy is presented. Applications span a broad range of cost, size, and complexity and include large-scale systems requiring high-power or high-energy accelerators to drive subcritical reactors for energy production or waste transmutation, as well as small-scale industrial systems used to improve oil and gas exploration and production. The enabling accelerator technologies will also be reviewed and future directions discussed.

  12. Solar paint: From synthesis to printing

    DOE PAGES

    Zhou, Xiaojing; Belcher, Warwick; Dastoor, Paul

    2014-11-13

    Water-based polymer nanoparticle dispersions (solar paint) offer the prospect of addressing two of the main challenges associated with printing large area organic photovoltaic devices; namely, how to control the nanoscale architecture of the active layer and eliminate the need for hazardous organic solvents during device fabrication. We review progress in the field of nanoparticulate organic photovoltaic (NPOPV) devices and future prospects for large-scale manufacturing of solar cells based on this technology.

  13. Solar paint: From synthesis to printing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Xiaojing; Belcher, Warwick; Dastoor, Paul

    Water-based polymer nanoparticle dispersions (solar paint) offer the prospect of addressing two of the main challenges associated with printing large area organic photovoltaic devices; namely, how to control the nanoscale architecture of the active layer and eliminate the need for hazardous organic solvents during device fabrication. We review progress in the field of nanoparticulate organic photovoltaic (NPOPV) devices and future prospects for large-scale manufacturing of solar cells based on this technology.

  14. Long time existence from interior gluing

    NASA Astrophysics Data System (ADS)

    Chruściel, Piotr T.

    2017-07-01

    We prove completeness-to-the-future of null hypersurfaces emanating outwards from large spheres, in vacuum space-times evolving from general asymptotically flat data with well-defined energy-momentum. The proof uses scaling and a gluing construction to reduce the problem to Bieri’s stability theorem.

  15. Ordering Unstructured Meshes for Sparse Matrix Computations on Leading Parallel Systems

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Li, Xiaoye; Heber, Gerd; Biswas, Rupak

    2000-01-01

    The ability of computers to solve hitherto intractable problems and simulate complex processes using mathematical models makes them an indispensable part of modern science and engineering. Computer simulations of large-scale realistic applications usually require solving a set of non-linear partial differential equations (PDES) over a finite region. For example, one thrust area in the DOE Grand Challenge projects is to design future accelerators such as the SpaHation Neutron Source (SNS). Our colleagues at SLAC need to model complex RFQ cavities with large aspect ratios. Unstructured grids are currently used to resolve the small features in a large computational domain; dynamic mesh adaptation will be added in the future for additional efficiency. The PDEs for electromagnetics are discretized by the FEM method, which leads to a generalized eigenvalue problem Kx = AMx, where K and M are the stiffness and mass matrices, and are very sparse. In a typical cavity model, the number of degrees of freedom is about one million. For such large eigenproblems, direct solution techniques quickly reach the memory limits. Instead, the most widely-used methods are Krylov subspace methods, such as Lanczos or Jacobi-Davidson. In all the Krylov-based algorithms, sparse matrix-vector multiplication (SPMV) must be performed repeatedly. Therefore, the efficiency of SPMV usually determines the eigensolver speed. SPMV is also one of the most heavily used kernels in large-scale numerical simulations.

  16. Macroweather Predictions and Climate Projections using Scaling and Historical Observations

    NASA Astrophysics Data System (ADS)

    Hébert, R.; Lovejoy, S.; Del Rio Amador, L.

    2017-12-01

    There are two fundamental time scales that are pertinent to decadal forecasts and multidecadal projections. The first is the lifetime of planetary scale structures, about 10 days (equal to the deterministic predictability limit), and the second is - in the anthropocene - the scale at which the forced anthropogenic variability exceeds the internal variability (around 16 - 18 years). These two time scales define three regimes of variability: weather, macroweather and climate that are respectively characterized by increasing, decreasing and then increasing varibility with scale.We discuss how macroweather temperature variability can be skilfully predicted to its theoretical stochastic predictability limits by exploiting its long-range memory with the Stochastic Seasonal and Interannual Prediction System (StocSIPS). At multi-decadal timescales, the temperature response to forcing is approximately linear and this can be exploited to make projections with a Green's function, or Climate Response Function (CRF). To make the problem tractable, we exploit the temporal scaling symmetry and restrict our attention to global mean forcing and temperature response using a scaling CRF characterized by the scaling exponent H and an inner scale of linearity τ. An aerosol linear scaling factor α and a non-linear volcanic damping exponent ν were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference using historical data and these allow us to analytically calculate a median (and likely 66% range) for the transient climate response, and for the equilibrium climate sensitivity: 1.6K ([1.5,1.8]K) and 2.4K ([1.9,3.4]K) respectively. Aerosol forcing typically has large uncertainty and we find a modern (2005) forcing very likely range (90%) of [-1.0, -0.3] Wm-2 with median at -0.7 Wm-2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to Representative Concentration Pathway (RCP) 2.6 for which the probability to remain under 1.5 K is 48%. RCP 4.5 and RCP 8.5-like futures overshoot with very high probability. This underscores that over the next century, the state of the environment will be strongly influenced by past, present and future economical policies.

  17. Contributions of changes in climatology and perturbation and the resulting nonlinearity to regional climate change.

    PubMed

    Adachi, Sachiho A; Nishizawa, Seiya; Yoshida, Ryuji; Yamaura, Tsuyoshi; Ando, Kazuto; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Tomita, Hirofumi

    2017-12-20

    Future changes in large-scale climatology and perturbation may have different impacts on regional climate change. It is important to understand the impacts of climatology and perturbation in terms of both thermodynamic and dynamic changes. Although many studies have investigated the influence of climatology changes on regional climate, the significance of perturbation changes is still debated. The nonlinear effect of these two changes is also unknown. We propose a systematic procedure that extracts the influences of three factors: changes in climatology, changes in perturbation and the resulting nonlinear effect. We then demonstrate the usefulness of the procedure, applying it to future changes in precipitation. All three factors have the same degree of influence, especially for extreme rainfall events. Thus, regional climate assessments should consider not only the climatology change but also the perturbation change and their nonlinearity. This procedure can advance interpretations of future regional climates.

  18. Multi-time-scale hydroclimate dynamics of a regional watershed and links to large-scale atmospheric circulation: Application to the Seine river catchment, France

    NASA Astrophysics Data System (ADS)

    Massei, N.; Dieppois, B.; Hannah, D. M.; Lavers, D. A.; Fossa, M.; Laignel, B.; Debret, M.

    2017-03-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating correlation between large and local scales, empirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: (i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and (ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the links between large and local scales were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach, which integrated discrete wavelet multiresolution analysis for reconstructing monthly regional hydrometeorological processes (predictand: precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector). This approach basically consisted in three steps: 1 - decomposing large-scale climate and hydrological signals (SLP field, precipitation or streamflow) using discrete wavelet multiresolution analysis, 2 - generating a statistical downscaling model per time-scale, 3 - summing up all scale-dependent models in order to obtain a final reconstruction of the predictand. The results obtained revealed a significant improvement of the reconstructions for both precipitation and streamflow when using the multiresolution ESD model instead of basic ESD. In particular, the multiresolution ESD model handled very well the significant changes in variance through time observed in either precipitation or streamflow. For instance, the post-1980 period, which had been characterized by particularly high amplitudes in interannual-to-interdecadal variability associated with alternating flood and extremely low-flow/drought periods (e.g., winter/spring 2001, summer 2003), could not be reconstructed without integrating wavelet multiresolution analysis into the model. In accordance with previous studies, the wavelet components detected in SLP, precipitation and streamflow on interannual to interdecadal time-scales could be interpreted in terms of influence of the Gulf-Stream oceanic front on atmospheric circulation.

  19. Using Hybrid Techniques for Generating Watershed-scale Flood Models in an Integrated Modeling Framework

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Merwade, V.; Singhofen, P.

    2017-12-01

    There is an increasing global trend towards developing large scale flood models that account for spatial heterogeneity at watershed scales to drive the future flood risk planning. Integrated surface water-groundwater modeling procedures can elucidate all the hydrologic processes taking part during a flood event to provide accurate flood outputs. Even though the advantages of using integrated modeling are widely acknowledged, the complexity of integrated process representation, computation time and number of input parameters required have deterred its application to flood inundation mapping, especially for large watersheds. This study presents a faster approach for creating watershed scale flood models using a hybrid design that breaks down the watershed into multiple regions of variable spatial resolution by prioritizing higher order streams. The methodology involves creating a hybrid model for the Upper Wabash River Basin in Indiana using Interconnected Channel and Pond Routing (ICPR) and comparing the performance with a fully-integrated 2D hydrodynamic model. The hybrid approach involves simplification procedures such as 1D channel-2D floodplain coupling; hydrologic basin (HUC-12) integration with 2D groundwater for rainfall-runoff routing; and varying spatial resolution of 2D overland flow based on stream order. The results for a 50-year return period storm event show that hybrid model (NSE=0.87) performance is similar to the 2D integrated model (NSE=0.88) but the computational time is reduced to half. The results suggest that significant computational efficiency can be obtained while maintaining model accuracy for large-scale flood models by using hybrid approaches for model creation.

  20. Safety leadership and systems thinking: application and evaluation of a Risk Management Framework in the mining industry.

    PubMed

    Donovan, Sarah-Louise; Salmon, Paul M; Lenné, Michael G; Horberry, Tim

    2017-10-01

    Safety leadership is an important factor in supporting safety in high-risk industries. This article contends that applying systems-thinking methods to examine safety leadership can support improved learning from incidents. A case study analysis was undertaken of a large-scale mining landslide incident in which no injuries or fatalities were incurred. A multi-method approach was adopted, in which the Critical Decision Method, Rasmussen's Risk Management Framework and Accimap method were applied to examine the safety leadership decisions and actions which enabled the safe outcome. The approach enabled Rasmussen's predictions regarding safety and performance to be examined in the safety leadership context, with findings demonstrating the distribution of safety leadership across leader and system levels, and the presence of vertical integration as key to supporting the successful safety outcome. In doing so, the findings also demonstrate the usefulness of applying systems-thinking methods to examine and learn from incidents in terms of what 'went right'. The implications, including future research directions, are discussed. Practitioner Summary: This paper presents a case study analysis, in which systems-thinking methods are applied to the examination of safety leadership decisions and actions during a large-scale mining landslide incident. The findings establish safety leadership as a systems phenomenon, and furthermore, demonstrate the usefulness of applying systems-thinking methods to learn from incidents in terms of what 'went right'. Implications, including future research directions, are discussed.

  1. Improved Large-Scale Inundation Modelling by 1D-2D Coupling and Consideration of Hydrologic and Hydrodynamic Processes - a Case Study in the Amazon

    NASA Astrophysics Data System (ADS)

    Hoch, J. M.; Bierkens, M. F.; Van Beek, R.; Winsemius, H.; Haag, A.

    2015-12-01

    Understanding the dynamics of fluvial floods is paramount to accurate flood hazard and risk modeling. Currently, economic losses due to flooding constitute about one third of all damage resulting from natural hazards. Given future projections of climate change, the anticipated increase in the World's population and the associated implications, sound knowledge of flood hazard and related risk is crucial. Fluvial floods are cross-border phenomena that need to be addressed accordingly. Yet, only few studies model floods at the large-scale which is preferable to tiling the output of small-scale models. Most models cannot realistically model flood wave propagation due to a lack of either detailed channel and floodplain geometry or the absence of hydrologic processes. This study aims to develop a large-scale modeling tool that accounts for both hydrologic and hydrodynamic processes, to find and understand possible sources of errors and improvements and to assess how the added hydrodynamics affect flood wave propagation. Flood wave propagation is simulated by DELFT3D-FM (FM), a hydrodynamic model using a flexible mesh to schematize the study area. It is coupled to PCR-GLOBWB (PCR), a macro-scale hydrological model, that has its own simpler 1D routing scheme (DynRout) which has already been used for global inundation modeling and flood risk assessments (GLOFRIS; Winsemius et al., 2013). A number of model set-ups are compared and benchmarked for the simulation period 1986-1996: (0) PCR with DynRout; (1) using a FM 2D flexible mesh forced with PCR output and (2) as in (1) but discriminating between 1D channels and 2D floodplains, and, for comparison, (3) and (4) the same set-ups as (1) and (2) but forced with observed GRDC discharge values. Outputs are subsequently validated against observed GRDC data at Óbidos and flood extent maps from the Dartmouth Flood Observatory. The present research constitutes a first step into a globally applicable approach to fully couple hydrologic with hydrodynamic computations while discriminating between 1D-channels and 2D-floodplains. Such a fully-fledged set-up would be able to provide higher-order flood hazard information, e.g. time to flooding and flood duration, ultimately leading to improved flood risk assessment and management at the large scale.

  2. Control networks and hubs.

    PubMed

    Gratton, Caterina; Sun, Haoxin; Petersen, Steven E

    2018-03-01

    Executive control functions are associated with frontal, parietal, cingulate, and insular brain regions that interact through distributed large-scale networks. Here, we discuss how fMRI functional connectivity can shed light on the organization of control networks and how they interact with other parts of the brain. In the first section of our review, we present convergent evidence from fMRI functional connectivity, activation, and lesion studies that there are multiple dissociable control networks in the brain with distinct functional properties. In the second section, we discuss how graph theoretical concepts can help illuminate the mechanisms by which control networks interact with other brain regions to carry out goal-directed functions, focusing on the role of specialized hub regions for mediating cross-network interactions. Again, we use a combination of functional connectivity, lesion, and task activation studies to bolster this claim. We conclude that a large-scale network perspective provides important neurobiological constraints on the neural underpinnings of executive control, which will guide future basic and translational research into executive function and its disruption in disease. © 2017 Society for Psychophysiological Research.

  3. Monitoring Ephemeral Streams Using Airborne Very High Resolution Multispectral Remote Sensing in Arid Environments

    NASA Astrophysics Data System (ADS)

    Hamada, Y.; O'Connor, B. L.

    2012-12-01

    Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary field measurements necessary to verify remote sensing landscape models, and for generating hydrologic models and analyses.

  4. Future Tense and Economic Decisions: Controlling for Cultural Evolution

    PubMed Central

    Roberts, Seán G.; Winters, James; Chen, Keith

    2015-01-01

    A previous study by Chen demonstrates a correlation between languages that grammatically mark future events and their speakers' propensity to save, even after controlling for numerous economic and demographic factors. The implication is that languages which grammatically distinguish the present and the future may bias their speakers to distinguish them psychologically, leading to less future-oriented decision making. However, Chen's original analysis assumed languages are independent. This neglects the fact that languages are related, causing correlations to appear stronger than is warranted (Galton's problem). In this paper, we test the robustness of Chen's correlations to corrections for the geographic and historical relatedness of languages. While the question seems simple, the answer is complex. In general, the statistical correlation between the two variables is weaker when controlling for relatedness. When applying the strictest tests for relatedness, and when data is not aggregated across individuals, the correlation is not significant. However, the correlation did remain reasonably robust under a number of tests. We argue that any claims of synchronic patterns between cultural variables should be tested for spurious correlations, with the kinds of approaches used in this paper. However, experiments or case-studies would be more fruitful avenues for future research on this specific topic, rather than further large-scale cross-cultural correlational studies. PMID:26186527

  5. Future Tense and Economic Decisions: Controlling for Cultural Evolution.

    PubMed

    Roberts, Seán G; Winters, James; Chen, Keith

    2015-01-01

    A previous study by Chen demonstrates a correlation between languages that grammatically mark future events and their speakers' propensity to save, even after controlling for numerous economic and demographic factors. The implication is that languages which grammatically distinguish the present and the future may bias their speakers to distinguish them psychologically, leading to less future-oriented decision making. However, Chen's original analysis assumed languages are independent. This neglects the fact that languages are related, causing correlations to appear stronger than is warranted (Galton's problem). In this paper, we test the robustness of Chen's correlations to corrections for the geographic and historical relatedness of languages. While the question seems simple, the answer is complex. In general, the statistical correlation between the two variables is weaker when controlling for relatedness. When applying the strictest tests for relatedness, and when data is not aggregated across individuals, the correlation is not significant. However, the correlation did remain reasonably robust under a number of tests. We argue that any claims of synchronic patterns between cultural variables should be tested for spurious correlations, with the kinds of approaches used in this paper. However, experiments or case-studies would be more fruitful avenues for future research on this specific topic, rather than further large-scale cross-cultural correlational studies.

  6. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    PubMed

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Validity of the Brunel Mood Scale for use With Malaysian Athletes.

    PubMed

    Lan, Mohamad Faizal; Lane, Andrew M; Roy, Jolly; Hanin, Nik Azma

    2012-01-01

    The aim of the present study was to investigate the factorial validity of the Brunel Mood Scale for use with Malaysian athletes. Athletes (N = 1485 athletes) competing at the Malaysian Games completed the Brunel of Mood Scale (BRUMS). Confirmatory Factor Analysis (CFA) results indicated a Confirmatory Fit Index (CFI) of .90 and Root Mean Squared Error of Approximation (RMSEA) was 0.05. The CFI was below the 0.95 criterion for acceptability and the RMSEA value was within the limits for acceptability suggested by Hu and Bentler, 1999. We suggest that results provide some support for validity of the BRUMS for use with Malaysian athletes. Given the large sample size used in the present study, descriptive statistics could be used as normative data for Malaysian athletes. Key pointsFindings from the present study lend support to the validity of the BRUMS for use with Malaysian athletes.Given the size of the sample used in the present study, we suggest descriptive data be used as the normative data for researchers using the scale with Malaysian athletes.It is suggested that future research investigate the effects of cultural differences on emotional states experienced by athletes before, during and post-competition.

  8. Validity of the Brunel Mood Scale for use With Malaysian Athletes

    PubMed Central

    Lan, Mohamad Faizal; Lane, Andrew M.; Roy, Jolly; Hanin, Nik Azma

    2012-01-01

    The aim of the present study was to investigate the factorial validity of the Brunel Mood Scale for use with Malaysian athletes. Athletes (N = 1485 athletes) competing at the Malaysian Games completed the Brunel of Mood Scale (BRUMS). Confirmatory Factor Analysis (CFA) results indicated a Confirmatory Fit Index (CFI) of .90 and Root Mean Squared Error of Approximation (RMSEA) was 0.05. The CFI was below the 0.95 criterion for acceptability and the RMSEA value was within the limits for acceptability suggested by Hu and Bentler, 1999. We suggest that results provide some support for validity of the BRUMS for use with Malaysian athletes. Given the large sample size used in the present study, descriptive statistics could be used as normative data for Malaysian athletes. Key points Findings from the present study lend support to the validity of the BRUMS for use with Malaysian athletes. Given the size of the sample used in the present study, we suggest descriptive data be used as the normative data for researchers using the scale with Malaysian athletes. It is suggested that future research investigate the effects of cultural differences on emotional states experienced by athletes before, during and post-competition. PMID:24149128

  9. Improving our fundamental understanding of the role of aerosol-cloud interactions in the climate system.

    PubMed

    Seinfeld, John H; Bretherton, Christopher; Carslaw, Kenneth S; Coe, Hugh; DeMott, Paul J; Dunlea, Edward J; Feingold, Graham; Ghan, Steven; Guenther, Alex B; Kahn, Ralph; Kraucunas, Ian; Kreidenweis, Sonia M; Molina, Mario J; Nenes, Athanasios; Penner, Joyce E; Prather, Kimberly A; Ramanathan, V; Ramaswamy, Venkatachalam; Rasch, Philip J; Ravishankara, A R; Rosenfeld, Daniel; Stephens, Graeme; Wood, Robert

    2016-05-24

    The effect of an increase in atmospheric aerosol concentrations on the distribution and radiative properties of Earth's clouds is the most uncertain component of the overall global radiative forcing from preindustrial time. General circulation models (GCMs) are the tool for predicting future climate, but the treatment of aerosols, clouds, and aerosol-cloud radiative effects carries large uncertainties that directly affect GCM predictions, such as climate sensitivity. Predictions are hampered by the large range of scales of interaction between various components that need to be captured. Observation systems (remote sensing, in situ) are increasingly being used to constrain predictions, but significant challenges exist, to some extent because of the large range of scales and the fact that the various measuring systems tend to address different scales. Fine-scale models represent clouds, aerosols, and aerosol-cloud interactions with high fidelity but do not include interactions with the larger scale and are therefore limited from a climatic point of view. We suggest strategies for improving estimates of aerosol-cloud relationships in climate models, for new remote sensing and in situ measurements, and for quantifying and reducing model uncertainty.

  10. Improving Our Fundamental Understanding of the Role of Aerosol Cloud Interactions in the Climate System

    NASA Technical Reports Server (NTRS)

    Seinfeld, John H.; Bretherton, Christopher; Carslaw, Kenneth S.; Coe, Hugh; DeMott, Paul J.; Dunlea, Edward J.; Feingold, Graham; Ghan, Steven; Guenther, Alex B.; Kahn, Ralph; hide

    2016-01-01

    The effect of an increase in atmospheric aerosol concentrations on the distribution and radiative properties of Earth's clouds is the most uncertain component of the overall global radiative forcing from preindustrial time. General circulation models (GCMs) are the tool for predicting future climate, but the treatment of aerosols, clouds, and aerosol-cloud radiative effects carries large uncertainties that directly affect GCM predictions, such as climate sensitivity. Predictions are hampered by the large range of scales of interaction between various components that need to be captured. Observation systems (remote sensing, in situ) are increasingly being used to constrain predictions, but significant challenges exist, to some extent because of the large range of scales and the fact that the various measuring systems tend to address different scales. Fine-scale models represent clouds, aerosols, and aerosol-cloud interactions with high fidelity but do not include interactions with the larger scale and are therefore limited from a climatic point of view. We suggest strategies for improving estimates of aerosol-cloud relationships in climate models, for new remote sensing and in situ measurements, and for quantifying and reducing model uncertainty.

  11. Support for solar energy: Examining sense of place and utility-scale development in California

    DOE PAGES

    Carlisle, Juliet E.; Kane, Stephanie L.; Solan, David; ...

    2014-08-20

    As solar costs have declined PV systems have experienced considerable growth since 2003, especially in China, Japan, Germany, and the U.S. Thus, a more nuanced understanding of a particular public's attitudes toward utility-scale solar development, as it arrives in a market and region, is warranted and will likely be instructive for other areas in the world where this type of development will occur in the near future. Using data collected from a 2013 telephone survey (N=594) from the six Southern Californian counties selected based on existing and proposed solar developments and available suitable land, we examine public attitudes toward solarmore » energy and construction of large-scale solar facilities, testing whether attitudes toward such developments are the result of sense of place and attachment to place. Overall, we have mixed results. Place attachment and sense of place fail to produce significant effects except in terms of perceived positive benefits. That is, respondents interpret the change resulting from large-scale solar development in a positive way insofar as perceived positive economic impacts are positively related to support for nearby large-scale construction.« less

  12. Improving our fundamental understanding of the role of aerosol-cloud interactions in the climate system

    DOE PAGES

    Seinfeld, John H.; Bretherton, Christopher; Carslaw, Kenneth S.; ...

    2016-05-24

    The effect of an increase in atmospheric aerosol concentrations on the distribution and radiative properties of Earth’s clouds is the most uncertain component of the overall global radiative forcing from pre-industrial time. General Circulation Models (GCMs) are the tool for predicting future climate, but the treatment of aerosols, clouds, and aerosol-cloud radiative effects carries large uncertainties that directly affect GCM predictions, such as climate sensitivity. Predictions are hampered by the large range of scales of interaction between various components that need to be captured. Observation systems (remote sensing, in situ) are increasingly being used to constrain predictions but significant challengesmore » exist, to some extent because of the large range of scales and the fact that the various measuring systems tend to address different scales. Fine-scale models represent clouds, aerosols, and aerosol-cloud interactions with high fidelity but do not include interactions with the larger scale and are therefore limited from a climatic point of view. Lastly, we suggest strategies for improving estimates of aerosol-cloud relationships in climate models, for new remote sensing and in situ measurements, and for quantifying and reducing model uncertainty.« less

  13. Support for solar energy: Examining sense of place and utility-scale development in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juliet E. Carlisle; Stephanie L. Kane; David Solan

    2015-07-01

    As solar costs have declined PV systems have experienced considerable growth since 2003, especially in China, Japan, Germany, and the U.S. Thus, a more nuanced understanding of a particular public's attitudes toward utility-scale solar development, as it arrives in a market and region, is warranted and will likely be instructive for other areas in the world where this type of development will occur in the near future. Using data collected from a 2013 telephone survey (N = 594) from the six Southern Californian counties selected based on existing and proposed solar developments and available suitable land, we examine public attitudesmore » toward solar energy and construction of large-scale solar facilities, testing whether attitudes toward such developments are the result of sense of place and attachment to place. Overall, we have mixed results. Place attachment and sense of place fail to produce significant effects except in terms of perceived positive benefits. That is, respondents interpret the change resulting from large-scale solar development in a positive way insofar as perceived positive economic impacts are positively related to support for nearby large-scale construction.« less

  14. Improving our fundamental understanding of the role of aerosol−cloud interactions in the climate system

    PubMed Central

    Seinfeld, John H.; Bretherton, Christopher; Carslaw, Kenneth S.; Coe, Hugh; DeMott, Paul J.; Dunlea, Edward J.; Feingold, Graham; Ghan, Steven; Guenther, Alex B.; Kraucunas, Ian; Molina, Mario J.; Nenes, Athanasios; Penner, Joyce E.; Prather, Kimberly A.; Ramanathan, V.; Ramaswamy, Venkatachalam; Rasch, Philip J.; Ravishankara, A. R.; Rosenfeld, Daniel; Stephens, Graeme; Wood, Robert

    2016-01-01

    The effect of an increase in atmospheric aerosol concentrations on the distribution and radiative properties of Earth’s clouds is the most uncertain component of the overall global radiative forcing from preindustrial time. General circulation models (GCMs) are the tool for predicting future climate, but the treatment of aerosols, clouds, and aerosol−cloud radiative effects carries large uncertainties that directly affect GCM predictions, such as climate sensitivity. Predictions are hampered by the large range of scales of interaction between various components that need to be captured. Observation systems (remote sensing, in situ) are increasingly being used to constrain predictions, but significant challenges exist, to some extent because of the large range of scales and the fact that the various measuring systems tend to address different scales. Fine-scale models represent clouds, aerosols, and aerosol−cloud interactions with high fidelity but do not include interactions with the larger scale and are therefore limited from a climatic point of view. We suggest strategies for improving estimates of aerosol−cloud relationships in climate models, for new remote sensing and in situ measurements, and for quantifying and reducing model uncertainty. PMID:27222566

  15. A multi-topographical-instrument analysis: the breast implant texture measurement

    NASA Astrophysics Data System (ADS)

    Garabédian, Charles; Delille, Rémi; Deltombe, Raphaël; Anselme, Karine; Atlan, Michael; Bigerelle, Maxence

    2017-06-01

    Capsular contracture is a major complication after implant-based breast augmentation. To address this tissue reaction, most manufacturers texture the outer breast implant surfaces with calibrated salt grains. However, the analysis of these surfaces on sub-micron scales has been under-studied. This scale range is of interest to understand the future of silicone particles potentially released from the implant surface and the aetiology of newly reported complications, such as Anaplastic Large Cell Lymphoma. The surface measurements were accomplished by tomography and by two optical devices based on interferometry and on focus variation. The robustness of the measurements was investigated from the tissue scale to the cellular scale. The macroscopic pore-based structure of the textured implant surfaces is consistently measured by the three instruments. However, the multi-scale analyses start to be discrepant in a scale range between 50 µm and 500 µm characteristic of a finer secondary roughness regardless of the pore shape. The focus variation and the micro-tomography would fail to capture this roughness regime because of a focus-related optical artefact and of step-shaped artefact respectively.

  16. Global change technology architecture trade study

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard (Editor); Hypes, Warren D. (Editor); Wright, Robert L. (Editor)

    1991-01-01

    Described here is an architecture trade study conducted by the Langley Research Center to develop a representative mix of advanced space science instrumentation, spacecraft, and mission orbits to assist in the technology selection processes. The analyses concentrated on the highest priority classes of global change measurements which are the global climate changes. Issues addressed in the tradeoffs includes assessments of the economics of scale of large platforms with multiple instruments relative to smaller spacecraft; the influences of current and possible future launch vehicles on payload sizes, and on-orbit assembly decisions; and the respective roles of low-Earth versus geostationary Earth orbiting systems.

  17. Extreme air pollution events in Hokkaido, Japan, traced back to early snowmelt and large-scale wildfires over East Eurasia: Case studies.

    PubMed

    Yasunari, Teppei J; Kim, Kyu-Myong; da Silva, Arlindo M; Hayasaki, Masamitsu; Akiyama, Masayuki; Murao, Naoto

    2018-04-25

    To identify the unusual climate conditions and their connections to air pollutions in a remote area due to wildfires, we examine three anomalous large-scale wildfires in May 2003, April 2008, and July 2014 over East Eurasia, as well as how products of those wildfires reached an urban city, Sapporo, in the northern part of Japan (Hokkaido), significantly affecting the air quality. NASA's MERRA-2 (the Modern-Era Retrospective analysis for Research and Applications, Version 2) aerosol re-analysis data closely reproduced the PM 2.5 variations in Sapporo for the case of smoke arrival in July 2014. Results show that all three cases featured unusually early snowmelt in East Eurasia, accompanied by warmer and drier surface conditions in the months leading to the fires, inducing long-lasting soil dryness and producing climate and environmental conditions conducive to active wildfires. Due to prevailing anomalous synoptic-scale atmospheric motions, smoke from those fires eventually reached a remote area, Hokkaido, and worsened the air quality in Sapporo. In future studies, continuous monitoring of the timing of Eurasian snowmelt and the air quality from the source regions to remote regions, coupled with the analysis of atmospheric and surface conditions, may be essential in more accurately predicting the effects of wildfires on air quality.

  18. Performance Characterization of Global Address Space Applications: A Case Study with NWChem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Jeffrey R.; Krishnamoorthy, Sriram; Shende, Sameer

    The use of global address space languages and one-sided communication for complex applications is gaining attention in the parallel computing community. However, lack of good evaluative methods to observe multiple levels of performance makes it difficult to isolate the cause of performance deficiencies and to understand the fundamental limitations of system and application design for future improvement. NWChem is a popular computational chemistry package which depends on the Global Arrays/ ARMCI suite for partitioned global address space functionality to deliver high-end molecular modeling capabilities. A workload characterization methodology was developed to support NWChem performance engineering on large-scale parallel platforms. Themore » research involved both the integration of performance instrumentation and measurement in the NWChem software, as well as the analysis of one-sided communication performance in the context of NWChem workloads. Scaling studies were conducted for NWChem on Blue Gene/P and on two large-scale clusters using different generation Infiniband interconnects and x86 processors. The performance analysis and results show how subtle changes in the runtime parameters related to the communication subsystem could have significant impact on performance behavior. The tool has successfully identified several algorithmic bottlenecks which are already being tackled by computational chemists to improve NWChem performance.« less

  19. Biodiversity conservation in Swedish forests: ways forward for a 30-year-old multi-scaled approach.

    PubMed

    Gustafsson, Lena; Perhans, Karin

    2010-12-01

    A multi-scaled model for biodiversity conservation in forests was introduced in Sweden 30 years ago, which makes it a pioneer example of an integrated ecosystem approach. Trees are set aside for biodiversity purposes at multiple scale levels varying from individual trees to areas of thousands of hectares, with landowner responsibility at the lowest level and with increasing state involvement at higher levels. Ecological theory supports the multi-scaled approach, and retention efforts at every harvest occasion stimulate landowners' interest in conservation. We argue that the model has large advantages but that in a future with intensified forestry and global warming, development based on more progressive thinking is necessary to maintain and increase biodiversity. Suggestions for the future include joint planning for several forest owners, consideration of cost-effectiveness, accepting opportunistic work models, adjusting retention levels to stand and landscape composition, introduction of temporary reserves, creation of "receiver habitats" for species escaping climate change, and protection of young forests.

  20. The ecological future of the North American bison: Conceiving long-term, large-scale conservation of a species

    USGS Publications Warehouse

    Sanderson, E.W.; Redford, Kent; Weber, Bill; Aune, K.; Baldes, Dick; Berger, J.; Carter, Dave; Curtin, C.; Derr, James N.; Dobrott, S.J.; Fearn, Eva; Fleener, Craig; Forrest, Steven C.; Gerlach, Craig; Gates, C. Cormack; Gross, J.E.; Gogan, P.; Grassel, Shaun M.; Hilty, Jodi A.; Jensen, Marv; Kunkel, Kyran; Lammers, Duane; List, R.; Minkowski, Karen; Olson, Tom; Pague, Chris; Robertson, Paul B.; Stephenson, Bob

    2008-01-01

    Many wide-ranging mammal species have experienced significant declines over the last 200 years; restoring these species will require long-term, large-scale recovery efforts. We highlight 5 attributes of a recent range-wide vision-setting exercise for ecological recovery of the North American bison (Bison bison) that are broadly applicable to other species and restoration targets. The result of the exercise, the “Vermejo Statement” on bison restoration, is explicitly (1) large scale, (2) long term, (3) inclusive, (4) fulfilling of different values, and (5) ambitious. It reads, in part, “Over the next century, the ecological recovery of the North American bison will occur when multiple large herds move freely across extensive landscapes within all major habitats of their historic range, interacting in ecologically significant ways with the fullest possible set of other native species, and inspiring, sustaining and connecting human cultures.” We refined the vision into a scorecard that illustrates how individual bison herds can contribute to the vision. We also developed a set of maps and analyzed the current and potential future distributions of bison on the basis of expert assessment. Although more than 500,000 bison exist in North America today, we estimated they occupy <1% of their historical range and in no place express the full range of ecological and social values of previous times. By formulating an inclusive, affirmative, and specific vision through consultation with a wide range of stakeholders, we hope to provide a foundation for conservation of bison, and other wide-ranging species, over the next 100 years.

  1. Research and Development of Wires and Cables for High-Field Accelerator Magnets

    DOE PAGES

    Barzi, Emanuela; Zlobin, Alexander V.

    2016-02-18

    The latest strategic plans for High Energy Physics endorse steadfast superconducting magnet technology R&D for future Energy Frontier Facilities. This includes 10 to 16 T Nb3Sn accelerator magnets for the luminosity upgrades of the Large Hadron Collider and eventually for a future 100 TeV scale proton-protonmore » $(pp)$ collider. This paper describes the multi-decade R&D investment in the $$Nb_3Sn$$ superconductor technology, which was crucial to produce the first reproducible 10 to 12 T accelerator-quality dipoles and quadrupoles, as well as their scale-up. We also indicate prospective research areas in superconducting $$Nb_3Sn$$ wires and cables to achieve the next goals for superconducting accelerator magnets. Emphasis is on increasing performance and decreasing costs while pushing the $$Nb_3Sn$$ technology to its limits for future $pp$ colliders.« less

  2. Large scale seismic vulnerability and risk evaluation of a masonry churches sample in the historical centre of Naples

    NASA Astrophysics Data System (ADS)

    Formisano, Antonio; Ciccone, Giuseppe; Mele, Annalisa

    2017-11-01

    This paper investigates about the seismic vulnerability and risk of fifteen masonry churches located in the historical centre of Naples. The used analysis method is derived from a procedure already implemented by the University of Basilicata on the churches of Matera. In order to evaluate for the study area the seismic vulnerability and hazard indexes of selected churches, the use of appropriate technical survey forms is done. Data obtained from applying the employed procedure allow for both plotting of vulnerability maps and providing seismic risk indicators of all churches. The comparison among the indexes achieved allows for the evaluation of the health state of inspected churches so to program a priority scale in performing future retrofitting interventions.

  3. The Variance of Intraclass Correlations in Three- and Four-Level Models

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Hedberg, E. C.; Kuyper, Arend M.

    2012-01-01

    Intraclass correlations are used to summarize the variance decomposition in populations with multilevel hierarchical structure. There has recently been considerable interest in estimating intraclass correlations from surveys or designed experiments to provide design parameters for planning future large-scale randomized experiments. The large…

  4. The 80 megawatt wind power project at Kahuku Point, Hawaii

    NASA Technical Reports Server (NTRS)

    Laessig, R. R.

    1982-01-01

    Windfarms Ltd. is developing the two largest wind energy projects in the world. Designed to produce 80 megawatts at Kahuku Point, Hawaii and 350 megawatts in Solano County, California, these projects will be the prototypes for future large-scale wind energy installations throughout the world.

  5. The Variance of Intraclass Correlations in Three and Four Level

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Hedberg, Eric C.; Kuyper, Arend M.

    2012-01-01

    Intraclass correlations are used to summarize the variance decomposition in popula- tions with multilevel hierarchical structure. There has recently been considerable interest in estimating intraclass correlations from surveys or designed experiments to provide design parameters for planning future large-scale randomized experiments. The large…

  6. New Roles for Occupational Instructors.

    ERIC Educational Resources Information Center

    Campbell, Dale F.

    Changes in the future role of occupational instructors which will be brought about by advances in educational technology are illustrated by the description of the Advanced Instructional System (AIS), a complex approach to occupational training which permits large-scale application of individualized instruction through the use of computer-assisted…

  7. Active microwave remote sensing of oceans, chapter 3

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A rationale is developed for the use of active microwave sensing in future aerospace applications programs for the remote sensing of the world's oceans, lakes, and polar regions. Summaries pertaining to applications, local phenomena, and large-scale phenomena are given along with a discussion of orbital errors.

  8. Motor Vehicle Demand Models : Assessment of the State of the Art and Directions for Future Research

    DOT National Transportation Integrated Search

    1981-04-01

    The report provides an assessment of the current state of motor vehicle demand modeling. It includes a detailed evaluation of one leading large-scale econometric vehicle demand model, which is tested for both logical consistency and forecasting accur...

  9. An Overview of Science Education in the Caribbean: Research, Policy and Practice.

    ERIC Educational Resources Information Center

    Sweeney, Aldrin E.

    2003-01-01

    Analyzes science education in the Caribbean and provides examples of science education policy and practice. Emphasizes large-scale national efforts in Barbados, Bermuda, and Jamaica. Discusses and provides recommendations for future directions in science education in these countries. (Contains 88 references.) (Author/NB)

  10. The periodic dynamics of the irregular heterogeneous celestial bodies

    NASA Astrophysics Data System (ADS)

    Lan, Lei; Yang, Mo; Baoyin, Hexi; Li, Junfeng

    2017-02-01

    In this paper, we develop a methodology to study the periodic dynamics of irregular heterogeneous celestial bodies. Heterogeneous bodies are not scarce in space. It has been found that bodies, such as 4 Vesta, 624 Hektor, 87 Sylvia, 16 Psyche and 25143 Itokawa, may all have varied internal structures. They can be divided into large-scale and small-scale cases. The varied internal structures of large-scale bodies always result from gradient pressure inside, which leads to compactness differences of the inner material. However, the heterogeneity of a small-scale body is always reflected by the different densities of different areas, which may originate from collision formation from multiple objects. We propose a modeling procedure for the heterogeneous bodies derived from the conventional polyhedral method and then compare its dynamical characteristics with those of the homogeneous case. It is found that zero-velocity curves, positions of equilibrium points, types of bifurcations in the continuation of the orbital family and the stabilities of periodic orbits near the heterogeneous body are different from those in the homogeneous case. The suborbicular orbits near the equatorial plane are potential parking orbits for a future mission, so we discuss the switching of the orbital stability of the family because it has fundamental significance to orbit maintenance and operations around actual asteroids.

  11. Early warning of climate tipping points

    NASA Astrophysics Data System (ADS)

    Lenton, Timothy M.

    2011-07-01

    A climate 'tipping point' occurs when a small change in forcing triggers a strongly nonlinear response in the internal dynamics of part of the climate system, qualitatively changing its future state. Human-induced climate change could push several large-scale 'tipping elements' past a tipping point. Candidates include irreversible melt of the Greenland ice sheet, dieback of the Amazon rainforest and shift of the West African monsoon. Recent assessments give an increased probability of future tipping events, and the corresponding impacts are estimated to be large, making them significant risks. Recent work shows that early warning of an approaching climate tipping point is possible in principle, and could have considerable value in reducing the risk that they pose.

  12. A large scale test of the gaming-enhancement hypothesis.

    PubMed

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  13. Large-scale shrimp farming in coastal wetlands of Venezuela, South America: Causes and consequences of land-use conflicts

    NASA Astrophysics Data System (ADS)

    Sebastiani, Mirady; González, Sara Elena; Castillo, María Mercedes; Alvizu, Pablo; Oliveira, María Albertina; Pérez, Jorge; Quilici, Antonio; Rada, Martín; Yáber, María Carolina; Lentino, Miguel

    1994-09-01

    In Venezuela, large-scale shrimp farming began in the 1980s. By 1987, the Ministry of Environment and Natural Resources (MARNR) had received 14 proposals for approval. A developer illegally started the construction of ponds at the Píritu Lagoon in the State of Anzoátegui before the authorization process was completed. This action triggered a land-use conflict. This study identifies the causes for public protest and determines the consequences of this conflict for land-use management. The results show that public protest was based on the impacts of the partial construction of ponds. These impacts were related to direct removal of wetlands, interruption of natural patterns of surface flows, and alteration of feeding grounds of some bird species with migratory status. Consequences were identified in relation to the role that nongovernmental organizations (NGOs) play in land-use conflicts and the actions that MARNR could take in the future to prevent and solve similar situations.

  14. A Review of Control Strategy of the Large-scale of Electric Vehicles Charging and Discharging Behavior

    NASA Astrophysics Data System (ADS)

    Kong, Lingyu; Han, Jiming; Xiong, Wenting; Wang, Hao; Shen, Yaqi; Li, Ying

    2017-05-01

    Large scale access of electric vehicles will bring huge challenges to the safe operation of the power grid, and it’s important to control the charging and discharging of the electric vehicle. First of all, from the electric quality and network loss, this paper points out the influence on the grid caused by electric vehicle charging behaviour. Besides, control strategy of electric vehicle charging and discharging has carried on the induction and the summary from the direct and indirect control. Direct control strategy means control the electric charging behaviour by controlling its electric vehicle charging and discharging power while the indirect control strategy by means of controlling the price of charging and discharging. Finally, for the convenience of the reader, this paper also proposed a complete idea of the research methods about how to study the control strategy, taking the adaptability and possibility of failure of electric vehicle control strategy into consideration. Finally, suggestions on the key areas for future research are put up.

  15. Possible roles for fronto-striatal circuits in reading disorder

    PubMed Central

    Hancock, Roeland; Richlan, Fabio; Hoeft, Fumiko

    2016-01-01

    Several studies have reported hyperactivation in frontal and striatal regions in individuals with reading disorder (RD) during reading-related tasks. Hyperactivation in these regions is typically interpreted as a form of neural compensation and related to articulatory processing. Fronto-striatal hyperactivation in RD can however, also arise from fundamental impairment in reading related processes, such as phonological processing and implicit sequence learning relevant to early language acquisition. We review current evidence for the compensation hypothesis in RD and apply large-scale reverse inference to investigate anatomical overlap between hyperactivation regions and neural systems for articulation, phonological processing, implicit sequence learning. We found anatomical convergence between hyperactivation regions and regions supporting articulation, consistent with the proposed compensatory role of these regions, and low convergence with phonological and implicit sequence learning regions. Although the application of large-scale reverse inference to decode function in a clinical population should be interpreted cautiously, our findings suggest future lines of research that may clarify the functional significance of hyperactivation in RD. PMID:27826071

  16. OpenSoC Fabric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-21

    Recent advancements in technology scaling have shown a trend towards greater integration with large-scale chips containing thousands of processors connected to memories and other I/O devices using non-trivial network topologies. Software simulation proves insufficient to study the tradeoffs in such complex systems due to slow execution time, whereas hardware RTL development is too time-consuming. We present OpenSoC Fabric, an on-chip network generation infrastructure which aims to provide a parameterizable and powerful on-chip network generator for evaluating future high performance computing architectures based on SoC technology. OpenSoC Fabric leverages a new hardware DSL, Chisel, which contains powerful abstractions provided by itsmore » base language, Scala, and generates both software (C++) and hardware (Verilog) models from a single code base. The OpenSoC Fabric2 infrastructure is modeled after existing state-of-the-art simulators, offers large and powerful collections of configuration options, and follows object-oriented design and functional programming to make functionality extension as easy as possible.« less

  17. Assessing large-scale wildlife responses to human infrastructure development

    PubMed Central

    Torres, Aurora; Jaeger, Jochen A. G.; Alonso, Juan Carlos

    2016-01-01

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future. PMID:27402749

  18. Assessing large-scale wildlife responses to human infrastructure development.

    PubMed

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  19. LES versus DNS: A comparative study

    NASA Technical Reports Server (NTRS)

    Shtilman, L.; Chasnov, J. R.

    1992-01-01

    We have performed Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) of forced isotropic turbulence at moderate Reynolds numbers. The subgrid scale model used in the LES is based on an eddy viscosity which adjusts instantaneously the energy spectrum of the LES to that of the DNS. The statistics of the large scales of the DNS (filtered DNS field or fDNS) are compared to that of the LES. We present results for the transfer spectra, the skewness and flatness factors of the velocity components, the PDF's of the angle between the vorticity and the eigenvectors of the rate of strain, and that between the vorticity and the vorticity stretching tensor. The above LES statistics are found to be in good agreement with those measured in the fDNS field. We further observe that in all the numerical measurements, the trend was for the LES field to be more gaussian than the fDNS field. Future research on this point is planned.

  20. Asymmetric response of tropical cyclone activity to global warming over the North Atlantic and western North Pacific from CMIP5 model projections

    NASA Astrophysics Data System (ADS)

    Park, Doo-Sun R.; Ho, Chang-Hoi; Chan, Johnny C. L.; Ha, Kyung-Ja; Kim, Hyeong-Seog; Kim, Jinwon; Kim, Joo-Hong

    2017-01-01

    Recent improvements in the theoretical understanding of the relationship between tropical cyclones (TCs) and their large-scale environments have resulted in significant improvements in the skill for forecasting TC activity at daily and seasonal time-scales. However, future changes in TC activity under a warmer climate remain uncertain, particularly in terms of TC genesis locations and subsequent pathways. Applying a track-pattern-based statistical model to 22 Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs for the historical period and the future period corresponding to the Representative Concentration Pathway 8.5 emissions scenarios, this study shows that in future climate conditions, TC passage frequency will decrease over the North Atlantic, particularly in the Gulf of Mexico, but will increase over the western North Pacific, especially that hits Korea and Japan. Unlike previous studies based on fine-resolution models, an ensemble mean of CMIP5 models projects an increase in TC activity in the western North Pacific, which is owing to enhanced subtropical deep convection and favorable dynamic conditions therein in conjunction with the expansion of the tropics and vice versa for the North Atlantic. Our results suggest that North America will experience less TC landfalls, while northeast Asia will experience more TCs than in the present-day climate.

  1. Characterizing the EPODE logic model: unravelling the past and informing the future.

    PubMed

    Van Koperen, T M; Jebb, S A; Summerbell, C D; Visscher, T L S; Romon, M; Borys, J M; Seidell, J C

    2013-02-01

    EPODE ('Ensemble Prévenons l'Obésité De Enfants' or 'Together let's Prevent Childhood Obesity') is a large-scale, centrally coordinated, capacity-building approach for communities to implement effective and sustainable strategies to prevent childhood obesity. Since 2004, EPODE has been implemented in over 500 communities in six countries. Although based on emergent practice and scientific knowledge, EPODE, as many community programs, lacks a logic model depicting key elements of the approach. The objective of this study is to gain insight in the dynamics and key elements of EPODE and to represent these in a schematic logic model. EPODE's process manuals and documents were collected and interviews were held with professionals involved in the planning and delivery of EPODE. Retrieved data were coded, themed and placed in a four-level logic model. With input from international experts, this model was scaled down to a concise logic model covering four critical components: political commitment, public and private partnerships, social marketing and evaluation. The EPODE logic model presented here can be used as a reference for future and follow-up research; to support future implementation of EPODE in communities; as a tool in the engagement of stakeholders; and to guide the construction of a locally tailored evaluation plan. © 2012 The Authors. obesity reviews © 2012 International Association for the Study of Obesity.

  2. Asymmetric response of tropical cyclone activity to global warming over the North Atlantic and western North Pacific from CMIP5 model projections.

    PubMed

    Park, Doo-Sun R; Ho, Chang-Hoi; Chan, Johnny C L; Ha, Kyung-Ja; Kim, Hyeong-Seog; Kim, Jinwon; Kim, Joo-Hong

    2017-01-30

    Recent improvements in the theoretical understanding of the relationship between tropical cyclones (TCs) and their large-scale environments have resulted in significant improvements in the skill for forecasting TC activity at daily and seasonal time-scales. However, future changes in TC activity under a warmer climate remain uncertain, particularly in terms of TC genesis locations and subsequent pathways. Applying a track-pattern-based statistical model to 22 Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs for the historical period and the future period corresponding to the Representative Concentration Pathway 8.5 emissions scenarios, this study shows that in future climate conditions, TC passage frequency will decrease over the North Atlantic, particularly in the Gulf of Mexico, but will increase over the western North Pacific, especially that hits Korea and Japan. Unlike previous studies based on fine-resolution models, an ensemble mean of CMIP5 models projects an increase in TC activity in the western North Pacific, which is owing to enhanced subtropical deep convection and favorable dynamic conditions therein in conjunction with the expansion of the tropics and vice versa for the North Atlantic. Our results suggest that North America will experience less TC landfalls, while northeast Asia will experience more TCs than in the present-day climate.

  3. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  4. k-neighborhood Decentralization: A Comprehensive Solution to Index the UMLS for Large Scale Knowledge Discovery

    PubMed Central

    Xiang, Yang; Lu, Kewei; James, Stephen L.; Borlawsky, Tara B.; Huang, Kun; Payne, Philip R.O.

    2011-01-01

    The Unified Medical Language System (UMLS) is the largest thesaurus in the biomedical informatics domain. Previous works have shown that knowledge constructs comprised of transitively-associated UMLS concepts are effective for discovering potentially novel biomedical hypotheses. However, the extremely large size of the UMLS becomes a major challenge for these applications. To address this problem, we designed a k-neighborhood Decentralization Labeling Scheme (kDLS) for the UMLS, and the corresponding method to effectively evaluate the kDLS indexing results. kDLS provides a comprehensive solution for indexing the UMLS for very efficient large scale knowledge discovery. We demonstrated that it is highly effective to use kDLS paths to prioritize disease-gene relations across the whole genome, with extremely high fold-enrichment values. To our knowledge, this is the first indexing scheme capable of supporting efficient large scale knowledge discovery on the UMLS as a whole. Our expectation is that kDLS will become a vital engine for retrieving information and generating hypotheses from the UMLS for future medical informatics applications. PMID:22154838

  5. k-Neighborhood decentralization: a comprehensive solution to index the UMLS for large scale knowledge discovery.

    PubMed

    Xiang, Yang; Lu, Kewei; James, Stephen L; Borlawsky, Tara B; Huang, Kun; Payne, Philip R O

    2012-04-01

    The Unified Medical Language System (UMLS) is the largest thesaurus in the biomedical informatics domain. Previous works have shown that knowledge constructs comprised of transitively-associated UMLS concepts are effective for discovering potentially novel biomedical hypotheses. However, the extremely large size of the UMLS becomes a major challenge for these applications. To address this problem, we designed a k-neighborhood Decentralization Labeling Scheme (kDLS) for the UMLS, and the corresponding method to effectively evaluate the kDLS indexing results. kDLS provides a comprehensive solution for indexing the UMLS for very efficient large scale knowledge discovery. We demonstrated that it is highly effective to use kDLS paths to prioritize disease-gene relations across the whole genome, with extremely high fold-enrichment values. To our knowledge, this is the first indexing scheme capable of supporting efficient large scale knowledge discovery on the UMLS as a whole. Our expectation is that kDLS will become a vital engine for retrieving information and generating hypotheses from the UMLS for future medical informatics applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. A phylogeny and revised classification of Squamata, including 4161 species of lizards and snakes

    PubMed Central

    2013-01-01

    Background The extant squamates (>9400 known species of lizards and snakes) are one of the most diverse and conspicuous radiations of terrestrial vertebrates, but no studies have attempted to reconstruct a phylogeny for the group with large-scale taxon sampling. Such an estimate is invaluable for comparative evolutionary studies, and to address their classification. Here, we present the first large-scale phylogenetic estimate for Squamata. Results The estimated phylogeny contains 4161 species, representing all currently recognized families and subfamilies. The analysis is based on up to 12896 base pairs of sequence data per species (average = 2497 bp) from 12 genes, including seven nuclear loci (BDNF, c-mos, NT3, PDC, R35, RAG-1, and RAG-2), and five mitochondrial genes (12S, 16S, cytochrome b, ND2, and ND4). The tree provides important confirmation for recent estimates of higher-level squamate phylogeny based on molecular data (but with more limited taxon sampling), estimates that are very different from previous morphology-based hypotheses. The tree also includes many relationships that differ from previous molecular estimates and many that differ from traditional taxonomy. Conclusions We present a new large-scale phylogeny of squamate reptiles that should be a valuable resource for future comparative studies. We also present a revised classification of squamates at the family and subfamily level to bring the taxonomy more in line with the new phylogenetic hypothesis. This classification includes new, resurrected, and modified subfamilies within gymnophthalmid and scincid lizards, and boid, colubrid, and lamprophiid snakes. PMID:23627680

  7. Measurement equivalence of seven selected items of posttraumatic growth between black and white adult survivors of Hurricane Katrina.

    PubMed

    Rhodes, Alison M; Tran, Thanh V

    2013-02-01

    This study examined the equivalence or comparability of the measurement properties of seven selected items measuring posttraumatic growth among self-identified Black (n = 270) and White (n = 707) adult survivors of Hurricane Katrina, using data from the Baseline Survey of the Hurricane Katrina Community Advisory Group Study. Internal consistency reliability was equally good for both groups (Cronbach's alphas = .79), as were correlations between individual scale items and their respective overall scale. Confirmatory factor analysis of a congeneric measurement model of seven selected items of posttraumatic growth showed adequate measures of fit for both groups. The results showed only small variation in magnitude of factor loadings and measurement errors between the two samples. Tests of measurement invariance showed mixed results, but overall indicated that factor loading, error variance, and factor variance were similar between the two samples. These seven selected items can be useful for future large-scale surveys of posttraumatic growth.

  8. Factorial validity and measurement equivalence of the Client Assessment of Treatment Scale for psychiatric inpatient care - a study in three European countries.

    PubMed

    Richardson, Michelle; Katsakou, Christina; Torres-González, Francisco; Onchev, George; Kallert, Thomas; Priebe, Stefan

    2011-06-30

    Patients' views of inpatient care need to be assessed for research and routine evaluation. For this a valid instrument is required. The Client Assessment of Treatment Scale (CAT) has been used in large scale international studies, but its psychometric properties have not been well established. The structural validity of the CAT was tested among involuntary inpatients with psychosis. Data from locations in three separate European countries (England, Spain and Bulgaria) were collected. The factorial validity was initially tested using single sample confirmatory factor analyses in each country. Subsequent multi-sample analyses were used to test for invariance of the factor loadings, and factor variances across the countries. Results provide good initial support for the factorial validity and invariance of the CAT scores. Future research is needed to cross-validate these findings and to generalise them to other countries, treatment settings, and patient populations. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Non-linear scaling of oxygen consumption and heart rate in a very large cockroach species (Gromphadorhina portentosa): correlated changes with body size and temperature.

    PubMed

    Streicher, Jeffrey W; Cox, Christian L; Birchard, Geoffrey F

    2012-04-01

    Although well documented in vertebrates, correlated changes between metabolic rate and cardiovascular function of insects have rarely been described. Using the very large cockroach species Gromphadorhina portentosa, we examined oxygen consumption and heart rate across a range of body sizes and temperatures. Metabolic rate scaled positively and heart rate negatively with body size, but neither scaled linearly. The response of these two variables to temperature was similar. This correlated response to endogenous (body mass) and exogenous (temperature) variables is likely explained by a mutual dependence on similar metabolic substrate use and/or coupled regulatory pathways. The intraspecific scaling for oxygen consumption rate showed an apparent plateauing at body masses greater than about 3 g. An examination of cuticle mass across all instars revealed isometric scaling with no evidence of an ontogenetic shift towards proportionally larger cuticles. Published oxygen consumption rates of other Blattodea species were also examined and, as in our intraspecific examination of G. portentosa, the scaling relationship was found to be non-linear with a decreasing slope at larger body masses. The decreasing slope at very large body masses in both intraspecific and interspecific comparisons may have important implications for future investigations of the relationship between oxygen transport and maximum body size in insects.

  10. Combinatorial Approach for Large-scale Identification of Linked Peptides from Tandem Mass Spectrometry Spectra*

    PubMed Central

    Wang, Jian; Anania, Veronica G.; Knott, Jeff; Rush, John; Lill, Jennie R.; Bourne, Philip E.; Bandeira, Nuno

    2014-01-01

    The combination of chemical cross-linking and mass spectrometry has recently been shown to constitute a powerful tool for studying protein–protein interactions and elucidating the structure of large protein complexes. However, computational methods for interpreting the complex MS/MS spectra from linked peptides are still in their infancy, making the high-throughput application of this approach largely impractical. Because of the lack of large annotated datasets, most current approaches do not capture the specific fragmentation patterns of linked peptides and therefore are not optimal for the identification of cross-linked peptides. Here we propose a generic approach to address this problem and demonstrate it using disulfide-bridged peptide libraries to (i) efficiently generate large mass spectral reference data for linked peptides at a low cost and (ii) automatically train an algorithm that can efficiently and accurately identify linked peptides from MS/MS spectra. We show that using this approach we were able to identify thousands of MS/MS spectra from disulfide-bridged peptides through comparison with proteome-scale sequence databases and significantly improve the sensitivity of cross-linked peptide identification. This allowed us to identify 60% more direct pairwise interactions between the protein subunits in the 20S proteasome complex than existing tools on cross-linking studies of the proteasome complexes. The basic framework of this approach and the MS/MS reference dataset generated should be valuable resources for the future development of new tools for the identification of linked peptides. PMID:24493012

  11. The Effect of a State Department of Education Teacher Mentor Initiative on Science Achievement

    NASA Astrophysics Data System (ADS)

    Pruitt, Stephen L.; Wallace, Carolyn S.

    2012-06-01

    This study investigated the effectiveness of a southern state's department of education program to improve science achievement through embedded professional development of science teachers in the lowest performing schools. The Science Mentor Program provided content and inquiry-based coaching by teacher leaders to science teachers in their own classrooms. The study analyzed the mean scale scores for the science portion of the state's high school graduation test for the years 2004 through 2007 to determine whether schools receiving the intervention scored significantly higher than comparison schools receiving no intervention. The results showed that all schools achieved significant improvement of scale scores between 2004 and 2007, but there were no significant performance differences between intervention and comparison schools, nor were there any significant differences between various subgroups in intervention and comparison schools. However, one subgroup, economically disadvantaged (ED) students, from high-level intervention schools closed the achievement gap with ED students from no-intervention schools across the period of the study. The study provides important information to guide future research on and design of large-scale professional development programs to foster inquiry-based science.

  12. Consistency of Aquarius version-4 sea surface salinity with Argo products on various spatial and temporal scales

    NASA Astrophysics Data System (ADS)

    Lee, Tong

    2017-04-01

    Understanding the accuracies of satellite-derived sea surface salinity (SSS) measurements in depicting temporal changes and the dependence of the accuracies on spatiotemporal scales are important to capability assessment, future mission design, and applications to study oceanic phenomena of different spatiotemporal scales. This study quantifies the consistency between Aquarius Version-4 monthly gridded SSS (released in late 2015) with two widely used Argo monthly gridded near-surface salinity products. The analysis focused on their consistency in depicting temporal changes (including seasonal and non-seasonal) on various spatial scales: 1˚ x1˚ , 3˚ x3˚ , and 10˚ x10˚ . Globally averaged standard deviation (STD) values for Aquarius-Argo salinity differences on these three spatial scales are 0.16, 0.14, 0.09 psu, compared to those between the two Argo products of 0.10, 0.09, and 0.04 psu. Aquarius SSS compare better with Argo data on non-seasonal (e.g., interannual and intraseasonal) than for seasonal time scales. The seasonal Aquarius-Argo SSS differences are mostly concentrated at high latitudes. The Aquarius team is making active efforts to further reduce these high-latitude seasonal biases. The consistency between Aquarius and Argo salinity is similar to that between the two Argo products in the tropics and subtropics for non-seasonal signals, and in the tropics for seasonal signals. Therefore, the representativeness errors of the Argo products for various spatial scales (related to sampling and gridding) need to be taken into account when estimating the uncertainty of Aquarius SSS. The globally averaged uncertainty of large-scale (10˚ x10˚ ) non-seasonal Aquarius SSS is approximately 0.04 psu. These estimates reflect the significant improvements of Aquarius Version-4 SSS over the previous versions. The estimates can be used as baseline requirements for future ocean salinity missions from space. The spatial distribution of the uncertainty estimates is also useful for assimilation of Aquarius SSS.

  13. Cosmological consistency tests of gravity theory and cosmic acceleration

    NASA Astrophysics Data System (ADS)

    Ishak-Boushaki, Mustapha B.

    2017-01-01

    Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.

  14. Self-Organized Evolution of Sandy Coastline Shapes: Connections with Shoreline Erosion Problems

    NASA Astrophysics Data System (ADS)

    Murray, A. B.; Ashton, A.

    2002-12-01

    Landward movement of the shoreline severely impacts property owners and communities where structures and infrastructure are built near the coast. While sea level rise will increase the average rate of coastal erosion, even a slight gradient in wave-driven alongshore sediment flux will locally overwhelm that effect, causing either shoreline accretion or enhanced erosion. Recent analysis shows that because of the nonlinear relationship between alongshore sediment flux and the angle between deep water wave crests and local shoreline orientation, in some wave climates a straight coastline is unstable (Ashton et al., Nature, 2001). When deep-water waves approach from angles greater than the one that maximizes alongshore flux, in concave-seaward shoreline segments sediment flux will diverge, causing erosion. Similarly, convex regions such as the crests of perturbations on an otherwise straight shoreline will experience accretion; perturbations will grow. When waves approach from smaller angles, the sign of the relationship between shoreline curvature and shoreline change is reversed, but any deviation from a perfectly straight coastline will still result in alongshore-inhomogeneous shoreline change. A numerical model designed to explore the long-term effects of this instability operating over a spatially extended alongshore domain has shown that as perturbations grow to finite amplitude and interact with each other, large-scale coastline structures can emerge. The character of the local and non-local interactions, and the resulting emergent structures, depends on the wave climate. The 100-km scale capes and cuspate forelands that form much of the coast of the Carolinas, USA, provides one possible natural example. Our modeling suggests that on such a shoreline, continued interactions between large-scale structures will cause continued large-scale change in coastline shape. Consequently, some coastline segments will tend to experience accentuated erosion. Communities established in these areas face discouraging future prospects. Attempts can be made to arrest the shoreline retreat on large scales-for example through large beach nourishment projects or policies that allow pervasive hard stabilization (e.g. seawall, jetties) along a coastline segment. However, even if such attempts are successful for a significant period of time, the pinning in place of some parts of an otherwise dynamic system will change the large-scale evolution of the coastline, altering the future erosion/accretion experienced at other, perhaps distant, locations. Simple properties of alongshore sediment transport could also be relevant to alongshore-inhomogeneous shoreline change (including erosion 'hot spots') on shorter time scales and smaller spatial scales. We are comparing predictions arising from the modeling, and from analysis of alongshore transport as a function of shoreline orientation, to recent observations of shoreline change ranging across spatial scales from 100s of meters to 10s of kilometers, and time scales from days to decades (List and Farris, Coastal Sediments,1999; Tebbens et al., PNAS, 2002). Considering that many other processes and factors can also influence shoreline change, initial results show a surprising degree of correlation between observations and predictions.

  15. Application of Landscape Mosaic Technology to Complement Coral Reef Resource Mapping and Monitoring

    DTIC Science & Technology

    2010-10-01

    irregular shapes pose a challenge for divers trying to delimit live tissue boundaries. Future improvements in the 3D representation of benthic mosaics...benthic habitats can be especially challenging when the spatial extent of injuries exceeds tens of square meters. These large injuries are often too...the impacts of severe physical disturbance on coral reefs can be especially challenging when large-scale modifications to the reef structure takes

  16. The Burn Wound Microenvironment

    PubMed Central

    Rose, Lloyd F.; Chan, Rodney K.

    2016-01-01

    Significance: While the survival rate of the severely burned patient has improved significantly, relatively little progress has been made in treatment or prevention of burn-induced long-term sequelae, such as contraction and fibrosis. Recent Advances: Our knowledge of the molecular pathways involved in burn wounds has increased dramatically, and technological advances now allow large-scale genomic studies, providing a global view of wound healing processes. Critical Issues: Translating findings from a large number of in vitro and preclinical animal studies into clinical practice represents a gap in our understanding, and the failures of a number of clinical trials suggest that targeting single pathways or cytokines may not be the best approach. Significant opportunities for improvement exist. Future Directions: Study of the underlying molecular influences of burn wound healing progression will undoubtedly continue as an active research focus. Increasing our knowledge of these processes will identify additional therapeutic targets, supporting informed clinical studies that translate into clinical relevance and practice. PMID:26989577

  17. Investigation of multilayer domains in large-scale CVD monolayer graphene by optical imaging

    NASA Astrophysics Data System (ADS)

    Yu, Yuanfang; Li, Zhenzhen; Wang, Wenhui; Guo, Xitao; Jiang, Jie; Nan, Haiyan; Ni, Zhenhua

    2017-03-01

    CVD graphene is a promising candidate for optoelectronic applications due to its high quality and high yield. However, multi-layer domains could inevitably form at the nucleation centers during the growth. Here, we propose an optical imaging technique to precisely identify the multilayer domains and also the ratio of their coverage in large-scale CVD monolayer graphene. We have also shown that the stacking disorder in twisted bilayer graphene as well as the impurities on the graphene surface could be distinguished by optical imaging. Finally, we investigated the effects of bilayer domains on the optical and electrical properties of CVD graphene, and found that the carrier mobility of CVD graphene is seriously limited by scattering from bilayer domains. Our results could be useful for guiding future optoelectronic applications of large-scale CVD graphene. Project supported by the National Natural Science Foundation of China (Nos. 61422503, 61376104), the Open Research Funds of Key Laboratory of MEMS of Ministry of Education (SEU, China), and the Fundamental Research Funds for the Central Universities.

  18. Statistical Analysis of Large-Scale Structure of Universe

    NASA Astrophysics Data System (ADS)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  19. Genetic overlap between type 2 diabetes and major depressive disorder identified by bioinformatics analysis.

    PubMed

    Ji, Hong-Fang; Zhuang, Qi-Shuai; Shen, Liang

    2016-04-05

    Our study investigated the shared genetic etiology underlying type 2 diabetes (T2D) and major depressive disorder (MDD) by analyzing large-scale genome wide association studies statistics. A total of 496 shared SNPs associated with both T2D and MDD were identified at p-value ≤ 1.0E-07. Functional enrichment analysis showed that the enriched pathways pertained to immune responses (Fc gamma R-mediated phagocytosis, T cell and B cell receptors signaling), cell signaling (MAPK, Wnt signaling), lipid metabolism, and cancer associated pathways. The findings will have potential implications for future interventional studies of the two diseases.

  20. On the Possibilities of Predicting Geomagnetic Secular Variation with Geodynamo Modeling

    NASA Technical Reports Server (NTRS)

    Kuang, Wei-Jia; Tangborn, Andrew; Sabaka, Terrance

    2004-01-01

    We use our MoSST core dynamics model and geomagnetic field at the core-mantle boundary (CMB) continued downward from surface observations to investigate possibilities of geomagnetic data assimilation, so that model results and current geomagnetic observations can be used to predict geomagnetic secular variation in future. As the first attempt, we apply data insertion technique to examine evolution of the model solution that is modified by geomagnetic input. Our study demonstrate that, with a single data insertion, large-scale poloidal magnetic field obtained from subsequent numerical simulation evolves similarly to the observed geomagnetic variation, regardless of the initial choice of the model solution (so long it is a well developed numerical solution). The model solution diverges on the time scales on the order of 60 years, similar to the time scales of the torsional oscillations in the Earth's core. Our numerical test shows that geomagnetic data assimilation is promising with our MoSST model.

Top