Sample records for building large scale

  1. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  2. What Will the Neighbors Think? Building Large-Scale Science Projects Around the World

    ScienceCinema

    Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug

    2017-12-22

    Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.

  3. A comprehensive study on urban true orthorectification

    USGS Publications Warehouse

    Zhou, G.; Chen, W.; Kelmelis, J.A.; Zhang, Dongxiao

    2005-01-01

    To provide some advanced technical bases (algorithms and procedures) and experience needed for national large-scale digital orthophoto generation and revision of the Standards for National Large-Scale City Digital Orthophoto in the National Digital Orthophoto Program (NDOP), this paper presents a comprehensive study on theories, algorithms, and methods of large-scale urban orthoimage generation. The procedures of orthorectification for digital terrain model (DTM)-based and digital building model (DBM)-based orthoimage generation and their mergence for true orthoimage generation are discussed in detail. A method of compensating for building occlusions using photogrammetric geometry is developed. The data structure needed to model urban buildings for accurately generating urban orthoimages is presented. Shadow detection and removal, the optimization of seamline for automatic mosaic, and the radiometric balance of neighbor images are discussed. Street visibility analysis, including the relationship between flight height, building height, street width, and relative location of the street to the imaging center, is analyzed for complete true orthoimage generation. The experimental results demonstrated that our method can effectively and correctly orthorectify the displacements caused by terrain and buildings in urban large-scale aerial images. ?? 2005 IEEE.

  4. Building the Foundations for a Large-Scale, Cross-Sector Collaboration for a Sustainable and Permanent Return to the Lunar Surface

    NASA Astrophysics Data System (ADS)

    Kapoglou, A.

    2017-10-01

    This presentation will describe how to build the foundations needed for a large scale, cross-industry collaboration to enable a sustainable and permanent return to the Moon based on system leadership, cross-sector partnership, and inclusive business.

  5. Single-trabecula building block for large-scale finite element models of cancellous bone.

    PubMed

    Dagan, D; Be'ery, M; Gefen, A

    2004-07-01

    Recent development of high-resolution imaging of cancellous bone allows finite element (FE) analysis of bone tissue stresses and strains in individual trabeculae. However, specimen-specific stress/strain analyses can include effects of anatomical variations and local damage that can bias the interpretation of the results from individual specimens with respect to large populations. This study developed a standard (generic) 'building-block' of a trabecula for large-scale FE models. Being parametric and based on statistics of dimensions of ovine trabeculae, this building block can be scaled for trabecular thickness and length and be used in commercial or custom-made FE codes to construct generic, large-scale FE models of bone, using less computer power than that currently required to reproduce the accurate micro-architecture of trabecular bone. Orthogonal lattices constructed with this building block, after it was scaled to trabeculae of the human proximal femur, provided apparent elastic moduli of approximately 150 MPa, in good agreement with experimental data for the stiffness of cancellous bone from this site. Likewise, lattices with thinner, osteoporotic-like trabeculae could predict a reduction of approximately 30% in the apparent elastic modulus, as reported in experimental studies of osteoporotic femora. Based on these comparisons, it is concluded that the single-trabecula element developed in the present study is well-suited for representing cancellous bone in large-scale generic FE simulations.

  6. Analysis on the restriction factors of the green building scale promotion based on DEMATEL

    NASA Astrophysics Data System (ADS)

    Wenxia, Hong; Zhenyao, Jiang; Zhao, Yang

    2017-03-01

    In order to promote the large-scale development of the green building in our country, DEMATEL method was used to classify influence factors of green building development into three parts, including green building market, green technology and macro economy. Through the DEMATEL model, the interaction mechanism of each part was analyzed. The mutual influence degree of each barrier factor that affects the green building promotion was quantitatively analysed and key factors for the development of green building in China were also finally determined. In addition, some implementation strategies of promoting green building scale development in our country were put forward. This research will show important reference value and practical value for making policies of the green building promotion.

  7. Construction of large scale switch matrix by interconnecting integrated optical switch chips with EDFAs

    NASA Astrophysics Data System (ADS)

    Liao, Mingle; Wu, Baojian; Hou, Jianhong; Qiu, Kun

    2018-03-01

    Large scale optical switches are essential components in optical communication network. We aim to build up a large scale optical switch matrix by the interconnection of silicon-based optical switch chips using 3-stage CLOS structure, where EDFAs are needed to compensate for the insertion loss of the chips. The optical signal-to-noise ratio (OSNR) performance of the resulting large scale optical switch matrix is investigated for TE-mode light and the experimental results are in agreement with the theoretical analysis. We build up a 64 ×64 switch matrix by use of 16 ×16 optical switch chips and the OSNR and receiver sensibility can respectively be improved by 0.6 dB and 0.2 dB by optimizing the gain configuration of the EDFAs.

  8. Advances in Multi-Sensor Scanning and Visualization of Complex Plants: the Utmost Case of a Reactor Building

    NASA Astrophysics Data System (ADS)

    Hullo, J.-F.; Thibault, G.; Boucheny, C.

    2015-02-01

    In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".

  9. a Method for the Seamlines Network Automatic Selection Based on Building Vector

    NASA Astrophysics Data System (ADS)

    Li, P.; Dong, Y.; Hu, Y.; Li, X.; Tan, P.

    2018-04-01

    In order to improve the efficiency of large scale orthophoto production of city, this paper presents a method for automatic selection of seamlines network in large scale orthophoto based on the buildings' vector. Firstly, a simple model of the building is built by combining building's vector, height and DEM, and the imaging area of the building on single DOM is obtained. Then, the initial Voronoi network of the measurement area is automatically generated based on the positions of the bottom of all images. Finally, the final seamlines network is obtained by optimizing all nodes and seamlines in the network automatically based on the imaging areas of the buildings. The experimental results show that the proposed method can not only get around the building seamlines network quickly, but also remain the Voronoi network' characteristics of projection distortion minimum theory, which can solve the problem of automatic selection of orthophoto seamlines network in image mosaicking effectively.

  10. Large-Scale Assessment, Rationality, and Scientific Management: The Case of No Child Left Behind

    ERIC Educational Resources Information Center

    Roach, Andrew T.; Frank, Jennifer

    2007-01-01

    This article examines the ways in which NCLB and the movement towards large-scale assessment systems are based on Weber's concept of formal rationality and tradition of scientific management. Building on these ideas, the authors use Ritzer's McDonaldization thesis to examine some of the core features of large-scale assessment and accountability…

  11. Development of Residential Prototype Building Models and Analysis System for Large-Scale Energy Efficiency Studies Using EnergyPlus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendon, Vrushali V.; Taylor, Zachary T.

    ABSTRACT: Recent advances in residential building energy efficiency and codes have resulted in increased interest in detailed residential building energy models using the latest energy simulation software. One of the challenges of developing residential building models to characterize new residential building stock is to allow for flexibility to address variability in house features like geometry, configuration, HVAC systems etc. Researchers solved this problem in a novel way by creating a simulation structure capable of creating fully-functional EnergyPlus batch runs using a completely scalable residential EnergyPlus template system. This system was used to create a set of thirty-two residential prototype buildingmore » models covering single- and multifamily buildings, four common foundation types and four common heating system types found in the United States (US). A weighting scheme with detailed state-wise and national weighting factors was designed to supplement the residential prototype models. The complete set is designed to represent a majority of new residential construction stock. The entire structure consists of a system of utility programs developed around the core EnergyPlus simulation engine to automate the creation and management of large-scale simulation studies with minimal human effort. The simulation structure and the residential prototype building models have been used for numerous large-scale studies, one of which is briefly discussed in this paper.« less

  12. Demonstration of reduced-order urban scale building energy models

    DOE PAGES

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...

    2017-09-08

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  13. Demonstration of reduced-order urban scale building energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  14. Pretest predictions for the response of a 1:8-scale steel LWR containment building model to static overpressurization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clauss, D.B.

    The analyses used to predict the behavior of a 1:8-scale model of a steel LWR containment building to static overpressurization are described and results are presented. Finite strain, large displacement, and nonlinear material properties were accounted for using finite element methods. Three-dimensional models were needed to analyze the penetrations, which included operable equipment hatches, personnel lock representations, and a constrained pipe. It was concluded that the scale model would fail due to leakage caused by large deformations of the equipment hatch sleeves. 13 refs., 34 figs., 1 tab.

  15. Project Management Life Cycle Models to Improve Management in High-rise Construction

    NASA Astrophysics Data System (ADS)

    Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana

    2018-03-01

    The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.

  16. Building to Scale: An Analysis of Web-Based Services in CIC (Big Ten) Libraries.

    ERIC Educational Resources Information Center

    Dewey, Barbara I.

    Advancing library services in large universities requires creative approaches for "building to scale." This is the case for CIC, Committee on Institutional Cooperation (Big Ten), libraries whose home institutions serve thousands of students, faculty, staff, and others. Developing virtual Web-based services is an increasingly viable…

  17. Scalable methodology for large scale building energy improvement: Relevance of calibration in model-based retrofit analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heo, Yeonsook; Augenbroe, Godfried; Graziano, Diane

    2015-05-01

    The increasing interest in retrofitting of existing buildings is motivated by the need to make a major contribution to enhancing building energy efficiency and reducing energy consumption and CO2 emission by the built environment. This paper examines the relevance of calibration in model-based analysis to support decision-making for energy and carbon efficiency retrofits of individual buildings and portfolios of buildings. The authors formulate a set of real retrofit decision-making situations and evaluate the role of calibration by using a case study that compares predictions and decisions from an uncalibrated model with those of a calibrated model. The case study illustratesmore » both the mechanics and outcomes of a practical alternative to the expert- and time-intense application of dynamic energy simulation models for large-scale retrofit decision-making under uncertainty.« less

  18. Ry Horsey | NREL

    Science.gov Websites

    Ry Horsey Photo of Henry Horsey Ry Horsey Software Developer - Commercial Buildings Energy Modeling the field of commercial building energy modeling. He is particularly interested in the increasing tools to support large-scale commercial building energy modeling. This work has led to contributions to

  19. Ignition and flame-growth modeling on realistic building and landscape objects in changing environments

    Treesearch

    Mark A. Dietenberger

    2010-01-01

    Effective mitigation of external fires on structures can be achieved flexibly, economically, and aesthetically by (1) preventing large-area ignition on structures by avoiding close proximity of burning vegetation; and (2) stopping flame travel from firebrands landing on combustible building objects. Using bench-scale and mid-scale fire tests to obtain flammability...

  20. Ignition and flame travel on realistic building and landscape objects in changing environments

    Treesearch

    Mark A. Dietenberger

    2007-01-01

    Effective mitigation of external fires on structures can be achieved flexibly, economically, and aesthetically by (1) preventing large-area ignition on structures from close proximity of burning vegetations and (2) stopping flame travel from firebrands landing on combustible building objects. In using bench-scale and mid-scale fire tests to obtain fire growth...

  1. Large-scale self-assembly of uniform submicron silver sulfide material driven by precise pressure control

    NASA Astrophysics Data System (ADS)

    Qi, Juanjuan; Chen, Ke; Zhang, Shuhao; Yang, Yun; Guo, Lin; Yang, Shihe

    2017-03-01

    The controllable self-assembly of nanosized building blocks into larger specific structures can provide an efficient method of synthesizing novel materials with excellent properties. The self-assembly of nanocrystals by assisted means is becoming an extremely active area of research, because it provides a method of producing large-scale advanced functional materials with potential applications in the areas of energy, electronics, optics, and biologics. In this study, we applied an efficient strategy, namely, the use of ‘pressure control’ to the assembly of silver sulfide (Ag2S) nanospheres with a diameter of approximately 33 nm into large-scale, uniform Ag2S sub-microspheres with a size of about 0.33 μm. More importantly, this strategy realizes the online control of the overall reaction system, including the pressure, reaction time, and temperature, and could also be used to easily fabricate other functional materials on an industrial scale. Moreover, the thermodynamics and kinetics parameters for the thermal decomposition of silver diethyldithiocarbamate (Ag(DDTC)) are also investigated to explore the formation mechanism of the Ag2S nanosized building blocks which can be assembled into uniform sub-micron scale architecture. As a method of producing sub-micron Ag2S particles by means of the pressure-controlled self-assembly of nanoparticles, we foresee this strategy being an efficient and universally applicable option for constructing other new building blocks and assembling novel and large functional micromaterials on an industrial scale.

  2. Developing building-damage scales for lahars: application to Merapi volcano, Indonesia

    NASA Astrophysics Data System (ADS)

    Jenkins, Susanna F.; Phillips, Jeremy C.; Price, Rebecca; Feloy, Kate; Baxter, Peter J.; Hadmoko, Danang Sri; de Bélizal, Edouard

    2015-09-01

    Lahar damage to buildings can include burial by sediment and/or failure of walls, infiltration into the building and subsequent damage to contents. The extent to which a building is damaged will be dictated by the dynamic characteristics of the lahar, i.e. the velocity, depth, sediment concentration and grain size, as well as the structural characteristics and setting of the building in question. The focus of this paper is on quantifying how buildings may respond to impact by lahar. We consider the potential for lahar damage to buildings on Merapi volcano, Indonesia, as a result of the voluminous deposits produced during the large (VEI 4) eruption in 2010. A building-damage scale has been developed that categorises likely lahar damage levels and, through theoretical calculations of expected building resistance to impact, approximate ranges of impact pressures. We found that most weak masonry buildings on Merapi would be destroyed by dilute lahars with relatively low velocities (ca. 3 m/s) and pressures (ca. 5 kPa); however, the majority of stronger rubble stone buildings may be expected to withstand higher velocities (to 6 m/s) and pressures (to 20 kPa). We applied this preliminary damage scale to a large lahar in the Putih River on 9 January 2011, which inundated and caused extensive building damage in the village of Gempol, 16 km southwest of Merapi. The scale was applied remotely through the use of public satellite images and through field studies to categorise damage and estimate impact pressures and velocities within the village. Results were compared with those calculated independently from Manning's calculations for flow velocity and depth within Gempol village using an estimate of flow velocity at one upstream site as input. The results of this calculation showed reasonable agreement with an average channel velocity derived from travel time observations. The calculated distribution of flow velocities across the area of damaged buildings was consistent with building damage as classified by the new damage scale. The complementary results, even given the basic nature of the tools and data, suggest that the damage scale provides a valid representation of the failure mode that is consistent with estimates of the flow conditions. The use of open-source simplified tools and data in producing these consistent findings is very promising.

  3. Why build a virtual brain? Large-scale neural simulations as jump start for cognitive computing

    NASA Astrophysics Data System (ADS)

    Colombo, Matteo

    2017-03-01

    Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates this claim and argues that the main challenge this era is facing is not the lack of biological realism. The challenge lies in identifying general neurocomputational principles for the design of artificial systems, which could display the robust flexibility characteristic of biological intelligence.

  4. Finite difference and Runge-Kutta methods for solving vibration problems

    NASA Astrophysics Data System (ADS)

    Lintang Renganis Radityani, Scolastika; Mungkasi, Sudi

    2017-11-01

    The vibration of a storey building can be modelled into a system of second order ordinary differential equations. If the number of floors of a building is large, then the result is a large scale system of second order ordinary differential equations. The large scale system is difficult to solve, and if it can be solved, the solution may not be accurate. Therefore, in this paper, we seek for accurate methods for solving vibration problems. We compare the performance of numerical finite difference and Runge-Kutta methods for solving large scale systems of second order ordinary differential equations. The finite difference methods include the forward and central differences. The Runge-Kutta methods include the Euler and Heun methods. Our research results show that the central finite difference and the Heun methods produce more accurate solutions than the forward finite difference and the Euler methods do.

  5. Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures

    NASA Astrophysics Data System (ADS)

    Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A.; Park, Jiwoong

    2017-10-01

    High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides--which represent one- and three-atom-thick two-dimensional building blocks, respectively--have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.

  6. Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures.

    PubMed

    Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A; Park, Jiwoong

    2017-10-12

    High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides-which represent one- and three-atom-thick two-dimensional building blocks, respectively-have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.

  7. Danny Studer | NREL

    Science.gov Websites

    Daniel.Studer@nrel.gov | 303-275-4368 Daniel joined NREL in 2009. As a member of the Commercial Buildings using EnergyPlus to identify large-scale areas for reducing and optimizing commercial building energy consumption. Recently, Daniel led NREL's commercial building workforce development efforts and he is leading

  8. Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resseguie, David R

    There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less

  9. Urban area thermal monitoring: Liepaja case study using satellite and aerial thermal data

    NASA Astrophysics Data System (ADS)

    Gulbe, Linda; Caune, Vairis; Korats, Gundars

    2017-12-01

    The aim of this study is to explore large (60 m/pixel) and small scale (individual building level) temperature distribution patterns from thermal remote sensing data and to conclude what kind of information could be extracted from thermal remote sensing on regular basis. Landsat program provides frequent large scale thermal images useful for analysis of city temperature patterns. During the study correlation between temperature patterns and vegetation content based on NDVI and building coverage based on OpenStreetMap data was studied. Landsat based temperature patterns were independent from the season, negatively correlated with vegetation content and positively correlated with building coverage. Small scale analysis included spatial and raster descriptor analysis for polygons corresponding to roofs of individual buildings for evaluating insulation of roofs. Remote sensing and spatial descriptors are poorly related to heat consumption data, however, thermal aerial data median and entropy can help to identify poorly insulated roofs. Automated quantitative roof analysis has high potential for acquiring city wide information about roof insulation, but quality is limited by reference data quality and information on building types, and roof materials would be crucial for further studies.

  10. Multistage Security Mechanism For Hybrid, Large-Scale Wireless Sensor Networks

    DTIC Science & Technology

    2007-06-01

    sensor network . Building on research in the areas of the wireless sensor networks (WSN) and the mobile ad hoc networks (MANET), this thesis proposes an...A wide area network consisting of ballistic missile defense satellites and terrestrial nodes can be viewed as a hybrid, large-scale mobile wireless

  11. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  12. Should we trust build-up/wash-off water quality models at the scale of urban catchments?

    PubMed

    Bonhomme, Céline; Petrucci, Guido

    2017-01-01

    Models of runoff water quality at the scale of an urban catchment usually rely on build-up/wash-off formulations obtained through small-scale experiments. Often, the physical interpretation of the model parameters, valid at the small-scale, is transposed to large-scale applications. Testing different levels of spatial variability, the parameter distributions of a water quality model are obtained in this paper through a Monte Carlo Markov Chain algorithm and analyzed. The simulated variable is the total suspended solid concentration at the outlet of a periurban catchment in the Paris region (2.3 km 2 ), for which high-frequency turbidity measurements are available. This application suggests that build-up/wash-off models applied at the catchment-scale do not maintain their physical meaning, but should be considered as "black-box" models. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    PubMed

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  14. ENCAPSULATING WASTE DISPOSAL METHODS - PHASE I

    EPA Science Inventory

    The release of chemical and biological agents on a large-scale urban environment would be devastating. The amount of waste generated during such an event would be comparable to a tornado ripping through a town. Building materials, furniture, office materials, building ins...

  15. Implementing Large-Scale Instructional Technology in Kenya: Changing Instructional Practice and Developing Accountability in a National Education System

    ERIC Educational Resources Information Center

    Piper, Benjamin; Oyanga, Arbogast; Mejia, Jessica; Pouezevara, Sarah

    2017-01-01

    Previous large-scale education technology interventions have shown only modest impacts on student achievement. Building on results from an earlier randomized controlled trial of three different applications of information and communication technologies (ICTs) on primary education in Kenya, the Tusome Early Grade Reading Activity developed the…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torcellini, P.; Pless, S.; Lobato, C.

    Ongoing work at the National Renewable Energy Laboratory indicates that net-zero energy building (NZEB) status is both achievable and repeatable today. This paper presents a definition framework for classifying NZEBs and a real-life example that demonstrates how a large-scale office building can cost-effectively achieve net-zero energy.

  17. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  18. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torcellini, P.; Pless, S.; Lobato, C.

    Until recently, large-scale, cost-effective net-zero energy buildings (NZEBs) were thought to lie decades in the future. However, ongoing work at the National Renewable Energy Laboratory (NREL) indicates that NZEB status is both achievable and repeatable today. This paper presents a definition framework for classifying NZEBs and a real-life example that demonstrates how a large-scale office building can cost-effectively achieve net-zero energy. The vision of NZEBs is compelling. In theory, these highly energy-efficient buildings will produce, during a typical year, enough renewable energy to offset the energy they consume from the grid. The NREL NZEB definition framework classifies NZEBs according tomore » the criteria being used to judge net-zero status and the way renewable energy is supplied to achieve that status. We use the new U.S. Department of Energy/NREL 220,000-ft{sub 2} Research Support Facilities (RSF) building to illustrate why a clear picture of NZEB definitions is important and how the framework provides a methodology for creating a cost-effective NZEB. The RSF, scheduled to open in June 2010, includes contractual commitments to deliver a Leadership in Energy Efficiency and Design (LEED) Platinum Rating, an energy use intensity of 25 kBtu/ft{sub 2} (half that of a typical LEED Platinum office building), and net-zero energy status. We will discuss the analysis method and cost tradeoffs that were performed throughout the design and build phases to meet these commitments and maintain construction costs at $259/ft{sub 2}. We will discuss ways to achieve large-scale, replicable NZEB performance. Many passive and renewable energy strategies are utilized, including full daylighting, high-performance lighting, natural ventilation through operable windows, thermal mass, transpired solar collectors, radiant heating and cooling, and workstation configurations allow for maximum daylighting.« less

  20. Building simulation: Ten challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Langevin, Jared; Sun, Kaiyu

    Buildings consume more than one-third of the world’s primary energy. Reducing energy use and greenhouse-gas emissions in the buildings sector through energy conservation and efficiency improvements constitutes a key strategy for achieving global energy and environmental goals. Building performance simulation has been increasingly used as a tool for designing, operating and retrofitting buildings to save energy and utility costs. However, opportunities remain for researchers, software developers, practitioners and policymakers to maximize the value of building performance simulation in the design and operation of low energy buildings and communities that leverage interdisciplinary approaches to integrate humans, buildings, and the power gridmore » at a large scale. This paper presents ten challenges that highlight some of the most important issues in building performance simulation, covering the full building life cycle and a wide range of modeling scales. In conclusion, the formulation and discussion of each challenge aims to provide insights into the state-of-the-art and future research opportunities for each topic, and to inspire new questions from young researchers in this field.« less

  1. Building simulation: Ten challenges

    DOE PAGES

    Hong, Tianzhen; Langevin, Jared; Sun, Kaiyu

    2018-04-12

    Buildings consume more than one-third of the world’s primary energy. Reducing energy use and greenhouse-gas emissions in the buildings sector through energy conservation and efficiency improvements constitutes a key strategy for achieving global energy and environmental goals. Building performance simulation has been increasingly used as a tool for designing, operating and retrofitting buildings to save energy and utility costs. However, opportunities remain for researchers, software developers, practitioners and policymakers to maximize the value of building performance simulation in the design and operation of low energy buildings and communities that leverage interdisciplinary approaches to integrate humans, buildings, and the power gridmore » at a large scale. This paper presents ten challenges that highlight some of the most important issues in building performance simulation, covering the full building life cycle and a wide range of modeling scales. In conclusion, the formulation and discussion of each challenge aims to provide insights into the state-of-the-art and future research opportunities for each topic, and to inspire new questions from young researchers in this field.« less

  2. Semantic classification of urban buildings combining VHR image and GIS data: An improved random forest approach

    NASA Astrophysics Data System (ADS)

    Du, Shihong; Zhang, Fangli; Zhang, Xiuyuan

    2015-07-01

    While most existing studies have focused on extracting geometric information on buildings, only a few have concentrated on semantic information. The lack of semantic information cannot satisfy many demands on resolving environmental and social issues. This study presents an approach to semantically classify buildings into much finer categories than those of existing studies by learning random forest (RF) classifier from a large number of imbalanced samples with high-dimensional features. First, a two-level segmentation mechanism combining GIS and VHR image produces single image objects at a large scale and intra-object components at a small scale. Second, a semi-supervised method chooses a large number of unbiased samples by considering the spatial proximity and intra-cluster similarity of buildings. Third, two important improvements in RF classifier are made: a voting-distribution ranked rule for reducing the influences of imbalanced samples on classification accuracy and a feature importance measurement for evaluating each feature's contribution to the recognition of each category. Fourth, the semantic classification of urban buildings is practically conducted in Beijing city, and the results demonstrate that the proposed approach is effective and accurate. The seven categories used in the study are finer than those in existing work and more helpful to studying many environmental and social problems.

  3. Large Scale Metal Additive Techniques Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nycz, Andrzej; Adediran, Adeola I; Noakes, Mark W

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environmentmore » friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.« less

  4. From catchment scale hydrologic processes to numerical models and robust predictions of climate change impacts at regional scales

    NASA Astrophysics Data System (ADS)

    Wagener, T.

    2017-12-01

    Current societal problems and questions demand that we increasingly build hydrologic models for regional or even continental scale assessment of global change impacts. Such models offer new opportunities for scientific advancement, for example by enabling comparative hydrology or connectivity studies, and for improved support of water management decision, since we might better understand regional impacts on water resources from large scale phenomena such as droughts. On the other hand, we are faced with epistemic uncertainties when we move up in scale. The term epistemic uncertainty describes those uncertainties that are not well determined by historical observations. This lack of determination can be because the future is not like the past (e.g. due to climate change), because the historical data is unreliable (e.g. because it is imperfectly recorded from proxies or missing), or because it is scarce (either because measurements are not available at the right scale or there is no observation network available at all). In this talk I will explore: (1) how we might build a bridge between what we have learned about catchment scale processes and hydrologic model development and evaluation at larger scales. (2) How we can understand the impact of epistemic uncertainty in large scale hydrologic models. And (3) how we might utilize large scale hydrologic predictions to understand climate change impacts, e.g. on infectious disease risk.

  5. Get It Together

    ERIC Educational Resources Information Center

    Coffey, Dave

    2006-01-01

    The scale of the mechanical and plumbing systems required to support a large, multi-building academic health sciences/research center entails a lot of ductwork. Getting mechanical systems installed and running while carrying out activities from other building disciplines requires a great deal of coordinated effort. A university and its…

  6. First Large-Scale Proteogenomic Study of Breast Cancer Provides Insight into Potential Therapeutic Targets | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.

  7. Building rooftop classification using random forests for large-scale PV deployment

    NASA Astrophysics Data System (ADS)

    Assouline, Dan; Mohajeri, Nahid; Scartezzini, Jean-Louis

    2017-10-01

    Large scale solar Photovoltaic (PV) deployment on existing building rooftops has proven to be one of the most efficient and viable sources of renewable energy in urban areas. As it usually requires a potential analysis over the area of interest, a crucial step is to estimate the geometric characteristics of the building rooftops. In this paper, we introduce a multi-layer machine learning methodology to classify 6 roof types, 9 aspect (azimuth) classes and 5 slope (tilt) classes for all building rooftops in Switzerland, using GIS processing. We train Random Forests (RF), an ensemble learning algorithm, to build the classifiers. We use (2 × 2) [m2 ] LiDAR data (considering buildings and vegetation) to extract several rooftop features, and a generalised footprint polygon data to localize buildings. The roof classifier is trained and tested with 1252 labeled roofs from three different urban areas, namely Baden, Luzern, and Winterthur. The results for roof type classification show an average accuracy of 67%. The aspect and slope classifiers are trained and tested with 11449 labeled roofs in the Zurich periphery area. The results for aspect and slope classification show different accuracies depending on the classes: while some classes are well identified, other under-represented classes remain challenging to detect.

  8. The Seasonal Predictability of Extreme Wind Events in the Southwest United States

    NASA Astrophysics Data System (ADS)

    Seastrand, Simona Renee

    Extreme wind events are a common phenomenon in the Southwest United States. Entities such as the United States Air Force (USAF) find the Southwest appealing for many reasons, primarily for the an expansive, unpopulated, and electronically unpolluted space for large-scale training and testing. However, wind events can cause hazards for the USAF including: surface wind gusts can impact the take-off and landing of all aircraft, can tip the airframes of large wing-surface aircraft during the performance of maneuvers close to the ground, and can even impact weapons systems. This dissertation is comprised of three sections intended to further our knowledge and understanding of wind events in the Southwest. The first section builds a climatology of wind events for seven locations in the Southwest during the twelve 3-month seasons of the year. The first section further examines the wind events in relation to terrain and the large-scale flow of the atmosphere. The second section builds upon the first by taking the wind events and generating mid-level composites for each of the twelve 3-month seasons. In the third section, teleconnections identified as consistent with the large-scale circulation in the second paper were used as predictor variables to build a Poisson regression model for each of the twelve 3-month seasons. The purpose of this research is to increase our understanding of the climatology of extreme wind events, increase our understanding of how the large-scale circulation influences extreme wind events, and create a model to enhance predictability of extreme wind events in the Southwest. Knowledge from this paper will help protect personnel and property associated with not only the USAF, but all those in the Southwest.

  9. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1974-01-01

    Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  10. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities. [cryogenic traonics wind tunnel

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  11. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  12. Assessment of automatic ligand building in ARP/wARP.

    PubMed

    Evrard, Guillaume X; Langer, Gerrit G; Perrakis, Anastassis; Lamzin, Victor S

    2007-01-01

    The efficiency of the ligand-building module of ARP/wARP version 6.1 has been assessed through extensive tests on a large variety of protein-ligand complexes from the PDB, as available from the Uppsala Electron Density Server. Ligand building in ARP/wARP involves two main steps: automatic identification of the location of the ligand and the actual construction of its atomic model. The first step is most successful for large ligands. The second step, ligand construction, is more powerful with X-ray data at high resolution and ligands of small to medium size. Both steps are successful for ligands with low to moderate atomic displacement parameters. The results highlight the strengths and weaknesses of both the method of ligand building and the large-scale validation procedure and help to identify means of further improvement.

  13. Multisite Studies and Scaling up in Educational Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2012-01-01

    A scale-up study in education typically expands the sample of students, schools, districts, and/or practices or materials used in smaller studies in ways that build in heterogeneity. Yet surprisingly little is known about the factors that promote successful scaling up efforts in education, in large part due to the absence of empirically supported…

  14. Incorporating residual temperature and specific humidity in predicting weather-dependent warm-season electricity consumption

    NASA Astrophysics Data System (ADS)

    Guan, Huade; Beecham, Simon; Xu, Hanqiu; Ingleton, Greg

    2017-02-01

    Climate warming and increasing variability challenges the electricity supply in warm seasons. A good quantitative representation of the relationship between warm-season electricity consumption and weather condition provides necessary information for long-term electricity planning and short-term electricity management. In this study, an extended version of cooling degree days (ECDD) is proposed for better characterisation of this relationship. The ECDD includes temperature, residual temperature and specific humidity effects. The residual temperature is introduced for the first time to reflect the building thermal inertia effect on electricity consumption. The study is based on the electricity consumption data of four multiple-street city blocks and three office buildings. It is found that the residual temperature effect is about 20% of the current-day temperature effect at the block scale, and increases with a large variation at the building scale. Investigation of this residual temperature effect provides insight to the influence of building designs and structures on electricity consumption. The specific humidity effect appears to be more important at the building scale than at the block scale. A building with high energy performance does not necessarily have low specific humidity dependence. The new ECDD better reflects the weather dependence of electricity consumption than the conventional CDD method.

  15. School Mental Health: The Impact of State and Local Capacity-Building Training

    ERIC Educational Resources Information Center

    Stephan, Sharon; Paternite, Carl; Grimm, Lindsey; Hurwitz, Laura

    2014-01-01

    Despite a growing number of collaborative partnerships between schools and community-based organizations to expand school mental health (SMH) service capacity in the United States, there have been relatively few systematic initiatives focused on key strategies for large-scale SMH capacity building with state and local education systems. Based on a…

  16. Scaling earthquake ground motions for performance-based assessment of buildings

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  17. Humidity Distributions in Multilayered Walls of High-rise Buildings

    NASA Astrophysics Data System (ADS)

    Gamayunova, Olga; Musorina, Tatiana; Ishkov, Alexander

    2018-03-01

    The limitation of free territories in large cities is the main reason for the active development of high-rise construction. Given the large-scale projects of high-rise buildings in recent years in Russia and abroad and their huge energy consumption, one of the fundamental principles in the design and reconstruction is the use of energy-efficient technologies. The main heat loss in buildings occurs through enclosing structures. However, not always the heat-resistant wall will be energy-efficient and dry at the same time (perhaps waterlogging). Temperature and humidity distributions in multilayer walls were studied in the paper, and the interrelation of other thermophysical characteristics was analyzed.

  18. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  19. Experimental seismic behavior of a full-scale four-story soft-story wood-frame building with retrofits II: shake table test results

    Treesearch

    John W. van de Lindt; Pouria Bahmani; Gary Mochizuki; Steven E. Pryor; Mikhail Gershfeld; Jingjing Tian; Michael D. Symans; Douglas Rammer

    2016-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. The majority of these buildings were constructed from the 1920s to the 1960s and are prone to collapse during moderate to large earthquakes due to a characteristic deficiency in strength and stiffness in their first story. In order to propose and validate retrofit...

  20. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size.

    PubMed

    Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E

    2015-01-01

    One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics.

  1. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size

    PubMed Central

    Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E.

    2015-01-01

    One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics. PMID:26381745

  2. Transaction-Based Building Controls Framework, Volume 1: Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somasundaram, Sriram; Pratt, Robert G.; Akyol, Bora A.

    This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.

  3. Building generalized tree mass/volume component models for improved estimation of forest stocks and utilization potential

    Treesearch

    David W. MacFarlane

    2015-01-01

    Accurately assessing forest biomass potential is contingent upon having accurate tree biomass models to translate data from forest inventories. Building generality into these models is especially important when they are to be applied over large spatial domains, such as regional, national and international scales. Here, new, generalized whole-tree mass / volume...

  4. Building Capacity for Assessment in PISA for Development Countries. PISA for Development Brief 14

    ERIC Educational Resources Information Center

    OECD Publishing, 2017

    2017-01-01

    This article explains how the Program for International Student Initiative for Development (PISA-D) initiative aims to make PISA more accessible to middle- and low-income countries. A key component of PISA-D is building capacity in the participating countries for managing large-scale student learning assessments and using the results to support…

  5. Integrating Delta Building Physics & Economics: Optimizing the Scale of Engineered Avulsions in the Mississippi River Delta

    NASA Astrophysics Data System (ADS)

    Kenney, M. A.; Mohrig, D.; Hobbs, B. F.; Parker, G.

    2011-12-01

    Land loss in the Mississippi River Delta caused by subsidence and erosion has resulted in habitat loss, interference with human activities, and increased exposure of New Orleans and other settled areas to storm surge risks. Prior to dam and levee building and oil and gas production in the 20th century, the long term rates of land building roughly balanced land loss through subsidence. Now, however, sediment is being deposited at dramatically lower rates in shallow areas in and adjacent to the Delta, with much of the remaining sediment borne by the Mississippi being lost to the deep areas of the Gulf of Mexico. A few projects have been built in order to divert sediment from the river to areas where land can be built, and many more are under consideration as part of State of Louisiana and Federal planning processes. Most are small scale, although there have been some proposals for large engineered avulsions that would divert a significant fraction of the remaining available sediment (W. Kim, et al. 2009, EOS). However, there is debate over whether small or large diversions are the economically optimally and socially most acceptable size of such land building projects. From an economic point of view, the optimal size involves tradeoffs between scale economies in civil work construction, the relationship between depth of diversion and sediment concentration in river water, effects on navigation, and possible diminishing returns to land building at a single location as the edge of built land progresses into deeper waters. Because land building efforts could potentially involve billions of dollars of investment, it is important to gain as much benefit as possible from those expenditures. We present the result of a general analysis of scale economies in land building from engineered avulsions. The analysis addresses the question: how many projects of what size should be built at what time in order to maximize the amount of land built by a particular time? The analysis integrates three models: 1. coarse sediment diversion as a function of the width, depth, and timing of water diversions (using our field measurements of sediment concentration as a function of depth), 2. land building as a function of the location, water, and amount of sediment diverted, accounting for bathymetry, subsidence, and other factors, and 3. cost of building and operating the necessary civil works. Our statistical analysis of past diversions indicates existence of scale economies in width and scale of diseconomies in depth. The analysis explores general relationships between size, cost, and land building, and does not consider specific actual project proposals or locations. Sensitivity to assumptions about fine sediment capture, accumulation rates for organic material, and other inputs will be discussed.

  6. Reserch on Spatial and Temporal Distribution of Color Steel Building Based on Multi-Source High-Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Yang, S. W.; Ma, J. J.; Wang, J. M.

    2018-04-01

    As representative vulnerable regions of the city, dense distribution areas of temporary color steel building are a major target for control of fire risks, illegal buildings, environmental supervision, urbanization quality and enhancement for city's image. In the domestic and foreign literature, the related research mainly focuses on fire risks and violation monitoring. However, due to temporary color steel building's special characteristics, the corresponding research about temporal and spatial distribution, and influence on urban spatial form etc. has not been reported. Therefore, firstly, the paper research aim plans to extract information of large-scale color steel building from high-resolution images. Secondly, the color steel plate buildings were classified, and the spatial and temporal distribution and aggregation characteristics of small (temporary buildings) and large (factory building, warehouse, etc.) buildings were studied respectively. Thirdly, the coupling relationship between the spatial distribution of color steel plate and the spatial pattern of urban space was analysed. The results show that there is a good coupling relationship between the color steel plate building and the urban spatial form. Different types of color steel plate building represent the pattern of regional differentiation of urban space and the phased pattern of urban development.

  7. Combining Flux Balance and Energy Balance Analysis for Large-Scale Metabolic Network: Biochemical Circuit Theory for Analysis of Large-Scale Metabolic Networks

    NASA Technical Reports Server (NTRS)

    Beard, Daniel A.; Liang, Shou-Dan; Qian, Hong; Biegel, Bryan (Technical Monitor)

    2001-01-01

    Predicting behavior of large-scale biochemical metabolic networks represents one of the greatest challenges of bioinformatics and computational biology. Approaches, such as flux balance analysis (FBA), that account for the known stoichiometry of the reaction network while avoiding implementation of detailed reaction kinetics are perhaps the most promising tools for the analysis of large complex networks. As a step towards building a complete theory of biochemical circuit analysis, we introduce energy balance analysis (EBA), which compliments the FBA approach by introducing fundamental constraints based on the first and second laws of thermodynamics. Fluxes obtained with EBA are thermodynamically feasible and provide valuable insight into the activation and suppression of biochemical pathways.

  8. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    DTIC Science & Technology

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...of agents, and each agent attempts to form a coalition with its most profitable partner. The second algorithm builds upon the Shapley for- mula [37...ters at the second layer. These Category Layer clusters each represent a single resource, and agents join one or more clusters based on their

  9. Intelligent Facades for High Performance Green Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyson, Anna

    Progress Towards Net-Zero and Net-Positive-Energy Commercial Buildings and Urban Districts Through Intelligent Building Envelope Strategies Previous research and development of intelligent facades systems has been limited in their contribution towards national goals for achieving on-site net zero buildings, because this R&D has failed to couple the many qualitative requirements of building envelopes such as the provision of daylighting, access to exterior views, satisfying aesthetic and cultural characteristics, with the quantitative metrics of energy harvesting, storage and redistribution. To achieve energy self-sufficiency from on-site solar resources, building envelopes can and must address this gamut of concerns simultaneously. With this project, wemore » have undertaken a high-performance building integrated combined-heat and power concentrating photovoltaic system with high temperature thermal capture, storage and transport towards multiple applications (BICPV/T). The critical contribution we are offering with the Integrated Concentrating Solar Façade (ICSF) is conceived to improve daylighting quality for improved health of occupants and mitigate solar heat gain while maximally capturing and transferring onsite solar energy. The ICSF accomplishes this multi-functionality by intercepting only the direct-normal component of solar energy (which is responsible for elevated cooling loads) thereby transforming a previously problematic source of energy into a high quality resource that can be applied to building demands such as heating, cooling, dehumidification, domestic hot water, and possible further augmentation of electrical generation through organic Rankine cycles. With the ICSF technology, our team is addressing the global challenge in transitioning commercial and residential building stock towards on-site clean energy self-sufficiency, by fully integrating innovative environmental control systems strategies within an intelligent and responsively dynamic building envelope. The advantage of being able to use the entire solar spectrum for active and passive benefits, along with the potential savings of avoiding transmission losses through direct current (DC) transfer to all buildings systems directly from the site of solar conversion, gives the system a compounded economic viability within the commercial and institutional building markets. With a team that spans multiple stakeholders across disparate industries, from CPV to A&E partners that are responsible for the design and development of District and Regional Scale Urban Development, this project demonstrates that integrating utility-scale high efficiency CPV installations with urban and suburban environments is both viable and desirable within the marketplace. The historical schism between utility scale CPV and BIPV has been one of differing scale and cultures. There is no technical reason why utility-scale CPV cannot be located within urban embedded district scale sites of energy harvesting. New models for leasing large areas of district scale roofs and facades are emerging, such that the model for utility scale energy harvesting can be reconciled to commercial and public scale building sites and campuses. This consortium is designed to unite utility scale solar harvesting into building applications for smart grid development.« less

  10. Capacity Building: Data- and Research-Informed Development of Schools and Teaching Practices in Denmark and Norway

    ERIC Educational Resources Information Center

    Qvortrup, Lars

    2016-01-01

    Based on experiences from a number of large scale data- and research-informed school development projects in Denmark and Norway, led by the author, three hypotheses are discussed: that an effective way of linking research and practice is achieved (1) using a capacity building approach, that is, to collaborate in the practical school context…

  11. Using a framework to implement large-scale innovation in medical education with the intent of achieving sustainability.

    PubMed

    Hudson, Judith N; Farmer, Elizabeth A; Weston, Kathryn M; Bushnell, John A

    2015-01-16

    Particularly when undertaken on a large scale, implementing innovation in higher education poses many challenges. Sustaining the innovation requires early adoption of a coherent implementation strategy. Using an example from clinical education, this article describes a process used to implement a large-scale innovation with the intent of achieving sustainability. Desire to improve the effectiveness of undergraduate medical education has led to growing support for a longitudinal integrated clerkship (LIC) model. This involves a move away from the traditional clerkship of 'block rotations' with frequent changes in disciplines, to a focus upon clerkships with longer duration and opportunity for students to build sustained relationships with supervisors, mentors, colleagues and patients. A growing number of medical schools have adopted the LIC model for a small percentage of their students. At a time when increasing medical school numbers and class sizes are leading to competition for clinical supervisors it is however a daunting challenge to provide a longitudinal clerkship for an entire medical school class. This challenge is presented to illustrate the strategy used to implement sustainable large scale innovation. A strategy to implement and build a sustainable longitudinal integrated community-based clerkship experience for all students was derived from a framework arising from Roberto and Levesque's research in business. The framework's four core processes: chartering, learning, mobilising and realigning, provided guidance in preparing and rolling out the 'whole of class' innovation. Roberto and Levesque's framework proved useful for identifying the foundations of the implementation strategy, with special emphasis on the relationship building required to implement such an ambitious initiative. Although this was innovation in a new School it required change within the school, wider university and health community. Challenges encountered included some resistance to moving away from traditional hospital-centred education, initial student concern, resource limitations, workforce shortage and potential burnout of the innovators. Large-scale innovations in medical education may productively draw upon research from other disciplines for guidance on how to lay the foundations for successfully achieving sustainability.

  12. Review of optimization techniques of polygeneration systems for building applications

    NASA Astrophysics Data System (ADS)

    Y, Rong A.; Y, Su; R, Lahdelma

    2016-08-01

    Polygeneration means simultaneous production of two or more energy products in a single integrated process. Polygeneration is an energy-efficient technology and plays an important role in transition into future low-carbon energy systems. It can find wide applications in utilities, different types of industrial sectors and building sectors. This paper mainly focus on polygeneration applications in building sectors. The scales of polygeneration systems in building sectors range from the micro-level for a single home building to the large- level for residential districts. Also the development of polygeneration microgrid is related to building applications. The paper aims at giving a comprehensive review for optimization techniques for designing, synthesizing and operating different types of polygeneration systems for building applications.

  13. Robust Control of Multivariable and Large Scale Systems.

    DTIC Science & Technology

    1986-03-14

    AD-A175 $5B ROBUST CONTROL OF MULTIVRRIALE AND LARG SCALE SYSTEMS V2 R75 (U) HONEYWELL SYSTEMS AND RESEARCH CENTER MINNEAPOLIS MN J C DOYLE ET AL...ONIJQ 86 R alFS ja ,.AMIECFOEPF:ORMING ORGANIZATION So OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATI ON jonevwell Systems & Research If 4000c" Air...Force Office of Scientific Research .~ C :AE S C.rv. Stare arma ZIP Code) 7C ADDRESS (Crty. Stare. am ZIP Code, *3660 Marshall Street NE Building 410

  14. DEEP: A Database of Energy Efficiency Performance to Accelerate Energy Retrofitting of Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Lee, Sang; Hong, Tianzhen; Sawaya, Geof

    The paper presents a method and process to establish a database of energy efficiency performance (DEEP) to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 35 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER prototype buildings. The prototype buildings represent seven building types across six vintages of constructions andmore » 16 California climate zones. DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and domestic hot water. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of an on-going project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users’ decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit. DEEP will be migrated into the DEnCity - DOE’s Energy City, which integrates large-scale energy data for multi-purpose, open, and dynamic database leveraging diverse source of existing simulation data.« less

  15. Linking Research, Education and Public Engagement in Geoscience: Leadership and Strategic Partnerships

    NASA Astrophysics Data System (ADS)

    Spellman, K.

    2017-12-01

    A changing climate has impacted Alaska communities at unprecedented rates, and the need for efficient and effective climate change learning in the Boreal and Arctic regions is urgent. Learning programs that can both increase personal understanding and connection to climate change science and also inform large scale scientific research about climate change are an attractive option for building community adaptive capacity at multiple scales. Citizen science has emerged as a powerful tool for facilitating learning across scales, and for building partnerships across natural sciences research, education, and outreach disciplines. As an early career scientist and interdisciplinary researcher, citizen science has become the centerpiece of my work and has provided some of the most rewarding moments of my career. I will discuss my early career journey building a research and leadership portfolio integrating climate change research, learning research, and public outreach through citizen science. I will share key experiences from graduate student to early career PI that cultivated my leadership skills and ability to build partnerships necessary to create citizen science programs that emphasize synergy between climate change research and education.

  16. A Pile of Legos.

    ERIC Educational Resources Information Center

    DePino, Andrew, Jr.

    1994-01-01

    Describes the relationships a high school built with neighborhood industry, a national laboratory, a national museum, and a large university while trying to build a scale model of the original atomic pile. Provides suggestions for teachers. (MVL)

  17. Lightweight computational steering of very large scale molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less

  18. Building and measuring a high performance network architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, William T.C.; Toole, Timothy; Fisher, Chuck

    2001-04-20

    Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning.more » The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.« less

  19. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  20. Highly crystalline covalent organic frameworks from flexible building blocks.

    PubMed

    Xu, Liqian; Ding, San-Yuan; Liu, Junmin; Sun, Junliang; Wang, Wei; Zheng, Qi-Yu

    2016-03-28

    Two novel 2D covalent organic frameworks (TPT-COF-1 and TPT-COF-2) were synthesized from the flexible 2,4,6-triaryloxy-1,3,5-triazine building blocks on a gram scale, which show high crystallinity and large surface area. The controllable formation of highly ordered frameworks is mainly attributed to the self-assembly Piedfort unit of 2,4,6-triaryloxy-1,3,5-triazine.

  1. ResStock Analysis Tool | Buildings | NREL

    Science.gov Websites

    Energy and Cost Savings for U.S. Homes Contact Eric Wilson to learn how ResStock can benefit your approach to large-scale residential energy analysis by combining: Large public and private data sources uncovered $49 billion in potential annual utility bill savings through cost-effective energy efficiency

  2. Towards large-scale mapping of urban three-dimensional structure using Landsat imagery and global elevation datasets

    NASA Astrophysics Data System (ADS)

    Wang, P.; Huang, C.

    2017-12-01

    The three-dimensional (3D) structure of buildings and infrastructures is fundamental to understanding and modelling of the impacts and challenges of urbanization in terms of energy use, carbon emissions, and earthquake vulnerabilities. However, spatially detailed maps of urban 3D structure have been scarce, particularly in fast-changing developing countries. We present here a novel methodology to map the volume of buildings and infrastructures at 30 meter resolution using a synergy of Landsat imagery and openly available global digital surface models (DSMs), including the Shuttle Radar Topography Mission (SRTM), ASTER Global Digital Elevation Map (GDEM), ALOS World 3D - 30m (AW3D30), and the recently released global DSM from the TanDEM-X mission. Our method builds on the concept of object-based height profile to extract height metrics from the DSMs and use a machine learning algorithm to predict height and volume from the height metrics. We have tested this algorithm in the entire England and assessed our result using Lidar measurements in 25 England cities. Our initial assessments achieved a RMSE of 1.4 m (R2 = 0.72) for building height and a RMSE of 1208.7 m3 (R2 = 0.69) for building volume, demonstrating the potential of large-scale applications and fully automated mapping of urban structure.

  3. Evaluation of Hydrogel Technologies for the Decontamination ...

    EPA Pesticide Factsheets

    Report This current research effort was developed to evaluate intermediate level (between bench-scale and large-scale or wide-area implementation) decontamination procedures, materials, technologies, and techniques used to remove radioactive material from different surfaces. In the event of such an incident, application of this technology would primarily be intended for decontamination of high-value buildings, important infrastructure, and landmarks.

  4. Building spatially-explicit model predictions for ecological condition of streams in the Pacific Northwest: An assessment of landscape variables, models, endpoints and prediction scale

    EPA Science Inventory

    While large-scale, randomized surveys estimate the percentage of a region’s streams in poor ecological condition, identifying particular stream reaches or watersheds in poor condition is an equally important goal for monitoring and management. We built predictive models of strea...

  5. AsterAnts: A Concept for Large-Scale Meteoroid Return and Processing using the International Space Station

    NASA Technical Reports Server (NTRS)

    Globus, Al; Biegel, Bryan A.; Traugott, Steve

    2004-01-01

    AsterAnts is a concept calling for a fleet of solar sail powered spacecraft to retrieve large numbers of small (1/2-1 meter diameter) Near Earth Objects (NEOs) for orbital processing. AsterAnts could use the International Space Station (ISS) for NEO processing, solar sail construction, and to test NEO capture hardware. Solar sails constructed on orbit are expected to have substantially better performance than their ground built counterparts [Wright 1992]. Furthermore, solar sails may be used to hold geosynchronous communication satellites out-of-plane [Forward 1981] increasing the total number of slots by at least a factor of three. potentially generating $2 billion worth of orbital real estate over North America alone. NEOs are believed to contain large quantities of water, carbon, other life-support materials and metals. Thus. with proper processing, NEO materials could in principle be used to resupply the ISS, produce rocket propellant, manufacture tools, and build additional ISS working space. Unlike proposals requiring massive facilities, such as lunar bases, before returning any extraterrestrial larger than a typical inter-planetary mission. Furthermore, AsterAnts could be scaled up to deliver large amounts of material by building many copies of the same spacecraft, thereby achieving manufacturing economies of scale. Because AsterAnts would capture NEOs whole, NEO composition details, which are generally poorly characterized, are relatively unimportant and no complex extraction equipment is necessary. In combination with a materials processing facility at the ISS, AsterAnts might inaugurate an era of large-scale orbital construction using extraterrestrial materials.

  6. Large Composite Structures Processing Technologies for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.

    2001-01-01

    Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.

  7. Mapping the Heavens: Probing Cosmology with Large Surveys

    ScienceCinema

    Frieman, Joshua [Fermilab

    2017-12-09

    This talk will provide an overview of recent and on-going sky surveys, focusing on their implications for cosmology. I will place particular emphasis on the Sloan Digital Sky Survey, the most ambitious mapping of the Universe yet undertaken, showing a virtual fly-through of the survey that reveals the large-scale structure of the galaxy distribution. Recent measurements of this large-scale structure, in combination with observations of the cosmic microwave background, have provided independent evidence for a Universe dominated by dark matter and dark energy as well as insights into how galaxies and larger-scale structures formed. Future planned surveys will build on these foundations to probe the history of the cosmic expansion--and thereby the dark energy--with greater precision.

  8. Guided growth of large-scale, horizontally aligned arrays of single-walled carbon nanotubes and their use in thin-film transistors.

    PubMed

    Kocabas, Coskun; Hur, Seung-Hyun; Gaur, Anshu; Meitl, Matthew A; Shim, Moonsub; Rogers, John A

    2005-11-01

    A convenient process for generating large-scale, horizontally aligned arrays of pristine, single-walled carbon nanotubes (SWNTs) is described. The approach uses guided growth, by chemical vapor deposition (CVD), of SWNTs on miscut single-crystal quartz substrates. Studies of the growth reveal important relationships between the density and alignment of the tubes, the CVD conditions, and the morphology of the quartz. Electrodes and dielectrics patterned on top of these arrays yield thin-film transistors that use the SWNTs as effective thin-film semiconductors. The ability to build high-performance devices of this type suggests significant promise for large-scale aligned arrays of SWNTs in electronics, sensors, and other applications.

  9. Mobility Data Analytics Center.

    DOT National Transportation Integrated Search

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  10. An Integrative Structural Health Monitoring System for the Local/Global Responses of a Large-Scale Irregular Building under Construction

    PubMed Central

    Park, Hyo Seon; Shin, Yunah; Choi, Se Woon; Kim, Yousok

    2013-01-01

    In this study, a practical and integrative SHM system was developed and applied to a large-scale irregular building under construction, where many challenging issues exist. In the proposed sensor network, customized energy-efficient wireless sensing units (sensor nodes, repeater nodes, and master nodes) were employed and comprehensive communications from the sensor node to the remote monitoring server were conducted through wireless communications. The long-term (13-month) monitoring results recorded from a large number of sensors (75 vibrating wire strain gauges, 10 inclinometers, and three laser displacement sensors) indicated that the construction event exhibiting the largest influence on structural behavior was the removal of bents that were temporarily installed to support the free end of the cantilevered members during their construction. The safety of each member could be confirmed based on the quantitative evaluation of each response. Furthermore, it was also confirmed that the relation between these responses (i.e., deflection, strain, and inclination) can provide information about the global behavior of structures induced from specific events. Analysis of the measurement results demonstrates the proposed sensor network system is capable of automatic and real-time monitoring and can be applied and utilized for both the safety evaluation and precise implementation of buildings under construction. PMID:23860317

  11. Energy Conservation: Heating Navy Hangars

    DTIC Science & Technology

    1984-07-01

    temperature, IF Tf Inside air temperature 1 foot above the floor, OF T. Inside design temperature, IF To Hot water temperature setpoint , OF TON Chiller ...systems capable of optimizing energy usage base-wide. An add-on to an existing large scale EMCS is probably the first preference, followed by single...the building comfort conditions are met during hours of building occupancy. 2. Optimized Start/Stop turns on equipment at the latest possible time and

  12. Layer by Layer Growth of 2D Quantum Superlattices (NBIT III)

    DTIC Science & Technology

    2017-02-28

    building quantum superlatticies using 2D materials as the building blocks. Specifically, we develop methods that allow i) large-scale growth of aligned...superlattice and heterostructures, iii) lateral and clean patterning of 2D materials for atomically-thin circuitry and iv) novel physical properties...high precision and flexibility beyond conventional methods. Moreover, it provides the solutions for current major barrier for 2D materials (e.g

  13. Building Community-Engaged Health Research and Discovery Infrastructure on the South Side of Chicago: Science in Service to Community Priorities

    PubMed Central

    Lindau, Stacy Tessler; Makelarski, Jennifer A.; Chin, Marshall H.; Desautels, Shane; Johnson, Daniel; Johnson, Waldo E.; Miller, Doriane; Peters, Susan; Robinson, Connie; Schneider, John; Thicklin, Florence; Watson, Natalie P.; Wolfe, Marcus; Whitaker, Eric

    2011-01-01

    Objective To describe the roles community members can and should play in, and an asset-based strategy used by Chicago’s South Side Health and Vitality Studies for, building sustainable, large-scale community health research infrastructure. The Studies are a family of research efforts aiming to produce actionable knowledge to inform health policy, programming, and investments for the region. Methods Community and university collaborators, using a consensus-based approach, developed shared theoretical perspectives, guiding principles, and a model for collaboration in 2008, which were used to inform an asset-based operational strategy. Ongoing community engagement and relationship-building support the infrastructure and research activities of the Studies. Results Key steps in the asset-based strategy include: 1) continuous community engagement and relationship building, 2) identifying community priorities, 3) identifying community assets, 4) leveraging assets, 5) conducting research, 6) sharing knowledge and 7) informing action. Examples of community member roles, and how these are informed by the Studies’ guiding principles, are provided. Conclusions Community and university collaborators, with shared vision and principles, can effectively work together to plan innovative, large-scale community-based research that serves community needs and priorities. Sustainable, effective models are needed to realize NIH’s mandate for meaningful translation of biomedical discovery into improved population health. PMID:21236295

  14. Collective backscattering of gyrotron radiation by small-scale plasma density fluctuations in large helical device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharchev, Nikolay; Batanov, German; Petrov, Alexandr

    2008-10-15

    A version of the collective backscattering diagnostic using gyrotron radiation for small-scale turbulence is described. The diagnostic is used to measure small-scale (k{sub s}{approx_equal}34 cm{sup -1}) plasma density fluctuations in large helical device experiments on the electron cyclotron heating of plasma with the use of 200 kW 82.7 GHz heating gyrotron. A good signal to noise ratio during plasma production phase was obtained, while contamination of stray light increased during plasma build-up phase. The effect of the stray radiation was investigated. The available quasioptical system of the heating system was utilized for this purpose.

  15. Use of Analogy in Learning Physics: The Role of Representations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Finkelstein, Naoh D.

    2006-01-01

    Previous studies have demonstrated that analogies can promote student learning in physics and can be productively taught to students to support their learning, under certain conditions. We build on these studies to explore the use of analogy by students in a large introductory college physics course. In the first large-scale study of its kind, we…

  16. Environmental assessment: The Eden project

    NASA Astrophysics Data System (ADS)

    Roza, Christodoulaki

    Non domestic buildings account for about one-sixth of the U.K.'s entire C02 emissions and one-third of the building related ones 2 . Their proportion of energy consumption, particularly electricity, has also been growing 2 . New buildings are not necessarily better, with energy use often proving to be much higher than their designers anticipated 2 . Annual C02 emissions of two- and sometimes three- times design expectations are far from unusual, leaving a massive credibility gap 2 . These and other global environmental and human health related concerns have motivated an increasing number of designers, developers and building users to pursue more environmentally sustainable designs and construction strategies 5 . However, these buildings can be difficult to evaluate, since they are large in scale, complex in materials and function and temporally dynamic due to limited service life of building components and changing user requirements 5 . All of these factors make environmental assessment of the buildings challenging. Previous Post Occupancy Review of Buildings and their Engineering (PROBE) building investigations have uncovered serious shortcomings in facilities management, or at least mismatches between a building's management needs and the ability of the occupiers to provide the right level of management 1 . Consequently, large differences between energy performance expectations and outcomes can occur virtually unnoticed, while designers continue to repeat flawed descriptions 2 . This investigation attempts to evaluate the building's operation and to help achieving demonstrable improvements in terms of energy efficiency and occupant satisfaction. The scope of this study is to evaluate the actual environmental performance of a building notable for its advanced design. The Education Resource Centre at the Eden Project was selected to compare design expectations with post occupancy performance. This report contains a small-scale survey of user satisfaction with the chosen building, an analysis of the building's energy use and information about the physical and managerial circumstances operating 24 . The author has attempted to zoom in on specific issues, such as energy performance and lighting consumption. Both successes and failures have been reported, providing owners, designers and end users with valuable, real-world information.

  17. Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valencia, Jayson F.; Dirks, James A.

    2008-08-29

    EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less

  18. Intervention for First Graders with Limited Number Knowledge: Large-Scale Replication of a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Gersten, Russell; Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Wilkins, Chuck; Dimino, Joseph

    2015-01-01

    Replication studies are extremely rare in education. This randomized controlled trial (RCT) is a scale-up replication of Fuchs et al., which in a sample of 139 found a statistically significant positive impact for Number Rockets, a small-group intervention for at-risk first graders that focused on building understanding of number operations. The…

  19. On Feature Extraction from Large Scale Linear LiDAR Data

    NASA Astrophysics Data System (ADS)

    Acharjee, Partha Pratim

    Airborne light detection and ranging (LiDAR) can generate co-registered elevation and intensity map over large terrain. The co-registered 3D map and intensity information can be used efficiently for different feature extraction application. In this dissertation, we developed two algorithms for feature extraction, and usages of features for practical applications. One of the developed algorithms can map still and flowing waterbody features, and another one can extract building feature and estimate solar potential on rooftops and facades. Remote sensing capabilities, distinguishing characteristics of laser returns from water surface and specific data collection procedures provide LiDAR data an edge in this application domain. Furthermore, water surface mapping solutions must work on extremely large datasets, from a thousand square miles, to hundreds of thousands of square miles. National and state-wide map generation/upgradation and hydro-flattening of LiDAR data for many other applications are two leading needs of water surface mapping. These call for as much automation as possible. Researchers have developed many semi-automated algorithms using multiple semi-automated tools and human interventions. This reported work describes a consolidated algorithm and toolbox developed for large scale, automated water surface mapping. Geometric features such as flatness of water surface, higher elevation change in water-land interface and, optical properties such as dropouts caused by specular reflection, bimodal intensity distributions were some of the linear LiDAR features exploited for water surface mapping. Large-scale data handling capabilities are incorporated by automated and intelligent windowing, by resolving boundary issues and integrating all results to a single output. This whole algorithm is developed as an ArcGIS toolbox using Python libraries. Testing and validation are performed on a large datasets to determine the effectiveness of the toolbox and results are presented. Significant power demand is located in urban areas, where, theoretically, a large amount of building surface area is also available for solar panel installation. Therefore, property owners and power generation companies can benefit from a citywide solar potential map, which can provide available estimated annual solar energy at a given location. An efficient solar potential measurement is a prerequisite for an effective solar energy system in an urban area. In addition, the solar potential calculation from rooftops and building facades could open up a wide variety of options for solar panel installations. However, complex urban scenes make it hard to estimate the solar potential, partly because of shadows cast by the buildings. LiDAR-based 3D city models could possibly be the right technology for solar potential mapping. Although, most of the current LiDAR-based local solar potential assessment algorithms mainly address rooftop potential calculation, whereas building facades can contribute a significant amount of viable surface area for solar panel installation. In this paper, we introduce a new algorithm to calculate solar potential of both rooftop and building facades. Solar potential received by the rooftops and facades over the year are also investigated in the test area.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cutler, Dylan; Frank, Stephen; Slovensky, Michelle

    Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less

  1. The influence of cosmic rays on the stability and large-scale dynamics of the interstellar medium

    NASA Astrophysics Data System (ADS)

    Kuznetsov, V. D.

    1986-06-01

    The diffusion-convection formulation is used to study the influence of galactic cosmic rays on the stability and dynamics of the interstellar medium which is supposedly kept in equilibrium by the gravitational field of stars. It is shown that the influence of cosmic rays on the growth rate of MHD instability depends largely on a dimensionless parameter expressing the ratio of the characteristic acoustic time scale to the cosmic-ray diffusion time. If this parameter is small, the cosmic rays will decelerate the build-up of instabilities, thereby stabilizing the system; in contrast, if the parameter is large, the system will be destabilized.

  2. Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Mah, S. B.; Cryderman, C. S.

    2015-08-01

    Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.

  3. A Protocol for Generating and Exchanging (Genome-Scale) Metabolic Resource Allocation Models.

    PubMed

    Reimers, Alexandra-M; Lindhorst, Henning; Waldherr, Steffen

    2017-09-06

    In this article, we present a protocol for generating a complete (genome-scale) metabolic resource allocation model, as well as a proposal for how to represent such models in the systems biology markup language (SBML). Such models are used to investigate enzyme levels and achievable growth rates in large-scale metabolic networks. Although the idea of metabolic resource allocation studies has been present in the field of systems biology for some years, no guidelines for generating such a model have been published up to now. This paper presents step-by-step instructions for building a (dynamic) resource allocation model, starting with prerequisites such as a genome-scale metabolic reconstruction, through building protein and noncatalytic biomass synthesis reactions and assigning turnover rates for each reaction. In addition, we explain how one can use SBML level 3 in combination with the flux balance constraints and our resource allocation modeling annotation to represent such models.

  4. Building continental-scale 3D subsurface layers in the Digital Crust project: constrained interpolation and uncertainty estimation.

    NASA Astrophysics Data System (ADS)

    Yulaeva, E.; Fan, Y.; Moosdorf, N.; Richard, S. M.; Bristol, S.; Peters, S. E.; Zaslavsky, I.; Ingebritsen, S.

    2015-12-01

    The Digital Crust EarthCube building block creates a framework for integrating disparate 3D/4D information from multiple sources into a comprehensive model of the structure and composition of the Earth's upper crust, and to demonstrate the utility of this model in several research scenarios. One of such scenarios is estimation of various crustal properties related to fluid dynamics (e.g. permeability and porosity) at each node of any arbitrary unstructured 3D grid to support continental-scale numerical models of fluid flow and transport. Starting from Macrostrat, an existing 4D database of 33,903 chronostratigraphic units, and employing GeoDeepDive, a software system for extracting structured information from unstructured documents, we construct 3D gridded fields of sediment/rock porosity, permeability and geochemistry for large sedimentary basins of North America, which will be used to improve our understanding of large-scale fluid flow, chemical weathering rates, and geochemical fluxes into the ocean. In this talk, we discuss the methods, data gaps (particularly in geologically complex terrain), and various physical and geological constraints on interpolation and uncertainty estimation.

  5. OUT Success Stories: Solar Hot Water Technology

    DOE R&D Accomplishments Database

    Clyne, R.

    2000-08-01

    Solar hot water technology was made great strides in the past two decades. Every home, commercial building, and industrial facility requires hot water. DOE has helped to develop reliable and durable solar hot water systems. For industrial applications, the growth potential lies in large-scale systems, using flat-plate and trough-type collectors. Flat-plate collectors are commonly used in residential hot water systems and can be integrated into the architectural design of the building.

  6. Pseudo-dynamic tests on masonry residential buildings seismically retrofitted by precast steel reinforced concrete walls

    NASA Astrophysics Data System (ADS)

    Li, Wenfeng; Wang, Tao; Chen, Xi; Zhong, Xiang; Pan, Peng

    2017-07-01

    A retrofitting technology using precast steel reinforced concrete (PSRC) panels is developed to improve the seismic performance of old masonry buildings. The PSRC panels are built up as an external PSRC wall system surrounding the existing masonry building. The PSRC walls are well connected to the existing masonry building, which provides enough confinement to effectively improve the ductility, strength, and stiffenss of old masonry structures. The PSRC panels are prefabricated in a factory, significantly reducing the situ work and associated construction time. To demonstrate the feasibility and mechanical effectivenss of the proposed retrofitting system, a full-scale five-story specimen was constructed. The retrofitting process was completed within five weeks with very limited indoor operation. The specimen was then tested in the lateral direction, which could potentially suffer sigifnicant damage in a large earthquake. The technical feasibility, construction workability, and seismic performance were thoroughly demonstrated by a full-scale specimen construction and pseudo-dynamic tests.

  7. Statistical Ensemble of Large Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Carati, Daniele; Rogers, Michael M.; Wray, Alan A.; Mansour, Nagi N. (Technical Monitor)

    2001-01-01

    A statistical ensemble of large eddy simulations (LES) is run simultaneously for the same flow. The information provided by the different large scale velocity fields is used to propose an ensemble averaged version of the dynamic model. This produces local model parameters that only depend on the statistical properties of the flow. An important property of the ensemble averaged dynamic procedure is that it does not require any spatial averaging and can thus be used in fully inhomogeneous flows. Also, the ensemble of LES's provides statistics of the large scale velocity that can be used for building new models for the subgrid-scale stress tensor. The ensemble averaged dynamic procedure has been implemented with various models for three flows: decaying isotropic turbulence, forced isotropic turbulence, and the time developing plane wake. It is found that the results are almost independent of the number of LES's in the statistical ensemble provided that the ensemble contains at least 16 realizations.

  8. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    PubMed

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  9. Beyond Widgets -- Systems Incentive Programs for Utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regnier, Cindy; Mathew, Paul; Robinson, Alastair

    Utility incentive programs remain one of the most significant means of deploying commercialized, but underutilized building technologies to scale. However, these programs have been largely limited to component-based products (e.g., lamps, RTUs). While some utilities do provide ‘custom’ incentive programs with whole building and system level technical assistance, these programs require deeper levels of analysis, resulting in higher program costs. This results in custom programs being restricted to utilities with greater resources, and are typically applied mainly to large or energy-intensive facilities, leaving much of the market without cost effective access and incentives for these solutions. In addition, with increasinglymore » stringent energy codes, cost effective component-based solutions that achieve significant savings are dwindling. Building systems (e.g., integrated façade, HVAC and/or lighting solutions) can deliver higher savings that translate into large sector-wide savings if deployed at the scale of these programs. However, systems application poses a number of challenges – baseline energy use must be defined and measured; the metrics for energy and performance must be defined and tested against; in addition, system savings must be validated under well understood conditions. This paper presents a sample of findings of a project to develop validated utility incentive program packages for three specific integrated building systems, in collaboration with Xcel Energy (CO, MN), ComEd, and a consortium of California Public Owned Utilities (CA POUs) (Northern California Power Agency(NCPA) and the Southern California Public Power Authority(SCPPA)). Furthermore, these program packages consist of system specifications, system performance, M&V protocols, streamlined assessment methods, market assessment and implementation guidance.« less

  10. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data

    PubMed Central

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H.

    2013-01-01

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the “big data” challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce. PMID:24501719

  11. The influence of micro-topography and external bioerosion on coral-reef-building organisms: recruitment, community composition and carbonate production over time

    NASA Astrophysics Data System (ADS)

    Mallela, Jennie

    2018-03-01

    The continued health and function of tropical coral reefs is highly dependent on the ability of reef-building organisms to build large, complex, three-dimensional structures that continue to accrete and evolve over time. The recent deterioration of reef health globally, including loss of coral cover, has resulted in significant declines in architectural complexity at a large, reef-scape scale. Interestingly, the fine-scale role of micro-structure in initiating and facilitating future reef development and calcium carbonate production has largely been overlooked. In this study, experimental substrates with and without micro-ridges were deployed in the lagoon at One Tree Island for 34 months. This study assessed how the presence or absence of micro-ridges promoted recruitment by key reef-building sclerobionts (corals and encrusters) and their subsequent development at micro (mm) and macro (cm) scales. Experimental plates were examined after 11 and 34 months to assess whether long-term successional and calcification processes on different micro-topographies led to convergent or divergent communities over time. Sclerobionts were most prevalent in micro-grooves when they were available. Interestingly, in shallow lagoon reef sites characterised by shoals of small parrotfish and low urchin abundance, flat substrates were also successfully recruited to. Mean rates of carbonate production were 374 ± 154 (SD) g CaCO3 m-2 yr-1 within the lagoon. Substrates with micro-ridges were characterised by significantly greater rates of carbonate production than smooth substrates. The orientation of the substrate and period of immersion also significantly impacted rates of carbonate production, with CaCO3 on cryptic tiles increasing by 28% between 11 and 34 months. In contrast, rates on exposed tiles declined by 35% over the same time. In conclusion, even at sites characterised by small-sized parrotfish and low urchin density, micro-topography is an important settlement niche clearly favouring sclerobiont early life-history processes and subsequent carbonate production.

  12. Towards building high performance medical image management system for clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel

    2011-03-01

    Medical image based biomarkers are being established for therapeutic cancer clinical trials, where image assessment is among the essential tasks. Large scale image assessment is often performed by a large group of experts by retrieving images from a centralized image repository to workstations to markup and annotate images. In such environment, it is critical to provide a high performance image management system that supports efficient concurrent image retrievals in a distributed environment. There are several major challenges: high throughput of large scale image data over the Internet from the server for multiple concurrent client users, efficient communication protocols for transporting data, and effective management of versioning of data for audit trails. We study the major bottlenecks for such a system, propose and evaluate a solution by using a hybrid image storage with solid state drives and hard disk drives, RESTfulWeb Services based protocols for exchanging image data, and a database based versioning scheme for efficient archive of image revision history. Our experiments show promising results of our methods, and our work provides a guideline for building enterprise level high performance medical image management systems.

  13. Highly multiplexed targeted proteomics using precise control of peptide retention time.

    PubMed

    Gallien, Sebastien; Peterman, Scott; Kiyonami, Reiko; Souady, Jamal; Duriez, Elodie; Schoen, Alan; Domon, Bruno

    2012-04-01

    Large-scale proteomics applications using SRM analysis on triple quadrupole mass spectrometers present new challenges to LC-MS/MS experimental design. Despite the automation of building large-scale LC-SRM methods, the increased numbers of targeted peptides can compromise the balance between sensitivity and selectivity. To facilitate large target numbers, time-scheduled SRM transition acquisition is performed. Previously published results have demonstrated incorporation of a well-characterized set of synthetic peptides enabled chromatographic characterization of the elution profile for most endogenous peptides. We have extended this application of peptide trainer kits to not only build SRM methods but to facilitate real-time elution profile characterization that enables automated adjustment of the scheduled detection windows. Incorporation of dynamic retention time adjustments better facilitate targeted assays lasting several days without the need for constant supervision. This paper provides an overview of how the dynamic retention correction approach identifies and corrects for commonly observed LC variations. This adjustment dramatically improves robustness in targeted discovery experiments as well as routine quantification experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Scaling NASA Applications to 1024 CPUs on Origin 3K

    NASA Technical Reports Server (NTRS)

    Taft, Jim

    2002-01-01

    The long and highly successful joint SGI-NASA research effort in ever larger SSI systems was to a large degree the result of the successful development of the MLP scalable parallel programming paradigm developed at ARC: 1) MLP scaling in real production codes justified ever larger systems at NAS; 2) MLP scaling on 256p Origin 2000 gave SGl impetus to productize 256p; 3) MLP scaling on 512 gave SGI courage to build 1024p O3K; and 4) History of MLP success resulted in IBM Star Cluster based MLP effort.

  15. Building brains for bodies

    NASA Technical Reports Server (NTRS)

    Brooks, Rodney Allen; Stein, Lynn Andrea

    1994-01-01

    We describe a project to capitalize on newly available levels of computational resources in order to understand human cognition. We will build an integrated physical system including vision, sound input and output, and dextrous manipulation, all controlled by a continuously operating large scale parallel MIMD computer. The resulting system will learn to 'think' by building on its bodily experiences to accomplish progressively more abstract tasks. Past experience suggests that in attempting to build such an integrated system we will have to fundamentally change the way artificial intelligence, cognitive science, linguistics, and philosophy think about the organization of intelligence. We expect to be able to better reconcile the theories that will be developed with current work in neuroscience.

  16. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  17. Cognitive Mapping Based on Conjunctive Representations of Space and Movement

    PubMed Central

    Zeng, Taiping; Si, Bailu

    2017-01-01

    It is a challenge to build robust simultaneous localization and mapping (SLAM) system in dynamical large-scale environments. Inspired by recent findings in the entorhinal–hippocampal neuronal circuits, we propose a cognitive mapping model that includes continuous attractor networks of head-direction cells and conjunctive grid cells to integrate velocity information by conjunctive encodings of space and movement. Visual inputs from the local view cells in the model provide feedback cues to correct drifting errors of the attractors caused by the noisy velocity inputs. We demonstrate the mapping performance of the proposed cognitive mapping model on an open-source dataset of 66 km car journey in a 3 km × 1.6 km urban area. Experimental results show that the proposed model is robust in building a coherent semi-metric topological map of the entire urban area using a monocular camera, even though the image inputs contain various changes caused by different light conditions and terrains. The results in this study could inspire both neuroscience and robotic research to better understand the neural computational mechanisms of spatial cognition and to build robust robotic navigation systems in large-scale environments. PMID:29213234

  18. Seismic isolation of buildings using composite foundations based on metamaterials

    NASA Astrophysics Data System (ADS)

    Casablanca, O.; Ventura, G.; Garescı, F.; Azzerboni, B.; Chiaia, B.; Chiappini, M.; Finocchio, G.

    2018-05-01

    Metamaterials can be engineered to interact with waves in entirely new ways, finding application on the nanoscale in various fields such as optics and acoustics. In addition, acoustic metamaterials can be used in large-scale experiments for filtering and manipulating seismic waves (seismic metamaterials). Here, we propose seismic isolation based on a device that combines some properties of seismic metamaterials (e.g., periodic mass-in-mass systems) with that of a standard foundation positioned right below the building for isolation purposes. The concepts on which this solution is based are the local resonance and a dual-stiffness structure that preserves large (small) rigidity for compression (shear) effects. In other words, this paper introduces a different approach to seismic isolation by using certain principles of seismic metamaterials. The experimental demonstrator tested on the laboratory scale exhibits a spectral bandgap that begins at 4.5 Hz. Within the bandgap, it filters more than 50% of the seismic energy via an internal dissipation process. Our results open a path toward the seismic resilience of buildings and a critical infrastructure to shear seismic waves, achieving higher efficiency compared to traditional seismic insulators and passive energy-dissipation systems.

  19. Interfacial growth of large-area single-layer metal-organic framework nanosheets

    PubMed Central

    Makiura, Rie; Konovalov, Oleg

    2013-01-01

    The air/liquid interface is an excellent platform to assemble two-dimensional (2D) sheets of materials by enhancing spontaneous organizational features of the building components and encouraging large length scale in-plane growth. We have grown 2D molecularly-thin crystalline metal-organic-framework (MOF) nanosheets composed of porphyrin building units and metal-ion joints (NAFS-13) under operationally simple ambient conditions at the air/liquid interface. In-situ synchrotron X-ray diffraction studies of the formation process performed directly at the interface were employed to optimize the NAFS-13 growth protocol leading to the development of a post-injection method –post-injection of the metal connectors into the water subphase on whose surface the molecular building blocks are pre-oriented– which allowed us to achieve the formation of large-surface area morphologically-uniform preferentially-oriented single-layer nanosheets. The growth of such large-size high-quality sheets is of interest for the understanding of the fundamental physical/chemical properties associated with ultra-thin sheet-shaped materials and the realization of their use in applications. PMID:23974345

  20. Facilitating Cohort Discovery by Enhancing Ontology Exploration, Query Management and Query Sharing for Large Clinical Data Repositories.

    PubMed

    Tao, Shiqiang; Cui, Licong; Wu, Xi; Zhang, Guo-Qiang

    2017-01-01

    To help researchers better access clinical data, we developed a prototype query engine called DataSphere for exploring large-scale integrated clinical data repositories. DataSphere expedites data importing using a NoSQL data management system and dynamically renders its user interface for concept-based querying tasks. DataSphere provides an interactive query-building interface together with query translation and optimization strategies, which enable users to build and execute queries effectively and efficiently. We successfully loaded a dataset of one million patients for University of Kentucky (UK) Healthcare into DataSphere with more than 300 million clinical data records. We evaluated DataSphere by comparing it with an instance of i2b2 deployed at UK Healthcare, demonstrating that DataSphere provides enhanced user experience for both query building and execution.

  1. Facilitating Cohort Discovery by Enhancing Ontology Exploration, Query Management and Query Sharing for Large Clinical Data Repositories

    PubMed Central

    Tao, Shiqiang; Cui, Licong; Wu, Xi; Zhang, Guo-Qiang

    2017-01-01

    To help researchers better access clinical data, we developed a prototype query engine called DataSphere for exploring large-scale integrated clinical data repositories. DataSphere expedites data importing using a NoSQL data management system and dynamically renders its user interface for concept-based querying tasks. DataSphere provides an interactive query-building interface together with query translation and optimization strategies, which enable users to build and execute queries effectively and efficiently. We successfully loaded a dataset of one million patients for University of Kentucky (UK) Healthcare into DataSphere with more than 300 million clinical data records. We evaluated DataSphere by comparing it with an instance of i2b2 deployed at UK Healthcare, demonstrating that DataSphere provides enhanced user experience for both query building and execution. PMID:29854239

  2. A comparison of wake characteristics of model and prototype buildings in transverse winds

    NASA Technical Reports Server (NTRS)

    Logan, E., Jr.; Phataraphruk, P.; Chang, J.

    1978-01-01

    Previously measured mean velocity and turbulence intensity profiles in the wake of a 26.8-m long building 3.2 m high and transverse to the wind direction in an atmospheric boundary layer several hundred meters thick were compared with profiles at corresponding stations downstream of a 1/50-scale model on the floor of a large meteorological wind tunnel in a boundary layer 0.61 m in thickness. The validity of using model wake data to predict full scale data was determined. Preliminary results are presented which indicate that disparities result from differences in relative depth of logarithmic layers, surface roughness, and the proximity of upstream obstacles.

  3. Quantification of fossil fuel CO2 emissions on the building/street scale for a large U.S. city.

    PubMed

    Gurney, Kevin R; Razlivanov, Igor; Song, Yang; Zhou, Yuyu; Benes, Bedrich; Abdul-Massih, Michel

    2012-11-06

    In order to advance the scientific understanding of carbon exchange with the land surface, build an effective carbon monitoring system, and contribute to quantitatively based U.S. climate change policy interests, fine spatial and temporal quantification of fossil fuel CO(2) emissions, the primary greenhouse gas, is essential. Called the "Hestia Project", this research effort is the first to use bottom-up methods to quantify all fossil fuel CO(2) emissions down to the scale of individual buildings, road segments, and industrial/electricity production facilities on an hourly basis for an entire urban landscape. Here, we describe the methods used to quantify the on-site fossil fuel CO(2) emissions across the city of Indianapolis, IN. This effort combines a series of data sets and simulation tools such as a building energy simulation model, traffic data, power production reporting, and local air pollution reporting. The system is general enough to be applied to any large U.S. city and holds tremendous potential as a key component of a carbon-monitoring system in addition to enabling efficient greenhouse gas mitigation and planning. We compare the natural gas component of our fossil fuel CO(2) emissions estimate to consumption data provided by the local gas utility. At the zip code level, we achieve a bias-adjusted Pearson r correlation value of 0.92 (p < 0.001).

  4. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  5. Creating a three level building classification using topographic and address-based data for Manchester

    NASA Astrophysics Data System (ADS)

    Hussain, M.; Chen, D.

    2014-11-01

    Buildings, the basic unit of an urban landscape, host most of its socio-economic activities and play an important role in the creation of urban land-use patterns. The spatial arrangement of different building types creates varied urban land-use clusters which can provide an insight to understand the relationships between social, economic, and living spaces. The classification of such urban clusters can help in policy-making and resource management. In many countries including the UK no national-level cadastral database containing information on individual building types exists in public domain. In this paper, we present a framework for inferring functional types of buildings based on the analysis of their form (e.g. geometrical properties, such as area and perimeter, layout) and spatial relationship from large topographic and address-based GIS database. Machine learning algorithms along with exploratory spatial analysis techniques are used to create the classification rules. The classification is extended to two further levels based on the functions (use) of buildings derived from address-based data. The developed methodology was applied to the Manchester metropolitan area using the Ordnance Survey's MasterMap®, a large-scale topographic and address-based data available for the UK.

  6. Building energy governance in Shanghai

    NASA Astrophysics Data System (ADS)

    Kung, YiHsiu Michelle

    With Asia's surging economies and urbanization, the region is adding to its built environment at an unprecedented rate, especially those population centers in China and India. With numerous existing buildings, plus a new building boom, construction in these major Asian cities has caused momentous sustainability challenges. This dissertation focuses on China's leading city, Shanghai, to explore and assess its existing commercial building energy policies and practices. Research estimates that Shanghai's commercial buildings might become a key challenge with regard to energy use and CO2 emissions as compared to other major Asian cities. Relevant building energy policy instruments at national and local levels for commercial buildings are reviewed. In addition, two benchmarks are established to further assess building energy policies in Shanghai. The first benchmark is based on the synthesis of relevant criteria and policy instruments as recommended by professional organizations, while the second practical benchmark is drawn from an analysis of three global cities: New York, London and Tokyo. Moreover, two large-scale commercial building sites - Shanghai IKEA and Plaza 66 - are selected for investigation and assessment of their efforts on building energy saving measures. Detailed building energy savings, CO2 reductions, and management cost reductions based on data availability and calculations are presented with the co-benefits approach. The research additionally analyzes different interventions and factors that facilitate or constrain the implementation process of building energy saving measures in each case. Furthermore, a multi-scale analytical framework is employed to investigate relevant stakeholders that shape Shanghai's commercial building energy governance. Research findings and policy recommendations are offered at the close of this dissertation. Findings and policy recommendations are intended to facilitate commercial building energy governance in Shanghai and other rapidly growing second-tier or third-tier cities in China, and to further contribute to the general body of knowledge on Asia's urban building sustainability.

  7. Laureates

    Science.gov Websites

    , multi-scale observing systems under challenging field conditions to document unexpectedly large soil CO2 pleased to recognize the Building Technology and Urban Systems Division's Retro-commissioning Sensor synthetic biology while providing novel approaches for crop engineering to support Berkeley Lab and DOE's

  8. The World's Largest Photovoltaic Concentrator System.

    ERIC Educational Resources Information Center

    Smith, Harry V.

    1982-01-01

    The Mississippi County Community College large-scale energy experiment, featuring the emerging high technology of solar electricity, is described. The project includes a building designed for solar electricity and a power plant consisting of a total energy photovoltaic system, and features two experimental developments. (MLW)

  9. Evaluation of wind-induced internal pressure in low-rise buildings: A multi scale experimental and numerical approach

    NASA Astrophysics Data System (ADS)

    Tecle, Amanuel Sebhatu

    Hurricane is one of the most destructive and costly natural hazard to the built environment and its impact on low-rise buildings, particularity, is beyond acceptable. The major objective of this research was to perform a parametric evaluation of internal pressure (IP) for wind-resistant design of low-rise buildings and wind-driven natural ventilation applications. For this purpose, a multi-scale experimental, i.e. full-scale at Wall of Wind (WoW) and small-scale at Boundary Layer Wind Tunnel (BLWT), and a Computational Fluid Dynamics (CFD) approach was adopted. This provided new capability to assess wind pressures realistically on internal volumes ranging from small spaces formed between roof tiles and its deck to attic to room partitions. Effects of sudden breaching, existing dominant openings on building envelopes as well as compartmentalization of building interior on the IP were systematically investigated. Results of this research indicated: (i) for sudden breaching of dominant openings, the transient overshooting response was lower than the subsequent steady state peak IP and internal volume correction for low-wind-speed testing facilities was necessary. For example a building without volume correction experienced a response four times faster and exhibited 30--40% lower mean and peak IP; (ii) for existing openings, vent openings uniformly distributed along the roof alleviated, whereas one sided openings aggravated the IP; (iii) larger dominant openings exhibited a higher IP on the building envelope, and an off-center opening on the wall exhibited (30--40%) higher IP than center located openings; (iv) compartmentalization amplified the intensity of IP and; (v) significant underneath pressure was measured for field tiles, warranting its consideration during net pressure evaluations. The study aimed at wind driven natural ventilation indicated: (i) the IP due to cross ventilation was 1.5 to 2.5 times higher for Ainlet/Aoutlet>1 compared to cases where Ainlet/Aoutlet<1, this in effect reduced the mixing of air inside the building and hence the ventilation effectiveness; (ii) the presence of multi-room partitioning increased the pressure differential and consequently the air exchange rate. Overall good agreement was found between the observed large-scale, small-scale and CFD based IP responses. Comparisons with ASCE 7-10 consistently demonstrated that the code underestimated peak positive and suction IP.

  10. Flood management: prediction of microbial contamination in large-scale floods in urban environments.

    PubMed

    Taylor, Jonathon; Lai, Ka Man; Davies, Mike; Clifton, David; Ridley, Ian; Biddulph, Phillip

    2011-07-01

    With a changing climate and increased urbanisation, the occurrence and the impact of flooding is expected to increase significantly. Floods can bring pathogens into homes and cause lingering damp and microbial growth in buildings, with the level of growth and persistence dependent on the volume and chemical and biological content of the flood water, the properties of the contaminating microbes, and the surrounding environmental conditions, including the restoration time and methods, the heat and moisture transport properties of the envelope design, and the ability of the construction material to sustain the microbial growth. The public health risk will depend on the interaction of these complex processes and the vulnerability and susceptibility of occupants in the affected areas. After the 2007 floods in the UK, the Pitt review noted that there is lack of relevant scientific evidence and consistency with regard to the management and treatment of flooded homes, which not only put the local population at risk but also caused unnecessary delays in the restoration effort. Understanding the drying behaviour of flooded buildings in the UK building stock under different scenarios, and the ability of microbial contaminants to grow, persist, and produce toxins within these buildings can help inform recovery efforts. To contribute to future flood management, this paper proposes the use of building simulations and biological models to predict the risk of microbial contamination in typical UK buildings. We review the state of the art with regard to biological contamination following flooding, relevant building simulation, simulation-linked microbial modelling, and current practical considerations in flood remediation. Using the city of London as an example, a methodology is proposed that uses GIS as a platform to integrate drying models and microbial risk models with the local building stock and flood models. The integrated tool will help local governments, health authorities, insurance companies and residents to better understand, prepare for and manage a large-scale flood in urban environments. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Enhancement of global flood damage assessments using building material based vulnerability curves

    NASA Astrophysics Data System (ADS)

    Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves. Finally, this approach allows for more accuracy in estimating losses as a result of direct damages. 1 http://earthquake.usgs.gov/data/pager/

  12. Extragalactic Astrophysics

    NASA Astrophysics Data System (ADS)

    Webb, James R.

    2016-09-01

    This book is intended to be a course about the creation and evolution of the universe at large, including the basic macroscopic building blocks (galaxies) and the overall large-scale structure. This text covers a broad range of topics for a graduate-level class in a physics department where students' available credit hours for astrophysics classes are limited. The sections cover galactic structure, external galaxies, galaxy clustering, active galaxies, general relativity and cosmology.

  13. Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice

    PubMed Central

    Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J

    2015-01-01

    Background System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. Methods We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Results Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Conclusions Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit. PMID:27512239

  14. Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice.

    PubMed

    Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J

    2014-04-01

    System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children's service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit.

  15. Zero Launch Mass 3D printer

    NASA Image and Video Library

    2018-05-01

    Packing light is the idea behind the Zero Launch Mass 3-D Printer. Instead of loading up on heavy building supplies, a large scale 3-D printer capable of using recycled plastic waste and dirt at the destination as construction material would save mass and money when launching robotic precursor missions to build infrastructure on the Moon or Mars in preparation for human habitation. To make this a reality, Nathan Gelino, a researcher engineer with NASA’s Swamp Works at Kennedy Space Center, measured the temperature of a test specimen from the 3-D printer Tuesday as an early step in characterizing printed material strength properties. Material temperature plays a large role in the strength of bonds between layers.

  16. To Make Archives Available Online: Transcending Boundaries or Building Walls?

    ERIC Educational Resources Information Center

    Hansen, Lars-Erik; Sundqvist, Anneli

    2012-01-01

    The development of information technology and the rise of the Internet have rendered a large-scale digitization and dissemination of originally analog information objects. On the Web sites "Lararnas Historia" ("History of Teachers" www.lararhistoria.se) and "Ingenjorshistoria" ("History of Engineers"…

  17. Applied Distributed Model Predictive Control for Energy Efficient Buildings and Ramp Metering

    NASA Astrophysics Data System (ADS)

    Koehler, Sarah Muraoka

    Industrial large-scale control problems present an interesting algorithmic design challenge. A number of controllers must cooperate in real-time on a network of embedded hardware with limited computing power in order to maximize system efficiency while respecting constraints and despite communication delays. Model predictive control (MPC) can automatically synthesize a centralized controller which optimizes an objective function subject to a system model, constraints, and predictions of disturbance. Unfortunately, the computations required by model predictive controllers for large-scale systems often limit its industrial implementation only to medium-scale slow processes. Distributed model predictive control (DMPC) enters the picture as a way to decentralize a large-scale model predictive control problem. The main idea of DMPC is to split the computations required by the MPC problem amongst distributed processors that can compute in parallel and communicate iteratively to find a solution. Some popularly proposed solutions are distributed optimization algorithms such as dual decomposition and the alternating direction method of multipliers (ADMM). However, these algorithms ignore two practical challenges: substantial communication delays present in control systems and also problem non-convexity. This thesis presents two novel and practically effective DMPC algorithms. The first DMPC algorithm is based on a primal-dual active-set method which achieves fast convergence, making it suitable for large-scale control applications which have a large communication delay across its communication network. In particular, this algorithm is suited for MPC problems with a quadratic cost, linear dynamics, forecasted demand, and box constraints. We measure the performance of this algorithm and show that it significantly outperforms both dual decomposition and ADMM in the presence of communication delay. The second DMPC algorithm is based on an inexact interior point method which is suited for nonlinear optimization problems. The parallel computation of the algorithm exploits iterative linear algebra methods for the main linear algebra computations in the algorithm. We show that the splitting of the algorithm is flexible and can thus be applied to various distributed platform configurations. The two proposed algorithms are applied to two main energy and transportation control problems. The first application is energy efficient building control. Buildings represent 40% of energy consumption in the United States. Thus, it is significant to improve the energy efficiency of buildings. The goal is to minimize energy consumption subject to the physics of the building (e.g. heat transfer laws), the constraints of the actuators as well as the desired operating constraints (thermal comfort of the occupants), and heat load on the system. In this thesis, we describe the control systems of forced air building systems in practice. We discuss the "Trim and Respond" algorithm which is a distributed control algorithm that is used in practice, and show that it performs similarly to a one-step explicit DMPC algorithm. Then, we apply the novel distributed primal-dual active-set method and provide extensive numerical results for the building MPC problem. The second main application is the control of ramp metering signals to optimize traffic flow through a freeway system. This application is particularly important since urban congestion has more than doubled in the past few decades. The ramp metering problem is to maximize freeway throughput subject to freeway dynamics (derived from mass conservation), actuation constraints, freeway capacity constraints, and predicted traffic demand. In this thesis, we develop a hybrid model predictive controller for ramp metering that is guaranteed to be persistently feasible and stable. This contrasts to previous work on MPC for ramp metering where such guarantees are absent. We apply a smoothing method to the hybrid model predictive controller and apply the inexact interior point method to this nonlinear non-convex ramp metering problem.

  18. The Chandra Deep Wide-Field Survey: Completing the new generation of Chandra extragalactic surveys

    NASA Astrophysics Data System (ADS)

    Hickox, Ryan

    2016-09-01

    Chandra X-ray surveys have revolutionized our view of the growth of black holes across cosmic time. Recently, fundamental questions have emerged about the connection of AGN to their host large scale structures that clearly demand a wide, deep survey over a large area, comparable to the recent extensive Chandra surveys in smaller fields. We propose the Chandra Deep Wide-Field Survey (CDWFS) covering the central 6 sq. deg in the Bootes field, totaling 1.025 Ms (building on 550 ks from the HRC GTO program). CDWFS will efficiently probe a large cosmic volume, allowing us to carry out accurate new investigations of the connections between black holes and their large-scale structures, and will complete the next generation surveys that comprise a key part of Chandra's legacy.

  19. The importance of building construction materials relative to other factors affecting structure survival during wildfire

    USGS Publications Warehouse

    Syphard, Alexandra D.; Brennan, Teresa J.; Keeley, Jon E.

    2017-01-01

    Structure loss to wildfire is a serious problem in wildland-urban interface areas across the world. Laboratory experiments suggest that fire-resistant building construction and design could be important for reducing structure destruction, but these need to be evaluated under real wildfire conditions, especially relative to other factors. Using empirical data from destroyed and surviving structures from large wildfires in southern California, we evaluated the relative importance of building construction and structure age compared to other local and landscape-scale variables associated with structure survival. The local-scale analysis showed that window preparation was especially important but, in general, creating defensible space adjacent to the home was as important as building construction. At the landscape scale, structure density and structure age were the two most important factors affecting structure survival, but there was a significant interaction between them. That is, young structure age was most important in higher-density areas where structure survival overall was more likely. On the other hand, newer-construction structures were less likely to survive wildfires at lower density. Here, appropriate defensible space near the structure and accessibility to major roads were important factors. In conclusion, community safety is a multivariate problem that will require a comprehensive solution involving land use planning, fire-safe construction, and property maintenance.

  20. Spatial Information in Support of 3D Flood Damage Assessment of Buildings at Micro Level: A Review

    NASA Astrophysics Data System (ADS)

    Amirebrahimi, S.; Rajabifard, A.; Sabri, S.; Mendis, P.

    2016-10-01

    Floods, as the most common and costliest natural disaster around the globe, have adverse impacts on buildings which are considered as major contributors to the overall economic damage. With emphasis on risk management methods for reducing the risks to structures and people, estimating damage from potential flood events becomes an important task for identifying and implementing the optimal flood risk-reduction solutions. While traditional Flood Damage Assessment (FDA) methods focus on simple representation of buildings for large-scale damage assessment purposes, recent emphasis on buildings' flood resilience resulted in development of a sophisticated method that allows for a detailed and effective damage evaluation at the scale of building and its components. In pursuit of finding the suitable spatial information model to satisfy the needs of implementing such frameworks, this article explores the technical developments for an effective representation of buildings, floods and other required information within the built environment. The search begins with the Geospatial domain and investigates the state-of-the-art and relevant developments from data point of view in this area. It is further extended to other relevant disciplines in the Architecture, Engineering and Construction domain (AEC/FM) and finally, even some overlapping areas between these domains are considered and explored.

  1. An investigation of rooftop STOL port aerodynamics

    NASA Technical Reports Server (NTRS)

    Blanton, J. N.; Parker, H. M.

    1972-01-01

    An investigation into aerodynamic problems associated with large building rooftop STOLports was performed. Initially, a qualitative flow visualization study indicated two essential problems: (1) the establishment of smooth, steady, attached flow over the rooftop, and (2) the generation of acceptable crosswind profile once (1) has been achieved. This study indicated that (1) could be achieved by attaching circular-arc rounded edge extensions to the upper edges of the building and that crosswind profiles could be modified by the addition of porous vertical fences to the lateral edges of the rooftop. Important fence parameters associated with crosswind alteration were found to be solidity, fence element number and spacing. Large scale building induced velocity fluctuations were discovered for most configurations tested and a possible explanation for their occurrence was postulated. Finally, a simple equation relating fence solidity to the resulting velocity profile was developed and tested for non-uniform single element fences with 30 percent maximum solidity.

  2. Adiabatic quantum-flux-parametron cell library adopting minimalist design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeuchi, Naoki, E-mail: takeuchi-naoki-kx@ynu.jp; Yamanashi, Yuki; Yoshikawa, Nobuyuki

    We herein build an adiabatic quantum-flux-parametron (AQFP) cell library adopting minimalist design and a symmetric layout. In the proposed minimalist design, every logic cell is designed by arraying four types of building block cells: buffer, NOT, constant, and branch cells. Therefore, minimalist design enables us to effectively build and customize an AQFP cell library. The symmetric layout reduces unwanted parasitic magnetic coupling and ensures a large mutual inductance in an output transformer, which enables very long wiring between logic cells. We design and fabricate several logic circuits using the minimal AQFP cell library so as to test logic cells inmore » the library. Moreover, we experimentally investigate the maximum wiring length between logic cells. Finally, we present an experimental demonstration of an 8-bit carry look-ahead adder designed using the minimal AQFP cell library and demonstrate that the proposed cell library is sufficiently robust to realize large-scale digital circuits.« less

  3. Adiabatic quantum-flux-parametron cell library adopting minimalist design

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Yamanashi, Yuki; Yoshikawa, Nobuyuki

    2015-05-01

    We herein build an adiabatic quantum-flux-parametron (AQFP) cell library adopting minimalist design and a symmetric layout. In the proposed minimalist design, every logic cell is designed by arraying four types of building block cells: buffer, NOT, constant, and branch cells. Therefore, minimalist design enables us to effectively build and customize an AQFP cell library. The symmetric layout reduces unwanted parasitic magnetic coupling and ensures a large mutual inductance in an output transformer, which enables very long wiring between logic cells. We design and fabricate several logic circuits using the minimal AQFP cell library so as to test logic cells in the library. Moreover, we experimentally investigate the maximum wiring length between logic cells. Finally, we present an experimental demonstration of an 8-bit carry look-ahead adder designed using the minimal AQFP cell library and demonstrate that the proposed cell library is sufficiently robust to realize large-scale digital circuits.

  4. Scaling up digital circuit computation with DNA strand displacement cascades.

    PubMed

    Qian, Lulu; Winfree, Erik

    2011-06-03

    To construct sophisticated biochemical circuits from scratch, one needs to understand how simple the building blocks can be and how robustly such circuits can scale up. Using a simple DNA reaction mechanism based on a reversible strand displacement process, we experimentally demonstrated several digital logic circuits, culminating in a four-bit square-root circuit that comprises 130 DNA strands. These multilayer circuits include thresholding and catalysis within every logical operation to perform digital signal restoration, which enables fast and reliable function in large circuits with roughly constant switching time and linear signal propagation delays. The design naturally incorporates other crucial elements for large-scale circuitry, such as general debugging tools, parallel circuit preparation, and an abstraction hierarchy supported by an automated circuit compiler.

  5. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    NASA Astrophysics Data System (ADS)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  6. A Framework for Daylighting Optimization in Whole Buildings with OpenStudio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-08-12

    We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less

  7. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  8. Talking About The Smokes: a large-scale, community-based participatory research project.

    PubMed

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  9. mySyntenyPortal: an application package to construct websites for synteny block analysis.

    PubMed

    Lee, Jongin; Lee, Daehwan; Sim, Mikang; Kwon, Daehong; Kim, Juyeon; Ko, Younhee; Kim, Jaebum

    2018-06-05

    Advances in sequencing technologies have facilitated large-scale comparative genomics based on whole genome sequencing. Constructing and investigating conserved genomic regions among multiple species (called synteny blocks) are essential in the comparative genomics. However, they require significant amounts of computational resources and time in addition to bioinformatics skills. Many web interfaces have been developed to make such tasks easier. However, these web interfaces cannot be customized for users who want to use their own set of genome sequences or definition of synteny blocks. To resolve this limitation, we present mySyntenyPortal, a stand-alone application package to construct websites for synteny block analyses by using users' own genome data. mySyntenyPortal provides both command line and web-based interfaces to build and manage websites for large-scale comparative genomic analyses. The websites can be also easily published and accessed by other users. To demonstrate the usability of mySyntenyPortal, we present an example study for building websites to compare genomes of three mammalian species (human, mouse, and cow) and show how they can be easily utilized to identify potential genes affected by genome rearrangements. mySyntenyPortal will contribute for extended comparative genomic analyses based on large-scale whole genome sequences by providing unique functionality to support the easy creation of interactive websites for synteny block analyses from user's own genome data.

  10. A state-based national network for effective wildlife conservation

    USGS Publications Warehouse

    Meretsky, Vicky J.; Maguire, Lynn A.; Davis, Frank W.; Stoms, David M.; Scott, J. Michael; Figg, Dennis; Goble, Dale D.; Griffith, Brad; Henke, Scott E.; Vaughn, Jacqueline; Yaffee, Steven L.

    2012-01-01

    State wildlife conservation programs provide a strong foundation for biodiversity conservation in the United States, building on state wildlife action plans. However, states may miss the species that are at the most risk at rangewide scales, and threats such as novel diseases and climate change increasingly act at regional and national levels. Regional collaborations among states and their partners have had impressive successes, and several federal programs now incorporate state priorities. However, regional collaborations are uneven across the country, and no national counterpart exists to support efforts at that scale. A national conservation-support program could fill this gap and could work across the conservation community to identify large-scale conservation needs and support efforts to meet them. By providing important information-sharing and capacity-building services, such a program would advance collaborative conservation among the states and their partners, thus increasing both the effectiveness and the efficiency of conservation in the United States.

  11. Investigating the Potential of Deep Neural Networks for Large-Scale Classification of Very High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Postadjian, T.; Le Bris, A.; Sahbi, H.; Mallet, C.

    2017-05-01

    Semantic classification is a core remote sensing task as it provides the fundamental input for land-cover map generation. The very recent literature has shown the superior performance of deep convolutional neural networks (DCNN) for many classification tasks including the automatic analysis of Very High Spatial Resolution (VHR) geospatial images. Most of the recent initiatives have focused on very high discrimination capacity combined with accurate object boundary retrieval. Therefore, current architectures are perfectly tailored for urban areas over restricted areas but not designed for large-scale purposes. This paper presents an end-to-end automatic processing chain, based on DCNNs, that aims at performing large-scale classification of VHR satellite images (here SPOT 6/7). Since this work assesses, through various experiments, the potential of DCNNs for country-scale VHR land-cover map generation, a simple yet effective architecture is proposed, efficiently discriminating the main classes of interest (namely buildings, roads, water, crops, vegetated areas) by exploiting existing VHR land-cover maps for training.

  12. Residential solar-heating system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Complete residential solar-heating and hot-water system, when installed in highly-insulated energy-saver home, can supply large percentage of total energy demand for space heating and domestic hot water. System which uses water-heating energy storage can be scaled to meet requirements of building in which it is installed.

  13. Preliminary design, analysis, and costing of a dynamic scale model of the NASA space station

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Pinson, E. D.; Voqui, H. L.; Crawley, E. F.; Everman, M. R.

    1987-01-01

    The difficulty of testing the next generation of large flexible space structures on the ground places an emphasis on other means for validating predicted on-orbit dynamic behavior. Scale model technology represents one way of verifying analytical predictions with ground test data. This study investigates the preliminary design, scaling and cost trades for a Space Station dynamic scale model. The scaling of nonlinear joint behavior is studied from theoretical and practical points of view. Suspension system interaction trades are conducted for the ISS Dual Keel Configuration and Build-Up Stages suspended in the proposed NASA/LaRC Large Spacecraft Laboratory. Key issues addressed are scaling laws, replication vs. simulation of components, manufacturing, suspension interactions, joint behavior, damping, articulation capability, and cost. These issues are the subject of parametric trades versus the scale model factor. The results of these detailed analyses are used to recommend scale factors for four different scale model options, each with varying degrees of replication. Potential problems in constructing and testing the scale model are identified, and recommendations for further study are outlined.

  14. Posttest analysis of a 1:6-scale reinforced concrete reactor containment building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weatherby, J.R.

    In an experiment conducted at Sandia National Laboratories, 1:6-scale model of a reinforced concrete light water reactor containment building was pressurized with nitrogen gas to more than three times its design pressure. The pressurization produced one large tear and several smaller tears in the steel liner plate that functioned as the primary pneumatic seal for the structure. The data collected from the overpressurization test have been used to evaluate and further refine methods of structural analysis that can be used to predict the performance of containment buildings under conditions produced by a severe accident. This report describes posttest finite elementmore » analyses of the 1:6-scale model tests and compares pretest predictions of the structural response to the experimental results. Strain and displacements calculated in axisymmetric finite element analyses of the 1:6-scale model are compared to strains and displacement measured in the experiment. Detailed analyses of the liner plate are also described in the report. The region of the liner surrounding the large tear was analyzed using two different two-dimensional finite elements model. The results from these analyzed indicate that the primary mechanisms that initiated the tear can be captured in a two- dimensional finite element model. Furthermore, the analyses show that studs used to anchor the liner to the concrete wall, played an important role in initiating the liner tear. Three-dimensional finite element analyses of liner plates loaded by studs are also presented. Results from the three-dimensional analyses are compared to results from two-dimensional analyses of the same problems. 12 refs., 56 figs., 1 tab.« less

  15. Scaling and Optimization of Magnetic Refrigeration for Commercial Building HVAC Systems Greater than 175 kW in Capacity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdelaziz, Omar; West, David L; Mallow, Anne M

    Heating, ventilation, air-conditioning and refrigeration (HVACR) account for approximately one- third of building energy consumption. Magnetic refrigeration presents an opportunity for significant energy savings and emissions reduction for serving the building heating, cooling, and refrigeration loads. In this paper, we have examined the magnet and MCE material requirements for scaling magnetic refrigeration systems for commercial building cooling applications. Scaling relationships governing the resources required for magnetic refrigeration systems have been developed. As system refrigeration capacity increases, the use of superconducting magnet systems becomes more applicable, and a comparison is presented of system requirements for permanent and superconducting (SC) magnetization systems.more » Included in this analysis is an investigation of the ability of superconducting magnet based systems to overcome the parasitic power penalty of the cryocooler used to keep SC windings at cryogenic temperatures. Scaling relationships were used to develop the initial specification for a SC magnet-based active magnetic regeneration (AMR) system. An optimized superconducting magnet was designed to support this system. In this analysis, we show that the SC magnet system consisting of two 0.38 m3 regenerators is capable of producing 285 kW of cooling power with a T of 28 K. A system COP of 4.02 including cryocooler and fan losses which illustrates that an SC magnet-based system can operate with efficiency comparable to traditional systems and deliver large cooling powers of 285.4 kW (81.2 Tons).« less

  16. Inferring cortical function in the mouse visual system through large-scale systems neuroscience.

    PubMed

    Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof

    2016-07-05

    The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.

  17. The study of integration about measurable image and 4D production

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Hu, Pingbo; Niu, Weiyun

    2008-12-01

    In this paper, we create the geospatial data of three-dimensional (3D) modeling by the combination of digital photogrammetry and digital close-range photogrammetry. For large-scale geographical background, we make the establishment of DEM and DOM combination of three-dimensional landscape model based on the digital photogrammetry which uses aerial image data to make "4D" (DOM: Digital Orthophoto Map, DEM: Digital Elevation Model, DLG: Digital Line Graphic and DRG: Digital Raster Graphic) production. For the range of building and other artificial features which the users are interested in, we realize that the real features of the three-dimensional reconstruction adopting the method of the digital close-range photogrammetry can come true on the basis of following steps : non-metric cameras for data collection, the camera calibration, feature extraction, image matching, and other steps. At last, we combine three-dimensional background and local measurements real images of these large geographic data and realize the integration of measurable real image and the 4D production.The article discussed the way of the whole flow and technology, achieved the three-dimensional reconstruction and the integration of the large-scale threedimensional landscape and the metric building.

  18. The Plant Phenology Ontology: A New Informatics Resource for Large-Scale Integration of Plant Phenology Data.

    PubMed

    Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona

    2018-01-01

    Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.

  19. Building America Case Study: Performance of a Hot-Dry Climate Whole House Retrofit, Stockton, California (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ARBI

    2014-09-01

    The Stockton house retrofit is a two-story tudor style single family deep retrofit in the hot-dry climate of Stockton, CA. The home is representative of a deep retrofit option of the scaled home energy upgrade packages offered to targeted neighborhoods under the pilot Large-Scale Retrofit Program (LSRP) administered by the Alliance for Residential Building Innovation (ARBI). Deep retrofit packages expand on the standard package by adding HVAC, water heater and window upgrades to the ducting, attic and floor insulation, domestic hot water insulation, envelope sealing, lighting and ventilation upgrades. Site energy savings with the deep retrofit were 23% compared tomore » the pre-retrofit case, and 15% higher than the savings estimated for the standard retrofit package. Energy savings were largely a result of the water heater upgrade, and a combination of the envelope sealing, insulation and HVAC upgrade. The HVAC system was of higher efficiency than the building code standard. Overall the financed retrofit would have been more cost effective had a less expensive HVAC system been selected and barriers to wall insulation remedied. The homeowner experienced improved comfort throughout the monitored period and was satisfied with the resulting utility bill savings.« less

  20. Incorporation of spatial interactions in location networks to identify critical geo-referenced routes for assessing disease control measures on a large-scale campus.

    PubMed

    Wen, Tzai-Hung; Chin, Wei Chien Benny

    2015-04-14

    Respiratory diseases mainly spread through interpersonal contact. Class suspension is the most direct strategy to prevent the spread of disease through elementary or secondary schools by blocking the contact network. However, as university students usually attend courses in different buildings, the daily contact patterns on a university campus are complicated, and once disease clusters have occurred, suspending classes is far from an efficient strategy to control disease spread. The purpose of this study is to propose a methodological framework for generating campus location networks from a routine administration database, analyzing the community structure of the network, and identifying the critical links and nodes for blocking respiratory disease transmission. The data comes from the student enrollment records of a major comprehensive university in Taiwan. We combined the social network analysis and spatial interaction model to establish a geo-referenced community structure among the classroom buildings. We also identified the critical links among the communities that were acting as contact bridges and explored the changes in the location network after the sequential removal of the high-risk buildings. Instead of conducting a questionnaire survey, the study established a standard procedure for constructing a location network on a large-scale campus from a routine curriculum database. We also present how a location network structure at a campus could function to target the high-risk buildings as the bridges connecting communities for blocking disease transmission.

  1. lidar change detection using building models

    NASA Astrophysics Data System (ADS)

    Kim, Angela M.; Runyon, Scott C.; Jalobeanu, Andre; Esterline, Chelsea H.; Kruse, Fred A.

    2014-06-01

    Terrestrial LiDAR scans of building models collected with a FARO Focus3D and a RIEGL VZ-400 were used to investigate point-to-point and model-to-model LiDAR change detection. LiDAR data were scaled, decimated, and georegistered to mimic real world airborne collects. Two physical building models were used to explore various aspects of the change detection process. The first model was a 1:250-scale representation of the Naval Postgraduate School campus in Monterey, CA, constructed from Lego blocks and scanned in a laboratory setting using both the FARO and RIEGL. The second model at 1:8-scale consisted of large cardboard boxes placed outdoors and scanned from rooftops of adjacent buildings using the RIEGL. A point-to-point change detection scheme was applied directly to the point-cloud datasets. In the model-to-model change detection scheme, changes were detected by comparing Digital Surface Models (DSMs). The use of physical models allowed analysis of effects of changes in scanner and scanning geometry, and performance of the change detection methods on different types of changes, including building collapse or subsistence, construction, and shifts in location. Results indicate that at low false-alarm rates, the point-to-point method slightly outperforms the model-to-model method. The point-to-point method is less sensitive to misregistration errors in the data. Best results are obtained when the baseline and change datasets are collected using the same LiDAR system and collection geometry.

  2. Gossip-Based Broadcast

    NASA Astrophysics Data System (ADS)

    Leitão, João; Pereira, José; Rodrigues, Luís

    Gossip, or epidemic, protocols have emerged as a powerful strategy to implement highly scalable and resilient reliable broadcast primitives on large scale peer-to-peer networks. Epidemic protocols are scalable because they distribute the load among all nodes in the system and resilient because they have an intrinsic level of redundancy that masks node and network failures. This chapter provides an introduction to gossip-based broadcast on large-scale unstructured peer-to-peer overlay networks: it surveys the main results in the field, discusses techniques to build and maintain the overlays that support efficient dissemination strategies, and provides an in-depth discussion and experimental evaluation of two concrete protocols, named HyParView and Plumtree.

  3. Hierarchical Genetic Analysis of German Cockroach (Blattella germanica) Populations from within Buildings to across Continents

    PubMed Central

    Vargo, Edward L.; Crissman, Jonathan R.; Booth, Warren; Santangelo, Richard G.; Mukha, Dmitry V.; Schal, Coby

    2014-01-01

    Understanding the population structure of species that disperse primarily by human transport is essential to predicting and controlling human-mediated spread of invasive species. The German cockroach (Blattella germanica) is a widespread urban invader that can actively disperse within buildings but is spread solely by human-mediated dispersal over longer distances; however, its population structure is poorly understood. Using microsatellite markers we investigated population structure at several spatial scales, from populations within single apartment buildings to populations from several cities across the U.S. and Eurasia. Both traditional measures of genetic differentiation and Bayesian clustering methods revealed increasing levels of genetic differentiation at greater geographic scales. Our results are consistent with active dispersal of cockroaches largely limited to movement within a building. Their low levels of genetic differentiation, yet limited active spread between buildings, suggests a greater likelihood of human-mediated dispersal at more local scales (within a city) than at larger spatial scales (within and between continents). About half the populations from across the U.S. clustered together with other U.S. populations, and isolation by distance was evident across the U.S. Levels of genetic differentiation among Eurasian cities were greater than those in the U.S. and greater than those between the U.S. and Eurasia, but no clear pattern of structure at the continent level was detected. MtDNA sequence variation was low and failed to reveal any geographical structure. The weak genetic structure detected here is likely due to a combination of historical admixture among populations and periodic population bottlenecks and founder events, but more extensive studies are needed to determine whether signatures of global movement may be present in this species. PMID:25020136

  4. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  5. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  6. DEVELOPMENT OF A SCALABLE, LOW-COST, ULTRANANOCRYSTALLINE DIAMOND ELECTROCHEMICAL PROCESS FOR THE DESTRUCTION OF CONTAMINANTS OF EMERGING CONCERN (CECS) - PHASE II

    EPA Science Inventory

    This Small Business Innovation Research (SBIR) Phase II project will employ the large scale; highly reliable boron-doped ultrananocrystalline diamond (BD-UNCD®) electrodes developed during Phase I project to build and test Electrochemical Anodic Oxidation process (EAOP)...

  7. The Developmental Evaluation of School Improvement Networks

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Glazer, Joshua L.; Winchell Lenhoff, Sarah

    2016-01-01

    The national education reform agenda has rapidly expanded to include attention to continuous improvement research in education. The purpose of this analysis is to propose a new approach to "developmental evaluation" aimed at building a foundation for continuous improvement in large-scale school improvement networks, on the argument that…

  8. Architectural and Mobility Management Designs in Internet-Based Infrastructure Wireless Mesh Networks

    ERIC Educational Resources Information Center

    Zhao, Weiyi

    2011-01-01

    Wireless mesh networks (WMNs) have recently emerged to be a cost-effective solution to support large-scale wireless Internet access. They have numerous applications, such as broadband Internet access, building automation, and intelligent transportation systems. One research challenge for Internet-based WMNs is to design efficient mobility…

  9. Building software tools to help contextualize and interpret monitoring data

    USDA-ARS?s Scientific Manuscript database

    Even modest monitoring efforts at landscape scales produce large volumes of data.These are most useful if they can be interpreted relative to land potential or other similar sites. However, for many ecological systems reference conditions may not be defined or are poorly described, which hinders und...

  10. Three Conceptual Replication Studies in Group Theory

    ERIC Educational Resources Information Center

    Melhuish, Kathleen

    2018-01-01

    Many studies in mathematics education research occur with a nonrepresentative sample and are never replicated. To challenge this paradigm, I designed a large-scale study evaluating student conceptions in group theory that surveyed a national, representative sample of students. By replicating questions previously used to build theory around student…

  11. In Search of the Eco-Teacher: Public School Edition

    ERIC Educational Resources Information Center

    Blenkinsop, Sean

    2014-01-01

    This paper uses an innovative building-less Canadian public elementary school and its accompanying large-scale research project to consider the characteristics that might be required of a teacher interested in working in an emergent, environmental, place- and community-based experiential public school setting. The six characteristics considered…

  12. Detecting Item Drift in Large-Scale Testing

    ERIC Educational Resources Information Center

    Guo, Hongwen; Robin, Frederic; Dorans, Neil

    2017-01-01

    The early detection of item drift is an important issue for frequently administered testing programs because items are reused over time. Unfortunately, operational data tend to be very sparse and do not lend themselves to frequent monitoring analyses, particularly for on-demand testing. Building on existing residual analyses, the authors propose…

  13. Forced Imbibition in Porous Media: A Fourfold Scenario

    NASA Astrophysics Data System (ADS)

    Odier, Céleste; Levaché, Bertrand; Santanach-Carreras, Enric; Bartolo, Denis

    2017-11-01

    We establish a comprehensive description of the patterns formed when a wetting liquid displaces a viscous fluid confined in a porous medium. Building on model microfluidic experiments, we evidence four imbibition scenarios all yielding different large-scale morphologies. Combining high-resolution imaging and confocal microscopy, we show that they originate from two liquid-entrainment transitions and a Rayleigh-Plateau instability at the pore scale. Finally, we demonstrate and explain the long-time coarsening of the resulting patterns.

  14. Microfluidic large-scale integration: the evolution of design rules for biological automation.

    PubMed

    Melin, Jessica; Quake, Stephen R

    2007-01-01

    Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.

  15. A dynamical systems approach to studying midlatitude weather extremes

    NASA Astrophysics Data System (ADS)

    Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide

    2017-04-01

    Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.

  16. Building Units Design and Scale Chemistry

    NASA Astrophysics Data System (ADS)

    Férey, Gérard

    2000-06-01

    The concept of a building unit (BU) is used in two ways: the first is an a posteriori tool for description of structures which can be used to imagine new topologies originating from the description; the second one, restricted to the routes leading to the solid from the solution, starts from the reality of these building units in the solution to design new solids obtained by the tuned precipitation of these BUs with proper counterions. The room temperature and the hydrothermal routes are examined. The existence of BUs with different sizes with close topologies, revealed by numerous examples, leads us to define the notion of "scale chemistry" which concerns the edification of solids with various BUs, either organic, hybrid, or inorganic, and the consequences it has for the corresponding frameworks and the voids they generate. Not only the framework is important, and applications of the existence of large cavities are discussed. The paper ends with a discussion of the new trends which arise from this topological concept.

  17. Large-scale modular biofiltration system for effective odor removal in a composting facility.

    PubMed

    Lin, Yueh-Hsien; Chen, Yu-Pei; Ho, Kuo-Ling; Lee, Tsung-Yih; Tseng, Ching-Ping

    2013-01-01

    Several different foul odors such as nitrogen-containing groups, sulfur-containing groups, and short-chain fatty-acids commonly emitted from composting facilities. In this study, an experimental laboratory-scale bioreactor was scaled up to build a large-scale modular biofiltration system that can process 34 m(3)min(-1)waste gases. This modular reactor system was proven effective in eliminating odors, with a 97% removal efficiency for 96 ppm ammonia, a 98% removal efficiency for 220 ppm amines, and a 100% removal efficiency of other odorous substances. The results of operational parameters indicate that this modular biofiltration system offers long-term operational stability. Specifically, a low pressure drop (<45 mmH2O m(-1)) was observed, indicating that the packing carrier in bioreactor units does not require frequent replacement. Thus, this modular biofiltration system can be used in field applications to eliminate various odors with compact working volume.

  18. Large-scale building scenes reconstruction from close-range images based on line and plane feature

    NASA Astrophysics Data System (ADS)

    Ding, Yi; Zhang, Jianqing

    2007-11-01

    Automatic generate 3D models of buildings and other man-made structures from images has become a topic of increasing importance, those models may be in applications such as virtual reality, entertainment industry and urban planning. In this paper we address the main problems and available solution for the generation of 3D models from terrestrial images. We first generate a coarse planar model of the principal scene planes and then reconstruct windows to refine the building models. There are several points of novelty: first we reconstruct the coarse wire frame model use the line segments matching with epipolar geometry constraint; Secondly, we detect the position of all windows in the image and reconstruct the windows by established corner points correspondences between images, then add the windows to the coarse model to refine the building models. The strategy is illustrated on image triple of college building.

  19. Shake Table Testing of an Elevator System in a Full-Scale Five-Story Building

    PubMed Central

    Wang, Xiang; Hutchinson, Tara C.; Astroza, Rodrigo; Conte, Joel P.; Restrepo, José I.; Hoehler, Matthew S.; Ribeiro, Waldir

    2016-01-01

    SUMMARY This paper investigates the seismic performance of a functional traction elevator as part of a full-scale five-story building shake table test program. The test building was subjected to a suite of earthquake input motions of increasing intensity, first while the building was isolated at its base, and subsequently while it was fixed to the shake table platen. In addition, low-amplitude white noise base excitation tests were conducted while the elevator system was placed in three different configurations, namely, by varying the vertical location of its cabin and counterweight, to study the acceleration amplifications of the elevator components due to dynamic excitations. During the earthquake tests, detailed observation of the physical damage and operability of the elevator as well as its measured response are reported. Although the cabin and counterweight sustained large accelerations due to impact during these tests, the use of well-restrained guide shoes demonstrated its effectiveness in preventing the cabin and counterweight from derailment during high-intensity earthquake shaking. However, differential displacements induced by the building imposed undesirable distortion of the elevator components and their surrounding support structure, which caused damage and inoperability of the elevator doors. It is recommended that these aspects be explicitly considered in elevator seismic design. PMID:28242957

  20. Shake Table Testing of an Elevator System in a Full-Scale Five-Story Building.

    PubMed

    Wang, Xiang; Hutchinson, Tara C; Astroza, Rodrigo; Conte, Joel P; Restrepo, José I; Hoehler, Matthew S; Ribeiro, Waldir

    2017-03-01

    This paper investigates the seismic performance of a functional traction elevator as part of a full-scale five-story building shake table test program. The test building was subjected to a suite of earthquake input motions of increasing intensity, first while the building was isolated at its base, and subsequently while it was fixed to the shake table platen. In addition, low-amplitude white noise base excitation tests were conducted while the elevator system was placed in three different configurations, namely, by varying the vertical location of its cabin and counterweight, to study the acceleration amplifications of the elevator components due to dynamic excitations. During the earthquake tests, detailed observation of the physical damage and operability of the elevator as well as its measured response are reported. Although the cabin and counterweight sustained large accelerations due to impact during these tests, the use of well-restrained guide shoes demonstrated its effectiveness in preventing the cabin and counterweight from derailment during high-intensity earthquake shaking. However, differential displacements induced by the building imposed undesirable distortion of the elevator components and their surrounding support structure, which caused damage and inoperability of the elevator doors. It is recommended that these aspects be explicitly considered in elevator seismic design.

  1. Centimeter-Scale 2D van der Waals Vertical Heterostructures Integrated on Deformable Substrates Enabled by Gold Sacrificial Layer-Assisted Growth.

    PubMed

    Islam, Md Ashraful; Kim, Jung Han; Schropp, Anthony; Kalita, Hirokjyoti; Choudhary, Nitin; Weitzman, Dylan; Khondaker, Saiful I; Oh, Kyu Hwan; Roy, Tania; Chung, Hee-Suk; Jung, Yeonwoong

    2017-10-11

    Two-dimensional (2D) transition metal dichalcogenides (TMDs) such as molybdenum or tungsten disulfides (MoS 2 or WS 2 ) exhibit extremely large in-plane strain limits and unusual optical/electrical properties, offering unprecedented opportunities for flexible electronics/optoelectronics in new form factors. In order for them to be technologically viable building-blocks for such emerging technologies, it is critically demanded to grow/integrate them onto flexible or arbitrary-shaped substrates on a large wafer-scale compatible with the prevailing microelectronics processes. However, conventional approaches to assemble them on such unconventional substrates via mechanical exfoliations or coevaporation chemical growths have been limited to small-area transfers of 2D TMD layers with uncontrolled spatial homogeneity. Moreover, additional processes involving a prolonged exposure to strong chemical etchants have been required for the separation of as-grown 2D layers, which is detrimental to their material properties. Herein, we report a viable strategy to universally combine the centimeter-scale growth of various 2D TMD layers and their direct assemblies on mechanically deformable substrates. By exploring the water-assisted debonding of gold (Au) interfaced with silicon dioxide (SiO 2 ), we demonstrate the direct growth, transfer, and integration of 2D TMD layers and heterostructures such as 2D MoS 2 and 2D MoS 2 /WS 2 vertical stacks on centimeter-scale plastic and metal foil substrates. We identify the dual function of the Au layer as a growth substrate as well as a sacrificial layer which facilitates 2D layer transfer. Furthermore, we demonstrate the versatility of this integration approach by fabricating centimeter-scale 2D MoS 2 /single walled carbon nanotube (SWNT) vertical heterojunctions which exhibit current rectification and photoresponse. This study opens a pathway to explore large-scale 2D TMD van der Waals layers as device building blocks for emerging mechanically deformable electronics/optoelectronics.

  2. Invention, design and performance of coconut agrowaste fiberboards for ecologically efficacious buildings

    NASA Astrophysics Data System (ADS)

    Lokko, Mae-ling Jovenes

    As global quantities of waste by-products from food production as well as the range of their applications increase, researchers are realizing critical opportunities to transform the burden of underutilized wastes into ecological profits. Within the tropical hot-humid region, where half the world's current and projected future population growth is concentrated, there is a dire demand for building materials to meet ambitious development schemes and rising housing deficits. However, the building sector has largely overlooked the potential of local agricultural wastes to serve as alternatives to energy-intensive, imported building technologies. Industrial ecologists have recently investigated the use of agrowaste biocomposites to replace conventional wood products that use harmful urea-formaldehyde, phenolic and isocyanate resins. Furthermore, developments in the performance of building material systems with respect to cost, energy, air quality management and construction innovation have evolved metrics about what constitutes material 'upcycling' within building life cycle. While these developments have largely been focused on technical and cost performance, much less attention has been paid to addressing deeply-seated social and cultural barriers to adoption that have sedimented over decades of importation. This dissertation evaluates the development coconut agricultural building material systems in four phases: (i) non-toxic, low-energy production of medium-high density boards (500-1200 kg/m3) from coconut fibers and emerging biobinders; (ii) characterization and evaluation of coconut agricultural building materials hygrothermal performance (iii) scaled-up design development of coconut modular building material systems and (iv) development of a value translation framework for the bottom-up distribution of value to stakeholders within the upcycling framework. This integrated design methodological approach is significant to develop ecological thinking around agrowaste building materials, influence social and cultural acceptability and create value translation frameworks that sufficiently characterize the composite value proposition of upcycled building systems.

  3. Linking native and invader traits explains native spider population responses to plant invasion

    Treesearch

    Jennifer N. Smith; Douglas J. Emlen; Dean E. Pearson

    2016-01-01

    Theoretically, the functional traits of native species should determine how natives respond to invader-driven changes. To explore this idea, we simulated a large-scale plant invasion using dead spotted knapweed (Centaurea stoebe) stems to determine if native spiders' web-building behaviors could explain differences in spider population responses to...

  4. Designing Large-Scale Multisite and Cluster-Randomized Studies of Professional Development

    ERIC Educational Resources Information Center

    Kelcey, Ben; Spybrook, Jessaca; Phelps, Geoffrey; Jones, Nathan; Zhang, Jiaqi

    2017-01-01

    We develop a theoretical and empirical basis for the design of teacher professional development studies. We build on previous work by (a) developing estimates of intraclass correlation coefficients for teacher outcomes using two- and three-level data structures, (b) developing estimates of the variance explained by covariates, and (c) modifying…

  5. Improving Teacher Practice: Teachers' Perspectives on Capacity-Building Initiatives in Literacy

    ERIC Educational Resources Information Center

    Mattos, Joseph C.

    2011-01-01

    Educational research over the past 15 years shows that schools and school districts have, on a large scale, failed to translate reform goals into improved teacher practice and student learning. Although classroom teachers are central to successful school reform, research has rarely examined how teachers experience reform initiatives and how that…

  6. The Water Turbine: An Integrative STEM Education Context

    ERIC Educational Resources Information Center

    Grubbs, Michael E.; Deck, Anita

    2015-01-01

    Water turbines have long been used to make work easier for humans while minimizing energy consumption. They are not only used in small- and large-scale operations, but also provide a great context for Integrative STEM education. Students can begin to understand the technological processes available by designing, building, and testing different…

  7. Building Effective Green Energy Programs in Community Colleges

    ERIC Educational Resources Information Center

    Bozell, Maureen R.; Liston, Cynthia D.

    2010-01-01

    Community colleges across the country are engaged in large-scale federal and state initiatives to train low-income individuals for the nascent field that's become known as "green jobs." Many green economy advocates believe that green jobs training can be part of career pathways that help move unemployed and disconnected individuals--who are often…

  8. How Today's Undergraduate Students See Themselves as Tomorrow's Socially Responsible Leaders

    ERIC Educational Resources Information Center

    Ricketts, Kristina G.; Bruce, Jacklyn A.; Ewing, John C.

    2008-01-01

    A new generation of leaders is needed not only to build local partnerships in today's communities, but to assume all positions of leadership. Undergraduate students within a College of Agricultural Sciences at a large land grant university were given the Socially Responsible Leadership Scale (SLRS) to determine their self-perception of leadership…

  9. Unsettling Ourselves: Some Thoughts on Non-Native Participation in Decolonization Work

    ERIC Educational Resources Information Center

    Soltys, Matt

    2011-01-01

    The ecological impact of colonialism is inextricable from empire building, industrialism, large-scale deforestation, and agriculture. Not long ago one could safely drink from nearly every lake, river, stream, and spring, and one could hunt animals as a part of intact ecosystems. Today's world is very different. Colonization alters the reality…

  10. From Networked Learning to Operational Practice: Constructing and Transferring Superintendent Knowledge in a Regional Instructional Rounds Network

    ERIC Educational Resources Information Center

    Travis, Timothy J.

    2015-01-01

    Instructional rounds are an emerging network structure with processes and protocols designed to develop superintendents' knowledge and skills in leading large-scale improvement, to enable superintendents to build an infrastructure that supports the work of improvement, to assist superintendents in distributing leadership throughout their district,…

  11. Solar energy use in U.S. agriculture. Overview and policy issues

    USDA-ARS?s Scientific Manuscript database

    Using solar energy on farms for livestock watering, electric fence charging, and building lighting is not new in the United States, but in the past five years, solar energy is now being used more for large scale irrigation, heating water in dairies, and running motors/appliances in farm houses and b...

  12. Virtual Environments Supporting Learning and Communication in Special Needs Education

    ERIC Educational Resources Information Center

    Cobb, Sue V. G.

    2007-01-01

    Virtual reality (VR) describes a set of technologies that allow users to explore and experience 3-dimensional computer-generated "worlds" or "environments." These virtual environments can contain representations of real or imaginary objects on a small or large scale (from modeling of molecular structures to buildings, streets, and scenery of a…

  13. The Building of Multimedia Communications Network based on Session Initiation Protocol

    NASA Astrophysics Data System (ADS)

    Yuexiao, Han; Yanfu, Zhang

    In this paper, we presented a novel design for a distributed multimedia communications network. We introduced the distributed tactic, flow procedure and particular structure. We also analyzed its scalability, stability, robustness, extension, and transmission delay of this architecture. Finally, the result shows our framework is suitable for very large scale communications.

  14. Building the Case for Large Scale Behavioral Education Adoptions

    ERIC Educational Resources Information Center

    Layng, Zachary R.; Layng, T. V. Joe

    2012-01-01

    Behaviorally-designed educational programs are often based on a research tradition that is not widely understood by potential users of the programs. Though the data may be sound and the prediction of outcomes for individual learners quite good, those advocating adoption of behaviorally-designed educational programs may need to do more in order to…

  15. Influence of small-scale disturbances by kangaroo rats on Chihuahuan Desert ants

    Treesearch

    R. L. Schooley; B. T. Bestelmeyer; J. F. Kelly

    2000-01-01

    Banner-tailed kangaroo rats (Dipodomys spectabilis) are prominent ecosystem engineers that build large mounds that influence the spatial structuring of fungi, plants, and some ground-dwelling animals. Ants are diverse and functionally important components of arid ecosystems; some species are also ecosystem engineers. We investigated the effects of...

  16. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  17. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  18. Rigging dark haloes: why is hierarchical galaxy formation consistent with the inside-out build-up of thin discs?

    NASA Astrophysics Data System (ADS)

    Pichon, C.; Pogosyan, D.; Kimm, T.; Slyz, A.; Devriendt, J.; Dubois, Y.

    2011-12-01

    State-of-the-art hydrodynamical simulations show that gas inflow through the virial sphere of dark matter haloes is focused (i.e. has a preferred inflow direction), consistent (i.e. its orientation is steady in time) and amplified (i.e. the amplitude of its advected specific angular momentum increases with time). We explain this to be a consequence of the dynamics of the cosmic web within the neighbourhood of the halo, which produces steady, angular momentum rich, filamentary inflow of cold gas. On large scales, the dynamics within neighbouring patches drives matter out of the surrounding voids, into walls and filaments before it finally gets accreted on to virialized dark matter haloes. As these walls/filaments constitute the boundaries of asymmetric voids, they acquire a net transverse motion, which explains the angular momentum rich nature of the later infall which comes from further away. We conjecture that this large-scale driven consistency explains why cold flows are so efficient at building up high-redshift thin discs inside out.

  19. Age distribution of human gene families shows significant roles of both large- and small-scale duplications in vertebrate evolution.

    PubMed

    Gu, Xun; Wang, Yufeng; Gu, Jianying

    2002-06-01

    The classical (two-round) hypothesis of vertebrate genome duplication proposes two successive whole-genome duplication(s) (polyploidizations) predating the origin of fishes, a view now being seriously challenged. As the debate largely concerns the relative merits of the 'big-bang mode' theory (large-scale duplication) and the 'continuous mode' theory (constant creation by small-scale duplications), we tested whether a significant proportion of paralogous genes in the contemporary human genome was indeed generated in the early stage of vertebrate evolution. After an extensive search of major databases, we dated 1,739 gene duplication events from the phylogenetic analysis of 749 vertebrate gene families. We found a pattern characterized by two waves (I, II) and an ancient component. Wave I represents a recent gene family expansion by tandem or segmental duplications, whereas wave II, a rapid paralogous gene increase in the early stage of vertebrate evolution, supports the idea of genome duplication(s) (the big-bang mode). Further analysis indicated that large- and small-scale gene duplications both make a significant contribution during the early stage of vertebrate evolution to build the current hierarchy of the human proteome.

  20. New Markets for Solar Photovoltaic Power Systems

    NASA Astrophysics Data System (ADS)

    Thomas, Chacko; Jennings, Philip; Singh, Dilawar

    2007-10-01

    Over the past five years solar photovoltaic (PV) power supply systems have matured and are now being deployed on a much larger scale. The traditional small-scale remote area power supply systems are still important and village electrification is also a large and growing market but large scale, grid-connected systems and building integrated systems are now being deployed in many countries. This growth has been aided by imaginative government policies in several countries and the overall result is a growth rate of over 40% per annum in the sales of PV systems. Optimistic forecasts are being made about the future of PV power as a major source of sustainable energy. Plans are now being formulated by the IEA for very large-scale PV installations of more than 100 MW peak output. The Australian Government has announced a subsidy for a large solar photovoltaic power station of 154 MW in Victoria, based on the concentrator technology developed in Australia. In Western Australia a proposal has been submitted to the State Government for a 2 MW photovoltaic power system to provide fringe of grid support at Perenjori. This paper outlines the technologies, designs, management and policies that underpin these exciting developments in solar PV power.

  1. Out Flying the Eagle: China’s Drive for Domestic Economic Innovation and Its Impact on U.S.-China Relations

    DTIC Science & Technology

    2014-03-01

    wind turbines from General Electric. China recognizes the issues with IPR but it is something that will take time to fix. It will be a significant...Large aircraft  Large-scale oil and gas exploration  Manned space, including lunar exploration  Next-generation broadband wireless ...circuits, and building an innovation system for China’s integrated circuit (IC) manufacturing industry. 3. New generation broadband wireless mobile

  2. Towards physics responsible for large-scale Lyman-α forest bias parameters

    DOE PAGES

    Agnieszka M. Cieplak; Slosar, Anze

    2016-03-08

    Using a series of carefully constructed numerical experiments based on hydrodynamic cosmological SPH simulations, we attempt to build an intuition for the relevant physics behind the large scale density (b δ) and velocity gradient (b η) biases of the Lyman-α forest. Starting with the fluctuating Gunn-Peterson approximation applied to the smoothed total density field in real-space, and progressing through redshift-space with no thermal broadening, redshift-space with thermal broadening and hydrodynamically simulated baryon fields, we investigate how approximations found in the literature fare. We find that Seljak's 2012 analytical formulae for these bias parameters work surprisingly well in the limit ofmore » no thermal broadening and linear redshift-space distortions. We also show that his b η formula is exact in the limit of no thermal broadening. Since introduction of thermal broadening significantly affects its value, we speculate that a combination of large-scale measurements of b η and the small scale flux PDF might be a sensitive probe of the thermal state of the IGM. Lastly, we find that large-scale biases derived from the smoothed total matter field are within 10–20% to those based on hydrodynamical quantities, in line with other measurements in the literature.« less

  3. Towards physics responsible for large-scale Lyman-α forest bias parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnieszka M. Cieplak; Slosar, Anze

    Using a series of carefully constructed numerical experiments based on hydrodynamic cosmological SPH simulations, we attempt to build an intuition for the relevant physics behind the large scale density (b δ) and velocity gradient (b η) biases of the Lyman-α forest. Starting with the fluctuating Gunn-Peterson approximation applied to the smoothed total density field in real-space, and progressing through redshift-space with no thermal broadening, redshift-space with thermal broadening and hydrodynamically simulated baryon fields, we investigate how approximations found in the literature fare. We find that Seljak's 2012 analytical formulae for these bias parameters work surprisingly well in the limit ofmore » no thermal broadening and linear redshift-space distortions. We also show that his b η formula is exact in the limit of no thermal broadening. Since introduction of thermal broadening significantly affects its value, we speculate that a combination of large-scale measurements of b η and the small scale flux PDF might be a sensitive probe of the thermal state of the IGM. Lastly, we find that large-scale biases derived from the smoothed total matter field are within 10–20% to those based on hydrodynamical quantities, in line with other measurements in the literature.« less

  4. Indoor Air Quality of Residential Building Before and After Renovation

    NASA Astrophysics Data System (ADS)

    Sánka, Imrich; Földváry, Veronika

    2017-06-01

    This study investigates the impact of energy renovation on the indoor air quality of an apartment building during the heating season. The study was performed in one residential building before and after its renovation. An evaluation of the indoor air quality was performed using objective measurements and a subjective survey. The concentration of CO2 was measured in the bedrooms, and a sampling of the total volatile compounds (TVOC) was performed in the living rooms of the selected apartments. Higher concentrations of CO2 and TVOC were observed in the residential building after its renovation. The concentrations of CO2, and TVOC in some of the cases exceeded the recommended maximum limits, especially after implementing energy-saving measures on the building. The average air exchange rate was visibly higher before the renovation of the building. The current study indicates that large-scale renovations may reduce the quality of an indoor environment in many apartments, especially in the winter season.

  5. Improving efficiency of polystyrene concrete production with composite binders

    NASA Astrophysics Data System (ADS)

    Lesovik, R. V.; Ageeva, M. S.; Lesovik, G. A.; Sopin, D. M.; Kazlitina, O. V.; Mitrokhina, A. A.

    2018-03-01

    According to leading marketing researchers, the construction market in Russia and CIS will continue growing at a rapid rate; this applies not only to a large-scale major construction, but to a construction of single-family houses and small-scale industrial facilities as well. Due to this, there are increased requirements for heat insulation of the building enclosures and a significant demand for efficient walling materials with high thermal performance. All these developments led to higher requirements imposed on the equipment that produces such materials.

  6. Evaluating trade-offs of a large, infrequent sediment diversion for restoration of a forested wetland in the Mississippi delta

    NASA Astrophysics Data System (ADS)

    Rutherford, Jeffrey S.; Day, John W.; D'Elia, Christopher F.; Wiegman, Adrian R. H.; Willson, Clinton S.; Caffey, Rex H.; Shaffer, Gary P.; Lane, Robert R.; Batker, David

    2018-04-01

    Flood control levees cut off the supply of sediment to Mississippi delta coastal wetlands, and contribute to putting much of the delta on a trajectory for continued submergence in the 21st century. River sediment diversions have been proposed as a method to provide a sustainable supply of sediment to the delta, but the frequency and magnitude of these diversions needs further assessment. Previous studies suggested operating river sediment diversions based on the size and frequency of natural crevasse events, which were large (>5000 m3/s) and infrequent (active < once a year) in the last naturally active delta. This study builds on these previous works by quantitatively assessing tradeoffs for a large, infrequent diversion into the forested wetlands of the Maurepas swamp. Land building was estimated for several diversion sizes and years inactive using a delta progradation model. A benefit-cost analysis (BCA) combined model land building results with an ecosystem service valuation and estimated costs. Results demonstrated that land building is proportional to diversion size and inversely proportional to years inactive. Because benefits were assumed to scale linearly with land gain, and costs increase with diversion size, there are disadvantages to operating large diversions less often, compared to smaller diversions more often for the immediate project area. Literature suggests that infrequent operation would provide additional gains (through increased benefits and reduced ecosystem service costs) to the broader Lake Maurepas-Pontchartrain-Borgne ecosystem. Future research should incorporate these additional effects into this type of BCA, to see if this changes the outcome for large, infrequent diversions.

  7. Four-center bubbled BPS solutions with a Gibbons-Hawking base

    NASA Astrophysics Data System (ADS)

    Heidmann, Pierre

    2017-10-01

    We construct four-center bubbled BPS solutions with a Gibbons-Hawking base space. We give a systematic procedure to build scaling solutions: starting from three-supertube configurations and using generalized spectral flows and gauge transformations to extend to solutions with four Gibbons-Hawking centers. This allows us to construct very large families of smooth horizonless solutions that have the same charges and angular momentum as supersymmetric black holes with a macroscopically large horizon area. Our construction reveals that all scaling solutions with four Gibbons Hawking centers have an angular momentum at around 99% of the cosmic censorship bound. We give both an analytical and a numerical explanation for this unexpected feature.

  8. Bottom-up production of meta-atoms for optical magnetism in visible and NIR light

    NASA Astrophysics Data System (ADS)

    Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe

    2018-02-01

    Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.

  9. Angular Momentum Transport in Thin Magnetically Arrested Disks

    NASA Astrophysics Data System (ADS)

    Marshall, Megan D.; Avara, Mark J.; McKinney, Jonathan C.

    2018-05-01

    In accretion disks with large-scale ordered magnetic fields, the magnetorotational instability (MRI) is marginally suppressed, so other processes may drive angular momentum transport leading to accretion. Accretion could then be driven by large-scale magnetic fields via magnetic braking, and large-scale magnetic flux can build-up onto the black hole and within the disk leading to a magnetically-arrested disk (MAD). Such a MAD state is unstable to the magnetic Rayleigh-Taylor (RT) instability, which itself leads to vigorous turbulence and the emergence of low-density highly-magnetized bubbles. This instability was studied in a thin (ratio of half-height H to radius R, H/R ≈ 0.1) MAD simulation, where it has a more dramatic effect on the dynamics of the disk than for thicker disks. Large amounts of flux are pushed off the black hole into the disk, leading to temporary decreases in stress, then this flux is reprocessed as the stress increases again. Throughout this process, we find that the dominant component of the stress is due to turbulent magnetic fields, despite the suppression of the axisymmetric MRI and the dominant presence of large-scale magnetic fields. This suggests that the magnetic RT instability plays a significant role in driving angular momentum transport in MADs.

  10. Integrating High-Resolution Datasets to Target Mitigation Efforts for Improving Air Quality and Public Health in Urban Neighborhoods

    PubMed Central

    Shandas, Vivek; Voelkel, Jackson; Rao, Meenakshi; George, Linda

    2016-01-01

    Reducing exposure to degraded air quality is essential for building healthy cities. Although air quality and population vary at fine spatial scales, current regulatory and public health frameworks assess human exposures using county- or city-scales. We build on a spatial analysis technique, dasymetric mapping, for allocating urban populations that, together with emerging fine-scale measurements of air pollution, addresses three objectives: (1) evaluate the role of spatial scale in estimating exposure; (2) identify urban communities that are disproportionately burdened by poor air quality; and (3) estimate reduction in mobile sources of pollutants due to local tree-planting efforts using nitrogen dioxide. Our results show a maximum value of 197% difference between cadastrally-informed dasymetric system (CIDS) and standard estimations of population exposure to degraded air quality for small spatial extent analyses, and a lack of substantial difference for large spatial extent analyses. These results provide the foundation for improving policies for managing air quality, and targeting mitigation efforts to address challenges of environmental justice. PMID:27527205

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Sha; Evans, Meredydd; Shi, Qing

    China will account for about half of the new construction globally in the coming decade. Its floorspace doubled from 1996 to 2011, and Chinese rural buildings alone have as much floorspace as all of U.S. residential buildings. Building energy consumption has also grown, increasing by over 40% since 1990. To curb building energy demand, the Chinese government has launched a series of policies and programs. Combined, this growth in buildings and renovations, along with the policies to promote green buildings, are creating a large market for energy efficiency products and services. This report assesses the impact of China’s policies onmore » building energy efficiency and on the market for energy efficiency in the future. The first chapter of this report introduces the trends in China, drawing on both historical analysis, and detailed modeling of the drivers behind changes in floorspace and building energy demand such as economic and population growth, urbanization, policy. The analysis describes the trends by region, building type and energy service. The second chapter discusses China’s policies to promote green buildings. China began developing building energy codes in the 1980s. Over time, the central government has increased the stringency of the code requirements and the extent of enforcement. The codes are mandatory in all new buildings and major renovations in China’s cities, and they have been a driving force behind the expansion of China’s markets for insulation, efficient windows, and other green building materials. China also has several other important policies to encourage efficient buildings, including the Three-Star Rating System (somewhat akin to LEED), financial incentives tied to efficiency, appliance standards, a phasing out of incandescent bulbs and promotion of efficient lighting, and several policies to encourage retrofits in existing buildings. In the third chapter, we take “deep dives” into the trends affecting key building components. This chapter examines insulation in walls and roofs; efficient windows and doors; heating, air conditioning and controls; and lighting. These markets have seen significant growth because of the strength of the construction sector but also the specific policies that require and promote efficient building components. At the same time, as requirements have become more stringent, there has been fierce competition, and quality has at time suffered, which in turn has created additional challenges. Next we examine existing buildings in chapter four. China has many Soviet-style, inefficient buildings built before stringent requirements for efficiency were more widely enforced. As a result, there are several specific market opportunities related to retrofits. These fall into two or three categories. First, China now has a code for retrofitting residential buildings in the north. Local governments have targets of the number of buildings they must retrofit each year, and they help finance the changes. The requirements focus on insulation, windows, and heat distribution. Second, the Chinese government recently decided to increase the scale of its retrofits of government and state-owned buildings. It hopes to achieve large scale changes through energy service contracts, which creates an opportunity for energy service companies. Third, there is also a small but growing trend to apply energy service contracts to large commercial and residential buildings. This report assesses the impacts of China’s policies on building energy efficiency. By examining the existing literature and interviewing stakeholders from the public, academic, and private sectors, the report seeks to offer an in-depth insights of the opportunities and barriers for major market segments related to building energy efficiency. The report also discusses trends in building energy use, policies promoting building energy efficiency, and energy performance contracting for public building retrofits.« less

  12. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  13. Space Weather Research at the National Science Foundation

    NASA Astrophysics Data System (ADS)

    Moretto, T.

    2015-12-01

    There is growing recognition that the space environment can have substantial, deleterious, impacts on society. Consequently, research enabling specification and forecasting of hazardous space effects has become of great importance and urgency. This research requires studying the entire Sun-Earth system to understand the coupling of regions all the way from the source of disturbances in the solar atmosphere to the Earth's upper atmosphere. The traditional, region-based structure of research programs in Solar and Space physics is ill suited to fully support the change in research directions that the problem of space weather dictates. On the observational side, dense, distributed networks of observations are required to capture the full large-scale dynamics of the space environment. However, the cost of implementing these is typically prohibitive, especially for measurements in space. Thus, by necessity, the implementation of such new capabilities needs to build on creative and unconventional solutions. A particularly powerful idea is the utilization of new developments in data engineering and informatics research (big data). These new technologies make it possible to build systems that can collect and process huge amounts of noisy and inaccurate data and extract from them useful information. The shift in emphasis towards system level science for geospace also necessitates the development of large-scale and multi-scale models. The development of large-scale models capable of capturing the global dynamics of the Earth's space environment requires investment in research team efforts that go beyond what can typically be funded under the traditional grants programs. This calls for effective interdisciplinary collaboration and efficient leveraging of resources both nationally and internationally. This presentation will provide an overview of current and planned initiatives, programs, and activities at the National Science Foundation pertaining to space weathe research.

  14. A VLSI decomposition of the deBruijn graph

    NASA Technical Reports Server (NTRS)

    Collins, O.; Dolinar, S.; Mceliece, R.; Pollara, F.

    1990-01-01

    A new Viterbi decoder for convolutional codes with constraint lengths up to 15, called the Big Viterbi Decoder, is under development for the Deep Space Network. It will be demonstrated by decoding data from the Galileo spacecraft, which has a rate 1/4, constraint-length 15 convolutional encoder on board. Here, the mathematical theory underlying the design of the very-large-scale-integrated (VLSI) chips that are being used to build this decoder is explained. The deBruijn graph B sub n describes the topology of a fully parallel, rate 1/v, constraint length n+2 Viterbi decoder, and it is shown that B sub n can be built by appropriately wiring together (i.e., connecting together with extra edges) many isomorphic copies of a fixed graph called a B sub n building block. The efficiency of such a building block is defined as the fraction of the edges in B sub n that are present in the copies of the building block. It is shown, among other things, that for any alpha less than 1, there exists a graph G which is a B sub n building block of efficiency greater than alpha for all sufficiently large n. These results are illustrated by describing a special hierarchical family of deBruijn building blocks, which has led to the design of the gate-array chips being used in the Big Viterbi Decoder.

  15. Building Inventory Database on the Urban Scale Using GIS for Earthquake Risk Assessment

    NASA Astrophysics Data System (ADS)

    Kaplan, O.; Avdan, U.; Guney, Y.; Helvaci, C.

    2016-12-01

    The majority of the existing buildings are not safe against earthquakes in most of the developing countries. Before a devastating earthquake, existing buildings need to be assessed and the vulnerable ones must be determined. Determining the seismic performance of existing buildings which is usually made with collecting the attributes of existing buildings, making the analysis and the necessary queries, and producing the result maps is very hard and complicated procedure that can be simplified with Geographic Information System (GIS). The aim of this study is to produce a building inventory database using GIS for assessing the earthquake risk of existing buildings. In this paper, a building inventory database for 310 buildings, located in Eskisehir, Turkey, was produced in order to assess the earthquake risk of the buildings. The results from this study show that 26% of the buildings have high earthquake risk, 33% of the buildings have medium earthquake risk and the 41% of the buildings have low earthquake risk. The produced building inventory database can be very useful especially for governments in dealing with the problem of determining seismically vulnerable buildings in the large existing building stocks. With the help of this kind of methods, determination of the buildings, which may collapse and cause life and property loss during a possible future earthquake, will be very quick, cheap and reliable.

  16. A study of the viability of exploiting memory content similarity to improve resilience to memory errors

    DOE PAGES

    Levy, Scott; Ferreira, Kurt B.; Bridges, Patrick G.; ...

    2014-12-09

    Building the next-generation of extreme-scale distributed systems will require overcoming several challenges related to system resilience. As the number of processors in these systems grow, the failure rate increases proportionally. One of the most common sources of failure in large-scale systems is memory. In this paper, we propose a novel runtime for transparently exploiting memory content similarity to improve system resilience by reducing the rate at which memory errors lead to node failure. We evaluate the viability of this approach by examining memory snapshots collected from eight high-performance computing (HPC) applications and two important HPC operating systems. Based on themore » characteristics of the similarity uncovered, we conclude that our proposed approach shows promise for addressing system resilience in large-scale systems.« less

  17. Engineering survey planning for the alignment of a particle accelerator: part II. Design of a reference network and measurement strategy

    NASA Astrophysics Data System (ADS)

    Junqueira Leão, Rodrigo; Raffaelo Baldo, Crhistian; Collucci da Costa Reis, Maria Luisa; Alves Trabanco, Jorge Luiz

    2018-03-01

    The building blocks of particle accelerators are magnets responsible for keeping beams of charged particles at a desired trajectory. Magnets are commonly grouped in support structures named girders, which are mounted on vertical and horizontal stages. The performance of this type of machine is highly dependent on the relative alignment between its main components. The length of particle accelerators ranges from small machines to large-scale national or international facilities, with typical lengths of hundreds of meters to a few kilometers. This relatively large volume together with micrometric positioning tolerances make the alignment activity a classical large-scale dimensional metrology problem. The alignment concept relies on networks of fixed monuments installed on the building structure to which all accelerator components are referred. In this work, the Sirius accelerator is taken as a case study, and an alignment network is optimized via computational methods in terms of geometry, densification, and surveying procedure. Laser trackers are employed to guide the installation and measure the girders’ positions, using the optimized network as a reference and applying the metric developed in part I of this paper. Simulations demonstrate the feasibility of aligning the 220 girders of the Sirius synchrotron to better than 0.080 mm, at a coverage probability of 95%.

  18. Building 3D structures of vanadium pentoxide nanosheets and application as electrodes in supercapacitors.

    PubMed

    Zhu, Jixin; Cao, Liujun; Wu, Yingsi; Gong, Yongji; Liu, Zheng; Hoster, Harry E; Zhang, Yunhuai; Zhang, Shengtao; Yang, Shubin; Yan, Qingyu; Ajayan, Pulickel M; Vajtai, Robert

    2013-01-01

    Various two-dimensional (2D) materials have recently attracted great attention owing to their unique properties and wide application potential in electronics, catalysis, energy storage, and conversion. However, large-scale production of ultrathin sheets and functional nanosheets remains a scientific and engineering challenge. Here we demonstrate an efficient approach for large-scale production of V2O5 nanosheets having a thickness of 4 nm and utilization as building blocks for constructing 3D architectures via a freeze-drying process. The resulting highly flexible V2O5 structures possess a surface area of 133 m(2) g(-1), ultrathin walls, and multilevel pores. Such unique features are favorable for providing easy access of the electrolyte to the structure when they are used as a supercapacitor electrode, and they also provide a large electroactive surface that advantageous in energy storage applications. As a consequence, a high specific capacitance of 451 F g(-1) is achieved in a neutral aqueous Na2SO4 electrolyte as the 3D architectures are utilized for energy storage. Remarkably, the capacitance retention after 4000 cycles is more than 90%, and the energy density is up to 107 W·h·kg(-1) at a high power density of 9.4 kW kg(-1).

  19. Time-sliced perturbation theory for large scale structure I: general formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blas, Diego; Garny, Mathias; Sibiryakov, Sergey

    2016-07-01

    We present a new analytic approach to describe large scale structure formation in the mildly non-linear regime. The central object of the method is the time-dependent probability distribution function generating correlators of the cosmological observables at a given moment of time. Expanding the distribution function around the Gaussian weight we formulate a perturbative technique to calculate non-linear corrections to cosmological correlators, similar to the diagrammatic expansion in a three-dimensional Euclidean quantum field theory, with time playing the role of an external parameter. For the physically relevant case of cold dark matter in an Einstein-de Sitter universe, the time evolution ofmore » the distribution function can be found exactly and is encapsulated by a time-dependent coupling constant controlling the perturbative expansion. We show that all building blocks of the expansion are free from spurious infrared enhanced contributions that plague the standard cosmological perturbation theory. This paves the way towards the systematic resummation of infrared effects in large scale structure formation. We also argue that the approach proposed here provides a natural framework to account for the influence of short-scale dynamics on larger scales along the lines of effective field theory.« less

  20. Data for Room Fire Model Comparisons

    PubMed Central

    Peacock, Richard D.; Davis, Sanford; Babrauskas, Vytenis

    1991-01-01

    With the development of models to predict fire growth and spread in buildings, there has been a concomitant evolution in the measurement and analysis of experimental data in real-scale fires. This report presents the types of analyses that can be used to examine large-scale room fire test data to prepare the data for comparison with zone-based fire models. Five sets of experimental data which can be used to test the limits of a typical two-zone fire model are detailed. A standard set of nomenclature describing the geometry of the building and the quantities measured in each experiment is presented. Availability of ancillary data (such as smaller-scale test results) is included. These descriptions, along with the data (available in computer-readable form) should allow comparisons between the experiment and model predictions. The base of experimental data ranges in complexity from one room tests with individual furniture items to a series of tests conducted in a multiple story hotel equipped with a zoned smoke control system. PMID:28184121

  1. Data for Room Fire Model Comparisons.

    PubMed

    Peacock, Richard D; Davis, Sanford; Babrauskas, Vytenis

    1991-01-01

    With the development of models to predict fire growth and spread in buildings, there has been a concomitant evolution in the measurement and analysis of experimental data in real-scale fires. This report presents the types of analyses that can be used to examine large-scale room fire test data to prepare the data for comparison with zone-based fire models. Five sets of experimental data which can be used to test the limits of a typical two-zone fire model are detailed. A standard set of nomenclature describing the geometry of the building and the quantities measured in each experiment is presented. Availability of ancillary data (such as smaller-scale test results) is included. These descriptions, along with the data (available in computer-readable form) should allow comparisons between the experiment and model predictions. The base of experimental data ranges in complexity from one room tests with individual furniture items to a series of tests conducted in a multiple story hotel equipped with a zoned smoke control system.

  2. Use of Machine Learning Algorithms to Propose a New Methodology to Conduct, Critique and Validate Urban Scale Building Energy Modeling

    NASA Astrophysics Data System (ADS)

    Pathak, Maharshi

    City administrators and real-estate developers have been setting up rather aggressive energy efficiency targets. This, in turn, has led the building science research groups across the globe to focus on urban scale building performance studies and level of abstraction associated with the simulations of the same. The increasing maturity of the stakeholders towards energy efficiency and creating comfortable working environment has led researchers to develop methodologies and tools for addressing the policy driven interventions whether it's urban level energy systems, buildings' operational optimization or retrofit guidelines. Typically, these large-scale simulations are carried out by grouping buildings based on their design similarities i.e. standardization of the buildings. Such an approach does not necessarily lead to potential working inputs which can make decision-making effective. To address this, a novel approach is proposed in the present study. The principle objective of this study is to propose, to define and evaluate the methodology to utilize machine learning algorithms in defining representative building archetypes for the Stock-level Building Energy Modeling (SBEM) which are based on operational parameter database. The study uses "Phoenix- climate" based CBECS-2012 survey microdata for analysis and validation. Using the database, parameter correlations are studied to understand the relation between input parameters and the energy performance. Contrary to precedence, the study establishes that the energy performance is better explained by the non-linear models. The non-linear behavior is explained by advanced learning algorithms. Based on these algorithms, the buildings at study are grouped into meaningful clusters. The cluster "mediod" (statistically the centroid, meaning building that can be represented as the centroid of the cluster) are established statistically to identify the level of abstraction that is acceptable for the whole building energy simulations and post that the retrofit decision-making. Further, the methodology is validated by conducting Monte-Carlo simulations on 13 key input simulation parameters. The sensitivity analysis of these 13 parameters is utilized to identify the optimum retrofits. From the sample analysis, the envelope parameters are found to be more sensitive towards the EUI of the building and thus retrofit packages should also be directed to maximize the energy usage reduction.

  3. Achieving Actionable Results from Available Inputs: Metamodels Take Building Energy Simulations One Step Further

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horsey, Henry; Fleming, Katherine; Ball, Brian

    Modeling commercial building energy usage can be a difficult and time-consuming task. The increasing prevalence of optimization algorithms provides one path for reducing the time and difficulty. Many use cases remain, however, where information regarding whole-building energy usage is valuable, but the time and expertise required to run and post-process a large number of building energy simulations is intractable. A relatively underutilized option to accurately estimate building energy consumption in real time is to pre-compute large datasets of potential building energy models, and use the set of results to quickly and efficiently provide highly accurate data. This process is calledmore » metamodeling. In this paper, two case studies are presented demonstrating the successful applications of metamodeling using the open-source OpenStudio Analysis Framework. The first case study involves the U.S. Department of Energy's Asset Score Tool, specifically the Preview Asset Score Tool, which is designed to give nontechnical users a near-instantaneous estimated range of expected results based on building system-level inputs. The second case study involves estimating the potential demand response capabilities of retail buildings in Colorado. The metamodel developed in this second application not only allows for estimation of a single building's expected performance, but also can be combined with public data to estimate the aggregate DR potential across various geographic (county and state) scales. In both case studies, the unique advantages of pre-computation allow building energy models to take the place of topdown actuarial evaluations. This paper ends by exploring the benefits of using metamodels and then examines the cost-effectiveness of this approach.« less

  4. Downscaling modelling system for multi-scale air quality forecasting

    NASA Astrophysics Data System (ADS)

    Nuterman, R.; Baklanov, A.; Mahura, A.; Amstrup, B.; Weismann, J.

    2010-09-01

    Urban modelling for real meteorological situations, in general, considers only a small part of the urban area in a micro-meteorological model, and urban heterogeneities outside a modelling domain affect micro-scale processes. Therefore, it is important to build a chain of models of different scales with nesting of higher resolution models into larger scale lower resolution models. Usually, the up-scaled city- or meso-scale models consider parameterisations of urban effects or statistical descriptions of the urban morphology, whereas the micro-scale (street canyon) models are obstacle-resolved and they consider a detailed geometry of the buildings and the urban canopy. The developed system consists of the meso-, urban- and street-scale models. First, it is the Numerical Weather Prediction (HIgh Resolution Limited Area Model) model combined with Atmospheric Chemistry Transport (the Comprehensive Air quality Model with extensions) model. Several levels of urban parameterisation are considered. They are chosen depending on selected scales and resolutions. For regional scale, the urban parameterisation is based on the roughness and flux corrections approach; for urban scale - building effects parameterisation. Modern methods of computational fluid dynamics allow solving environmental problems connected with atmospheric transport of pollutants within urban canopy in a presence of penetrable (vegetation) and impenetrable (buildings) obstacles. For local- and micro-scales nesting the Micro-scale Model for Urban Environment is applied. This is a comprehensive obstacle-resolved urban wind-flow and dispersion model based on the Reynolds averaged Navier-Stokes approach and several turbulent closures, i.e. k -ɛ linear eddy-viscosity model, k - ɛ non-linear eddy-viscosity model and Reynolds stress model. Boundary and initial conditions for the micro-scale model are used from the up-scaled models with corresponding interpolation conserving the mass. For the boundaries a kind of Dirichlet condition is chosen to provide the values based on interpolation from the coarse to the fine grid. When the roughness approach is changed to the obstacle-resolved one in the nested model, the interpolation procedure will increase the computational time (due to additional iterations) for meteorological/ chemical fields inside the urban sub-layer. In such situations, as a possible alternative, the perturbation approach can be applied. Here, the effects of main meteorological variables and chemical species are considered as a sum of two components: background (large-scale) values, described by the coarse-resolution model, and perturbations (micro-scale) features, obtained from the nested fine resolution model.

  5. Basic numerical competences in large-scale assessment data: Structure and long-term relevance.

    PubMed

    Hirsch, Stefa; Lambert, Katharina; Coppens, Karien; Moeller, Korbinian

    2018-03-01

    Basic numerical competences are seen as building blocks for later numerical and mathematical achievement. The current study aimed at investigating the structure of early numeracy reflected by different basic numerical competences in kindergarten and its predictive value for mathematical achievement 6 years later using data from large-scale assessment. This allowed analyses based on considerably large sample sizes (N > 1700). A confirmatory factor analysis indicated that a model differentiating five basic numerical competences at the end of kindergarten fitted the data better than a one-factor model of early numeracy representing a comprehensive number sense. In addition, these basic numerical competences were observed to reliably predict performance in a curricular mathematics test in Grade 6 even after controlling for influences of general cognitive ability. Thus, our results indicated a differentiated view on early numeracy considering basic numerical competences in kindergarten reflected in large-scale assessment data. Consideration of different basic numerical competences allows for evaluating their specific predictive value for later mathematical achievement but also mathematical learning difficulties. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Large-Scale Advanced Prop-Fan (LAP)

    NASA Technical Reports Server (NTRS)

    Degeorge, C. L.

    1988-01-01

    In recent years, considerable attention has been directed toward improving aircraft fuel efficiency. Analytical studies and research with wind tunnel models have demonstrated that the high inherent efficiency of low speed turboprop propulsion systems may now be extended to the Mach .8 flight regime of today's commercial airliners. This can be accomplished with a propeller, employing a large number of thin highly swept blades. The term Prop-Fan has been coined to describe such a propulsion system. In 1983 the NASA-Lewis Research Center contracted with Hamilton Standard to design, build and test a near full scale Prop-Fan, designated the Large Scale Advanced Prop-Fan (LAP). This report provides a detailed description of the LAP program. The assumptions and analytical procedures used in the design of Prop-Fan system components are discussed in detail. The manufacturing techniques used in the fabrication of the Prop-Fan are presented. Each of the tests run during the course of the program are also discussed and the major conclusions derived from them stated.

  7. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  8. Exploring network operations for data and information networks

    NASA Astrophysics Data System (ADS)

    Yao, Bing; Su, Jing; Ma, Fei; Wang, Xiaomin; Zhao, Xiyang; Yao, Ming

    2017-01-01

    Barabási and Albert, in 1999, formulated scale-free models based on some real networks: World-Wide Web, Internet, metabolic and protein networks, language or sexual networks. Scale-free networks not only appear around us, but also have high qualities in the world. As known, high quality information networks can transfer feasibly and efficiently data, clearly, their topological structures are very important for data safety. We build up network operations for constructing large scale of dynamic networks from smaller scale of network models having good property and high quality. We focus on the simplest operators to formulate complex operations, and are interesting on the closeness of operations to desired network properties.

  9. Coordination and Control of Flexible Building Loads for Renewable Integration; Demonstrations using VOLTTRON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, He; Liu, Guopeng; Huang, Sen

    Renewable energy resources such as wind and solar power have a high degree of uncertainty. Large-scale integration of these variable generation sources into the grid is a big challenge for power system operators. Buildings, in which we live and work, consume about 75% of the total electricity in the United States. They also have a large capacity of power flexibility due to their massive thermal capacitance. Therefore, they present a great opportunity to help the grid to manage power balance. In this report, we study coordination and control of flexible building loads for renewable integration. We first present the motivationmore » and background, and conduct a literature review on building-to-grid integration. We also compile a catalog of flexible building loads that have great potential for renewable integration, and discuss their characteristics. We next collect solar generation data from a photovoltaic panel on Pacific Northwest National Laboratory campus, and conduct data analysis to study their characteristics. We find that solar generation output has a strong uncertainty, and the uncertainty occurs at almost all time scales. Additional data from other sources are also used to verify our study. We propose two transactive coordination strategies to manage flexible building loads for renewable integration. We prove the theories that support the two transactive coordination strategies and discuss their pros and cons. In this report, we select three types of flexible building loads—air-handling unit, rooftop unit, and a population of WHs—for which we demonstrate control of the flexible load to track a dispatch signal (e.g., renewable generation fluctuation) using experiment, simulation, or hardware-in-the-loop study. More specifically, we present the system description, model identification, controller design, test bed setup, and experiment results for each demonstration. We show that coordination and control of flexible loads has a great potential to integrate variable generation sources. The flexible loads can successfully track a power dispatch signal from the coordinator, while having little impact on the quality of service to the end-users.« less

  10. NREL Research Team Wins R&D 100 Award | News | NREL

    Science.gov Websites

    performance PV modules for large-scale solar power plants, commercial and residential buildings, and off-grid Laboratory (NREL) and First Solar have been selected to receive a 2003 R&D 100 award from R&D Magazine for developing a new process for depositing semiconductor layers onto photovoltaic (PV) modules

  11. Designing, Building, and Connecting Networks to Support Distributed Collaborative Empirical Writing Research

    ERIC Educational Resources Information Center

    Brunk-Chavez, Beth; Pigg, Stacey; Moore, Jessie; Rosinski, Paula; Grabill, Jeffrey T.

    2018-01-01

    To speak to diverse audiences about how people learn to write and how writing works inside and outside the academy, we must conduct research across geographical, institutional, and cultural contexts as well as research that enables comparison when appropriate. Large-scale empirical research is useful for both of these moves; however, we must…

  12. Multicultural Adolescents between Tradition and Postmodernity: Dialogical Self Theory and the Paradox of Localization and Globalization

    ERIC Educational Resources Information Center

    van Meijl, Toon

    2012-01-01

    This chapter builds on Dialogical Self Theory to investigate the identity development of adolescents growing up in multicultural societies. Their cultural identity is not only compounded by the rapid cultural changes associated with globalization, but also by the paradoxical revival of cultural traditions which the large-scale compression of…

  13. Designing Professional Learning for Effecting Change: Partnerships for Local and System Networks

    ERIC Educational Resources Information Center

    Wyatt-Smith, Claire; Bridges, Susan; Hedemann, Maree; Neville, Mary

    2008-01-01

    This paper presents (i) a purpose-built conceptual model for professional learning and (ii) a leadership framework designed to support a large-scale project involving diverse sites across the state of Queensland, Australia. The project had as its focus teacher-capacity building and ways to improve literacy and numeracy outcomes for students at…

  14. A Year of Progress in School-to-Career System Building. The Benchmark Communities Initiative.

    ERIC Educational Resources Information Center

    Martinez, Martha I.; And Others

    This document examines the first year of Jobs for the Future's Benchmark Communities Initiative (BCI), a 5-year effort to achieve the following: large-scale systemic restructuring of K-16 educational systems; involvement of significant numbers of employers in work and learning partnerships; and development of the infrastructure necessary to…

  15. WikiTextbooks: Designing Your Course around a Collaborative Writing Project

    ERIC Educational Resources Information Center

    Katz, Brian P.; Thoren, Elizabeth

    2014-01-01

    We have used wiki technology to support large-scale, collaborative writing projects in which the students build reference texts (called WikiTextbooks). The goal of this paper is to prepare readers to adapt this idea for their own courses. We give examples of the implementation of WikiTextbooks in a variety of courses, including lecture and…

  16. From Zero Energy Buildings to Zero Energy Districts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, Ben; Kutscher, Chuck; Macumber, Dan

    Some U.S. cities are planning advanced districts that have goals for zero energy, water, waste, and/or greenhouse gas emissions. From an energy perspective, zero energy districts present unique opportunities to cost-effectively achieve high levels of energy efficiency and renewable energy penetration across a collection of buildings that may be infeasible at the individual building scale. These high levels of performance are accomplished through district energy systems that harness renewable and wasted energy at large scales and flexible building loads that coordinate with variable renewable energy supply. Unfortunately, stakeholders face a lack of documented processes, tools, and best practices to assistmore » them in achieving zero energy districts. The National Renewable Energy Laboratory (NREL) is partnering on two new district projects in Denver: the National Western Center and the Sun Valley Neighborhood. We are working closely with project stakeholders in their zero energy master planning efforts to develop the resources needed to resolve barriers and create replicable processes to support future zero energy district efforts across the United States. Initial results of these efforts include the identification and description of key zero energy district design principles (maximizing building efficiency, solar potential, renewable thermal energy, and load control), economic drivers, and master planning principles. The work has also resulted in NREL making initial enhancements to the U.S. Department of Energy's open source building energy modeling platform (OpenStudio and EnergyPlus) with the long-term goal of supporting the design and optimization of energy districts.« less

  17. Assessing the Performance of Large Scale Green Roofs and Their Impact on the Urban Microclimate

    NASA Astrophysics Data System (ADS)

    Smalls-Mantey, L.; Foti, R.; Montalto, F. A.

    2015-12-01

    In ultra-urban environments green roofs offer a feasible solution to add green infrastructure (GI) in neighborhoods where space is limited. Green roofs offer the typical advantages of urban GI such as stormwater reduction and management while providing direct benefits to the buildings on which they are installed through thermal protection and mitigation of temperature fluctuations. At 6.8 acres, the Jacob K. Javits Convention Center (JJCC) in New York City, hosts the second largest green roof in the United States. Since its installation in August 2013, the Sustainable Water Resource (SWRE) Laboratory at Drexel University has monitored the climate on and around the green roof by means of four weather stations situated on various roof and ground locations. Using two years of fine scale climatic data collected at the JJCC, this study explores the energy balance of a large scale green roof system. Temperature, radiation, evapotranspiration and wind profiles pre- and post- installation of the JJCC green roof were analyzed and compared across monitored locations, with the goal of identifying the impact of the green roof on the building and urban micro-climate. Our findings indicate that the presence of the green roof, not only altered the climatic conditions above the JJCC, but also had a measurable impact on the climatic profile of the areas immediately surrounding it. Furthermore, as a result of the mitigation of roof temperature fluctuations and of the cooling provided during warmer months, an improvement of the building thermal efficiency was contextually observed. Such findings support the installation of GI as an effective practice in urban settings and important in the discussion of key issues including energy conservation measures, carbon emission reductions and the mitigation of urban heat islands.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kung, Feitau; Frank, Stephen; Scheib, Jennifer

    A zero energy building (ZEB)—also known as a net zero energy or zero net energy building—is a building that exports as much renewable energy as the total energy it imports from other sources on an annual basis (DOE 2015). Large-scale and commercially viable ZEBs are now in the marketplace, and they are expected to become a larger share of the commercial building footprint as government and private sector policies continue to promote the development of buildings that produce more on-site energy than they use. However, the load profiles of ZEBs are currently perceived by electric utilities to be unfavorable andmore » unpredictable. As shown in Figure ES-1, ZEB load profiles can have abrupt changes in magnitude, at times switching rapidly between exporting and importing electricity. This is a challenge for utilities, which are responsible for constantly balancing electricity supply and demand across the grid. Addressing these concerns will require new strategies and tools.« less

  19. A biological decontamination process for small, privately owned buildings.

    PubMed

    Krauter, Paula; Tucker, Mark

    2011-09-01

    An urban wide-area recovery and restoration effort following a large-scale biological release will require extensive resources and tax the capabilities of government authorities. Further, the number of private decontamination contractors available may not be sufficient to respond to the needs. These resource limitations could create the need for decontamination by the building owner/occupant. This article provides owners/occupants with a simple method to decontaminate a building or area following a wide-area release of Bacillus anthracis using liquid sporicidal decontamination materials, such as pH-amended bleach or activated peroxide; simple application devices; and high-efficiency particulate air-filtered vacuums. Owner/occupant decontamination would be recommended only after those charged with overseeing decontamination-the Unified Command/Incident Command-identify buildings and areas appropriate for owner/occupant decontamination based on modeling and environmental sampling and conduct health and safety training for cleanup workers.

  20. Producing lasting amphiphobic building surfaces with self-cleaning properties

    NASA Astrophysics Data System (ADS)

    Facio, Dario S.; Carrascosa, Luis A. M.; Mosquera, María J.

    2017-06-01

    Nowadays, producing building surfaces that prevent water and oil uptake and which present self-cleaning activity is still a challenge. In this study, amphiphobic (superhydrophobic and oleophobic) building surfaces were successfully produced. A simple and low-cost process was developed, which is applicable to large-scale building surfaces, according the following procedure: (1) by spraying a SiO2 nanocomposite which produces a closely-packed nanoparticle uniform topography; (2) by functionalizing the previous coating with a fluorinated alkoxysilane, producing high hydrophobicity and oleophobicity. The formation of a Cassie-Baxter regime, in which air pockets could be trapped between the aggregates of particles, was confirmed by topographic study. The building surface demonstrated an excellent self-cleaning performance. Finally, the surface presented lasting superhydrophobicity with high stability against successive attachment/detachment force cycles. This high durability can be explained by the effective grafting of the silica nanocomposite coating skeleton with the substrate, and with the additional fluorinated coating produced by condensation reactions.

  1. Large eddy simulation on buoyant gas diffusion near building

    NASA Astrophysics Data System (ADS)

    Tominaga, Yoshihide; Murakami, Shuzo; Mochida, Akashi

    1992-12-01

    Large eddy simulations on turbulent diffusion of buoyant gases near a building model are carried out for three cases in which the densimetric Froude Number (Frd) was specified at - 8.6, zero and 8.6 respectively. The accuracy of these simulations is examined by comparing the numerically predicted results with wind tunnel experiments conducted. Two types of sub-grid scale models, the standard Smagorinsky model (type 1) and the modified Smagorinsky model (type 2) are compared. The former does not take account of the production of subgrid energy by buoyancy force but the latter incorporates this effect. The latter model (type 2) gives more accurate results than those given by the standard Smagorinsky model (type 1) in terms of the distributions of kappa greater than sign C less than sign greater than sign C(sup - 2) less than sign.

  2. DebugIT for patient safety - improving the treatment with antibiotics through multimedia data mining of heterogeneous clinical data.

    PubMed

    Lovis, Christian; Colaert, Dirk; Stroetmann, Veli N

    2008-01-01

    The concepts and architecture underlying a large-scale integrating project funded within the 7th EU Framework Programme (FP7) are discussed. The main objective of the project is to build a tool that will have a significant impact for the monitoring and the control of infectious diseases and antimicrobial resistances in Europe; This will be realized by building a technical and semantic infrastructure able to share heterogeneous clinical data sets from different hospitals in different countries, with different languages and legislations; to analyze large amounts of this clinical data with advanced multimedia data mining and finally apply the obtained knowledge for clinical decisions and outcome monitoring. There are numerous challenges in this project at all levels, technical, semantical, legal and ethical that will have to be addressed.

  3. Supervised Outlier Detection in Large-Scale Mvs Point Clouds for 3d City Modeling Applications

    NASA Astrophysics Data System (ADS)

    Stucker, C.; Richard, A.; Wegner, J. D.; Schindler, K.

    2018-05-01

    We propose to use a discriminative classifier for outlier detection in large-scale point clouds of cities generated via multi-view stereo (MVS) from densely acquired images. What makes outlier removal hard are varying distributions of inliers and outliers across a scene. Heuristic outlier removal using a specific feature that encodes point distribution often delivers unsatisfying results. Although most outliers can be identified correctly (high recall), many inliers are erroneously removed (low precision), too. This aggravates object 3D reconstruction due to missing data. We thus propose to discriminatively learn class-specific distributions directly from the data to achieve high precision. We apply a standard Random Forest classifier that infers a binary label (inlier or outlier) for each 3D point in the raw, unfiltered point cloud and test two approaches for training. In the first, non-semantic approach, features are extracted without considering the semantic interpretation of the 3D points. The trained model approximates the average distribution of inliers and outliers across all semantic classes. Second, semantic interpretation is incorporated into the learning process, i.e. we train separate inlieroutlier classifiers per semantic class (building facades, roof, ground, vegetation, fields, and water). Performance of learned filtering is evaluated on several large SfM point clouds of cities. We find that results confirm our underlying assumption that discriminatively learning inlier-outlier distributions does improve precision over global heuristics by up to ≍ 12 percent points. Moreover, semantically informed filtering that models class-specific distributions further improves precision by up to ≍ 10 percent points, being able to remove very isolated building, roof, and water points while preserving inliers on building facades and vegetation.

  4. Attribution of Large-Scale Climate Patterns to Seasonal Peak-Flow and Prospects for Prediction Globally

    NASA Astrophysics Data System (ADS)

    Lee, Donghoon; Ward, Philip; Block, Paul

    2018-02-01

    Flood-related fatalities and impacts on society surpass those from all other natural disasters globally. While the inclusion of large-scale climate drivers in streamflow (or high-flow) prediction has been widely studied, an explicit link to global-scale long-lead prediction is lacking, which can lead to an improved understanding of potential flood propensity. Here we attribute seasonal peak-flow to large-scale climate patterns, including the El Niño Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO), using streamflow station observations and simulations from PCR-GLOBWB, a global-scale hydrologic model. Statistically significantly correlated climate patterns and streamflow autocorrelation are subsequently applied as predictors to build a global-scale season-ahead prediction model, with prediction performance evaluated by the mean squared error skill score (MSESS) and the categorical Gerrity skill score (GSS). Globally, fair-to-good prediction skill (20% ≤ MSESS and 0.2 ≤ GSS) is evident for a number of locations (28% of stations and 29% of land area), most notably in data-poor regions (e.g., West and Central Africa). The persistence of such relevant climate patterns can improve understanding of the propensity for floods at the seasonal scale. The prediction approach developed here lays the groundwork for further improving local-scale seasonal peak-flow prediction by identifying relevant global-scale climate patterns. This is especially attractive for regions with limited observations and or little capacity to develop flood early warning systems.

  5. Large Prefabricated Concrete Panels Collective Dwellings from the 1970s: Context and Improvements

    NASA Astrophysics Data System (ADS)

    Muntean, Daniel M.; Ungureanu, Viorel; Petran, Ioan; Georgescu, Mircea

    2017-10-01

    The period between 1960s and 1970s had a significant impact in Romania on the urban development of major cities. Because the vast expansion of the industry, the urban population has massively increased, due the large number of workers coming from the rural areas. This intense process has led to a shortage of homes on the housing market. In order to rapidly build new homes, standard residential project types were erected using large prefabricated concrete panels. By using repetitive patterns, such buildings were built in a short amount of time through the entire country. Nowadays, these buildings represent 1.8% of the built environment and accommodate more than half of a city’s population. Even though these units have reached only half their intended life span, they fail to satisfy present living standards and consume huge amounts of energy for heating, cooling, ventilation and lighting. Due to the fact that these building are based on standardised projects and were built in such a large scale, the creation of a system that brings them to current standards will not only benefit the building but also it will significantly improve the quality of life within. With the transition of the existing power grids to a “smart grid” such units can become micro power plants in future electricity networks thus contributing to micro-generation and energy storage. If one is to consider the EU 20-20-20 commitments, to find ideas for alternative and innovative strategies for further improving these building through locally adapted measures can be seen as one of the most addressed issues of today. This research offers a possible retrofitting scenario of these buildings towards a sustainable future. The building envelope is upgraded using a modular insulation system with integrated solar cells. Renewable energy systems for cooling and ventilation are integrated in order to provide flexibility of the indoor climate. Due to their small floor area, the space within the apartments is redesigned for a more efficient use of space and an improved natural lighting. Active core modules are placed on top of the unused attics and a solar panel array is introduced. Furthermore accessibility issues are addressed by facilitating access for disabled people and implementing an elevator system that currently these building do not have.

  6. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    PubMed

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  7. Exploring Entrainment Patterns of Human Emotion in Social Media

    PubMed Central

    Luo, Chuan; Zhang, Zhu

    2016-01-01

    Emotion entrainment, which is generally defined as the synchronous convergence of human emotions, performs many important social functions. However, what the specific mechanisms of emotion entrainment are beyond in-person interactions, and how human emotions evolve under different entrainment patterns in large-scale social communities, are still unknown. In this paper, we aim to examine the massive emotion entrainment patterns and understand the underlying mechanisms in the context of social media. As modeling emotion dynamics on a large scale is often challenging, we elaborate a pragmatic framework to characterize and quantify the entrainment phenomenon. By applying this framework on the datasets from two large-scale social media platforms, we find that the emotions of online users entrain through social networks. We further uncover that online users often form their relations via dual entrainment, while maintain it through single entrainment. Remarkably, the emotions of online users are more convergent in nonreciprocal entrainment. Building on these findings, we develop an entrainment augmented model for emotion prediction. Experimental results suggest that entrainment patterns inform emotion proximity in dyads, and encoding their associations promotes emotion prediction. This work can further help us to understand the underlying dynamic process of large-scale online interactions and make more reasonable decisions regarding emergency situations, epidemic diseases, and political campaigns in cyberspace. PMID:26953692

  8. Exploring Entrainment Patterns of Human Emotion in Social Media.

    PubMed

    He, Saike; Zheng, Xiaolong; Zeng, Daniel; Luo, Chuan; Zhang, Zhu

    2016-01-01

    Emotion entrainment, which is generally defined as the synchronous convergence of human emotions, performs many important social functions. However, what the specific mechanisms of emotion entrainment are beyond in-person interactions, and how human emotions evolve under different entrainment patterns in large-scale social communities, are still unknown. In this paper, we aim to examine the massive emotion entrainment patterns and understand the underlying mechanisms in the context of social media. As modeling emotion dynamics on a large scale is often challenging, we elaborate a pragmatic framework to characterize and quantify the entrainment phenomenon. By applying this framework on the datasets from two large-scale social media platforms, we find that the emotions of online users entrain through social networks. We further uncover that online users often form their relations via dual entrainment, while maintain it through single entrainment. Remarkably, the emotions of online users are more convergent in nonreciprocal entrainment. Building on these findings, we develop an entrainment augmented model for emotion prediction. Experimental results suggest that entrainment patterns inform emotion proximity in dyads, and encoding their associations promotes emotion prediction. This work can further help us to understand the underlying dynamic process of large-scale online interactions and make more reasonable decisions regarding emergency situations, epidemic diseases, and political campaigns in cyberspace.

  9. Seismic Behavior and Retrofit of Concrete Columns of Old R.C. Buildings Reinforced With Plain Bars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marefat, M. S.; Arani, K. Karbasi; Shirazi, S. M. Hassanzadeh

    2008-07-08

    Seismic rehabilitation of old buildings has been a major challenge in recent years. The first step in seismic rehabilitation is evaluation of the existing capacity and the seismic behaviour. For investigation of the seismic behaviour of RC members of a real old building in Iran which has been designed and constructed by European engineers in 1940, three half-scale column specimens reinforced with plain bars have been tested. The tests indicate significant differences between the responses of specimens reinforced by plain bars relative to those reinforced by deformed bars. A regular pattern of cracking and a relatively brittle behaviour was observedmore » while a relatively large residual strength appeared after sudden drop of initial strength and stiffness due to slip of longitudinal bars.« less

  10. Ultrasonic humidification for telecommunications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Longo, F.

    1994-03-01

    This article examines two installations which demonstrate that ultrasonic humidification is an excellent option for large-scale commercial installations. Many existing telephone switching centers constructed 20 to 30 years ago were equipped with electro-mechanical switching equipment that was not sensitive to humidity. Today's sophisticated solid-state telecommunications equipment requires specific levels of relative humidity to operate properly. Over the last several years, Einhorn Yaffee Prescott (formerly Rose Beaton + Rose) designed two of the largest ultrasonic humidification systems at telecommunications buildings located in Cheshire, Conn., and White Plains, N.Y. The Cheshire project was a retrofit to the existing system in a 1960smore » building; the White Plains project involved an upgrade to a totally new air handling system, including an ultrasonic humidification component, in a 1950s building.« less

  11. Browndye: A Software Package for Brownian Dynamics

    PubMed Central

    McCammon, J. Andrew

    2010-01-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109

  12. Applications of the Theory of Distributed and Real Time Systems to the Development of Large-Scale Timing Based Systems.

    DTIC Science & Technology

    1996-04-01

    time systems . The focus is on the study of ’building-blocks’ for the construction of reliable and efficient systems. Our works falls into three...Members of MIT’s Theory of Distributed Systems group have continued their work on modelling, designing, verifying and analyzing distributed and real

  13. e-Tutor: A Multilingual Open Educational Resource for Faculty Development to Teach Online

    ERIC Educational Resources Information Center

    Rapp, Christian; Gülbahar, Yasemin; Adnan, Muge

    2016-01-01

    The situation in Ukraine poses severe problems to the higher education system and to students in Eastern Ukraine. Many students and academicians had been compelled to leave their university buildings and move westwards. Hence, they are forced to substitute face-to-face teaching with distance learning, often on a large scale, but within a short…

  14. Building an Effective and Affordable K-12 Geoscience Outreach Program from the Ground Up: A Simple Model for Universities

    ERIC Educational Resources Information Center

    Dahl, Robyn Mieko; Droser, Mary L.

    2016-01-01

    University earth science departments seeking to establish meaningful geoscience outreach programs often pursue large-scale, grant-funded programs. Although this type of outreach is highly successful, it is also extremely costly, and grant funding can be difficult to secure. Here, we present the Geoscience Education Outreach Program (GEOP), a…

  15. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  16. Adaptive Texture Synthesis for Large Scale City Modeling

    NASA Astrophysics Data System (ADS)

    Despine, G.; Colleu, T.

    2015-02-01

    Large scale city models textured with aerial images are well suited for bird-eye navigation but generally the image resolution does not allow pedestrian navigation. One solution to face this problem is to use high resolution terrestrial photos but it requires huge amount of manual work to remove occlusions. Another solution is to synthesize generic textures with a set of procedural rules and elementary patterns like bricks, roof tiles, doors and windows. This solution may give realistic textures but with no correlation to the ground truth. Instead of using pure procedural modelling we present a method to extract information from aerial images and adapt the texture synthesis to each building. We describe a workflow allowing the user to drive the information extraction and to select the appropriate texture patterns. We also emphasize the importance to organize the knowledge about elementary pattern in a texture catalogue allowing attaching physical information, semantic attributes and to execute selection requests. Roofs are processed according to the detected building material. Façades are first described in terms of principal colours, then opening positions are detected and some window features are computed. These features allow selecting the most appropriate patterns from the texture catalogue. We experimented this workflow on two samples with 20 cm and 5 cm resolution images. The roof texture synthesis and opening detection were successfully conducted on hundreds of buildings. The window characterization is still sensitive to the distortions inherent to the projection of aerial images onto the facades.

  17. Decision tree analysis of factors influencing rainfall-related building damage

    NASA Astrophysics Data System (ADS)

    Spekkers, M. H.; Kok, M.; Clemens, F. H. L. R.; ten Veldhuis, J. A. E.

    2014-04-01

    Flood damage prediction models are essential building blocks in flood risk assessments. Little research has been dedicated so far to damage of small-scale urban floods caused by heavy rainfall, while there is a need for reliable damage models for this flood type among insurers and water authorities. The aim of this paper is to investigate a wide range of damage-influencing factors and their relationships with rainfall-related damage, using decision tree analysis. For this, district-aggregated claim data from private property insurance companies in the Netherlands were analysed, for the period of 1998-2011. The databases include claims of water-related damage, for example, damages related to rainwater intrusion through roofs and pluvial flood water entering buildings at ground floor. Response variables being modelled are average claim size and claim frequency, per district per day. The set of predictors include rainfall-related variables derived from weather radar images, topographic variables from a digital terrain model, building-related variables and socioeconomic indicators of households. Analyses were made separately for property and content damage claim data. Results of decision tree analysis show that claim frequency is most strongly associated with maximum hourly rainfall intensity, followed by real estate value, ground floor area, household income, season (property data only), buildings age (property data only), ownership structure (content data only) and fraction of low-rise buildings (content data only). It was not possible to develop statistically acceptable trees for average claim size, which suggest that variability in average claim size is related to explanatory variables that cannot be defined at the district scale. Cross-validation results show that decision trees were able to predict 22-26% of variance in claim frequency, which is considerably better compared to results from global multiple regression models (11-18% of variance explained). Still, a large part of the variance in claim frequency is left unexplained, which is likely to be caused by variations in data at subdistrict scale and missing explanatory variables.

  18. Gray-Box Approach for Thermal Modelling of Buildings for Applications in District Heating and Cooling Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saurav, Kumar; Chandan, Vikas

    District-heating-and-cooling (DHC) systems are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increasemore » the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components such as buildings, pipes, valves, heating source, etc., interacting with each other. In this paper, we focus on building modelling. In particular, we present a gray-box methodology for thermal modelling of buildings. Gray-box modelling is a hybrid of data driven and physics based models where, coefficients of the equations from physics based models are learned using data. This approach allows us to capture the dynamics of the buildings more effectively as compared to pure data driven approach. Additionally, this approach results in a simpler models as compared to pure physics based models. We first develop the individual components of the building such as temperature evolution, flow controller, etc. These individual models are then integrated in to the complete gray-box model for the building. The model is validated using data collected from one of the buildings at Lule{\\aa}, a city on the coast of northern Sweden.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Thomas M.; Boudreau, Marie-Claude; Helsen, Lieve

    Recent advances in information and communications technology (ICT) have initiated development of a smart electrical grid and smart buildings. Buildings consume a large portion of the total electricity production worldwide, and to fully develop a smart grid they must be integrated with that grid. Buildings can now be 'prosumers' on the grid (both producers and consumers), and the continued growth of distributed renewable energy generation is raising new challenges in terms of grid stability over various time scales. Buildings can contribute to grid stability by managing their overall electrical demand in response to current conditions. Facility managers must balance demandmore » response requests by grid operators with energy needed to maintain smooth building operations. For example, maintaining thermal comfort within an occupied building requires energy and, thus an optimized solution balancing energy use with indoor environmental quality (adequate thermal comfort, lighting, etc.) is needed. Successful integration of buildings and their systems with the grid also requires interoperable data exchange. However, the adoption and integration of newer control and communication technologies into buildings can be problematic with older legacy HVAC and building control systems. Public policy and economic structures have not kept up with the technical developments that have given rise to the budding smart grid, and further developments are needed in both technical and non-technical areas.« less

  20. Evaluation of Penalized and Nonpenalized Methods for Disease Prediction with Large-Scale Genetic Data.

    PubMed

    Won, Sungho; Choi, Hosik; Park, Suyeon; Lee, Juyoung; Park, Changyi; Kwon, Sunghoon

    2015-01-01

    Owing to recent improvement of genotyping technology, large-scale genetic data can be utilized to identify disease susceptibility loci and this successful finding has substantially improved our understanding of complex diseases. However, in spite of these successes, most of the genetic effects for many complex diseases were found to be very small, which have been a big hurdle to build disease prediction model. Recently, many statistical methods based on penalized regressions have been proposed to tackle the so-called "large P and small N" problem. Penalized regressions including least absolute selection and shrinkage operator (LASSO) and ridge regression limit the space of parameters, and this constraint enables the estimation of effects for very large number of SNPs. Various extensions have been suggested, and, in this report, we compare their accuracy by applying them to several complex diseases. Our results show that penalized regressions are usually robust and provide better accuracy than the existing methods for at least diseases under consideration.

  1. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  2. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  3. The use of impact force as a scale parameter for the impact response of composite laminates

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Poe, C. C., Jr.

    1992-01-01

    The building block approach is currently used to design composite structures. With this approach, the data from coupon tests is scaled up to determine the design of a structure. Current standard impact tests and methods of relating test data to other structures are not generally understood and are often used improperly. A methodology is outlined for using impact force as a scale parameter for delamination damage for impacts of simple plates. Dynamic analyses were used to define ranges of plate parameters and impact parameters where quasi-static analyses are valid. These ranges include most low velocity impacts where the mass of the impacter is large and the size of the specimen is small. For large mass impacts of moderately thick (0.35 to 0.70 cm) laminates, the maximum extent of delamination damage increased with increasing impact force and decreasing specimen thickness. For large mass impact tests at a given kinetic energy, impact force and hence delamination size depends on specimen size, specimen thickness, boundary conditions, and indenter size and shape. If damage is reported in terms of impact force instead of kinetic energy, large mass test results can be applied directly to other plates of the same size.

  4. The use of impact force as a scale parameter for the impact response of composite laminates

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Poe, C. C., Jr.

    1992-01-01

    The building block approach is currently used to design composite structures. With this approach, the data from coupon tests are scaled up to determine the design of a structure. Current standard impact tests and methods of relating test data to other structures are not generally understood and are often used improperly. A methodology is outlined for using impact force as a scale parameter for delamination damage for impacts of simple plates. Dynamic analyses were used to define ranges of plate parameters and impact parameters where quasi-static analyses are valid. These ranges include most low-velocity impacts where the mass of the impacter is large, and the size of the specimen is small. For large-mass impacts of moderately thick (0.35-0.70 cm) laminates, the maximum extent of delamination damage increased with increasing impact force and decreasing specimen thickness. For large-mass impact tests at a given kinetic energy, impact force and hence delamination size depends on specimen size, specimen thickness, boundary conditions, and indenter size and shape. If damage is reported in terms of impact force instead of kinetic energy, large-mass test results can be applied directly to other plates of the same thickness.

  5. Safety Performance of Exterior Wall Insulation Material Based on Large Security Concept

    NASA Astrophysics Data System (ADS)

    Zuo, Q. L.; Wang, Y. J.; Li, J. S.

    2018-05-01

    In order to evaluate the fire spread characteristics of building insulation materials under corner fire, an experiment is carried out with small-scale fire spread test system. The change rule of the parameters such as the average height of the flame, the average temperature of the flame and the shape of the flame are analyzed. The variations of the fire spread characteristic parameters of the building insulation materials are investigated. The results show that the average temperature of Expanded Polystyrene (EPS) board, with different thickness, decrease - rise - decrease - increase. During the combustion process, the fire of 4cm thick plate spreads faster.

  6. Colorado State Capitol Geothermal project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepherd, Lance

    Colorado State Capitol Geothermal Project - Final report is redacted due to space constraints. This project was an innovative large-scale ground-source heat pump (GSHP) project at the Colorado State Capitol in Denver, Colorado. The project employed two large wells on the property. One for pulling water from the aquifer, and another for returning the water to the aquifer, after performing the heat exchange. The two wells can work in either direction. Heat extracted/added to the water via a heat exchanger is used to perform space conditioning in the building.

  7. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  8. Large Fluvial Fans: Aspects of the Attribute Array

    NASA Technical Reports Server (NTRS)

    Wilkinson, Justin M.

    2015-01-01

    In arguing for a strict definition of the alluvial fan (coarse-grained with radii less than10 km, in mountain-front settings), Blair and McPherson (1994) proposed that there is no meaningful difference between large fluvial fans (LFF) and floodplains, because the building blocks of both are channel-levee-overbank deposits. Sediment bodies at the LFF scale (greater than 100 km long, fan-shaped in planform), are relatively unstudied although greater than 160 are now identified globally. The following perspectives suggest that the significance of LFF needs to be reconsidered.

  9. Shared versus distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The question of whether multiprocessors should have shared or distributed memory has attracted a great deal of attention. Some researchers argue strongly for building distributed memory machines, while others argue just as strongly for programming shared memory multiprocessors. A great deal of research is underway on both types of parallel systems. Special emphasis is placed on systems with a very large number of processors for computation intensive tasks and considers research and implementation trends. It appears that the two types of systems will likely converge to a common form for large scale multiprocessors.

  10. The Structure of Coronal Loops

    NASA Technical Reports Server (NTRS)

    Antiochos, Spiro K.

    2009-01-01

    It is widely believed that the simple coronal loops observed by XUV imagers, such as EIT, TRACE, or XRT, actually have a complex internal structure consisting of many (perhaps hundreds) of unresolved, interwoven "strands". According to the nanoflare model, photospheric motions tangle the strands, causing them to reconnect and release the energy required to produce the observed loop plasma. Although the strands, themselves, are unresolved by present-generation imagers, there is compelling evidence for their existence and for the nanoflare model from analysis of loop intensities and temporal evolution. A problem with this scenario is that, although reconnection can eliminate some of the strand tangles, it cannot destroy helicity, which should eventually build up to observable scales. we consider, therefore, the injection and evolution of helicity by the nanoflare process and its implications for the observed structure of loops and the large-scale corona. we argue that helicity does survive and build up to observable levels, but on spatial and temporal scales larger than those of coronal loops. we discuss the implications of these results for coronal loops and the corona, in general .

  11. Large-area, lightweight and thick biomimetic composites with superior material properties via fast, economic, and green pathways.

    PubMed

    Walther, Andreas; Bjurhager, Ingela; Malho, Jani-Markus; Pere, Jaakko; Ruokolainen, Janne; Berglund, Lars A; Ikkala, Olli

    2010-08-11

    Although remarkable success has been achieved to mimic the mechanically excellent structure of nacre in laboratory-scale models, it remains difficult to foresee mainstream applications due to time-consuming sequential depositions or energy-intensive processes. Here, we introduce a surprisingly simple and rapid methodology for large-area, lightweight, and thick nacre-mimetic films and laminates with superior material properties. Nanoclay sheets with soft polymer coatings are used as ideal building blocks with intrinsic hard/soft character. They are forced to rapidly self-assemble into aligned nacre-mimetic films via paper-making, doctor-blading or simple painting, giving rise to strong and thick films with tensile modulus of 45 GPa and strength of 250 MPa, that is, partly exceeding nacre. The concepts are environmentally friendly, energy-efficient, and economic and are ready for scale-up via continuous roll-to-roll processes. Excellent gas barrier properties, optical translucency, and extraordinary shape-persistent fire-resistance are demonstrated. We foresee advanced large-scale biomimetic materials, relevant for lightweight sustainable construction and energy-efficient transportation.

  12. Scaling and allometry in the building geometries of Greater London

    NASA Astrophysics Data System (ADS)

    Batty, M.; Carvalho, R.; Hudson-Smith, A.; Milton, R.; Smith, D.; Steadman, P.

    2008-06-01

    Many aggregate distributions of urban activities such as city sizes reveal scaling but hardly any work exists on the properties of spatial distributions within individual cities, notwithstanding considerable knowledge about their fractal structure. We redress this here by examining scaling relationships in a world city using data on the geometric properties of individual buildings. We first summarise how power laws can be used to approximate the size distributions of buildings, in analogy to city-size distributions which have been widely studied as rank-size and lognormal distributions following Zipf [ Human Behavior and the Principle of Least Effort (Addison-Wesley, Cambridge, 1949)] and Gibrat [ Les Inégalités Économiques (Librarie du Recueil Sirey, Paris, 1931)]. We then extend this analysis to allometric relationships between buildings in terms of their different geometric size properties. We present some preliminary analysis of building heights from the Emporis database which suggests very strong scaling in world cities. The data base for Greater London is then introduced from which we extract 3.6 million buildings whose scaling properties we explore. We examine key allometric relationships between these different properties illustrating how building shape changes according to size, and we extend this analysis to the classification of buildings according to land use types. We conclude with an analysis of two-point correlation functions of building geometries which supports our non-spatial analysis of scaling.

  13. Energy savings potential from improved building controls for the US commercial building sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandez, Nick; Katipamula, Srinivas; Wang, Weimin

    The U.S. Department of Energy’s (DOE’s) Building Technologies Office (BTO) sponsored a study to determine the potential national savings achievable in the commercial building sector through widespread deployment of best practice controls, elimination of system and component faults, and use of better sensing. Detailed characterization of potential savings was one source of input to set research, development, and deployment (RD&D) goals in the field of building sensors and controls. DOE’s building energy simulation software, EnergyPlus, was employed to estimate the potential savings from 34 measures in 9 building types and across 16 climates representing almost 57% of commercial building sectormore » energy consumption. In addition to estimating savings from individual measures, three packages of measures were created to estimate savings from the packages. These packages represented an 1) efficient building, 2) typical building, and 3) inefficient building. To scale the results from individual measures or a package to the national scale, building weights by building type and climate locations from the Energy Information Administration’s 2012 Commercial Building Energy Consumption Survey (CBECS) were used. The results showed significant potential for energy savings across all building types and climates. The total site potential savings from individual measures by building type and climate location ranged between 0% and 25%. The total site potential savings by building type aggregated across all climates (using the CBECS building weights) for each measure varied between 0% and 16%. The total site potential savings aggregated across all building types and climates for each measure varied between 0% and 11%. Some individual measures had negative savings because correcting underlying operational problems (e.g., inadequate ventilation) resulted in increased energy consumption. When combined into packages, the overall national savings potential is estimated to be 29%; seven of the nine building types were in the range of 23 to 29% and two exceeded 40%. The total potential national site savings in for each building type ranged between 95x106 GJ (0.09 Quadrillion British thermal units [Quads]; Large Hotels) to 222x106 GJ (0.21 Quads; Large Office, Hospital Administrative areas, and College/University), resulting in total site savings of 1,393x106 GJ (1.32 Quads) when the three packages are applied to the U.S. commercial buildings stock. Using the source (or primary) energy conversion factors of 1.05 for natural gas and 3.14 for electricity resulted in an approximate potential primary energy savings of 2,912x106 GJ (2.76 Quads), which would be 15% of the sector’s 2015 use of approximately 18,991x106 GJ (18 Quads). Extrapolating the results for other building types not analyzed as part of this study, the primary energy savings could be in the range of 4,220x106 GJ to 5,275x106 GJ (4 to 5 Quads). If this savings potential is realized, it would be equivalent to not combusting 180 to 230 million tons of coal or reducing the energy impacts, at today’s energy intensities, of the per capita consumption of 12 to 15 million people in the U.S. To realize most of this potential savings, many gaps can be addressed through RD&D, as recommended in this paper.« less

  14. Novel approach for extinguishing large-scale coal fires using gas-liquid foams in open pit mines.

    PubMed

    Lu, Xinxiao; Wang, Deming; Qin, Botao; Tian, Fuchao; Shi, Guangyi; Dong, Shuaijun

    2015-12-01

    Coal fires are a serious threat to the workers' security and safe production in open pit mines. The coal fire source is hidden and innumerable, and the large-area cavity is prevalent in the coal seam after the coal burned, causing the conventional extinguishment technology difficult to work. Foams are considered as an efficient means of fire extinguishment in these large-scale workplaces. A noble foam preparation method is introduced, and an original design of cavitation jet device is proposed to add foaming agent stably. The jet cavitation occurs when the water flow rate and pressure ratio reach specified values. Through self-building foaming system, the high performance foams are produced and then infused into the blast drilling holes at a large flow. Without complicated operation, this system is found to be very suitable for extinguishing large-scale coal fires. Field application shows that foam generation adopting the proposed key technology makes a good fire extinguishment effect. The temperature reduction using foams is 6-7 times higher than water, and CO concentration is reduced from 9.43 to 0.092‰ in the drilling hole. The coal fires are controlled successfully in open pit mines, ensuring the normal production as well as the security of personnel and equipment.

  15. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katkov, Ivan Yu.; Sil'chenko, Olga K.; Afanasiev, Victor L., E-mail: katkov.ivan@gmail.com, E-mail: olga@sai.msu.su, E-mail: vafan@sao.ru

    We have obtained and analyzed long-slit spectral data for the lenticular galaxy IC 719. In this gas-rich S0 galaxy, its large-scale gaseous disk counterrotates the global stellar disk. Moreover, in the IC 719 disk, we have detected a secondary stellar component corotating the ionized gas. By using emission line intensity ratios, we have proven the gas excitation by young stars and thus claim current star formation, the most intense in a ring-like zone at a radius of 10'' (1.4 kpc). The oxygen abundance of the gas in the star-forming ring is about half of the solar abundance. Since the stellarmore » disk remains dynamically cool, we conclude that smooth prolonged accretion of the external gas from a neighboring galaxy provides the current building of the thin large-scale stellar disk.« less

  17. A combinatorial code for pattern formation in Drosophila oogenesis.

    PubMed

    Yakoby, Nir; Bristow, Christopher A; Gong, Danielle; Schafer, Xenia; Lembong, Jessica; Zartman, Jeremiah J; Halfon, Marc S; Schüpbach, Trudi; Shvartsman, Stanislav Y

    2008-11-01

    Two-dimensional patterning of the follicular epithelium in Drosophila oogenesis is required for the formation of three-dimensional eggshell structures. Our analysis of a large number of published gene expression patterns in the follicle cells suggests that they follow a simple combinatorial code based on six spatial building blocks and the operations of union, difference, intersection, and addition. The building blocks are related to the distribution of inductive signals, provided by the highly conserved epidermal growth factor receptor and bone morphogenetic protein signaling pathways. We demonstrate the validity of the code by testing it against a set of patterns obtained in a large-scale transcriptional profiling experiment. Using the proposed code, we distinguish 36 distinct patterns for 81 genes expressed in the follicular epithelium and characterize their joint dynamics over four stages of oogenesis. The proposed combinatorial framework allows systematic analysis of the diversity and dynamics of two-dimensional transcriptional patterns and guides future studies of gene regulation.

  18. Seismic isolation of small modular reactors using metamaterials

    NASA Astrophysics Data System (ADS)

    Witarto, Witarto; Wang, S. J.; Yang, C. Y.; Nie, Xin; Mo, Y. L.; Chang, K. C.; Tang, Yu; Kassawara, Robert

    2018-04-01

    Adaptation of metamaterials at micro- to nanometer scales to metastructures at much larger scales offers a new alternative for seismic isolation systems. These new isolation systems, known as periodic foundations, function both as a structural foundation to support gravitational weight of the superstructure and also as a seismic isolator to isolate the superstructure from incoming seismic waves. Here we describe the application of periodic foundations for the seismic protection of nuclear power plants, in particular small modular reactors (SMR). For this purpose, a large-scale shake table test on a one-dimensional (1D) periodic foundation supporting an SMR building model was conducted. The 1D periodic foundation was designed and fabricated using reinforced concrete and synthetic rubber (polyurethane) materials. The 1D periodic foundation structural system was tested under various input waves, which include white noise, stepped sine and seismic waves in the horizontal and vertical directions as well as in the torsional mode. The shake table test results show that the 1D periodic foundation can reduce the acceleration response (transmissibility) of the SMR building up to 90%. In addition, the periodic foundation-isolated structure also exhibited smaller displacement than the non-isolated SMR building. This study indicates that the challenge faced in developing metastructures can be overcome and the periodic foundations can be applied to isolating vibration response of engineering structures.

  19. Challenging Common Sense: Cases of School Reform for Learning Community under an International Cooperation Project in Bac Giang Province, Vietnam

    ERIC Educational Resources Information Center

    Saito, Eisuke; Tsukui, Atsushi

    2008-01-01

    This paper aims to discuss the challenges in the process of building a learning community in Vietnamese primary schools. Five lessons emerge from the cases. First, changing teachers' beliefs is time-consuming. Second, because of the reluctance of teachers to change, large-scale delivery of the educational project should be critically revisited…

  20. A Feasibility Study of Sustainable Distributed Generation Technologies to Improve the electrical System on the Duck Valley Reservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman Atkins, Shoshone-Paiute; Mark Hannifan, New West Technologies

    A range of sustainable energy options were assessed for feasibility in addressing chronic electric grid reliability problems at Duck Valley IR. Wind power and building energy efficiency were determined to have the most merit, with the Duck Valley Tribes now well positioned to pursue large scale wind power development for on- and off-reservation sales.

  1. Building on and Honoring Forty Years of PBL Scholarship from Howard Barrows: A Scientometric, Large-Scale Data, and Visualization-Based Analysis

    ERIC Educational Resources Information Center

    Xian, Hanjun; Madhavan, Krishna

    2013-01-01

    Over the past forty years, Howard Barrows' contributions to PBL research have influenced and guided educational research and practice in a diversity of domains. It is necessary to make visible to all PBL scholars what has been accomplished, what is perceived as significant, and what is the scope of applicability for Barrows' groundbreaking…

  2. Driving Solar Innovations from Laboratory to Marketplace - Continuum

    Science.gov Websites

    . military-funded core technologies would someday lead to the internet. Or that a solar photovoltaics (PV more than a dozen start-up thin-film PV companies. This ultimately led to the creation of First Solar build a large-scale solar PV module plant in Colorado. As it has matured, CdTe technology has achieved

  3. Building for the Future by Expatiating the Past: High Drama from the Summit of China's Learning Mountain

    ERIC Educational Resources Information Center

    Boshier, Roger; Huang, Yan

    2006-01-01

    As part of a large-scale learning initiative, the Chinese Communist Party has declared Lushan to be a "learning mountain". There have been people learning at Lushan Mountain for 2000 years. In 1959 there was a Central Committee meeting at Lushan, where Mao Zedong purged his widely respected comrade Peng Dehuai for daring to say people…

  4. Fire development and wall endurance in sandwich and wood-frame structures

    Treesearch

    Carlton A. Holmes; Herbert W. Eickner; John J. Brenden; Curtis C. Peters; Robert H. White

    1980-01-01

    Large-scale fire tests were conducted on seven 16- by 24-foot structures. Four of these structures were of sandwich construction with cores of plastic or paper honeycomb and three were of wood-frame construction. The wasss were loaded to a computer design loading, and the fire endurance determined under a fire exposure from a typical building contents loading of 4-1/2...

  5. A Holistic Redundancy- and Incentive-Based Framework to Improve Content Availability in Peer-to-Peer Networks

    ERIC Educational Resources Information Center

    Herrera-Ruiz, Octavio

    2012-01-01

    Peer-to-Peer (P2P) technology has emerged as an important alternative to the traditional client-server communication paradigm to build large-scale distributed systems. P2P enables the creation, dissemination and access to information at low cost and without the need of dedicated coordinating entities. However, existing P2P systems fail to provide…

  6. PETSc Users Manual Revision 3.7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, Satish; Abhyankar, S.; Adams, M.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.

  7. Nonvolatile Array Of Synapses For Neural Network

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul

    1993-01-01

    Elements of array programmed with help of ultraviolet light. A 32 x 32 very-large-scale integrated-circuit array of electronic synapses serves as building-block chip for analog neural-network computer. Synaptic weights stored in nonvolatile manner. Makes information content of array invulnerable to loss of power, and, by eliminating need for circuitry to refresh volatile synaptic memory, makes architecture simpler and more compact.

  8. PETSc Users Manual Revision 3.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.; Abhyankar, S.; Adams, M.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.

  9. Improved Calibration Shows Images True Colors

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Innovative Imaging and Research, located at Stennis Space Center, used a single SBIR contract with the center to build a large-scale integrating sphere, capable of calibrating a whole array of cameras simultaneously, at a fraction of the usual cost for such a device. Through the use of LEDs, the company also made the sphere far more efficient than existing products and able to mimic sunlight.

  10. Statistical Literacy in Data Revolution Era: Building Blocks and Instructional Dilemmas

    ERIC Educational Resources Information Center

    Prodromou, Theodosia; Dunne, Tim

    2017-01-01

    The data revolution has given citizens access to enormous large-scale open databases. In order to take into account the full complexity of data, we have to change the way we think in terms of the nature of data and its availability, the ways in which it is displayed and used, and the skills that are required for its interpretation. Substantial…

  11. High-rate, roll-to-roll nanomanufacturing of flexible systems

    NASA Astrophysics Data System (ADS)

    Cooper, Khershed P.; Wachter, Ralph F.

    2012-10-01

    Since the National Nanotechnology Initiative was first announced in 2000, nanotechnology has developed an impressive catalog of nano-scale structures with building-blocks such as nanoparticles, nanotubes, nanorods, nanopillars, and quantum dots. Similarly, there are accompanying materials processes such as, atomic layer deposition, pulsed layer deposition, nanoprinting, nanoimprinting, transfer printing, nanolithography and nanopatterning. One of the challenges of nanomanufacturing is scaling up these processes reliably and affordably. Roll-to-roll manufacturing is a means for scaling up, for increasing throughput. It is high-speed production using a continuous, moving platform such as a web or a flexible substrate. The adoption of roll-to-roll to nanomanufacturing is novel. The goal is to build structures and devices with nano-scale features and specific functionality. The substrate could be a polymer, metal foil, silk, cloth or paper. The materials to build the structures and multi-level devices could be organic, inorganic or biological. Processing could be solution-based, e.g., ink-jet printing, or vacuum-based, e.g., chemical vapor deposition. Products could be electronics, optoelectronics, membranes, catalysts, microfluidics, lab-on-film, filters, etc. By this means, processing of large and conformal areas is achievable. High-throughput translates into low cost, which is the attraction of roll-to-roll nanomanufacturing. There are technical challenges requiring fundamental scientific advances in materials and process development and in manufacturing and system-integration where achieving nano-scale feature size, resolution and accuracy at high speeds can be major hurdles. We will give an overview of roll-to-roll nanomanufacturing with emphasis on the need to understand the material, process and system complexities, the need for instrumentation, measurement, and process control and describe the concept of cyber-enabled nanomanufacturing for reliable and predictable production.

  12. Direct Georeferencing of Uav Data Based on Simple Building Structures

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2016-06-01

    Unmanned Aerial Vehicle (UAV) data acquisition is more flexible compared with the more complex traditional airborne data acquisition. This advantage puts UAV platforms in a position as an alternative acquisition method in many applications including Large Scale Topographical Mapping (LSTM). LSTM, i.e. larger or equal than 1:10.000 map scale, is one of a number of prominent priority tasks to be solved in an accelerated way especially in third world developing countries such as Indonesia. As one component of fundamental geospatial data sets, large scale topographical maps are mandatory in order to enable detailed spatial planning. However, the accuracy of the products derived from the UAV data are normally not sufficient for LSTM as it needs robust georeferencing, which requires additional costly efforts such as the incorporation of sophisticated GPS Inertial Navigation System (INS) or Inertial Measurement Unit (IMU) on the platform and/or Ground Control Point (GCP) data on the ground. To reduce the costs and the weight on the UAV alternative solutions have to be found. This paper outlines a direct georeferencing method of UAV data by providing image orientation parameters derived from simple building structures and presents results of an investigation on the achievable results in a LSTM application. In this case, the image orientation determination has been performed through sequential images without any input from INS/IMU equipment. The simple building structures play a significant role in such a way that geometrical characteristics have been considered. Some instances are the orthogonality of the building's wall/rooftop and the local knowledge of the building orientation in the field. In addition, we want to include the Structure from Motion (SfM) approach in order to reduce the number of required GCPs especially for the absolute orientation purpose. The SfM technique applied to the UAV data and simple building structures additionally presents an effective tool for the LSTM application at low cost. Our results show that image orientation calculations from building structure essentially improve the accuracy of direct georeferencing procedure adjusted also by the GCPs. To gain three dimensional (3D) point clouds in local coordinate system, an extraction procedure has been performed by using Agisoft Photo Scan. Subsequently, a Digital Surface Model (DSM) generated from the acquired data is the main output for LSTM that has to be assessed using standard field and conventional mapping workflows. For an appraisal, our DSM is compared directly with a similar DSM obtained by conventional airborne data acquisition using Leica RCD-30 metric camera as well as Trimble Phase One (P65+) camera. The comparison reveals that our approach can achieve meter level accuracy both in planimetric and vertical dimensions.

  13. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  14. The value of cows in reference populations for genomic selection of new functional traits.

    PubMed

    Buch, L H; Kargo, M; Berg, P; Lassen, J; Sørensen, A C

    2012-06-01

    Today, almost all reference populations consist of progeny tested bulls. However, older progeny tested bulls do not have reliable estimated breeding values (EBV) for new traits. Thus, to be able to select for these new traits, it is necessary to build a reference population. We used a deterministic prediction model to test the hypothesis that the value of cows in reference populations depends on the availability of phenotypic records. To test the hypothesis, we investigated different strategies of building a reference population for a new functional trait over a 10-year period. The trait was either recorded on a large scale (30 000 cows per year) or on a small scale (2000 cows per year). For large-scale recording, we compared four scenarios where the reference population consisted of 30 sires; 30 sires and 170 test bulls; 30 sires and 2000 cows; or 30 sires, 2000 cows and 170 test bulls in the first year with measurements of the new functional trait. In addition to varying the make-up of the reference population, we also varied the heritability of the trait (h2 = 0.05 v. 0.15). The results showed that a reference population of test bulls, cows and sires results in the highest accuracy of the direct genomic values (DGV) for a new functional trait, regardless of its heritability. For small-scale recording, we compared two scenarios where the reference population consisted of the 2000 cows with phenotypic records or the 30 sires of these cows in the first year with measurements of the new functional trait. The results showed that a reference population of cows results in the highest accuracy of the DGV whether the heritability is 0.05 or 0.15, because variation is lost when phenotypic data on cows are summarized in EBV of their sires. The main conclusions from this study are: (i) the fewer phenotypic records, the larger effect of including cows in the reference population; (ii) for small-scale recording, the accuracy of the DGV will continue to increase for several years, whereas the increases in the accuracy of the DGV quickly decrease with large-scale recording; (iii) it is possible to achieve accuracies of the DGV that enable selection for new functional traits recorded on a large scale within 3 years from commencement of recording; and (iv) a higher heritability benefits a reference population of cows more than a reference population of bulls.

  15. Approaches for advancing scientific understanding of macrosystems

    USGS Publications Warehouse

    Levy, Ofir; Ball, Becky A.; Bond-Lamberty, Ben; Cheruvelil, Kendra S.; Finley, Andrew O.; Lottig, Noah R.; Surangi W. Punyasena,; Xiao, Jingfeng; Zhou, Jizhong; Buckley, Lauren B.; Filstrup, Christopher T.; Keitt, Tim H.; Kellner, James R.; Knapp, Alan K.; Richardson, Andrew D.; Tcheng, David; Toomey, Michael; Vargas, Rodrigo; Voordeckers, James W.; Wagner, Tyler; Williams, John W.

    2014-01-01

    The emergence of macrosystems ecology (MSE), which focuses on regional- to continental-scale ecological patterns and processes, builds upon a history of long-term and broad-scale studies in ecology. Scientists face the difficulty of integrating the many elements that make up macrosystems, which consist of hierarchical processes at interacting spatial and temporal scales. Researchers must also identify the most relevant scales and variables to be considered, the required data resources, and the appropriate study design to provide the proper inferences. The large volumes of multi-thematic data often associated with macrosystem studies typically require validation, standardization, and assimilation. Finally, analytical approaches need to describe how cross-scale and hierarchical dynamics and interactions relate to macroscale phenomena. Here, we elaborate on some key methodological challenges of MSE research and discuss existing and novel approaches to meet them.

  16. Turbulence Measurements and Computations for the Predication of Broadband Noise in High Bypass Ratio Fans

    NASA Technical Reports Server (NTRS)

    Devenport, William J.; Ragab, Saad A.

    2000-01-01

    Work was performed under this grant with a view to providing the experimental and computational results needed to improve the prediction of broadband stator noise in large bypass ratio aircraft engines. The central hypothesis of our study was that a large fraction of this noise was generated by the fan tip leakage vortices. More specifically, that these vortices are a significant component of the fan wake turbulence and they contain turbulent eddies of a type that can produce significant broadband noise. To test this hypothesis we originally proposed experimental work and computations with the following objectives: (1) to build a large scale two-dimensional cascade with a tip gap and a stationary endwall that, as far as possible, simulates the fan tip geometry, (2) to build a moving endwall for use with the large scale cascade, (3) to measure, in detail, the turbulence structure and spectrum generated by the blade wake and tip leakage vortex, for both endwall configurations, (4) to use the CFD to compute the flow and turbulence distributions for both the experimental configurations and the ADP fan, (5) to provide the experimental and CFD results for the cascades and the physical understanding gained from their study as a basis for improving the broadband noise prediction method. In large part these objectives have been achieved. The most important achievements and findings of our experimental and computational efforts are summarized below. The bibliography at the end of this report includes a list of all publications produced to date under this project. Note that this list is necessarily incomplete the task of publication (particularly in journal papers) continues.

  17. Primordial perturbations with pre-inflationary bounce

    NASA Astrophysics Data System (ADS)

    Cai, Yong; Wang, Yu-Tong; Zhao, Jin-Yun; Piao, Yun-Song

    2018-05-01

    Based on the effective field theory (EFT) of nonsingular cosmologies, we build a stable model, without the ghost and gradient instabilities, of bounce-inflation (inflation is preceded by a cosmological bounce). We perform a full simulation for the evolution of scalar perturbation, and find that the perturbation spectrum has a large-scale suppression (as expected), which is consistent with the power deficit of the cosmic microwave background (CMB) TT-spectrum at low multipoles, but unexpectedly, it also shows itself one marked lower valley. The depth of valley is relevant with the physics around the bounce scale, which is model-dependent.

  18. Bridges to sustainable tropical health

    PubMed Central

    Singer, Burton H.; de Castro, Marcia Caldas

    2007-01-01

    Ensuring sustainable health in the tropics will require bridge building between communities that currently have a limited track record of interaction. It will also require new organizational innovation if much of the negative health consequences of large-scale economic development projects are to be equitably mitigated, if not prevented. We focus attention on three specific contexts: (i) forging linkages between the engineering and health communities to implement clean water and sanitation on a broad scale to prevent reworming, after the current deworming-only programs, of people by diverse intestinal parasites; (ii) building integrated human and animal disease surveillance infrastructure and technical capacity in tropical countries on the reporting and scientific evidence requirements of the sanitary and phytosanitary agreement under the World Trade Organization; and (iii) developing an independent and equitable organizational structure for health impact assessments as well as monitoring and mitigation of health consequences of economic development projects. Effective global disease surveillance and timely early warning of new outbreaks will require a far closer integration of veterinary and human medicine than heretofore. Many of the necessary surveillance components exist within separate animal- and human-oriented organizations. The challenge is to build the necessary bridges between them. PMID:17913894

  19. Identifying unproven cancer treatments on the health web: addressing accuracy, generalizability and scalability.

    PubMed

    Aphinyanaphongs, Yin; Fu, Lawrence D; Aliferis, Constantin F

    2013-01-01

    Building machine learning models that identify unproven cancer treatments on the Health Web is a promising approach for dealing with the dissemination of false and dangerous information to vulnerable health consumers. Aside from the obvious requirement of accuracy, two issues are of practical importance in deploying these models in real world applications. (a) Generalizability: The models must generalize to all treatments (not just the ones used in the training of the models). (b) Scalability: The models can be applied efficiently to billions of documents on the Health Web. First, we provide methods and related empirical data demonstrating strong accuracy and generalizability. Second, by combining the MapReduce distributed architecture and high dimensionality compression via Markov Boundary feature selection, we show how to scale the application of the models to WWW-scale corpora. The present work provides evidence that (a) a very small subset of unproven cancer treatments is sufficient to build a model to identify unproven treatments on the web; (b) unproven treatments use distinct language to market their claims and this language is learnable; (c) through distributed parallelization and state of the art feature selection, it is possible to prepare the corpora and build and apply models with large scalability.

  20. Multilevel Hierarchical Kernel Spectral Clustering for Real-Life Large Scale Complex Networks

    PubMed Central

    Mall, Raghvendra; Langone, Rocco; Suykens, Johan A. K.

    2014-01-01

    Kernel spectral clustering corresponds to a weighted kernel principal component analysis problem in a constrained optimization framework. The primal formulation leads to an eigen-decomposition of a centered Laplacian matrix at the dual level. The dual formulation allows to build a model on a representative subgraph of the large scale network in the training phase and the model parameters are estimated in the validation stage. The KSC model has a powerful out-of-sample extension property which allows cluster affiliation for the unseen nodes of the big data network. In this paper we exploit the structure of the projections in the eigenspace during the validation stage to automatically determine a set of increasing distance thresholds. We use these distance thresholds in the test phase to obtain multiple levels of hierarchy for the large scale network. The hierarchical structure in the network is determined in a bottom-up fashion. We empirically showcase that real-world networks have multilevel hierarchical organization which cannot be detected efficiently by several state-of-the-art large scale hierarchical community detection techniques like the Louvain, OSLOM and Infomap methods. We show that a major advantage of our proposed approach is the ability to locate good quality clusters at both the finer and coarser levels of hierarchy using internal cluster quality metrics on 7 real-life networks. PMID:24949877

  1. The build up of the correlation between halo spin and the large-scale structure

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Kang, Xi

    2018-01-01

    Both simulations and observations have confirmed that the spin of haloes/galaxies is correlated with the large-scale structure (LSS) with a mass dependence such that the spin of low-mass haloes/galaxies tend to be parallel with the LSS, while that of massive haloes/galaxies tend to be perpendicular with the LSS. It is still unclear how this mass dependence is built up over time. We use N-body simulations to trace the evolution of the halo spin-LSS correlation and find that at early times the spin of all halo progenitors is parallel with the LSS. As time goes on, mass collapsing around massive halo is more isotropic, especially the recent mass accretion along the slowest collapsing direction is significant and it brings the halo spin to be perpendicular with the LSS. Adopting the fractional anisotropy (FA) parameter to describe the degree of anisotropy of the large-scale environment, we find that the spin-LSS correlation is a strong function of the environment such that a higher FA (more anisotropic environment) leads to an aligned signal, and a lower anisotropy leads to a misaligned signal. In general, our results show that the spin-LSS correlation is a combined consequence of mass flow and halo growth within the cosmic web. Our predicted environmental dependence between spin and large-scale structure can be further tested using galaxy surveys.

  2. 5. Credit BG. This interior view shows the weigh room, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Credit BG. This interior view shows the weigh room, looking west (240°): Electric lighting and scale read-outs (boxes with circular windows on the wall) are fitted with explosion-proof enclosures; these enclosures prevent malfunctioning electrical parts from sparking and starting fires or explosions. One marble table and scale have been removed at the extreme left of the view. Two remaining scales handle small and large quantities of propellants and additives. Marble tables do not absorb chemicals or conduct electricity; their mass also prevents vibration from upsetting the scales. The floor has an electrically conductive coating to dissipate static electric charges, thus preventing sparks which might ignite propellants. - Jet Propulsion Laboratory Edwards Facility, Weigh & Control Building, Edwards Air Force Base, Boron, Kern County, CA

  3. Cumulative metal leaching from utilisation of secondary building materials in river engineering.

    PubMed

    Leuven, R S E W; Willems, F H G

    2004-01-01

    The present paper estimates the utilisation of bulky wastes (minestone, steel slag, phosphorus slag and demolition waste) in hydraulic engineering structures in Dutch parts of the rivers Rhine, Meuse and Scheldt over the period 1980-2025. Although they offer several economic, technical and environmental benefits, these secondary building materials contain various metals that may leach into river water. A leaching model was used to predict annual emissions of arsenic, cadmium, copper, chromium, lead, mercury, nickel and zinc. Under the current utilisation and model assumptions, the contribution of secondary building materials to metal pollution in Dutch surface waters is expected to be relatively low compared to other sources (less than 0.1% and 0.2% in the years 2000 and 2025, respectively). However, continued and widespread large-scale applications of secondary building materials will increase pollutant leaching and may require further cuts to be made in emissions from other sources to meet emission reduction targets and water quality standards. It is recommended to validate available leaching models under various field conditions. Complete registration of secondary building materials will be required to improve input data for leaching models.

  4. A compressed sensing method with analytical results for lidar feature classification

    NASA Astrophysics Data System (ADS)

    Allen, Josef D.; Yuan, Jiangbo; Liu, Xiuwen; Rahmes, Mark

    2011-04-01

    We present an innovative way to autonomously classify LiDAR points into bare earth, building, vegetation, and other categories. One desirable product of LiDAR data is the automatic classification of the points in the scene. Our algorithm automatically classifies scene points using Compressed Sensing Methods via Orthogonal Matching Pursuit algorithms utilizing a generalized K-Means clustering algorithm to extract buildings and foliage from a Digital Surface Models (DSM). This technology reduces manual editing while being cost effective for large scale automated global scene modeling. Quantitative analyses are provided using Receiver Operating Characteristics (ROC) curves to show Probability of Detection and False Alarm of buildings vs. vegetation classification. Histograms are shown with sample size metrics. Our inpainting algorithms then fill the voids where buildings and vegetation were removed, utilizing Computational Fluid Dynamics (CFD) techniques and Partial Differential Equations (PDE) to create an accurate Digital Terrain Model (DTM) [6]. Inpainting preserves building height contour consistency and edge sharpness of identified inpainted regions. Qualitative results illustrate other benefits such as Terrain Inpainting's unique ability to minimize or eliminate undesirable terrain data artifacts.

  5. Numerical Study of Rotating Turbulence with External Forcing

    NASA Technical Reports Server (NTRS)

    Yeung, P. K.; Zhou, Ye

    1998-01-01

    Direct numerical simulation at 256(exp 3) resolution have been carried out to study the response of isotropic turbulence to the concurrent effects of solid-body rotation and numerical forcing at the large scales. Because energy transfer to the smaller scales is weakened by rotation, energy input from forcing gradually builds up at the large scales, causing the overall kinetic energy to increase. At intermediate wavenumbers the energy spectrum undergoes a transition from a limited k(exp -5/3) inertial range to k(exp -2) scaling recently predicted in the literature. Although the Reynolds stress tensor remains approximately isotropic and three-components, evidence for anisotropy and quasi- two-dimensionality in length scales and spectra in different velocity components and directions is strong. The small scales are found to deviate from local isotropy, primarily as a result of anisotropic transfer to the high wavenumbers. To understand the spectral dynamics of this flow we study the detailed behavior of nonlinear triadic interactions in wavenumber space. Spectral transfer in the velocity component parallel to the axis of rotation is qualitatively similar to that in non-rotating turbulence; however the perpendicular component is characterized by a greatly suppressed energy cascade at high wavenumber and a local reverse transfer at the largest scales. The broader implications of this work are briefly addressed.

  6. Experiments in engagement: Designing public engagement with science and technology for capacity building.

    PubMed

    Selin, Cynthia; Rawlings, Kelly Campbell; de Ridder-Vignone, Kathryn; Sadowski, Jathan; Altamirano Allende, Carlo; Gano, Gretchen; Davies, Sarah R; Guston, David H

    2017-08-01

    Public engagement with science and technology is now widely used in science policy and communication. Touted as a means of enhancing democratic discussion of science and technology, analysis of public engagement with science and technology has shown that it is often weakly tied to scientific governance. In this article, we suggest that the notion of capacity building might be a way of reframing the democratic potential of public engagement with science and technology activities. Drawing on literatures from public policy and administration, we outline how public engagement with science and technology might build citizen capacity, before using the notion of capacity building to develop five principles for the design of public engagement with science and technology. We demonstrate the use of these principles through a discussion of the development and realization of the pilot for a large-scale public engagement with science and technology activity, the Futurescape City Tours, which was carried out in Arizona in 2012.

  7. Results from the Phoenix Urban Heat Island (UHI) experiment: effects at the local, neighbourhood and urban scales

    NASA Astrophysics Data System (ADS)

    di Sabatino, S.; Leo, L. S.; Hedquist, B. C.; Carter, W.; Fernando, H. J. S.

    2009-04-01

    This paper reports on the analysis of results from a large urban heat island experiment (UHI) performed in Phoenix (AZ) in April 2008. From 1960 to 2000, the city of Phoenix experienced a minimum temperature rise of 0.47 °C per decade, which is one of the highest rates in the world for a city of this size (Golden, 2004). Contemporaneously, the city has recorded a rapid enlargement and large portion of the land and desert vegetation have been replaced by buildings, asphalt and concrete (Brazel et al., 2007, Emmanuel and Fernando, 2007). Besides, model predictions show that minimum air temperatures for Phoenix metropolitan area in future years might be even higher than 38 °C. In order to make general statements and mitigation strategies of the UHI phenomenon in Phoenix and other cities in hot arid climates, a one-day intensive experiment was conducted on the 4th-5th April 2008 to collect surface and ambient temperatures within various landscapes in Central Phoenix. Inter alia, infrared thermography (IRT) was used for UHI mapping. The aim was to investigate UHI modifications within the city of Phoenix at three spatial scales i.e. the local (Central Business District, CBD), the neighborhood and the city scales. This was achieved by combining IRT measurements taken at ground level by mobile equipment (automobile-mounted and pedicab) and at high elevation by a helicopter. At local scale detailed thermographic images of about twenty building façades and several street canyons were collected. In total, about two thousand images were taken during the 24-hour campaign. Image analysis provides detailed information on building surface and pavement temperatures at fine resolution (Hedquist et al. 2009, Di Sabatino et al. 2009). This unique dataset allows us several investigations on local air temperature dependence on albedo, building thermal inertia, building shape and orientation and sky view factors. Besides, the mosaic of building façade temperatures are being analyzed in terms of local buoyancy fluxes and possible wind flow modifications by such thermally driven flows will be elucidated. The results are of consequence for understanding microclimate of large cities in order to derive urbanizations schemes for numerical models and to set-up suitable heat mitigation strategies. REFERENCES Brazel, AJ, Gober, P., Lee, S., Grossman-Clarke, S., Zehnder, J., Hedquist, B. and Comparri, E 2007: Dynamics and determinants of urban heat island change (1990-2004) with Phoenix, Arizona, USA. Climate Research 33, 171-182. Di Sabatino S, Hedquist BC, Carter W, Leo LS, Fernando HJS. 2009. Phoenix urban heat island experiment: effects of built elements. Proceedings of the Eighth Symposium on the Urban Environment, Phoenix, Arizona. Emmanuel, R. and Fernando HJS 2007: Effects of urban form and thermal properties in urban heat island mitigation in hot humid and hot arid climates: The cases of Colombo, Sri Lanka and Phoenix, USA. Climate Research 34, 241-251. Golden JS. 2004. The built environment induced urban heat island in rapidly urbanizing arid regions: a sustainable urban engineering complexity. Environmental Sciences 1(4):321-349. Hedquist, BC, Brazel, AJ, Di Sabatino, S., Carter, W. and Fernando, HJS 2009: Phoenix urban heat island experiment: micrometeorological aspects. Proceedings of the Eighth Symposium on the Urban Environment, Phoenix, Arizona.

  8. Mortality during a Large-Scale Heat Wave by Place, Demographic Group, Internal and External Causes of Death, and Building Climate Zone.

    PubMed

    Joe, Lauren; Hoshiko, Sumi; Dobraca, Dina; Jackson, Rebecca; Smorodinsky, Svetlana; Smith, Daniel; Harnly, Martha

    2016-03-09

    Mortality increases during periods of elevated heat. Identification of vulnerable subgroups by demographics, causes of death, and geographic regions, including deaths occurring at home, is needed to inform public health prevention efforts. We calculated mortality relative risks (RRs) and excess deaths associated with a large-scale California heat wave in 2006, comparing deaths during the heat wave with reference days. For total (all-place) and at-home mortality, we examined risks by demographic factors, internal and external causes of death, and building climate zones. During the heat wave, 582 excess deaths occurred, a 5% increase over expected (RR = 1.05, 95% confidence interval (CI) 1.03-1.08). Sixty-six percent of excess deaths were at home (RR = 1.12, CI 1.07-1.16). Total mortality risk was higher among those aged 35-44 years than ≥ 65, and among Hispanics than whites. Deaths from external causes increased more sharply (RR = 1.18, CI 1.10-1.27) than from internal causes (RR = 1.04, CI 1.02-1.07). Geographically, risk varied by building climate zone; the highest risks of at-home death occurred in the northernmost coastal zone (RR = 1.58, CI 1.01-2.48) and the southernmost zone of California's Central Valley (RR = 1.43, CI 1.21-1.68). Heat wave mortality risk varied across subpopulations, and some patterns of vulnerability differed from those previously identified. Public health efforts should also address at-home mortality, non-elderly adults, external causes, and at-risk geographic regions.

  9. Energy efficiency design strategies for buildings with grid-connected photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Yimprayoon, Chanikarn

    The building sector in the United States represents more than 40% of the nation's energy consumption. Energy efficiency design strategies and renewable energy are keys to reduce building energy demand. Grid-connected photovoltaic (PV) systems installed on buildings have been the fastest growing market in the PV industry. This growth poses challenges for buildings qualified to serve in this market sector. Electricity produced from solar energy is intermittent. Matching building electricity demand with PV output can increase PV system efficiency. Through experimental methods and case studies, computer simulations were used to investigate the priorities of energy efficiency design strategies that decreased electricity demand while producing load profiles matching with unique output profiles from PV. Three building types (residential, commercial, and industrial) of varying sizes and use patterns located in 16 climate zones were modeled according to ASHRAE 90.1 requirements. Buildings were analyzed individually and as a group. Complying with ASHRAE energy standards can reduce annual electricity consumption at least 13%. With energy efficiency design strategies, the reduction could reach up to 65%, making it possible for PV systems to meet reduced demands in residential and industrial buildings. The peak electricity demand reduction could be up to 71% with integration of strategies and PV. Reducing lighting power density was the best single strategy with high overall performances. Combined strategies such as zero energy building are also recommended. Electricity consumption reductions are the sum of the reductions from strategies and PV output. However, peak electricity reductions were less than their sum because they reduced peak at different times. The potential of grid stress reduction is significant. Investment incentives from government and utilities are necessary. The PV system sizes on net metering interconnection should not be limited by legislation existing in some states. Data from this study provides insight of impacts from applying energy efficiency design strategies in buildings with grid-connected PV systems. With the current transition from traditional electric grids to future smart grids, this information plus large database of various building conditions allow possible investigations needed by governments or utilities in large scale communities for implementing various measures and policies.

  10. What's exposed? Mapping elements at risk from space

    NASA Astrophysics Data System (ADS)

    Taubenböck, Hannes; Klotz, Martin; Geiß, Christian

    2014-05-01

    The world has suffered from severe natural disasters over the last decennium. The earthquake in Haiti in 2010 or the typhoon "Haiyan" hitting the Philippines in 2013 are among the most prominent examples in recent years. Especially in developing countries, knowledge on amount, location or type of the exposed elements or people is often not given. (Geo)-data are mostly inaccurate, generalized, not up-to-date or even not available at all. Thus, fast and effective disaster management is often delayed until necessary geo-data allow an assessment of effected people, buildings, infrastructure and their respective locations. In the last decade, Earth observation data and methods have developed a product portfolio from low resolution land cover datasets to high resolution spatially accurate building inventories to classify elements at risk or even assess indirectly population densities. This presentation will give an overview on the current available products and EO-based capabilities from global to local scale. On global to regional scale, remote sensing derived geo-products help to approximate the inventory of elements at risk in their spatial extent and abundance by mapping and modelling approaches of land cover or related spatial attributes such as night-time illumination or fractions of impervious surfaces. The capabilities and limitations for mapping physical exposure will be discussed in detail using the example of DLR's 'Global Urban Footprint' initiative. On local scale, the potential of remote sensing particularly lies in the generation of spatially and thematically accurate building inventories for the detailed analysis of the building stock's physical exposure. Even vulnerability-related indicators can be derived. Indicators such as building footprint, height, shape characteristics, roof materials, location, and construction age and structure type have already been combined with civil engineering approaches to assess building stability for large areas. Especially last generation optical sensors - often in combination with digital surface models - featuring very high geometric resolutions are perceived as advantageous for operational applications, especially for small to medium scale urban areas. With regard to user-oriented product generation in the FP-7project SENSUM, a multi-scale and multi-source reference database has been set up to systematically screen available products - global to local ones - with regard to data availability in data-rich and data-poor countries. Thus, the higher ranking goal in this presentation is to provide a systematic overview on EO-based data sets and their individual capabilities and limitations with respect to spatial, temporal and thematic details to support decision-making in before, during and after natural disasters.

  11. Tracking a head-mounted display in a room-sized environment with head-mounted cameras

    NASA Astrophysics Data System (ADS)

    Wang, Jih-Fang; Azuma, Ronald T.; Bishop, Gary; Chi, Vernon; Eyles, John; Fuchs, Henry

    1990-10-01

    This paper presents our efforts to accurately track a Head-Mounted Display (HMD) in a large environment. We review our current benchtop prototype (introduced in {WCF9O]), then describe our plans for building the full-scale system. Both systems use an inside-oui optical tracking scheme, where lateraleffect photodiodes mounted on the user's helmet view flashing infrared beacons placed in the environment. Church's method uses the measured 2D image positions and the known 3D beacon locations to recover the 3D position and orientation of the helmet in real-time. We discuss the implementation and performance of the benchtop prototype. The full-scale system design includes ceiling panels that hold the infrared beacons and a new sensor arrangement of two photodiodes with holographic lenses. In the full-scale system, the user can walk almost anywhere under the grid of ceiling panels, making the working volume nearly as large as the room.

  12. Energy harvesting: small scale energy production from ambient sources

    NASA Astrophysics Data System (ADS)

    Yeatman, Eric M.

    2009-03-01

    Energy harvesting - the collection of otherwise unexploited energy in the local environment - is attracting increasing attention for the powering of electronic devices. While the power levels that can be reached are typically modest (microwatts to milliwatts), the key motivation is to avoid the need for battery replacement or recharging in portable or inaccessible devices. Wireless sensor networks are a particularly important application: the availability of essentially maintenance free sensor nodes, as enabled by energy harvesting, will greatly increase the feasibility of large scale networks, in the paradigm often known as pervasive sensing. Such pervasive sensing networks, used to monitor buildings, structures, outdoor environments or the human body, offer significant benefits for large scale energy efficiency, health and safety, and many other areas. Sources of energy for harvesting include light, temperature differences, and ambient motion, and a wide range of miniature energy harvesters based on these sources have been proposed or demonstrated. This paper reviews the principles and practice in miniature energy harvesters, and discusses trends, suitable applications, and possible future developments.

  13. Building up the spin - orbit alignment of interacting galaxy pairs

    NASA Astrophysics Data System (ADS)

    Moon, Jun-Sung; Yoon, Suk-Jin

    2018-01-01

    Galaxies are not just randomly distributed throughout space. Instead, they are in alignment over a wide range of scales from the cosmic web down to a pair of galaxies. Motivated by recent findings that the spin and the orbital angular momentum vectors of galaxy pairs tend to be parallel, we here investigate the spin - orbit orientation in close pairs using the Illustris cosmological simulation. We find that since z ~ 1, the parallel alignment has become progressively stronger with time through repetitive encounters. The pair Interactions are preferentially in prograde at z = 0 (over 5 sigma significance). The prograde fraction at z = 0 is larger for the pairs influenced more heavily by each other during their evolution. We find no correlation between the spin - orbit orientation and the surrounding large-scale structure. Our results favor the scenario in which the alignment in close pairs is caused by tidal interactions later on, rather than the primordial torquing by the large-scale structures.

  14. Automated 3D structure composition for large RNAs

    PubMed Central

    Popenda, Mariusz; Szachniuk, Marta; Antczak, Maciej; Purzycka, Katarzyna J.; Lukasiak, Piotr; Bartol, Natalia; Blazewicz, Jacek; Adamiak, Ryszard W.

    2012-01-01

    Understanding the numerous functions that RNAs play in living cells depends critically on knowledge of their three-dimensional structure. Due to the difficulties in experimentally assessing structures of large RNAs, there is currently great demand for new high-resolution structure prediction methods. We present the novel method for the fully automated prediction of RNA 3D structures from a user-defined secondary structure. The concept is founded on the machine translation system. The translation engine operates on the RNA FRABASE database tailored to the dictionary relating the RNA secondary structure and tertiary structure elements. The translation algorithm is very fast. Initial 3D structure is composed in a range of seconds on a single processor. The method assures the prediction of large RNA 3D structures of high quality. Our approach needs neither structural templates nor RNA sequence alignment, required for comparative methods. This enables the building of unresolved yet native and artificial RNA structures. The method is implemented in a publicly available, user-friendly server RNAComposer. It works in an interactive mode and a batch mode. The batch mode is designed for large-scale modelling and accepts atomic distance restraints. Presently, the server is set to build RNA structures of up to 500 residues. PMID:22539264

  15. Control of Smart Building Using Advanced SCADA

    NASA Astrophysics Data System (ADS)

    Samuel, Vivin Thomas

    For complete control of the building, a proper SCADA implementation and the optimization strategy has to be build. For better communication and efficiency a proper channel between the Communication protocol and SCADA has to be designed. This paper concentrate mainly between the communication protocol, and the SCADA implementation, for a better optimization and energy savings is derived to large scale industrial buildings. The communication channel used in order to completely control the building remotely from a distant place. For an efficient result we consider the temperature values and the power ratings of the equipment so that while controlling the equipment, we are setting a threshold values for FDD technique implementation. Building management system became a vital source for any building to maintain it and for safety purpose. Smart buildings, refers to various distinct features, where the complete automation system, office building controls, data center controls. ELC's are used to communicate the load values of the building to the remote server from a far location with the help of an Ethernet communication channel. Based on the demand fluctuation and the peak voltage, the loads operate differently increasing the consumption rate thus results in the increase in the annual consumption bill. In modern days, saving energy and reducing the consumption bill is most essential for any building for a better and long operation. The equipment - monitored regularly and optimization strategy is implemented for cost reduction automation system. Thus results in the reduction of annual cost reduction and load lifetime increase.

  16. Expedient Encapsulation: Protective Structural Coatings

    DTIC Science & Technology

    2004-11-16

    strippable coating applied to the interior of a portable ‘office’ shed. The barrier polymer was assessed for its ability to significantly diminish the...proof-of-concept study that will assess the application of two select coatings on a large-scale (approximately 8’ x 8’ x 8’) shelter interior...available coating applied to a portable storage building. The coating selected for use in the study was developed for radiological surface

  17. JPRS Report, Soviet Union, Economic Affairs

    DTIC Science & Technology

    1988-10-18

    34Commodities—The Mirror of Cost Accounting"] [Text] A number of large-scale decisions directed toward increasing the production of high-quality...suitable in the sphere of scientific research and experimental design work. It is known, for example, that the number of blueprints , specifications, or...the situation, Yu. Kozyrev , deputy chief of the Department for Problems of the Machine Building Complex of the USSR State Committee for Science and

  18. A Holistic Management Architecture for Large-Scale Adaptive Networks

    DTIC Science & Technology

    2007-09-01

    transmission and processing overhead required for management. The challenges of building models to describe dynamic systems are well-known to the field of...increases the challenge of finding a simple approach to assessing the state of the network. Moreover, the performance state of one network link may be... challenging . These obstacles indicate the need for a less comprehensive-analytical, more systemic-holistic approach to managing networks. This approach might

  19. A Composite Theoretical Model Showing Potential Hidden Costs of Online Distance Education at Historically Black Colleges and Universities: With Implications for Building Cost-Resistant Courses and Programs

    ERIC Educational Resources Information Center

    Arroyo, Andrew T.

    2014-01-01

    Growing numbers of historically Black colleges and universities (HBCUs) are entering the arena of online distance education. Some are seeking to grow large-scale programs that can compete for market share with historically White institutions and for-profit schools. This theoretical essay develops a composite model to assist HBCU administrators in…

  20. Flexible Dye-Sensitized Solar Cell Based on Vertical ZnO Nanowire Arrays

    PubMed Central

    2011-01-01

    Flexible dye-sensitized solar cells are fabricated using vertically aligned ZnO nanowire arrays that are transferred onto ITO-coated poly(ethylene terephthalate) substrates using a simple peel-off process. The solar cells demonstrate an energy conversion efficiency of 0.44% with good bending tolerance. This technique paves a new route for building large-scale cost-effective flexible photovoltaic and optoelectronic devices. PMID:27502660

  1. Compact wavelength-selective optical switch based on digital optical phase conjugation.

    PubMed

    Li, Zhiyang; Claver, Havyarimana

    2013-11-15

    In this Letter, we show that digital optical phase conjugation might be utilized to construct a new kind of wavelength-selective switches. When incorporated with a multimode interferometer, these switches have wide bandwidth, high tolerance for fabrication error, and low polarization dependency. They might help to build large-scale multiwavelength nonblocking switching systems, or even to fabricate an optical cross-connecting or routing system on a chip.

  2. Occupancy mapping and surface reconstruction using local Gaussian processes with Kinect sensors.

    PubMed

    Kim, Soohwan; Kim, Jonghyuk

    2013-10-01

    Although RGB-D sensors have been successfully applied to visual SLAM and surface reconstruction, most of the applications aim at visualization. In this paper, we propose a noble method of building continuous occupancy maps and reconstructing surfaces in a single framework for both navigation and visualization. Particularly, we apply a Bayesian nonparametric approach, Gaussian process classification, to occupancy mapping. However, it suffers from high-computational complexity of O(n(3))+O(n(2)m), where n and m are the numbers of training and test data, respectively, limiting its use for large-scale mapping with huge training data, which is common with high-resolution RGB-D sensors. Therefore, we partition both training and test data with a coarse-to-fine clustering method and apply Gaussian processes to each local clusters. In addition, we consider Gaussian processes as implicit functions, and thus extract iso-surfaces from the scalar fields, continuous occupancy maps, using marching cubes. By doing that, we are able to build two types of map representations within a single framework of Gaussian processes. Experimental results with 2-D simulated data show that the accuracy of our approximated method is comparable to previous work, while the computational time is dramatically reduced. We also demonstrate our method with 3-D real data to show its feasibility in large-scale environments.

  3. Geo-information for sustainable urban development of Greater Dhaka City, Bangladesh

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Asaduzzaman, Atm; Bahls, Rebecca; Ludwig, Rüdiger; Ashraful Kamal, Mohammad; Nahar Faruqa, Nurun

    2015-04-01

    Greater Dhaka City (including Dhaka and five adjacent municipal areas) is one of the fastest developing urban regions in the world. Densely build-up areas in the developed metropolitan area of Dhaka City are subject to extensive restructuring as common six-storied buildings are replaced by higher and heavier constructions. Additional stories are built on existing houses, frequently exceeding the allowable bearing pressure on the subsoil as supported by the foundations. In turn, newly developing areas are projected in marshy areas modified by extensive, largely unengineered landfills. In many areas, these terrains bear unfavorable building ground conditions, and reliable geospatial information is a major prerequisite for risk-sensitive urban planning. Within a collaborative technical cooperation project between Bangladesh and Germany, BGR supports GSB in the provision of geo-information for the Capital Development authority (RAJUK). For general urban planning, RAJUK successively develops a detailed area plan (DAP) at scale 1 : 50000 for the whole Greater Dhaka City area. Geospatial information have not been considered in the present DAP. Within the project, GSB prepared a detailed geomorphologic map matching the DAP both in areal extent and scale. The geomorphological setting can be used as an important spatial proxy for the characterization of the subsurface since highly segmented, elevated terraces consisting of consolidated sandy Pliocene deposits overlain by stiff Plio-Pleistocene sediments are sharply bordered by low lying-areas. The floodplain and marsh areas are consisting of thick, mechanically weak Holocene fluvial sandy-silty sediments that are sometimes alternated by organic layers. A first expert-based engineering geological reclassification of the geomorphological map resulting in five building ground suitability classes is highly supported by the spatial analysis of extensive archive borehole information consisting of depth-continuous standard penetration test (SPT) observations, engineering geological sample analyses and lithological profiles. The database compiled within the project currently contains more than 1600 locations. The joining of the spatial geomorphological information with the borehole data allows a specific characterization of the building ground classes in terms of bearing capacities for different foundation designs, earthquake-induced subsoil liquefaction potentials and depth-to-engineering rock head considerations. First-order hazard and cost scenarios for several general types of projected settlements can already be broadly evaluated with the data presented in a small scale (DAP scale). However, detailed building ground surveys have to be performed at larger spatial scales (1 : 10000 - 1 : 5000) in areas assigned for new settlements. These involve regular spaced borehole observations, 3-D modeling of the subsurface and geophysical loggings. Within the project, specific representative pilot areas in different geomorphological settings are defined where detailed geospatial building ground investigations are conducted, providing a robust basis for sustainable urban planning related to natural and technological hazards and their associated risks.

  4. Visualizing the history of living spaces.

    PubMed

    Ivanov, Yuri; Wren, Christopher; Sorokin, Alexander; Kaur, Ishwinder

    2007-01-01

    The technology available to building designers now makes it possible to monitor buildings on a very large scale. Video cameras and motion sensors are commonplace in practically every office space, and are slowly making their way into living spaces. The application of such technologies, in particular video cameras, while improving security, also violates privacy. On the other hand, motion sensors, while being privacy-conscious, typically do not provide enough information for a human operator to maintain the same degree of awareness about the space that can be achieved by using video cameras. We propose a novel approach in which we use a large number of simple motion sensors and a small set of video cameras to monitor a large office space. In our system we deployed 215 motion sensors and six video cameras to monitor the 3,000-square-meter office space occupied by 80 people for a period of about one year. The main problem in operating such systems is finding a way to present this highly multidimensional data, which includes both spatial and temporal components, to a human operator to allow browsing and searching recorded data in an efficient and intuitive way. In this paper we present our experiences and the solutions that we have developed in the course of our work on the system. We consider this work to be the first step in helping designers and managers of building systems gain access to information about occupants' behavior in the context of an entire building in a way that is only minimally intrusive to the occupants' privacy.

  5. Large Scale Simulation Platform for NODES Validation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotorrio, P.; Qin, Y.; Min, L.

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and lightmore » commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.« less

  6. A large-scale examination of the nature and efficacy of teachers' practices to engage parents: assessment, parental contact, and student-level impact.

    PubMed

    Seitsinger, Anne M; Felner, Robert D; Brand, Stephen; Burns, Amy

    2008-08-01

    As schools move forward with comprehensive school reform, parents' roles have shifted and been redefined. Parent-teacher communication is critical to student success, yet how schools and teachers contact parents is the subject of few studies. Evaluations of school-change efforts require reliable and useful measures of teachers' practices in communicating with parents. The structure of teacher-parent-contact practices was examined using data from multiple, longitudinal cohorts of schools and teachers from a large-scale project and found to be a reliable and stable measure of parent contact across building levels and localities. Teacher/school practices in contacting parents were found to be significantly related to parent reports of school contact performance and student academic adjustment and achievement. Implications for school improvement efforts are discussed.

  7. Large-scale synthesis of a novel tri(8-hydroxyquioline) aluminum nanostructure.

    PubMed

    Tian, Xike; Fei, Jinbo; Pi, Zhenbang; Yang, Chao; Xiao, Zhidong; Zhang, Lide

    2006-08-01

    A novel tri(8-hydroxyquioline) aluminum (AlQ3) nanostructure was prepared on large scale at low cost by low-temperature physical vapor deposition (PVD). The morphologies, the chemical bondings, and photoluminescence of the AlQ3 nanostructure were investigated by environmental scanning electronic microscopy (ESEM), Fourier transform infrared spectrum (FT-IR), and photoluminescence (PL) spectra, respectively. The AlQ3 nanostructure was composed of micro-sphere with nanowire-cluster growing on the surface. The diameter of micro-sphere and nanowire were about 5 microm and 80 nm, respectively. FT-IR results indicated that the AlQ3 molecule had a strong thermal stability under research conditions. The growth mechanism of the novel nanostructure was discussed. The novel organic nanostructure would be believed to attractive building field-emission devices and other optical devices.

  8. Power Supply for Variable Frequency Induction Heating Using MERS Soft-Switching High Frequency Inverter

    NASA Astrophysics Data System (ADS)

    Isobe, Takanori; Kitahara, Tadayuki; Fukutani, Kazuhiko; Shimada, Ryuichi

    Variable frequency induction heating has great potential for industrial heating applications due to the possibility of achieving heating distribution control; however, large-scale induction heating with variable frequency has not yet been introduced for practical use. This paper proposes a high frequency soft-switching inverter for induction heating that can achieve variable frequency operation. One challenge of variable frequency induction heating is increasing power electronics ratings. This paper indicates that its current source type dc-link configuration and soft-switching characteristics can make it possible to build a large-scale system with variable frequency capability. A 90-kVA 150-1000Hz variable frequency experimental power supply for steel strip induction heating was developed. Experiments confirmed the feasibility of variable frequency induction heating with proposed converter and the advantages of variable frequency operation.

  9. A Prominence Puzzle Explained?

    NASA Astrophysics Data System (ADS)

    Yeates, A. R.; Mackay, D. H.; van Ballegooijen, A. A.

    2009-02-01

    Long-standing observations reveal a global organisation of the magnetic field direction in solar prominences (aka filaments), large clouds of cool dense plasma suspended in the Sun's hot corona. However, theorists have thus far been unable to explain the origin of this hemispheric pattern. In particular, simple shearing by large-scale surface motions would appear to lead to the wrong magnetic field direction. To explain the observations, we have developed a new model of the global magnetic field evolution in the solar corona over six months. For the first time our model can follow the build-up of magnetic helicity and shear on a global scale, driven by flux emergence and surface motions. The model is successful in predicting the correct magnetic field direction in the vast majority of prominences tested, and has enabled us to determine the key physical mechanisms behind the mysterious hemispheric pattern.

  10. Building-Resolved CFD Simulations for Greenhouse Gas Transport and Dispersion over Washington DC / Baltimore

    NASA Astrophysics Data System (ADS)

    Prasad, K.; Lopez-Coto, I.; Ghosh, S.; Mueller, K.; Whetstone, J. R.

    2015-12-01

    The North-East Corridor project aims to use a top-down inversion methodology to quantify sources of Greenhouse Gas (GHG) emissions over urban domains such as Washington DC / Baltimore with high spatial and temporal resolution. Atmospheric transport of tracer gases from an emission source to a tower mounted receptor are usually conducted using the Weather Research and Forecasting (WRF) model. For such simulations, WRF employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and communities comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with WRF. FDS has the potential to evaluate the impact of complex urban topography on near-field dispersion and mixing difficult to simulate with a mesoscale atmospheric model. Such capabilities may be important in determining urban GHG emissions using atmospheric measurements. A methodology has been developed to run FDS as a sub-grid scale model within a WRF simulation. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. Using the coupled WRF / FDS model, NIST will investigate the effects of the urban canopy at horizontal resolutions of 10-20 m in a domain of 12 x 12 km. The coupled WRF-FDS simulations will be used to calculate the dispersion of tracer gases in the North-East Corridor and to evaluate the upwind areas that contribute to tower observations, referred to in the inversion community as influence functions. Results of this study will provide guidance regarding the importance of explicit simulations of urban atmospheric turbulence in obtaining accurate estimates of greenhouse gas emissions and transport.

  11. Analysis of 3d Building Models Accuracy Based on the Airborne Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Ostrowski, W.; Pilarska, M.; Charyton, J.; Bakuła, K.

    2018-05-01

    Creating 3D building models in large scale is becoming more popular and finds many applications. Nowadays, a wide term "3D building models" can be applied to several types of products: well-known CityGML solid models (available on few Levels of Detail), which are mainly generated from Airborne Laser Scanning (ALS) data, as well as 3D mesh models that can be created from both nadir and oblique aerial images. City authorities and national mapping agencies are interested in obtaining the 3D building models. Apart from the completeness of the models, the accuracy aspect is also important. Final accuracy of a building model depends on various factors (accuracy of the source data, complexity of the roof shapes, etc.). In this paper the methodology of inspection of dataset containing 3D models is presented. The proposed approach check all building in dataset with comparison to ALS point clouds testing both: accuracy and level of details. Using analysis of statistical parameters for normal heights for reference point cloud and tested planes and segmentation of point cloud provides the tool that can indicate which building and which roof plane in do not fulfill requirement of model accuracy and detail correctness. Proposed method was tested on two datasets: solid and mesh model.

  12. Assessing high shares of renewable energies in district heating systems - a case study for the city of Herten

    NASA Astrophysics Data System (ADS)

    Aydemir, Ali; Popovski, Eftim; Bellstädt, Daniel; Fleiter, Tobias; Büchele, Richard

    2017-11-01

    Many earlier studies have assessed the DH generation mix without taking explicitly into account future changes in the building stock and heat demand. The approach of this study consists of three steps that combine stock modeling, energy demand forecasting, and simulation of different energy technologies. First, a detailed residential building stock model for Herten is constructed by using remote sensing together with a typology for the German building stock. Second, a bottom-up simulation model is used which calculates the thermal energy demand based on energy-related investments in buildings in order to forecast the thermal demand up to 2050. Third, solar thermal fields in combination with large-scale heat pumps are sized as an alternative to the current coal-fired CHPs. We finally assess cost of heat and CO2 reduction for these units for two scenarios which differ with regard to the DH expansion. It can be concluded that up to 2030 and 2050 a substantial reduction in buildings heat demand due to the improved building insulation is expected. The falling heat demand in the DH substantially reduces the economic feasibility of new RES generation capacity. This reduction might be compensated by continuously connecting apartment buildings to the DH network until 2050.

  13. Performance/price estimates for cortex-scale hardware: a design space exploration.

    PubMed

    Zaveri, Mazad S; Hammerstrom, Dan

    2011-04-01

    In this paper, we revisit the concept of virtualization. Virtualization is useful for understanding and investigating the performance/price and other trade-offs related to the hardware design space. Moreover, it is perhaps the most important aspect of a hardware design space exploration. Such a design space exploration is a necessary part of the study of hardware architectures for large-scale computational models for intelligent computing, including AI, Bayesian, bio-inspired and neural models. A methodical exploration is needed to identify potentially interesting regions in the design space, and to assess the relative performance/price points of these implementations. As an example, in this paper we investigate the performance/price of (digital and mixed-signal) CMOS and hypothetical CMOL (nanogrid) technology based hardware implementations of human cortex-scale spiking neural systems. Through this analysis, and the resulting performance/price points, we demonstrate, in general, the importance of virtualization, and of doing these kinds of design space explorations. The specific results suggest that hybrid nanotechnology such as CMOL is a promising candidate to implement very large-scale spiking neural systems, providing a more efficient utilization of the density and storage benefits of emerging nano-scale technologies. In general, we believe that the study of such hypothetical designs/architectures will guide the neuromorphic hardware community towards building large-scale systems, and help guide research trends in intelligent computing, and computer engineering. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Refrigeration Playbook. Heat Reclaim; Optimizing Heat Rejection and Refrigeration Heat Reclaim for Supermarket Energy Conservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reis, Chuck; Nelson, Eric; Armer, James

    The purpose of this playbook and accompanying spreadsheets is to generalize the detailed CBP analysis and to put tools in the hands of experienced refrigeration designers to evaluate multiple applications of refrigeration waste heat reclaim across the United States. Supermarkets with large portfolios of similar buildings can use these tools to assess the impact of large-scale implementation of heat reclaim systems. In addition, the playbook provides best practices for implementing heat reclaim systems to achieve the best long-term performance possible. It includes guidance on operations and maintenance as well as measurement and verification.

  15. Three-dimensional crossbar arrays of self-rectifying Si/SiO 2/Si memristors

    DOE PAGES

    Li, Can; Han, Lili; Jiang, Hao; ...

    2017-06-05

    Memristors are promising building blocks for the next generation memory, unconventional computing systems and beyond. Currently common materials used to build memristors are not necessarily compatible with the silicon dominant complementary metal-oxide-semiconductor (CMOS) technology. Furthermore, external selector devices or circuits are usually required in order for large memristor arrays to function properly, resulting in increased circuit complexity. Here we demonstrate fully CMOS-compatible, all-silicon based and self-rectifying memristors that negate the need for external selectors in large arrays. It consists of p- and n-type doped single crystalline silicon electrodes and a thin chemically produced silicon oxide switching layer. The device exhibitsmore » repeatable resistance switching behavior with high rectifying ratio (10 5), high ON/OFF conductance ratio (10 4) and attractive retention at 300 °C. We further build a 5-layer 3-dimensional (3D) crossbar array of 100 nm memristors by stacking fluid supported silicon membranes. The CMOS compatibility and self-rectifying behavior open up opportunities for mass production of memristor arrays and 3D hybrid circuits on full-wafer scale silicon and flexible substrates without increasing circuit complexity.« less

  16. Cortical circuitry implementing graphical models.

    PubMed

    Litvak, Shai; Ullman, Shimon

    2009-11-01

    In this letter, we develop and simulate a large-scale network of spiking neurons that approximates the inference computations performed by graphical models. Unlike previous related schemes, which used sum and product operations in either the log or linear domains, the current model uses an inference scheme based on the sum and maximization operations in the log domain. Simulations show that using these operations, a large-scale circuit, which combines populations of spiking neurons as basic building blocks, is capable of finding close approximations to the full mathematical computations performed by graphical models within a few hundred milliseconds. The circuit is general in the sense that it can be wired for any graph structure, it supports multistate variables, and it uses standard leaky integrate-and-fire neuronal units. Following previous work, which proposed relations between graphical models and the large-scale cortical anatomy, we focus on the cortical microcircuitry and propose how anatomical and physiological aspects of the local circuitry may map onto elements of the graphical model implementation. We discuss in particular the roles of three major types of inhibitory neurons (small fast-spiking basket cells, large layer 2/3 basket cells, and double-bouquet neurons), subpopulations of strongly interconnected neurons with their unique connectivity patterns in different cortical layers, and the possible role of minicolumns in the realization of the population-based maximum operation.

  17. Structural Element Testing in Support of the Design of the NASA Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Wade C.; Thesken, John C.; Schleicher, Eric; Wagner, Perry; Kirsch, Michael T.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). For the design and manufacturing of the CCM, the team adopted the building block approach where design and manufacturing risks were mitigated through manufacturing trials and structural testing at various levels of complexity. Following NASA's Structural Design Verification Requirements, a further objective was the verification of design analysis methods and the provision of design data for critical structural features. Test articles increasing in complexity from basic material characterization coupons through structural feature elements and large structural components, to full-scale structures were evaluated. This paper discusses only four elements tests three of which include joints and one that includes a tapering honeycomb core detail. For each test series included are specimen details, instrumentation, test results, a brief analysis description, test analysis correlation and conclusions.

  18. Host population genetic structure and zooxanthellae diversity of two reef-building coral species along the Florida Reef Tract and wider Caribbean

    NASA Astrophysics Data System (ADS)

    Baums, I. B.; Johnson, M. E.; Devlin-Durante, M. K.; Miller, M. W.

    2010-12-01

    In preparation for a large-scale coral restoration project, we surveyed host population genetic structure and symbiont diversity of two reef-building corals in four reef zones along the Florida reef tract (FRT). There was no evidence for coral population subdivision along the FRT in Acropora cervicornis or Montastraea faveolata based on microsatellite markers. However, in A. cervicornis, significant genetic differentiation was apparent when extending the analysis to broader scales (Caribbean). Clade diversity of the zooxanthellae differed along the FRT. A. cervicornis harbored mostly clade A with clade D zooxanthellae being prominent in colonies growing inshore and in the mid-channel zones that experience greater temperature fluctuations and receive significant nutrient and sediment input. M. faveolata harbored a more diverse array of symbionts, and variation in symbiont diversity among four habitat zones was more subtle but still significant. Implications of these results are discussed for ongoing restoration and conservation work.

  19. Performance-Based Seismic Retrofit of Soft-Story Woodframe Buildings Using Energy-Dissipation Systems

    NASA Astrophysics Data System (ADS)

    Tian, Jingjing

    Low-rise woodframe buildings with disproportionately flexible ground stories represent a significant percentage of the building stock in seismically vulnerable communities in the Western United States. These structures have a readily identifiable structural weakness at the ground level due to an asymmetric distribution of large openings in the perimeter wall lines and to a lack of interior partition walls, resulting in a soft story condition that makes the structure highly susceptible to severe damage or collapse under design-level earthquakes. The conventional approach to retrofitting such structures is to increase the ground story stiffness. An alternate approach is to increase the energy dissipation capacity of the structure via the incorporation of supplemental energy dissipation devices (dampers), thereby relieving the energy dissipation demands on the framing system. Such a retrofit approach is consistent with a Performance-Based Seismic Retrofit (PBSR) philosophy through which multiple performance levels may be targeted. The effectiveness of such a retrofit is presented via examination of the seismic response of a full-scale four-story building that was tested on the outdoor shake table at NEES-UCSD and a full-scale three-story building that was tested using slow pseudo-dynamic hybrid testing at NEES-UB. In addition, a Direct Displacement Design (DDD) methodology was developed as an improvement over current DDD methods by considering torsion, with or without the implementation of damping devices, in an attempt to avoid the computational expense of nonlinear time-history analysis (NLTHA) and thus facilitating widespread application of PBSR in engineering practice.

  20. What scaling means in wind engineering: Complementary role of the reduced scale approach in a BLWT and the full scale testing in a large climatic wind tunnel

    NASA Astrophysics Data System (ADS)

    Flamand, Olivier

    2017-12-01

    Wind engineering problems are commonly studied by wind tunnel experiments at a reduced scale. This introduces several limitations and calls for a careful planning of the tests and the interpretation of the experimental results. The talk first revisits the similitude laws and discusses how they are actually applied in wind engineering. It will also remind readers why different scaling laws govern in different wind engineering problems. Secondly, the paper focuses on the ways to simplify a detailed structure (bridge, building, platform) when fabricating the downscaled models for the tests. This will be illustrated by several examples from recent engineering projects. Finally, under the most severe weather conditions, manmade structures and equipment should remain operational. What “recreating the climate” means and aims to achieve will be illustrated through common practice in climatic wind tunnel modelling.

  1. Dynamical systems proxies of atmospheric predictability and mid-latitude extremes

    NASA Astrophysics Data System (ADS)

    Messori, Gabriele; Faranda, Davide; Caballero, Rodrigo; Yiou, Pascal

    2017-04-01

    Extreme weather ocurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. Many extremes (for e.g. storms, heatwaves, cold spells, heavy precipitation) are tied to specific patterns of midlatitude atmospheric circulation. The ability to identify these patterns and use them to enhance the predictability of the extremes is therefore a topic of crucial societal and economic value. We propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We use two simple dynamical systems metrics - local dimension and persistence - to identify sets of similar large-scale atmospheric flow patterns which present a coherent temporal evolution. When these patterns correspond to weather extremes, they therefore afford a particularly good forward predictability. We specifically test this technique on European winter temperatures, whose variability largely depends on the atmospheric circulation in the North Atlantic region. We find that our dynamical systems approach provides predictability of large-scale temperature extremes up to one week in advance.

  2. Thermal Texture Selection and Correction for Building Facade Inspection Based on Thermal Radiant Characteristics

    NASA Astrophysics Data System (ADS)

    Lin, D.; Jarzabek-Rychard, M.; Schneider, D.; Maas, H.-G.

    2018-05-01

    An automatic building façade thermal texture mapping approach, using uncooled thermal camera data, is proposed in this paper. First, a shutter-less radiometric thermal camera calibration method is implemented to remove the large offset deviations caused by changing ambient environment. Then, a 3D façade model is generated from a RGB image sequence using structure-from-motion (SfM) techniques. Subsequently, for each triangle in the 3D model, the optimal texture is selected by taking into consideration local image scale, object incident angle, image viewing angle as well as occlusions. Afterwards, the selected textures can be further corrected using thermal radiant characteristics. Finally, the Gauss filter outperforms the voted texture strategy at the seams smoothing and thus for instance helping to reduce the false alarm rate in façade thermal leakages detection. Our approach is evaluated on a building row façade located at Dresden, Germany.

  3. Integrating complexity into data-driven multi-hazard supply chain network strategies

    USGS Publications Warehouse

    Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.

    2013-01-01

    Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.

  4. In-situ device integration of large-area patterned organic nanowire arrays for high-performance optical sensors

    PubMed Central

    Wu, Yiming; Zhang, Xiujuan; Pan, Huanhuan; Deng, Wei; Zhang, Xiaohong; Zhang, Xiwei; Jie, Jiansheng

    2013-01-01

    Single-crystalline organic nanowires (NWs) are important building blocks for future low-cost and efficient nano-optoelectronic devices due to their extraordinary properties. However, it remains a critical challenge to achieve large-scale organic NW array assembly and device integration. Herein, we demonstrate a feasible one-step method for large-area patterned growth of cross-aligned single-crystalline organic NW arrays and their in-situ device integration for optical image sensors. The integrated image sensor circuitry contained a 10 × 10 pixel array in an area of 1.3 × 1.3 mm2, showing high spatial resolution, excellent stability and reproducibility. More importantly, 100% of the pixels successfully operated at a high response speed and relatively small pixel-to-pixel variation. The high yield and high spatial resolution of the operational pixels, along with the high integration level of the device, clearly demonstrate the great potential of the one-step organic NW array growth and device construction approach for large-scale optoelectronic device integration. PMID:24287887

  5. Mantis: A Fast, Small, and Exact Large-Scale Sequence-Search Index.

    PubMed

    Pandey, Prashant; Almodaresi, Fatemeh; Bender, Michael A; Ferdman, Michael; Johnson, Rob; Patro, Rob

    2018-06-18

    Sequence-level searches on large collections of RNA sequencing experiments, such as the NCBI Sequence Read Archive (SRA), would enable one to ask many questions about the expression or variation of a given transcript in a population. Existing approaches, such as the sequence Bloom tree, suffer from fundamental limitations of the Bloom filter, resulting in slow build and query times, less-than-optimal space usage, and potentially large numbers of false-positives. This paper introduces Mantis, a space-efficient system that uses new data structures to index thousands of raw-read experiments and facilitates large-scale sequence searches. In our evaluation, index construction with Mantis is 6× faster and yields a 20% smaller index than the state-of-the-art split sequence Bloom tree (SSBT). For queries, Mantis is 6-108× faster than SSBT and has no false-positives or -negatives. For example, Mantis was able to search for all 200,400 known human transcripts in an index of 2,652 RNA sequencing experiments in 82 min; SSBT took close to 4 days. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Energy Efficiency Potential in the U.S. Single-Family Housing Stock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Eric J.; Christensen, Craig B.; Horowitz, Scott G.

    Typical approaches for assessing energy efficiency potential in buildings use a limited number of prototypes, and therefore suffer from inadequate resolution when pass-fail cost-effectiveness tests are applied, which can significantly underestimate or overestimate the economic potential of energy efficiency technologies. This analysis applies a new approach to large-scale residential energy analysis, combining the use of large public and private data sources, statistical sampling, detailed building simulations, and high-performance computing to achieve unprecedented granularity - and therefore accuracy - in modeling the diversity of the single-family housing stock. The result is a comprehensive set of maps, tables, and figures showing themore » technical and economic potential of 50 plus residential energy efficiency upgrades and packages for each state. Policymakers, program designers, and manufacturers can use these results to identify upgrades with the highest potential for cost-effective savings in a particular state or region, as well as help identify customer segments for targeted marketing and deployment. The primary finding of this analysis is that there is significant technical and economic potential to save electricity and on-site fuel use in the single-family housing stock. However, the economic potential is very sensitive to the cost-effectiveness criteria used for analysis. Additionally, the savings of particular energy efficiency upgrades is situation-specific within the housing stock (depending on climate, building vintage, heating fuel type, building physical characteristics, etc.).« less

  7. Role of Hydrodynamic and Mineralogical Heterogeneities on Reactive Transport Processes.

    NASA Astrophysics Data System (ADS)

    Luquot, L.; Garcia-Rios, M.; soler Sagarra, J.; Gouze, P.; Martinez-Perez, L.; Carrera, J.

    2017-12-01

    Predicting reactive transport at large scale, i.e., Darcy- and field- scale, is still challenging considering the number of heterogeneities that may be present from nm- to pore-scale. It is well documented that conventional continuum-scale approaches oversimplify and/or ignore many important aspects of rock structure, chemical reactions, fluid displacement and transport, which, as a consequence, results in uncertainties when applied to field-scale operations. The changes in flow and reactive transport across the different spatial and temporal scales are of central concern in many geological applications such as groundwater systems, geo-energy, rock building heritage and geological storage... In this presentation, we will discuss some laboratory and numerical results on how local heterogeneities (structural, hydrodynamic and mineralogical) can affect the localization and the rate of the reaction processes. Different flow through laboratory experiments using various rock samples will be presented, from simple monomineral rocks such as limestone samples, and more complex rocks composed of different minerals with a large range of kinetic reactions. A new numerical approach based on multirate water mixing approach will be presented and applied to one of the laboratory experiment in order to analyze and distinguish the effect of the mineralogy distribution and the hydrodynamic heterogeneity on the total reaction rate.

  8. Variance in Dominant Grain Size Across the Mississippi River Delta

    NASA Astrophysics Data System (ADS)

    Miller, K. L.; Chamberlain, E. L.; Esposito, C. R.; Wagner, R. W.; Mohrig, D. C.

    2016-02-01

    Proposals to restore coastal Louisiana often center on Mississippi River diversion projects wherein water and sediment are routed into wetlands and shallow waters in an effort to build land. Successful design and implementation of diversions will include consideration of behavior and characteristics of sediment, both in the river and in the receiving basin. The Mississippi River sediment load is primarily mud (roughly 75%), with the remainder being very-fine to medium sand or organic detritus. The dominance of muds leads many to suggest that diversions should focus on capturing the mud fraction despite the smaller size and longer settling times required for these particles compared to sand; others believe that sand should be the focus. We present a systemic analysis of the texture of land-building sediment in the Mississippi Delta using borehole data from various depositional environments representing a range of spatial scales, system ages, and fluvial and basin characteristics. We include subdelta-scale data from the incipient Wax Lake Delta and from the distal plain of the abandoned Lafourche subdelta, as well as crevasse-scale data from modern Cubit's Gap and the Attakapas splay, an inland Lafourche crevasse. Comparison of these sites demonstrates a large variance in the volumetric mud to sand ratios across the system. We consider the differences to be emblematic of the various forcings on each lobe as it formed and suggest that the most efficient building block for a diversion is a function of the receiving basin and is not uniform across the entire delta.

  9. Advancing research opportunities and promoting pathways in graduate education: a systemic approach to BUILD training at California State University, Long Beach (CSULB).

    PubMed

    Urizar, Guido G; Henriques, Laura; Chun, Chi-Ah; Buonora, Paul; Vu, Kim-Phuong L; Galvez, Gino; Kingsford, Laura

    2017-01-01

    First-generation college graduates, racial and ethnic minorities, people with disabilities, and those from disadvantaged backgrounds are gravely underrepresented in the health research workforce representing behavioral health sciences and biomedical sciences and engineering (BHS/BSE). Furthermore, relative to their peers, very few students from these underrepresented groups (URGs) earn scientific bachelor's degrees with even fewer earning doctorate degrees. Therefore, programs that engage and retain URGs in health-related research careers early on in their career path are imperative to promote the diversity of well-trained research scientists who have the ability to address the nation's complex health challenges in an interdisciplinary way. The purpose of this paper is to describe the challenges, lessons learned, and sustainability of implementing a large-scale, multidisciplinary research infrastructure at California State University, Long Beach (CSULB) - a minority-serving institution - through federal funding received by the National Institutes of Health (NIH) Building Infrastructure Leading to Diversity (BUILD) Initiative. The CSULB BUILD initiative consists of developing a research infrastructure designed to engage and retain URGs on the research career path by providing them with the research training and skills needed to make them highly competitive for doctoral programs and entry into the research workforce. This initiative unites many research disciplines using basic, applied, and translational approaches to offer insights and develop technologies addressing prominent community and national health issues from a multidisciplinary perspective. Additionally, this initiative brings together local (e.g., high school, community college, doctoral research institutions) and national (e.g., National Research Mentoring Network) collaborative partners to alter how we identify, develop, and implement resources to enhance student and faculty research. Finally, this initiative establishes a student research training program that engages URGs earlier in their academic development, is larger and multidisciplinary in scope, and is responsive to the life contexts and promotes the cultural capital that URGs bring to their career path. Although there have been many challenges to planning for and developing CSULB BUILD's large-scale, multidisciplinary research infrastructure, there have been many lessons learned in the process that could aid other campuses in the development and sustainability of similar research programs.

  10. Low-carbon building assessment and multi-scale input-output analysis

    NASA Astrophysics Data System (ADS)

    Chen, G. Q.; Chen, H.; Chen, Z. M.; Zhang, Bo; Shao, L.; Guo, S.; Zhou, S. Y.; Jiang, M. M.

    2011-01-01

    Presented as a low-carbon building evaluation framework in this paper are detailed carbon emission account procedures for the life cycle of buildings in terms of nine stages as building construction, fitment, outdoor facility construction, transportation, operation, waste treatment, property management, demolition, and disposal for buildings, supported by integrated carbon intensity databases based on multi-scale input-output analysis, essential for low-carbon planning, procurement and supply chain design, and logistics management.

  11. Development, implementation and evaluation of a clinical research engagement and leadership capacity building program in a large Australian health care service.

    PubMed

    Misso, Marie L; Ilic, Dragan; Haines, Terry P; Hutchinson, Alison M; East, Christine E; Teede, Helena J

    2016-01-14

    Health professionals need to be integrated more effectively in clinical research to ensure that research addresses clinical needs and provides practical solutions at the coal face of care. In light of limited evidence on how best to achieve this, evaluation of strategies to introduce, adapt and sustain evidence-based practices across different populations and settings is required. This project aims to address this gap through the co-design, development, implementation, evaluation, refinement and ultimately scale-up of a clinical research engagement and leadership capacity building program in a clinical setting with little to no co-ordinated approach to clinical research engagement and education. The protocol is based on principles of research capacity building and on a six-step framework, which have previously led to successful implementation and long-term sustainability. A mixed methods study design will be used. Methods will include: (1) a review of the literature about strategies that engage health professionals in research through capacity building and/or education in research methods; (2) a review of existing local research education and support elements; (3) a needs assessment in the local clinical setting, including an online cross-sectional survey and semi-structured interviews; (4) co-design and development of an educational and support program; (5) implementation of the program in the clinical environment; and (6) pre- and post-implementation evaluation and ultimately program scale-up. The evaluation focuses on research activity and knowledge, attitudes and preferences about clinical research, evidence-based practice and leadership and post implementation, about their satisfaction with the program. The investigators will evaluate the feasibility and effect of the program according to capacity building measures and will revise where appropriate prior to scale-up. It is anticipated that this clinical research engagement and leadership capacity building program will enable and enhance clinically relevant research to be led and conducted by health professionals in the health setting. This approach will also encourage identification of areas of clinical uncertainty and need that can be addressed through clinical research within the health setting.

  12. Research on Optimal Observation Scale for Damaged Buildings after Earthquake Based on Optimal Feature Space

    NASA Astrophysics Data System (ADS)

    Chen, J.; Chen, W.; Dou, A.; Li, W.; Sun, Y.

    2018-04-01

    A new information extraction method of damaged buildings rooted in optimal feature space is put forward on the basis of the traditional object-oriented method. In this new method, ESP (estimate of scale parameter) tool is used to optimize the segmentation of image. Then the distance matrix and minimum separation distance of all kinds of surface features are calculated through sample selection to find the optimal feature space, which is finally applied to extract the image of damaged buildings after earthquake. The overall extraction accuracy reaches 83.1 %, the kappa coefficient 0.813. The new information extraction method greatly improves the extraction accuracy and efficiency, compared with the traditional object-oriented method, and owns a good promotional value in the information extraction of damaged buildings. In addition, the new method can be used for the information extraction of different-resolution images of damaged buildings after earthquake, then to seek the optimal observation scale of damaged buildings through accuracy evaluation. It is supposed that the optimal observation scale of damaged buildings is between 1 m and 1.2 m, which provides a reference for future information extraction of damaged buildings.

  13. Evaluation of three common green building materials for ozone removal, and primary and secondary emissions of aldehydes

    NASA Astrophysics Data System (ADS)

    Gall, Elliott; Darling, Erin; Siegel, Jeffrey A.; Morrison, Glenn C.; Corsi, Richard L.

    2013-10-01

    Ozone reactions that occur on material surfaces can lead to elevated concentrations of oxidized products in the occupied space of buildings. However, there is little information on the impact of materials at full scale, especially for green building materials. Experiments were completed in a 68 m3 climate-controlled test chamber with three certified green building materials that can cover large areas in buildings: (1) recycled carpet, (2) perlite-based ceiling tile and (3) low-VOC paint and primer on recycled drywall. Ozone deposition velocity and primary and secondary emission rates of C1 to C10 saturated carbonyls were determined for two chamber mixing conditions and three values of relative humidity. A direct comparison was made between ozone deposition velocities and carbonyl yields observed for the same materials analyzed in small (10 L) chambers. Total primary carbonyl emission rates from carpet, ceiling tile and painted drywall ranged from 27 to 120 μg m-2 h-1, 13 to 40 μg m-2 h-1, 3.9 to 42 μg m-2 h-1, respectively. Ozone deposition velocity to these three materials averaged 6.1 m h-1, 2.3 m h-1 and 0.32 m h-1, respectively. Total secondary carbonyl emissions from these materials ranged from 70 to 276 μg m-2 h-1, 0 to 12 μg m-2 h-1, and 0 to 30 μg m-2 h-1, respectively. Carbonyl emissions were determined with a transient approximation, and were found to be in general agreement with those found in the literature. These results suggest that care should be taken when selecting green building materials due to potentially large differences in primary and secondary emissions.

  14. Conformal Ablative Thermal Protection System for Small and Large Scale Missions: Approaching TRL 6 for Planetary and Human Exploration Missions and TRL 9 for Small Probe Missions

    NASA Technical Reports Server (NTRS)

    Beck, R. A. S.; Gasch, M. J.; Milos, F. S.; Stackpoole, M. M.; Smith, B. P.; Switzer, M. R.; Venkatapathy, E.; Wilder, M. C.; Boghhozian, T.; Chavez-Garcia, J. F.

    2015-01-01

    In 2011, NASAs Aeronautics Research Mission Directorate (ARMD) funded an effort to develop an ablative thermal protection system (TPS) material that would have improved properties when compared to Phenolic Impregnated Carbon Ablator (PICA) and AVCOAT. Their goal was a conformal material, processed with a flexible reinforcement that would result in similar or better thermal characteristics and higher strain-to-failure characteristics that would allow for easier integration on flight aeroshells than then-current rigid ablative TPS materials. In 2012, NASAs Space Technology Mission Directorate (STMD) began funding the maturation of the best formulation of the game changing conformal ablator, C-PICA. Progress has been reported at IPPW over the past three years, describing C-PICA with a density and recession rates similar to PICA, but with a higher strain-to-failure which allows for direct bonding and no gap fillers, and even more important, with thermal characteristics resulting in half the temperature rise of PICA. Overall, C-PICA should be able to replace PICA with a thinner, lighter weight, less complicated design. These characteristics should be particularly attractive for use as backshell TPS on high energy planetary entry vehicles. At the end of this year, the material should be ready for missions to consider including in their design, in fact, NASAs Science Mission Directorate (SMD) is considering incentivizing the use of C-PICA in the next Discovery Proposal call. This year both scale up of the material to large (1-m) sized pieces and the design and build of small probe heatshields for flight tests will be completed. NASA, with an industry partner, will build a 1-m long manufacturing demonstration unit (MDU) with a shape based on a mid LD lifting body. In addition, in an effort to fly as you test and test as you fly, NASA, with a second industry partner, will build a small probe to test in the Interactive Heating Facility (IHF) arc jet and, using nearly the same design, build the aeroshell and TPS, with instrumentation, for a small probe flight test article, due to fly in 2017. At the end of the year, the C-PICA will be at TRL 5+, and with the flight data in 2017, it will be at TRL 9 for missions needs with C-PICA at a small scale (12 diameter). The scale-up and small probe efforts will be de-scribed in this presentation.

  15. Skinfold Measurements and the Percentage of Body Fat Differences Between Black and White Male Soldiers

    DTIC Science & Technology

    1984-03-16

    cooperation, making it impractical for large-scale studies. Today, the hydrostatic method is used primarily in laboratory studies by exercise physiologists...inappropriate equations could cause serious errors and result in adverse advice being given to the athlete concerning dietary habits and/or exercise ...football players. Their results showed that when black and white football players are matched somatotypically (type of body build), there is no significant

  16. Existing Whole-House Solutions Case Study: Pilot Demonstration of Phased Retrofits in Florida Homes - Central and South Florida Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-08-01

    In this pilot project, the Building America Partnership for Improved Residential Construction and Florida Power and Light are collaborating to retrofit a large number of homes using a phased approach to both simple and deep retrofits. This project will provide the information necessary to significantly reduce energy use through larger community-scale projects in collaboration with utilities, program administrators and other market leader stakeholders.

  17. Merging Surface Reconstructions of Terrestrial and Airborne LIDAR Range Data

    DTIC Science & Technology

    2009-05-19

    Mangan and R. Whitaker. Partitioning 3D surface meshes using watershed segmentation . IEEE Trans. on Visualization and Computer Graphics, 5(4), pp...Jain, and A. Zakhor. Data Processing Algorithms for Generating Textured 3D Building Facade Meshes from Laser Scans and Camera Images. International...acquired set of overlapping range images into a single mesh [2,9,10]. However, due to the volume of data involved in large scale urban modeling, data

  18. U.S. Strategic Interest in the Middle East and Implications for the Army

    DTIC Science & Technology

    2017-01-01

    three of which evolved into civil wars, have altered the landscape. These have resulted in occasional opportunities, such as the effort to build a...the Arab world, though experience indicates that large-scale intervention in such conflicts is likely to produce disappointing results . Army...the Iranian regime to rally people around the flag. As a result , the United States should be cautious in demonstrating support for such progress

  19. III-V Semiconductor Optical Micro-Ring Resonators

    NASA Astrophysics Data System (ADS)

    Grover, Rohit; Absil, Philippe P.; Ibrahim, Tarek A.; Ho, Ping-Tong

    2004-05-01

    We describe the theory of optical ring resonators, and our work on GaAs-AlGaAs and GaInAsP-InP optical micro-ring resonators. These devices are promising building blocks for future all-optical signal processing and photonic logic circuits. Their versatility allows the fabrication of ultra-compact multiplexers/demultiplexers, optical channel dropping filters, lasers, amplifiers, and logic gates (to name a few), which will enable large-scale monolithic integration for optics.

  20. Rotation-invariant convolutional neural networks for galaxy morphology prediction

    NASA Astrophysics Data System (ADS)

    Dieleman, Sander; Willett, Kyle W.; Dambre, Joni

    2015-06-01

    Measuring the morphological parameters of galaxies is a key requirement for studying their formation and evolution. Surveys such as the Sloan Digital Sky Survey have resulted in the availability of very large collections of images, which have permitted population-wide analyses of galaxy morphology. Morphological analysis has traditionally been carried out mostly via visual inspection by trained experts, which is time consuming and does not scale to large (≳104) numbers of images. Although attempts have been made to build automated classification systems, these have not been able to achieve the desired level of accuracy. The Galaxy Zoo project successfully applied a crowdsourcing strategy, inviting online users to classify images by answering a series of questions. Unfortunately, even this approach does not scale well enough to keep up with the increasing availability of galaxy images. We present a deep neural network model for galaxy morphology classification which exploits translational and rotational symmetry. It was developed in the context of the Galaxy Challenge, an international competition to build the best model for morphology classification based on annotated images from the Galaxy Zoo project. For images with high agreement among the Galaxy Zoo participants, our model is able to reproduce their consensus with near-perfect accuracy (>99 per cent) for most questions. Confident model predictions are highly accurate, which makes the model suitable for filtering large collections of images and forwarding challenging images to experts for manual annotation. This approach greatly reduces the experts' workload without affecting accuracy. The application of these algorithms to larger sets of training data will be critical for analysing results from future surveys such as the Large Synoptic Survey Telescope.

  1. Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware.

    PubMed

    Rast, Alexander; Galluppi, Francesco; Davies, Sergio; Plana, Luis; Patterson, Cameron; Sharp, Thomas; Lester, David; Furber, Steve

    2011-11-01

    Dedicated hardware is becoming increasingly essential to simulate emerging very-large-scale neural models. Equally, however, it needs to be able to support multiple models of the neural dynamics, possibly operating simultaneously within the same system. This may be necessary either to simulate large models with heterogeneous neural types, or to simplify simulation and analysis of detailed, complex models in a large simulation by isolating the new model to a small subpopulation of a larger overall network. The SpiNNaker neuromimetic chip is a dedicated neural processor able to support such heterogeneous simulations. Implementing these models on-chip uses an integrated library-based tool chain incorporating the emerging PyNN interface that allows a modeller to input a high-level description and use an automated process to generate an on-chip simulation. Simulations using both LIF and Izhikevich models demonstrate the ability of the SpiNNaker system to generate and simulate heterogeneous networks on-chip, while illustrating, through the network-scale effects of wavefront synchronisation and burst gating, methods that can provide effective behavioural abstractions for large-scale hardware modelling. SpiNNaker's asynchronous virtual architecture permits greater scope for model exploration, with scalable levels of functional and temporal abstraction, than conventional (or neuromorphic) computing platforms. The complete system illustrates a potential path to understanding the neural model of computation, by building (and breaking) neural models at various scales, connecting the blocks, then comparing them against the biology: computational cognitive neuroscience. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. From Pleistocene to Holocene: the prehistory of southwest Asia in evolutionary context.

    PubMed

    Watkins, Trevor

    2017-08-14

    In this paper I seek to show how cultural niche construction theory offers the potential to extend the human evolutionary story beyond the Pleistocene, through the Neolithic, towards the kind of very large-scale societies in which we live today. The study of the human past has been compartmentalised, each compartment using different analytical vocabularies, so that their accounts are written in mutually incompatible languages. In recent years social, cognitive and cultural evolutionary theories, building on a growing body of archaeological evidence, have made substantial sense of the social and cultural evolution of the genus Homo. However, specialists in this field of studies have found it difficult to extend their kind of analysis into the Holocene human world. Within southwest Asia the three or four millennia of the Neolithic period at the beginning of the Holocene represents a pivotal point, which saw the transformation of human society in the emergence of the first large-scale, permanent communities, the domestication of plants and animals, and the establishment of effective farming economies. Following the Neolithic, the pace of human social, economic and cultural evolution continued to increase. By 5000 years ago, in parts of southwest Asia and northeast Africa there were very large-scale urban societies, and the first large-scale states (kingdoms). An extension of cultural niche construction theory enables us to extend the evolutionary narrative of the Pleistocene into the Holocene, opening the way to developing a single, long-term, evolutionary account of human history.

  3. COMPUTATIONAL METHODOLOGIES for REAL-SPACE STRUCTURAL REFINEMENT of LARGE MACROMOLECULAR COMPLEXES

    PubMed Central

    Goh, Boon Chong; Hadden, Jodi A.; Bernardi, Rafael C.; Singharoy, Abhishek; McGreevy, Ryan; Rudack, Till; Cassidy, C. Keith; Schulten, Klaus

    2017-01-01

    The rise of the computer as a powerful tool for model building and refinement has revolutionized the field of structure determination for large biomolecular systems. Despite the wide availability of robust experimental methods capable of resolving structural details across a range of spatiotemporal resolutions, computational hybrid methods have the unique ability to integrate the diverse data from multimodal techniques such as X-ray crystallography and electron microscopy into consistent, fully atomistic structures. Here, commonly employed strategies for computational real-space structural refinement are reviewed, and their specific applications are illustrated for several large macromolecular complexes: ribosome, virus capsids, chemosensory array, and photosynthetic chromatophore. The increasingly important role of computational methods in large-scale structural refinement, along with current and future challenges, is discussed. PMID:27145875

  4. Cumulative effects of restoration efforts on ecological characteristics of an open water area within the Upper Mississippi River

    USGS Publications Warehouse

    Gray, B.R.; Shi, W.; Houser, J.N.; Rogala, J.T.; Guan, Z.; Cochran-Biederman, J. L.

    2011-01-01

    Ecological restoration efforts in large rivers generally aim to ameliorate ecological effects associated with large-scale modification of those rivers. This study examined whether the effects of restoration efforts-specifically those of island construction-within a largely open water restoration area of the Upper Mississippi River (UMR) might be seen at the spatial scale of that 3476ha area. The cumulative effects of island construction, when observed over multiple years, were postulated to have made the restoration area increasingly similar to a positive reference area (a proximate area comprising contiguous backwater areas) and increasingly different from two negative reference areas. The negative reference areas represented the Mississippi River main channel in an area proximate to the restoration area and an open water area in a related Mississippi River reach that has seen relatively little restoration effort. Inferences on the effects of restoration were made by comparing constrained and unconstrained models of summer chlorophyll a (CHL), summer inorganic suspended solids (ISS) and counts of benthic mayfly larvae. Constrained models forced trends in means or in both means and sampling variances to become, over time, increasingly similar to those in the positive reference area and increasingly dissimilar to those in the negative reference areas. Trends were estimated over 12- (mayflies) or 14-year sampling periods, and were evaluated using model information criteria. Based on these methods, restoration effects were observed for CHL and mayflies while evidence in favour of restoration effects on ISS was equivocal. These findings suggest that the cumulative effects of island building at relatively large spatial scales within large rivers may be estimated using data from large-scale surveillance monitoring programs. Published in 2010 by John Wiley & Sons, Ltd.

  5. Understanding and Controlling Sialylation in a CHO Fc-Fusion Process

    PubMed Central

    Lewis, Amanda M.; Croughan, William D.; Aranibar, Nelly; Lee, Alison G.; Warrack, Bethanne; Abu-Absi, Nicholas R.; Patel, Rutva; Drew, Barry; Borys, Michael C.; Reily, Michael D.; Li, Zheng Jian

    2016-01-01

    A Chinese hamster ovary (CHO) bioprocess, where the product is a sialylated Fc-fusion protein, was operated at pilot and manufacturing scale and significant variation of sialylation level was observed. In order to more tightly control glycosylation profiles, we sought to identify the cause of variability. Untargeted metabolomics and transcriptomics methods were applied to select samples from the large scale runs. Lower sialylation was correlated with elevated mannose levels, a shift in glucose metabolism, and increased oxidative stress response. Using a 5-L scale model operated with a reduced dissolved oxygen set point, we were able to reproduce the phenotypic profiles observed at manufacturing scale including lower sialylation, higher lactate and lower ammonia levels. Targeted transcriptomics and metabolomics confirmed that reduced oxygen levels resulted in increased mannose levels, a shift towards glycolysis, and increased oxidative stress response similar to the manufacturing scale. Finally, we propose a biological mechanism linking large scale operation and sialylation variation. Oxidative stress results from gas transfer limitations at large scale and the presence of oxygen dead-zones inducing upregulation of glycolysis and mannose biosynthesis, and downregulation of hexosamine biosynthesis and acetyl-CoA formation. The lower flux through the hexosamine pathway and reduced intracellular pools of acetyl-CoA led to reduced formation of N-acetylglucosamine and N-acetylneuraminic acid, both key building blocks of N-glycan structures. This study reports for the first time a link between oxidative stress and mammalian protein sialyation. In this study, process, analytical, metabolomic, and transcriptomic data at manufacturing, pilot, and laboratory scales were taken together to develop a systems level understanding of the process and identify oxygen limitation as the root cause of glycosylation variability. PMID:27310468

  6. Data Sharing in DHT Based P2P Systems

    NASA Astrophysics Data System (ADS)

    Roncancio, Claudia; Del Pilar Villamil, María; Labbé, Cyril; Serrano-Alvarado, Patricia

    The evolution of peer-to-peer (P2P) systems triggered the building of large scale distributed applications. The main application domain is data sharing across a very large number of highly autonomous participants. Building such data sharing systems is particularly challenging because of the “extreme” characteristics of P2P infrastructures: massive distribution, high churn rate, no global control, potentially untrusted participants... This article focuses on declarative querying support, query optimization and data privacy on a major class of P2P systems, that based on Distributed Hash Table (P2P DHT). The usual approaches and the algorithms used by classic distributed systems and databases for providing data privacy and querying services are not well suited to P2P DHT systems. A considerable amount of work was required to adapt them for the new challenges such systems present. This paper describes the most important solutions found. It also identifies important future research trends in data management in P2P DHT systems.

  7. Modeling Atmospheric Transport for Greenhouse Gas Observations within the Urban Dome

    NASA Astrophysics Data System (ADS)

    Nehrkorn, T.; Sargent, M. R.; Wofsy, S. C.

    2016-12-01

    Observations of CO2, CH4, and other greenhouse gases (GHGs) within the urban dome of major cities generally show large enhancements over background values, and large sensitivity to surface fluxes (as measured by the footprints computed by Lagrangian Particle Dispersion Models, LPDMs) within the urban dome. However, their use in top-down inversion studies to constrain urban emission estimates is complicated by difficulties in proper modeling of the atmospheric transport. We are conducting experiments with the Weather Research and Forecast model (WRF) coupled to the STILT LPDM to improve model simulation of atmospheric transport on spatial scales of a few km in urban domains, because errors in transport on short time/space scales are amplified by the patchiness of GHG emissions and may engender systematic errors of simulated concentrations.We are evaluating the quality of the meteorological simulations from model configurations with different resolutions and PBL packages, using both standard and non-standard (Lidar PBL height and ACARS aircraft profile) observations. To take into account the effect of building scale eddies for observations located on top of buildings, we are modifying the basic STILT algorithm for the computation of footprints by replacing the nominal receptor height by an effective sampling height. In addition, the footprint computations for near-field emissions make use of the vertical particle spread within the LPDM to arrive at a more appropriate estimate of mixing heights in the immediate vicinity of receptors. We present the effect of these and similar modifications on simulated concentrations and their level of agreement with observed values.

  8. Handheld low-temperature plasma probe for portable "point-and-shoot" ambient ionization mass spectrometry.

    PubMed

    Wiley, Joshua S; Shelley, Jacob T; Cooks, R Graham

    2013-07-16

    We describe a handheld, wireless low-temperature plasma (LTP) ambient ionization source and its performance on a benchtop and a miniature mass spectrometer. The source, which is inexpensive to build and operate, is battery-powered and utilizes miniature helium cylinders or air as the discharge gas. Comparison of a conventional, large-scale LTP source against the handheld LTP source, which uses less helium and power than the large-scale version, revealed that the handheld source had similar or slightly better analytical performance. Another advantage of the handheld LTP source is the ability to quickly interrogate a gaseous, liquid, or solid sample without requiring any setup time. A small, 7.4-V Li-polymer battery is able to sustain plasma for 2 h continuously, while the miniature helium cylinder supplies gas flow for approximately 8 continuous hours. Long-distance ion transfer was achieved for distances up to 1 m.

  9. Inherent polarization entanglement generated from a monolithic semiconductor chip

    PubMed Central

    Horn, Rolf T.; Kolenderski, Piotr; Kang, Dongpeng; Abolghasem, Payam; Scarcella, Carmelo; Frera, Adriano Della; Tosi, Alberto; Helt, Lukas G.; Zhukovsky, Sergei V.; Sipe, J. E.; Weihs, Gregor; Helmy, Amr S.; Jennewein, Thomas

    2013-01-01

    Creating miniature chip scale implementations of optical quantum information protocols is a dream for many in the quantum optics community. This is largely because of the promise of stability and scalability. Here we present a monolithically integratable chip architecture upon which is built a photonic device primitive called a Bragg reflection waveguide (BRW). Implemented in gallium arsenide, we show that, via the process of spontaneous parametric down conversion, the BRW is capable of directly producing polarization entangled photons without additional path difference compensation, spectral filtering or post-selection. After splitting the twin-photons immediately after they emerge from the chip, we perform a variety of correlation tests on the photon pairs and show non-classical behaviour in their polarization. Combined with the BRW's versatile architecture our results signify the BRW design as a serious contender on which to build large scale implementations of optical quantum processing devices. PMID:23896982

  10. Big Data Analytics for Genomic Medicine

    PubMed Central

    He, Karen Y.; Ge, Dongliang; He, Max M.

    2017-01-01

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287

  11. Interprofessional Education and Practice Guide No. 7: Development, implementation, and evaluation of a large-scale required interprofessional education foundational programme.

    PubMed

    Shrader, Sarah; Hodgkins, Renee; Laverentz, Delois; Zaudke, Jana; Waxman, Michael; Johnston, Kristy; Jernigan, Stephen

    2016-09-01

    Health profession educators and administrators are interested in how to develop an effective and sustainable interprofessional education (IPE) programme. We describe the approach used at the University of Kansas Medical Centre, Kansas City, United States. This approach is a foundational programme with multiple large-scale, half-day events each year. The programme is threaded with common curricular components that build in complexity over time and assures that each learner is exposed to IPE. In this guide, lessons learned and general principles related to the development of IPE programming are discussed. Important areas that educators should consider include curriculum development, engaging leadership, overcoming scheduling barriers, providing faculty development, piloting the programming, planning for logistical coordination, intentionally pairing IP facilitators, anticipating IP conflict, setting clear expectations for learners, publicising the programme, debriefing with faculty, planning for programme evaluation, and developing a scholarship and dissemination plan.

  12. Big Data Analytics for Genomic Medicine.

    PubMed

    He, Karen Y; Ge, Dongliang; He, Max M

    2017-02-15

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.

  13. Fine-Scale Population Estimation by 3D Reconstruction of Urban Residential Buildings

    PubMed Central

    Wang, Shixin; Tian, Ye; Zhou, Yi; Liu, Wenliang; Lin, Chenxi

    2016-01-01

    Fine-scale population estimation is essential in emergency response and epidemiological applications as well as urban planning and management. However, representing populations in heterogeneous urban regions with a finer resolution is a challenge. This study aims to obtain fine-scale population distribution based on 3D reconstruction of urban residential buildings with morphological operations using optical high-resolution (HR) images from the Chinese No. 3 Resources Satellite (ZY-3). Specifically, the research area was first divided into three categories when dasymetric mapping was taken into consideration. The results demonstrate that the morphological building index (MBI) yielded better results than built-up presence index (PanTex) in building detection, and the morphological shadow index (MSI) outperformed color invariant indices (CIIT) in shadow extraction and height retrieval. Building extraction and height retrieval were then combined to reconstruct 3D models and to estimate population. Final results show that this approach is effective in fine-scale population estimation, with a mean relative error of 16.46% and an overall Relative Total Absolute Error (RATE) of 0.158. This study gives significant insights into fine-scale population estimation in complicated urban landscapes, when detailed 3D information of buildings is unavailable. PMID:27775670

  14. Development of a novel multi-layer MRE isolator for suppression of building vibrations under seismic events

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Sun, Shuaishuai; Tian, Tongfei; Li, Weihua; Du, Haiping; Alici, Gursel; Nakano, Masami

    2016-03-01

    Protecting civil engineering structures from uncontrollable events such as earthquakes while maintaining their structural integrity and serviceability is very important; this paper describes the performance of a stiffness softening magnetorheological elastomer (MRE) isolator in a scaled three storey building. In order to construct a closed-loop system, a scaled three storey building was designed and built according to the scaling laws, and then four MRE isolator prototypes were fabricated and utilised to isolate the building from the motion induced by a scaled El Centro earthquake. Fuzzy logic was used to output the current signals to the isolators, based on the real-time responses of the building floors, and then a simulation was used to evaluate the feasibility of this closed loop control system before carrying out an experimental test. The simulation and experimental results showed that the stiffness softening MRE isolator controlled by fuzzy logic could suppress structural vibration well.

  15. Global- to Micro-Scale Evolution of the Pinatubo Aerosol: Using Composite Data Sets to Build the Picture and Assess Consistency of Different Measurements

    NASA Technical Reports Server (NTRS)

    Russell, P. B.; Pueschel, R. F.; Livingston, J. M.; Bergstrom, R.; Lawless, James G. (Technical Monitor)

    1994-01-01

    This paper brings together experimental. evidence required to build realistic models of the global evolution of physical, chemical, and optical properties of the aerosol resulting from the 1991 Pinatubo volcanic eruption. Such models are needed to compute the effects of the aerosol on atmospheric chemistry, dynamics, radiation, and temperature. Whereas there is now a large and growing body of post-Pinatubo measurements by a variety of techniques, some results are in conflict, and a self-consistent, unified picture is needed, along with an assessment of remaining uncertainties. This paper examines data from photometers, radiometers, impactors, optical counters/sizers, and lidars operated on the ground, aircraft, balloons, and spacecraft.

  16. Health policy in Asia and the Pacific: Navigating local needs and global challenges

    PubMed Central

    Lee, Kelley

    2014-01-01

    Asia and the Pacific are undergoing a remarkable economic transformation which is occurring at an exceptional pace. There is clear evidence of an equally rapid epidemiological transition in the region. This paper sets out the policy challenges of building healthy societies in the context of rapid economic change. The region’s location at the crossroads of contemporary globalization, resulting in intensified population mobility, large-scale trade and investment, and pressures to take collective action on shared problems, adds to the complexity of this task. The paper argues that health is integral to building stable and sustainable societies, and that there are opportunities to develop more holistic approaches that bring together hitherto separate policy spheres. PMID:24592312

  17. Navajo-Hopi Land Commission Renewable Energy Development Project (NREP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Benally, Deputy Director,

    2012-05-15

    The Navajo Hopi Land Commission Office (NHLCO), a Navajo Nation executive branch agency has conducted activities to determine capacity-building, institution-building, outreach and management activities to initiate the development of large-scale renewable energy - 100 megawatt (MW) or larger - generating projects on land in Northwestern New Mexico in the first year of a multi-year program. The Navajo Hopi Land Commission Renewable Energy Development Project (NREP) is a one year program that will develop and market a strategic business plan; form multi-agency and public-private project partnerships; compile site-specific solar, wind and infrastructure data; and develop and use project communication and marketingmore » tools to support outreach efforts targeting the public, vendors, investors and government audiences.« less

  18. Mortality during a Large-Scale Heat Wave by Place, Demographic Group, Internal and External Causes of Death, and Building Climate Zone

    PubMed Central

    Joe, Lauren; Hoshiko, Sumi; Dobraca, Dina; Jackson, Rebecca; Smorodinsky, Svetlana; Smith, Daniel; Harnly, Martha

    2016-01-01

    Mortality increases during periods of elevated heat. Identification of vulnerable subgroups by demographics, causes of death, and geographic regions, including deaths occurring at home, is needed to inform public health prevention efforts. We calculated mortality relative risks (RRs) and excess deaths associated with a large-scale California heat wave in 2006, comparing deaths during the heat wave with reference days. For total (all-place) and at-home mortality, we examined risks by demographic factors, internal and external causes of death, and building climate zones. During the heat wave, 582 excess deaths occurred, a 5% increase over expected (RR = 1.05, 95% confidence interval (CI) 1.03–1.08). Sixty-six percent of excess deaths were at home (RR = 1.12, CI 1.07–1.16). Total mortality risk was higher among those aged 35–44 years than ≥65, and among Hispanics than whites. Deaths from external causes increased more sharply (RR = 1.18, CI 1.10–1.27) than from internal causes (RR = 1.04, CI 1.02–1.07). Geographically, risk varied by building climate zone; the highest risks of at-home death occurred in the northernmost coastal zone (RR = 1.58, CI 1.01–2.48) and the southernmost zone of California’s Central Valley (RR = 1.43, CI 1.21–1.68). Heat wave mortality risk varied across subpopulations, and some patterns of vulnerability differed from those previously identified. Public health efforts should also address at-home mortality, non-elderly adults, external causes, and at-risk geographic regions. PMID:27005646

  19. Indirect effects of overfishing on Caribbean reefs: sponges overgrow reef-building corals

    PubMed Central

    Loh, Tse-Lynn; McMurray, Steven E.; Henkel, Timothy P.; Vicente, Jan

    2015-01-01

    Consumer-mediated indirect effects at the community level are difficult to demonstrate empirically. Here, we show an explicit indirect effect of overfishing on competition between sponges and reef-building corals from surveys of 69 sites across the Caribbean. Leveraging the large-scale, long-term removal of sponge predators, we selected overfished sites where intensive methods, primarily fish-trapping, have been employed for decades or more, and compared them to sites in remote or marine protected areas (MPAs) with variable levels of enforcement. Sponge-eating fishes (angelfishes and parrotfishes) were counted at each site, and the benthos surveyed, with coral colonies scored for interaction with sponges. Overfished sites had >3 fold more overgrowth of corals by sponges, and mean coral contact with sponges was 25.6%, compared with 12.0% at less-fished sites. Greater contact with corals by sponges at overfished sites was mostly by sponge species palatable to sponge predators. Palatable species have faster rates of growth or reproduction than defended sponge species, which instead make metabolically expensive chemical defenses. These results validate the top-down conceptual model of sponge community ecology for Caribbean reefs, as well as provide an unambiguous justification for MPAs to protect threatened reef-building corals. An unanticipated outcome of the benthic survey component of this study was that overfished sites had lower mean macroalgal cover (23.1% vs. 38.1% for less-fished sites), a result that is contrary to prevailing assumptions about seaweed control by herbivorous fishes. Because we did not quantify herbivores for this study, we interpret this result with caution, but suggest that additional large-scale studies comparing intensively overfished and MPA sites are warranted to examine the relative impacts of herbivorous fishes and urchins on Caribbean reefs. PMID:25945305

  20. A Facile Molten-Salt Route for Large-Scale Synthesis of NiFe2O4 Nanoplates with Enhanced Lithium Storage Capability.

    PubMed

    Huang, Gang; Du, Xinchuan; Zhang, Feifei; Yin, Dongming; Wang, Limin

    2015-09-28

    Binary metal oxides have been deemed as a promising class of electrode materials for high-performance lithium ion batteries owing to their higher conductivity and electrochemical activity than corresponding monometal oxides. Here, NiFe2O4 nanoplates consisting of nanosized building blocks have been successfully fabricated by a facile, large-scale NaCl and KCl molten-salt route, and the changes in the morphology of NiFe2O4 as a function of the molten-salt amount have been systemically investigated. The results indicate that the molten-salt amount mainly influences the diameter and thickness of the NiFe2O4 nanoplates as well as the morphology of the nanosized building blocks. Cyclic voltammetry (CV) and galvanostatic charge-discharge measurements have been conducted to evaluate the lithium storage properties of the NiFe2O4 nanoplates prepared with a Ni(NO3)2/Fe(NO3)3/KCl/NaCl molar ratio of 1:2:20:60. A high reversible capacity of 888 mAh g(-1) is delivered over 100 cycles at a current density of 100 mA g(-1). Even at a current density of 5000 mA g(-1) , the discharge capacity could still reach 173 mAh g(-1). Such excellent electrochemical performances of the NiFe2O4 nanoplates are contributed to the short Li(+) diffusion distance of the nanosized building blocks and the synergetic effect of the Ni(2+) and Fe(3+) ions. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Indirect effects of overfishing on Caribbean reefs: sponges overgrow reef-building corals.

    PubMed

    Loh, Tse-Lynn; McMurray, Steven E; Henkel, Timothy P; Vicente, Jan; Pawlik, Joseph R

    2015-01-01

    Consumer-mediated indirect effects at the community level are difficult to demonstrate empirically. Here, we show an explicit indirect effect of overfishing on competition between sponges and reef-building corals from surveys of 69 sites across the Caribbean. Leveraging the large-scale, long-term removal of sponge predators, we selected overfished sites where intensive methods, primarily fish-trapping, have been employed for decades or more, and compared them to sites in remote or marine protected areas (MPAs) with variable levels of enforcement. Sponge-eating fishes (angelfishes and parrotfishes) were counted at each site, and the benthos surveyed, with coral colonies scored for interaction with sponges. Overfished sites had >3 fold more overgrowth of corals by sponges, and mean coral contact with sponges was 25.6%, compared with 12.0% at less-fished sites. Greater contact with corals by sponges at overfished sites was mostly by sponge species palatable to sponge predators. Palatable species have faster rates of growth or reproduction than defended sponge species, which instead make metabolically expensive chemical defenses. These results validate the top-down conceptual model of sponge community ecology for Caribbean reefs, as well as provide an unambiguous justification for MPAs to protect threatened reef-building corals. An unanticipated outcome of the benthic survey component of this study was that overfished sites had lower mean macroalgal cover (23.1% vs. 38.1% for less-fished sites), a result that is contrary to prevailing assumptions about seaweed control by herbivorous fishes. Because we did not quantify herbivores for this study, we interpret this result with caution, but suggest that additional large-scale studies comparing intensively overfished and MPA sites are warranted to examine the relative impacts of herbivorous fishes and urchins on Caribbean reefs.

  2. Space and time scales in human-landscape systems.

    PubMed

    Kondolf, G Mathias; Podolak, Kristen

    2014-01-01

    Exploring spatial and temporal scales provides a way to understand human alteration of landscape processes and human responses to these processes. We address three topics relevant to human-landscape systems: (1) scales of human impacts on geomorphic processes, (2) spatial and temporal scales in river restoration, and (3) time scales of natural disasters and behavioral and institutional responses. Studies showing dramatic recent change in sediment yields from uplands to the ocean via rivers illustrate the increasingly vast spatial extent and quick rate of human landscape change in the last two millennia, but especially in the second half of the twentieth century. Recent river restoration efforts are typically small in spatial and temporal scale compared to the historical human changes to ecosystem processes, but the cumulative effectiveness of multiple small restoration projects in achieving large ecosystem goals has yet to be demonstrated. The mismatch between infrequent natural disasters and individual risk perception, media coverage, and institutional response to natural disasters results in un-preparedness and unsustainable land use and building practices.

  3. A GLOBAL GALACTIC DYNAMO WITH A CORONA CONSTRAINED BY RELATIVE HELICITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, A.; Mangalam, A., E-mail: avijeet@iiap.res.in, E-mail: mangalam@iiap.res.in

    We present a model for a global axisymmetric turbulent dynamo operating in a galaxy with a corona that treats the parameters of turbulence driven by supernovae and by magneto-rotational instability under a common formalism. The nonlinear quenching of the dynamo is alleviated by the inclusion of small-scale advective and diffusive magnetic helicity fluxes, which allow the gauge-invariant magnetic helicity to be transferred outside the disk and consequently to build up a corona during the course of dynamo action. The time-dependent dynamo equations are expressed in a separable form and solved through an eigenvector expansion constructed using the steady-state solutions ofmore » the dynamo equation. The parametric evolution of the dynamo solution allows us to estimate the final structure of the global magnetic field and the saturated value of the turbulence parameter α{sub m}, even before solving the dynamical equations for evolution of magnetic fields in the disk and the corona, along with α-quenching. We then solve these equations simultaneously to study the saturation of the large-scale magnetic field, its dependence on the small-scale magnetic helicity fluxes, and the corresponding evolution of the force-free field in the corona. The quadrupolar large-scale magnetic field in the disk is found to reach equipartition strength within a timescale of 1 Gyr. The large-scale magnetic field in the corona obtained is much weaker than the field inside the disk and has only a weak impact on the dynamo operation.« less

  4. Large-scale Estimates of Leaf Area Index from Active Remote Sensing Laser Altimetry

    NASA Astrophysics Data System (ADS)

    Hopkinson, C.; Mahoney, C.

    2016-12-01

    Leaf area index (LAI) is a key parameter that describes the spatial distribution of foliage within forest canopies which in turn control numerous relationships between the ground, canopy, and atmosphere. The retrieval of LAI has demonstrated success by in-situ (digital) hemispherical photography (DHP) and airborne laser scanning (ALS) data; however, field and ALS acquisitions are often spatially limited (100's km2) and costly. Large-scale (>1000's km2) retrievals have been demonstrated by optical sensors, however, accuracies remain uncertain due to the sensor's inability to penetrate the canopy. The spaceborne Geoscience Laser Altimeter System (GLAS) provides a possible solution in retrieving large-scale derivations whilst simultaneously penetrating the canopy. LAI retrieved by multiple DHP from 6 Australian sites, representing a cross-section of Australian ecosystems, were employed to model ALS LAI, which in turn were used to infer LAI from GLAS data at 5 other sites. An optimally filtered GLAS dataset was then employed in conjunction with a host of supplementary data to build a Random Forest (RF) model to infer predictions (and uncertainties) of LAI at a 250 m resolution across the forested regions of Australia. Predictions were validated against ALS-based LAI from 20 sites (R2=0.64, RMSE=1.1 m2m-2); MODIS-based LAI were also assessed against these sites (R2=0.30, RMSE=1.78 m2m-2) to demonstrate the strength of GLAS-based predictions. The large-scale nature of current predictions was also leveraged to demonstrate large-scale relationships of LAI with other environmental characteristics, such as: canopy height, elevation, and slope. The need for such wide-scale quantification of LAI is key in the assessment and modification of forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network, in fulfilling their government issued mandates.

  5. Review of the Need for a Large-scale Test Facility for Research on the Effects of Extreme Winds on Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. G. Little

    1999-03-01

    The Idaho National Engineering and Environmental Laboratory (INEEL), through the US Department of Energy (DOE), has proposed that a large-scale wind test facility (LSWTF) be constructed to study, in full-scale, the behavior of low-rise structures under simulated extreme wind conditions. To determine the need for, and potential benefits of, such a facility, the Idaho Operations Office of the DOE requested that the National Research Council (NRC) perform an independent assessment of the role and potential value of an LSWTF in the overall context of wind engineering research. The NRC established the Committee to Review the Need for a Large-scale Testmore » Facility for Research on the Effects of Extreme Winds on Structures, under the auspices of the Board on Infrastructure and the Constructed Environment, to perform this assessment. This report conveys the results of the committee's deliberations as well as its findings and recommendations. Data developed at large-scale would enhanced the understanding of how structures, particularly light-frame structures, are affected by extreme winds (e.g., hurricanes, tornadoes, sever thunderstorms, and other events). With a large-scale wind test facility, full-sized structures, such as site-built or manufactured housing and small commercial or industrial buildings, could be tested under a range of wind conditions in a controlled, repeatable environment. At this time, the US has no facility specifically constructed for this purpose. During the course of this study, the committee was confronted by three difficult questions: (1) does the lack of a facility equate to a need for the facility? (2) is need alone sufficient justification for the construction of a facility? and (3) would the benefits derived from information produced in an LSWTF justify the costs of producing that information? The committee's evaluation of the need and justification for an LSWTF was shaped by these realities.« less

  6. Rocket University at KSC

    NASA Technical Reports Server (NTRS)

    Sullivan, Steven J.

    2014-01-01

    "Rocket University" is an exciting new initiative at Kennedy Space Center led by NASA's Engineering and Technology Directorate. This hands-on experience has been established to develop, refine & maintain targeted flight engineering skills to enable the Agency and KSC strategic goals. Through "RocketU", KSC is developing a nimble, rapid flight engineering life cycle systems knowledge base. Ongoing activities in RocketU develop and test new technologies and potential customer systems through small scale vehicles, build and maintain flight experience through balloon and small-scale rocket missions, and enable a revolving fresh perspective of engineers with hands on expertise back into the large scale NASA programs, providing a more experienced multi-disciplined set of systems engineers. This overview will define the Program, highlight aspects of the training curriculum, and identify recent accomplishments and activities.

  7. Industrial technology for the economic and viable encapsulation for large-scale solar panels (technologie industrielle d'encapsulation economique et fiable pour panneaux solaires de grandes dimensions). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anguet, J.; Salles, Y.

    The aim of the work is to apply the laminated glass technology used in buildings and car windscreens to the encapsulation of solar panels so as to form a glass-polyvinylbutyral-glass 'sandwich'. Based on small-scale experimental panels, the following studies were made: (1) adhesion techniques; (2) structure studies to find the most suitable means for maintaining the mechanical stability of the cells; (3) types of connections for the solar panels and (4) climatic tests and humidity resistance. Mechanical and climatic tests with the minimodules gave encouraging results, whereupon larger scale models were designed. The results obtained with these confirmed those obtainedmore » with the mini-modules.« less

  8. A New Method of Building Scale-Model Houses

    Treesearch

    Richard N. Malcolm

    1978-01-01

    Scale-model houses are used to display new architectural and construction designs.Some scale-model houses will not withstand the abuse of shipping and handling.This report describes how to build a solid-core model house which is rigid, lightweight, and sturdy.

  9. DNA-encoded chemistry: enabling the deeper sampling of chemical space.

    PubMed

    Goodnow, Robert A; Dumelin, Christoph E; Keefe, Anthony D

    2017-02-01

    DNA-encoded chemical library technologies are increasingly being adopted in drug discovery for hit and lead generation. DNA-encoded chemistry enables the exploration of chemical spaces four to five orders of magnitude more deeply than is achievable by traditional high-throughput screening methods. Operation of this technology requires developing a range of capabilities including aqueous synthetic chemistry, building block acquisition, oligonucleotide conjugation, large-scale molecular biological transformations, selection methodologies, PCR, sequencing, sequence data analysis and the analysis of large chemistry spaces. This Review provides an overview of the development and applications of DNA-encoded chemistry, highlighting the challenges and future directions for the use of this technology.

  10. Assessment of the State-of-the-Art in the Design and Manufacturing of Large Composite Structure

    NASA Technical Reports Server (NTRS)

    Harris, C. E.

    2001-01-01

    This viewgraph presentation gives an assessment of the state-of-the-art in the design and manufacturing of large component structures, including details on the use of continuous fiber reinforced polymer matrix composites (CFRP) in commercial and military aircraft and in space launch vehicles. Project risk mitigation plans must include a building-block test approach to structural design development, manufacturing process scale-up development tests, and pre-flight ground tests to verify structural integrity. The potential benefits of composite structures justifies NASA's investment in developing the technology. Advanced composite structures technology is enabling to virtually every Aero-Space Technology Enterprise Goal.

  11. The social welfare function and individual responsibility: some theoretical issues and empirical evidence.

    PubMed

    Dolan, Paul; Tsuchiya, Aki

    2009-01-01

    The literature on income distribution has attempted to evaluate different degrees of inequality using a social welfare function (SWF) approach. However, it has largely ignored the source of such inequalities, and has thus failed to consider different degrees of inequity. The literature on egalitarianism has addressed issues of equity, largely in relation to individual responsibility. This paper builds upon these two literatures, and introduces individual responsibility into the SWF. Results from a small-scale study of people's preferences in relation to the distribution of health benefits are presented to illustrate how the parameter values of a SWF might be determined.

  12. The Hyper Suprime-Cam software pipeline

    NASA Astrophysics Data System (ADS)

    Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi

    2018-01-01

    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.

  13. Cross-Scale Molecular Analysis of Chemical Heterogeneity in Shale Rocks

    DOE PAGES

    Hao, Zhao; Bechtel, Hans A.; Kneafsey, Timothy; ...

    2018-02-07

    The organic and mineralogical heterogeneity in shale at micrometer and nanometer spatial scales contributes to the quality of gas reserves, gas flow mechanisms and gas production. Here, we demonstrate two molecular imaging approaches based on infrared spectroscopy to obtain mineral and kerogen information at these mesoscale spatial resolutions in large-sized shale rock samples. The first method is a modified microscopic attenuated total reflectance measurement that utilizes a large germanium hemisphere combined with a focal plane array detector to rapidly capture chemical images of shale rock surfaces spanning hundreds of micrometers with micrometer spatial resolution. The second method, synchrotron infrared nano-spectroscopy,more » utilizes a metallic atomic force microscope tip to obtain chemical images of micrometer dimensions but with nanometer spatial resolution. This chemically "deconvoluted" imaging at the nano-pore scale is then used to build a machine learning model to generate a molecular distribution map across scales with a spatial span of 1000 times, which enables high-throughput geochemical characterization in greater details across the nano-pore and micro-grain scales and allows us to identify co-localization of mineral phases with chemically distinct organics and even with gas phase sorbents. Finally, this characterization is fundamental to understand mineral and organic compositions affecting the behavior of shales.« less

  14. Cross-Scale Molecular Analysis of Chemical Heterogeneity in Shale Rocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, Zhao; Bechtel, Hans A.; Kneafsey, Timothy

    The organic and mineralogical heterogeneity in shale at micrometer and nanometer spatial scales contributes to the quality of gas reserves, gas flow mechanisms and gas production. Here, we demonstrate two molecular imaging approaches based on infrared spectroscopy to obtain mineral and kerogen information at these mesoscale spatial resolutions in large-sized shale rock samples. The first method is a modified microscopic attenuated total reflectance measurement that utilizes a large germanium hemisphere combined with a focal plane array detector to rapidly capture chemical images of shale rock surfaces spanning hundreds of micrometers with micrometer spatial resolution. The second method, synchrotron infrared nano-spectroscopy,more » utilizes a metallic atomic force microscope tip to obtain chemical images of micrometer dimensions but with nanometer spatial resolution. This chemically "deconvoluted" imaging at the nano-pore scale is then used to build a machine learning model to generate a molecular distribution map across scales with a spatial span of 1000 times, which enables high-throughput geochemical characterization in greater details across the nano-pore and micro-grain scales and allows us to identify co-localization of mineral phases with chemically distinct organics and even with gas phase sorbents. Finally, this characterization is fundamental to understand mineral and organic compositions affecting the behavior of shales.« less

  15. Obscuring and Feeding Supermassive Black Holes with Evolving Nuclear Star Clusters

    NASA Astrophysics Data System (ADS)

    Schartmann, M.; Burkert, A.; Krause, M.; Camenzind, M.; Meisenheimer, K.; Davies, R. I.

    2010-05-01

    Recently, high-resolution observations made with the help of the near-infrared adaptive optics integral field spectrograph SINFONI at the VLT proved the existence of massive and young nuclear star clusters in the centers of a sample of Seyfert galaxies. With the help of high-resolution hydrodynamical simulations with the pluto code, we follow the evolution of such clusters, especially focusing on mass and energy feedback from young stars. This leads to a filamentary inflow of gas on large scales (tens of parsecs), whereas a turbulent and very dense disk builds up on the parsec scale. Here we concentrate on the long-term evolution of the nuclear disk in NGC 1068 with the help of an effective viscous disk model, using the mass input from the large-scale simulations and accounting for star formation in the disk. This two-stage modeling enables us to connect the tens-of-parsecs scale region (observable with SINFONI) with the parsec-scale environment (MIDI observations). At the current age of the nuclear star cluster, our simulations predict disk sizes of the order 0.8 to 0.9 pc, gas masses of order 106 M⊙, and mass transfer rates through the inner boundary of order 0.025 M⊙ yr-1, in good agreement with values derived from observations.

  16. Classification of event location using matched filters via on-floor accelerometers

    NASA Astrophysics Data System (ADS)

    Woolard, Americo G.; Malladi, V. V. N. Sriram; Alajlouni, Sa'ed; Tarazaga, Pablo A.

    2017-04-01

    Recent years have shown prolific advancements in smart infrastructures, allowing buildings of the modern world to interact with their occupants. One of the sought-after attributes of smart buildings is the ability to provide unobtrusive, indoor localization of occupants. The ability to locate occupants indoors can provide a broad range of benefits in areas such as security, emergency response, and resource management. Recent research has shown promising results in occupant building localization, although there is still significant room for improvement. This study presents a passive, small-scale localization system using accelerometers placed around the edges of a small area in an active building environment. The area is discretized into a grid of small squares, and vibration measurements are processed using a pattern matching approach that estimates the location of the source. Vibration measurements are produced with ball-drops, hammer-strikes, and footsteps as the sources of the floor excitation. The developed approach uses matched filters based on a reference data set, and the location is classified using a nearest-neighbor search. This approach detects the appropriate location of impact-like sources i.e. the ball-drops and hammer-strikes with a 100% accuracy. However, this accuracy reduces to 56% for footsteps, with the average localization results being within 0.6 m (α = 0.05) from the true source location. While requiring a reference data set can make this method difficult to implement on a large scale, it may be used to provide accurate localization abilities in areas where training data is readily obtainable. This exploratory work seeks to examine the feasibility of the matched filter and nearest neighbor search approach for footstep and event localization in a small, instrumented area within a multi-story building.

  17. Fine-scale flight strategies of gulls in urban airflows indicate risk and reward in city living

    PubMed Central

    Shepard, Emily L. C.

    2016-01-01

    Birds modulate their flight paths in relation to regional and global airflows in order to reduce their travel costs. Birds should also respond to fine-scale airflows, although the incidence and value of this remains largely unknown. We resolved the three-dimensional trajectories of gulls flying along a built-up coastline, and used computational fluid dynamic models to examine how gulls reacted to airflows around buildings. Birds systematically altered their flight trajectories with wind conditions to exploit updraughts over features as small as a row of low-rise buildings. This provides the first evidence that human activities can change patterns of space-use in flying birds by altering the profitability of the airscape. At finer scales still, gulls varied their position to select a narrow range of updraught values, rather than exploiting the strongest updraughts available, and their precise positions were consistent with a strategy to increase their velocity control in gusty conditions. Ultimately, strategies such as these could help unmanned aerial vehicles negotiate complex airflows. Overall, airflows around fine-scale features have profound implications for flight control and energy use, and consideration of this could lead to a paradigm-shift in the way ecologists view the urban environment. This article is part of the themed issue ‘Moving in a moving medium: new perspectives on flight’. PMID:27528784

  18. Scout: An Impact Analysis Tool for Building Energy-Efficiency Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Chioke; Langevin, Jared; Roth, Amir

    Evaluating the national impacts of candidate U.S. building energy-efficiency technologies has historically been difficult for organizations with large energy efficiency portfolios. In particular, normalizing results from technology-specific impact studies is time-consuming when those studies do not use comparable assumptions about the underlying building stock. To equitably evaluate its technology research, development, and deployment portfolio, the U.S. Department of Energy's Building Technologies Office has developed Scout, a software tool that quantitatively assesses the energy and CO2 impacts of building energy-efficiency measures on the national building stock. Scout efficiency measures improve upon the unit performance and/or lifetime operational costs of an equipmentmore » stock baseline that is determined from the U.S. Energy Information Administration Annual Energy Outlook (AEO). Scout measures are characterized by a market entry and exit year, unit performance level, cost, and lifetime. To evaluate measures on a consistent basis, Scout uses EnergyPlus simulation on prototype building models to translate measure performance specifications to whole-building energy savings; these savings impacts are then extended to a national scale using floor area weighting factors. Scout represents evolution in the building stock over time using AEO projections for new construction, retrofit, and equipment replacements, and competes technologies within market segments under multiple adoption scenarios. Scout and its efficiency measures are open-source, as is the EnergyPlus whole building simulation framework that is used to evaluate measure performance. The program is currently under active development and will be formally released once an initial set of measures has been analyzed and reviewed.« less

  19. Energy Constraints for Building Large-Scale Systems

    DTIC Science & Technology

    2016-03-17

    power (and energy) constrained in their communication. The human cortex consumes about 20W of power, of which, only a fraction (< 25%) of this power...neurobiological systems use a similar approach in the fact that over 90% of neurons in cortex project locally to nearby neurons (i.e. nearest 1000 pyramidal...are constrained in their communication because of power constraints [1]. The human cortex consumes about 20W of power, of which, only a fraction (25

  20. Photovoltaic-Thermal New Technology Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dean, Jesse; McNutt, Peter; Lisell, Lars

    Photovoltaic-thermal (PV-T) hybrid solar systems offer increased electricity production by cooling the PV panel, and using the removed thermal energy to heat water - all in the same footprint as a standard PV system. GPG's assessment of the nation's first large-scale PV-T system installed at the Thomas P. O'Neill, Jr. Federal Building in Boston, MA, provided numerous lessons learned in system design, and identified a target market of locations with high utility costs and electric hot water backup.

  1. Public health nutrition capacity: assuring the quality of workforce preparation for scaling up nutrition programmes.

    PubMed

    Shrimpton, Roger; du Plessis, Lisanne M; Delisle, Hélène; Blaney, Sonia; Atwood, Stephen J; Sanders, David; Margetts, Barrie; Hughes, Roger

    2016-08-01

    To describe why and how capacity-building systems for scaling up nutrition programmes should be constructed in low- and middle-income countries (LMIC). Position paper with task force recommendations based on literature review and joint experience of global nutrition programmes, public health nutrition (PHN) workforce size, organization, and pre-service and in-service training. The review is global but the recommendations are made for LMIC scaling up multisectoral nutrition programmes. The multitude of PHN workers, be they in the health, agriculture, education, social welfare, or water and sanitation sector, as well as the community workers who ensure outreach and coverage of nutrition-specific and -sensitive interventions. Overnutrition and undernutrition problems affect at least half of the global population, especially those in LMIC. Programme guidance exists for undernutrition and overnutrition, and priority for scaling up multisectoral programmes for tackling undernutrition in LMIC is growing. Guidance on how to organize and scale up such programmes is scarce however, and estimates of existing PHN workforce numbers - although poor - suggest they are also inadequate. Pre-service nutrition training for a PHN workforce is mostly clinical and/or food science oriented and in-service nutrition training is largely restricted to infant and young child nutrition. Unless increased priority and funding is given to building capacity for scaling up nutrition programmes in LMIC, maternal and child undernutrition rates are likely to remain high and nutrition-related non-communicable diseases to escalate. A hybrid distance learning model for PHN workforce managers' in-service training is urgently needed in LMIC.

  2. Compiling a national resistivity atlas of Denmark based on airborne and ground-based transient electromagnetic data

    NASA Astrophysics Data System (ADS)

    Barfod, Adrian A. S.; Møller, Ingelise; Christiansen, Anders V.

    2016-11-01

    We present a large-scale study of the petrophysical relationship of resistivities obtained from densely sampled ground-based and airborne transient electromagnetic surveys and lithological information from boreholes. The overriding aim of this study is to develop a framework for examining the resistivity-lithology relationship in a statistical manner and apply this framework to gain a better description of the large-scale resistivity structures of the subsurface. In Denmark very large and extensive datasets are available through the national geophysical and borehole databases, GERDA and JUPITER respectively. In a 10 by 10 km grid, these data are compiled into histograms of resistivity versus lithology. To do this, the geophysical data are interpolated to the position of the boreholes, which allows for a lithological categorization of the interpolated resistivity values, yielding different histograms for a set of desired lithological categories. By applying the proposed algorithm to all available boreholes and airborne and ground-based transient electromagnetic data we build nation-wide maps of the resistivity-lithology relationships in Denmark. The presented Resistivity Atlas reveals varying patterns in the large-scale resistivity-lithology relations, reflecting geological details such as available source material for tills. The resistivity maps also reveal a clear ambiguity in the resistivity values for different lithologies. The Resistivity Atlas is highly useful when geophysical data are to be used for geological or hydrological modeling.

  3. Newspaper coverage of controversies about large-scale swine facilities in rural communities in Illinois.

    PubMed

    Reisner, A E

    2005-11-01

    The building and expansion of large-scale swine facilities has created considerable controversy in many neighboring communities, but to date, no systematic analysis has been done of the types of claims made during these conflicts. This study examined how local newspapers in one state covered the transition from the dominance of smaller, diversified swine operations to large, single-purpose pig production facilities. To look at publicly made statements concerning large-scale swine facilities (LSSF), the study collected all articles related to LSSF from 22 daily Illinois newspapers over a 3-yr period (a total of 1,737 articles). The most frequent sets of claims used by proponents of LSSF were that the environment was not harmed, that state regulations were sufficiently strict, and that the state economically needed this type of agriculture. The most frequent claims made by opponents were that LSSF harmed the environment and neighboring communities and that stricter regulations were needed. Proponents' claims were primarily defensive and, to some degree, underplayed the advantages of LSSF. Pro-and anti-LSSF groups were talking at cross-purposes, to some degree. Even across similar themes, those in favor of LSSF and those opposed were addressing different sets of concerns. The newspaper claims did not indicate any effective alliances forming between local anti-LSSF groups and national environmental or animal rights groups.

  4. LARGE BUILDING RADON MANUAL

    EPA Science Inventory

    The report summarizes information on how bilding systems -- especially the heating, ventilating, and air-conditioning (HVAC) system -- inclurence radon entry into large buildings and can be used to mitigate radon problems. It addresses the fundamentals of large building HVAC syst...

  5. Changing vessel routes could significantly reduce the cost of future offshore wind projects.

    PubMed

    Samoteskul, Kateryna; Firestone, Jeremy; Corbett, James; Callahan, John

    2014-08-01

    With the recent emphasis on offshore wind energy Coastal and Marine Spatial Planning (CMSP) has become one of the main frameworks used to plan and manage the increasingly complex web of ocean and coastal uses. As wind development becomes more prevalent, existing users of the ocean space, such as commercial shippers, will be compelled to share their historically open-access waters with these projects. Here, we demonstrate the utility of using cost-effectiveness analysis (CEA) to support siting decisions within a CMSP framework. In this study, we assume that large-scale offshore wind development will take place in the US Mid-Atlantic within the next decades. We then evaluate whether building projects nearshore or far from shore would be more cost-effective. Building projects nearshore is assumed to require rerouting of the commercial vessel traffic traveling between the US Mid-Atlantic ports by an average of 18.5 km per trip. We focus on less than 1500 transits by large deep-draft vessels. We estimate that over 29 years of the study, commercial shippers would incur an additional $0.2 billion (in 2012$) in direct and indirect costs. Building wind projects closer to shore where vessels used to transit would generate approximately $13.4 billion (in 2012$) in savings. Considering the large cost savings, modifying areas where vessels transit needs to be included in the portfolio of policies used to support the growth of the offshore wind industry in the US. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Neuromorphic Hardware Architecture Using the Neural Engineering Framework for Pattern Recognition.

    PubMed

    Wang, Runchun; Thakur, Chetan Singh; Cohen, Gregory; Hamilton, Tara Julia; Tapson, Jonathan; van Schaik, Andre

    2017-06-01

    We present a hardware architecture that uses the neural engineering framework (NEF) to implement large-scale neural networks on field programmable gate arrays (FPGAs) for performing massively parallel real-time pattern recognition. NEF is a framework that is capable of synthesising large-scale cognitive systems from subnetworks and we have previously presented an FPGA implementation of the NEF that successfully performs nonlinear mathematical computations. That work was developed based on a compact digital neural core, which consists of 64 neurons that are instantiated by a single physical neuron using a time-multiplexing approach. We have now scaled this approach up to build a pattern recognition system by combining identical neural cores together. As a proof of concept, we have developed a handwritten digit recognition system using the MNIST database and achieved a recognition rate of 96.55%. The system is implemented on a state-of-the-art FPGA and can process 5.12 million digits per second. The architecture and hardware optimisations presented offer high-speed and resource-efficient means for performing high-speed, neuromorphic, and massively parallel pattern recognition and classification tasks.

  7. Barriers to Building Energy Efficiency (BEE) promotion: A transaction costs perspective

    NASA Astrophysics Data System (ADS)

    Qian Kun, Queena

    Worldwide, buildings account for a surprisingly high 40% of global energy consumption, and the resulting carbon footprint significantly exceeds that of all forms of transportation combined. Large and attractive opportunities exist to reduce buildings' energy use at lower costs and higher returns than in other sectors. This thesis analyzes the concerns of the market stakeholders, mainly real estate developers and end-users, in terms of transaction costs as they make decisions about investing in Building Energy Efficiency (BEE). It provides a detailed analysis of the current situation and future prospects for BEE adoption by the market's stakeholders. It delineates the market and lays out the economic and institutional barriers to the large-scale deployment of energy-efficient building techniques. The aim of this research is to investigate the barriers raised by transaction costs that hinder market stakeholders from investing in BEES. It explains interactions among stakeholders in general and in the specific case of Hong Kong as they consider transaction costs. It focuses on the influence of transaction costs on the decision-making of the stakeholders during the entire process of real estate development. The objectives are: 1) To establish an analytical framework for understanding the barriers to BEE investment with consideration of transaction costs; 2) To build a theoretical game model of decision making among the BEE market stakeholders; 3) To study the empirical data from questionnaire surveys of building designers and from focused interviews with real estate developers in Hong Kong; 4) To triangulate the study's empirical findings with those of the theoretical model and analytical framework. The study shows that a coherent institutional framework needs to be established to ensure that the design and implementation of BEE policies acknowledge the concerns of market stakeholders by taking transaction costs into consideration. Regulatory and incentive options should be integrated into BEE policies to minimize efficiency gaps and to realize a sizeable increase in the number of energy-efficient buildings in the next decades. Specifically, the analysis shows that a thorough understanding of the transaction costs borne by particular stakeholders could improve the energy efficiency of buildings, even without improvements in currently available technology.

  8. Large-Scale Spatial Distribution Patterns of Gastropod Assemblages in Rocky Shores

    PubMed Central

    Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C.

    2013-01-01

    Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages. PMID:23967204

  9. Multidimensional model to assess the readiness of Saudi Arabia to implement evidence based child maltreatment prevention programs at a large scale.

    PubMed

    Almuneef, Maha A; Qayad, Mohamed; Noor, Ismail K; Al-Eissa, Majid A; Albuhairan, Fadia S; Inam, Sarah; Mikton, Christopher

    2014-03-01

    There has been increased awareness of child maltreatment in Saudi Arabia recently. This study assessed the readiness for implementing large-scale evidence-based child maltreatment prevention programs in Saudi Arabia. Key informants, who were key decision makers and senior managers in the field of child maltreatment, were invited to participate in the study. A multidimensional tool, developed by WHO and collaborators from several middle and low income countries, was used to assess 10 dimensions of readiness. A group of experts also gave an objective assessment of the 10 dimensions and key informants' and experts' scores were compared. On a scale of 100, the key informants gave a readiness score of 43% for Saudi Arabia to implement large-scale, evidence-based CM prevention programs, and experts gave an overall readiness score of 40%. Both the key informants and experts agreed that 4 of the dimensions (attitudes toward child maltreatment prevention, institutional links and resources, material resources, and human and technical resources) had low readiness scores (<5) each and three dimensions (knowledge of child maltreatment prevention, scientific data on child maltreatment prevention, and will to address child maltreatment problem) had high readiness scores (≥5) each. There was significant disagreement between key informants and experts on the remaining 3 dimensions. Overall, Saudi Arabia has a moderate/fair readiness to implement large-scale child maltreatment prevention programs. Capacity building; strengthening of material resources; and improving institutional links, collaborations, and attitudes toward the child maltreatment problem are required to improve the country's readiness to implement such programs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. A design-build-test cycle using modeling and experiments reveals interdependencies between upper glycolysis and xylose uptake in recombinant S. cerevisiae and improves predictive capabilities of large-scale kinetic models.

    PubMed

    Miskovic, Ljubisa; Alff-Tuomala, Susanne; Soh, Keng Cher; Barth, Dorothee; Salusjärvi, Laura; Pitkänen, Juha-Pekka; Ruohonen, Laura; Penttilä, Merja; Hatzimanikatis, Vassily

    2017-01-01

    Recent advancements in omics measurement technologies have led to an ever-increasing amount of available experimental data that necessitate systems-oriented methodologies for efficient and systematic integration of data into consistent large-scale kinetic models. These models can help us to uncover new insights into cellular physiology and also to assist in the rational design of bioreactor or fermentation processes. Optimization and Risk Analysis of Complex Living Entities (ORACLE) framework for the construction of large-scale kinetic models can be used as guidance for formulating alternative metabolic engineering strategies. We used ORACLE in a metabolic engineering problem: improvement of the xylose uptake rate during mixed glucose-xylose consumption in a recombinant Saccharomyces cerevisiae strain. Using the data from bioreactor fermentations, we characterized network flux and concentration profiles representing possible physiological states of the analyzed strain. We then identified enzymes that could lead to improved flux through xylose transporters (XTR). For some of the identified enzymes, including hexokinase (HXK), we could not deduce if their control over XTR was positive or negative. We thus performed a follow-up experiment, and we found out that HXK2 deletion improves xylose uptake rate. The data from the performed experiments were then used to prune the kinetic models, and the predictions of the pruned population of kinetic models were in agreement with the experimental data collected on the HXK2 -deficient S. cerevisiae strain. We present a design-build-test cycle composed of modeling efforts and experiments with a glucose-xylose co-utilizing recombinant S. cerevisiae and its HXK2 -deficient mutant that allowed us to uncover interdependencies between upper glycolysis and xylose uptake pathway. Through this cycle, we also obtained kinetic models with improved prediction capabilities. The present study demonstrates the potential of integrated "modeling and experiments" systems biology approaches that can be applied for diverse applications ranging from biotechnology to drug discovery.

  11. Building-related health impacts in European and Chinese cities: a scalable assessment method.

    PubMed

    Tuomisto, Jouni T; Niittynen, Marjo; Pärjälä, Erkki; Asikainen, Arja; Perez, Laura; Trüeb, Stephan; Jantunen, Matti; Künzli, Nino; Sabel, Clive E

    2015-12-14

    Public health is often affected by societal decisions that are not primarily about health. Climate change mitigation requires intensive actions to minimise greenhouse gas emissions in the future. Many of these actions take place in cities due to their traffic, buildings, and energy consumption. Active climate mitigation policies will also, aside of their long term global impacts, have short term local impacts, both positive and negative, on public health. Our main objective was to develop a generic open impact model to estimate health impacts of emissions due to heat and power consumption of buildings. In addition, the model should be usable for policy comparisons by non-health experts on city level with city-specific data, it should give guidance on the particular climate mitigation questions but at the same time increase understanding on the related health impacts and the model should follow the building stock in time, make comparisons between scenarios, propagate uncertainties, and scale to different levels of detail. We tested The functionalities of the model in two case cities, namely Kuopio and Basel. We estimated the health and climate impacts of two actual policies planned or implemented in the cities. The assessed policies were replacement of peat with wood chips in co-generation of district heat and power, and improved energy efficiency of buildings achieved by renovations. Health impacts were not large in the two cities, but also clear differences in implementation and predictability between the two tested policies were seen. Renovation policies can improve the energy efficiency of buildings and reduce greenhouse gas emissions significantly, but this requires systematic policy sustained for decades. In contrast, fuel changes in large district heating facilities may have rapid and large impacts on emissions. However, the life cycle impacts of different fuels is somewhat an open question. In conclusion, we were able to develop a practical model for city-level assessments promoting evidence-based policy in general and health aspects in particular. Although all data and code is freely available, implementation of the current model version in a new city requires some modelling skills.

  12. Using ambient vibration measurements for risk assessment at an urban scale: from numerical proof of concept to Beirut case study (Lebanon)

    NASA Astrophysics Data System (ADS)

    Salameh, Christelle; Bard, Pierre-Yves; Guillier, Bertrand; Harb, Jacques; Cornou, Cécile; Gérard, Jocelyne; Almakari, Michelle

    2017-04-01

    Post-seismic investigations repeatedly indicate that structures having frequencies close to foundation soil frequencies exhibit significantly heavier damages (Caracas 1967; Mexico 1985; Pujili, Ecuador 1996; L'Aquila 2009). However, observations of modal frequencies of soils and buildings in a region or within a current seismic risk analysis are not fully considered together, even when past earthquakes have demonstrated that coinciding soil and building frequencies leads to greater damage. The present paper thus focuses on a comprehensive numerical analysis to investigate the effect of coincidence between site and building frequencies. A total of 887 realistic soil profiles are coupled with a set of 141 single-degree-of-freedom elastoplastic oscillators, and their combined (nonlinear) response is computed for both linear and nonlinear soil behaviors, for a large number (60) of synthetic input signals with various PGA levels and frequency contents. The associated damage is quantified on the basis of the maximum displacement as compared to both yield and ultimate post-elastic displacements, according to the RISK-UE project recommendations (Lagomarsino and Giovinazzi in Bull Earthq Eng 4(4):415-443, 2006), and compared with the damage obtained in the case of a similar building located on rock. The correlation between this soil/rock damage increment and a number of simplified mechanical and loading parameters is then analyzed using a neural network approach. The results emphasize the key role played by the building/soil frequency ratio even when both soil and building behave nonlinearly; other important parameters are the PGA level, the soil/rock velocity contrast and the building ductility. A numerical investigation based on simulation of ambient noise for the whole set of 887 profiles also indicates that the amplitude of H/ V ratio may be considered as a satisfactory proxy for site amplification when applied to measurements at urban scale. A very easy implementation of this method, using ambient vibration measurements both at ground level and within buildings, is illustrated with an example application for the city of Beirut (Lebanon).[Figure not available: see fulltext.

  13. Building capacity in biodiversity monitoring at the global scale

    USGS Publications Warehouse

    Schmeller, Dirk S.; Bohm, Monika; Arvanitidis, Christos; Barber-Meyer, Shannon; Brummitt, Neil; Chandler, Mark; Chatzinikolaou, Eva; Costello, Mark J.; Ding, Hui; García-Moreno, Jaime; Gill, Michael J.; Haase, Peter; Jones, Miranda; Juillard, Romain; Magnusson, William E.; Martin, Corinne S.; McGeoch, Melodie A.; Mihoub, Jean-Baptiste; Pettorelli, Nathalie; Proença, Vânia; Peng, Cui; Regan, Eugenie; Schmiedel, Ute; Simsika, John P.; Weatherdon, Lauren; Waterman, Carly; Xu, Haigen; Belnap, Jayne

    2017-01-01

    Human-driven global change is causing ongoing declines in biodiversity worldwide. In order to address these declines, decision-makers need accurate assessments of the status of and pressures on biodiversity. However, these are heavily constrained by incomplete and uneven spatial, temporal and taxonomic coverage. For instance, data from regions such as Europe and North America are currently used overwhelmingly for large-scale biodiversity assessments due to lesser availability of suitable data from other, more biodiversity-rich, regions. These data-poor regions are often those experiencing the strongest threats to biodiversity, however. There is therefore an urgent need to fill the existing gaps in global biodiversity monitoring. Here, we review current knowledge on best practice in capacity building for biodiversity monitoring and provide an overview of existing means to improve biodiversity data collection considering the different types of biodiversity monitoring data. Our review comprises insights from work in Africa, South America, Polar Regions and Europe; in government-funded, volunteer and citizen-based monitoring in terrestrial, freshwater and marine ecosystems. The key steps to effectively building capacity in biodiversity monitoring are: identifying monitoring questions and aims; identifying the key components, functions, and processes to monitor; identifying the most suitable monitoring methods for these elements, carrying out monitoring activities; managing the resultant data; and interpreting monitoring data. Additionally, biodiversity monitoring should use multiple approaches including extensive and intensive monitoring through volunteers and professional scientists but also harnessing new technologies. Finally, we call on the scientific community to share biodiversity monitoring data, knowledge and tools to ensure the accessibility, interoperability, and reporting of biodiversity data at a global scale.

  14. xHMMER3x2: Utilizing HMMER3's speed and HMMER2's sensitivity and specificity in the glocal alignment mode for improved large-scale protein domain annotation.

    PubMed

    Yap, Choon-Kong; Eisenhaber, Birgit; Eisenhaber, Frank; Wong, Wing-Cheong

    2016-11-29

    While the local-mode HMMER3 is notable for its massive speed improvement, the slower glocal-mode HMMER2 is more exact for domain annotation by enforcing full domain-to-sequence alignments. Since a unit of domain necessarily implies a unit of function, local-mode HMMER3 alone remains insufficient for precise function annotation tasks. In addition, the incomparable E-values for the same domain model by different HMMER builds create difficulty when checking for domain annotation consistency on a large-scale basis. In this work, both the speed of HMMER3 and glocal-mode alignment of HMMER2 are combined within the xHMMER3x2 framework for tackling the large-scale domain annotation task. Briefly, HMMER3 is utilized for initial domain detection so that HMMER2 can subsequently perform the glocal-mode, sequence-to-full-domain alignments for the detected HMMER3 hits. An E-value calibration procedure is required to ensure that the search space by HMMER2 is sufficiently replicated by HMMER3. We find that the latter is straightforwardly possible for ~80% of the models in the Pfam domain library (release 29). However in the case of the remaining ~20% of HMMER3 domain models, the respective HMMER2 counterparts are more sensitive. Thus, HMMER3 searches alone are insufficient to ensure sensitivity and a HMMER2-based search needs to be initiated. When tested on the set of UniProt human sequences, xHMMER3x2 can be configured to be between 7× and 201× faster than HMMER2, but with descending domain detection sensitivity from 99.8 to 95.7% with respect to HMMER2 alone; HMMER3's sensitivity was 95.7%. At extremes, xHMMER3x2 is either the slow glocal-mode HMMER2 or the fast HMMER3 with glocal-mode. Finally, the E-values to false-positive rates (FPR) mapping by xHMMER3x2 allows E-values of different model builds to be compared, so that any annotation discrepancies in a large-scale annotation exercise can be flagged for further examination by dissectHMMER. The xHMMER3x2 workflow allows large-scale domain annotation speed to be drastically improved over HMMER2 without compromising for domain-detection with regard to sensitivity and sequence-to-domain alignment incompleteness. The xHMMER3x2 code and its webserver (for Pfam release 27, 28 and 29) are freely available at http://xhmmer3x2.bii.a-star.edu.sg/ . Reviewed by Thomas Dandekar, L. Aravind, Oliviero Carugo and Shamil Sunyaev. For the full reviews, please go to the Reviewers' comments section.

  15. Coupling of Large Eddy Simulations with Meteorological Models to simulate Methane Leaks from Natural Gas Storage Facilities

    NASA Astrophysics Data System (ADS)

    Prasad, K.

    2017-12-01

    Atmospheric transport is usually performed with weather models, e.g., the Weather Research and Forecasting (WRF) model that employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and features comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with models like WRF. FDS has the potential to evaluate the impact of complex topography on near-field dispersion and mixing that is difficult to simulate with a mesoscale atmospheric model. A methodology has been developed to couple the FDS model with WRF mesoscale transport models. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. This approach allows the FDS model to operate as a sub-grid scale model with in a WRF simulation. To test and validate the coupled FDS - WRF model, the methane leak from the Aliso Canyon underground storage facility was simulated. Large eddy simulations were performed over the complex topography of various natural gas storage facilities including Aliso Canyon, Honor Rancho and MacDonald Island at 10 m horizontal and vertical resolution. The goal of these simulations included improving and validating transport models as well as testing leak hypotheses. Forward simulation results were compared with aircraft and tower based in-situ measurements as well as methane plumes observed using the NASA Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) and the next generation instrument AVIRIS-NG. Comparison of simulation results with measurement data demonstrate the capability of the coupled FDS-WRF models to accurately simulate the transport and dispersion of methane plumes over urban domains. Simulated integrated methane enhancements will be presented and compared with results obtained from spectrometer data to estimate the temporally evolving methane flux during the Aliso Canyon blowout.

  16. Response of high-rise and base-isolated buildings to a hypothetical M w 7.0 blind thrust earthquake

    USGS Publications Warehouse

    Heaton, T.H.; Hall, J.F.; Wald, D.J.; Halling, M.W.

    1995-01-01

    High-rise flexible-frame buildings are commonly considered to be resistant to shaking from the largest earthquakes. In addition, base isolation has become increasingly popular for critical buildings that should still function after an earthquake. How will these two types of buildings perform if a large earthquake occurs beneath a metropolitan area? To answer this question, we simulated the near-source ground motions of a Mw 7.0 thrust earthquake and then mathematically modeled the response of a 20-story steel-frame building and a 3-story base-isolated building. The synthesized ground motions were characterized by large displacement pulses (up to 2 meters) and large ground velocities. These ground motions caused large deformation and possible collapse of the frame building, and they required exceptional measures in the design of the base-isolated building if it was to remain functional.

  17. CMOL/CMOS hardware architectures and performance/price for Bayesian memory - The building block of intelligent systems

    NASA Astrophysics Data System (ADS)

    Zaveri, Mazad Shaheriar

    The semiconductor/computer industry has been following Moore's law for several decades and has reaped the benefits in speed and density of the resultant scaling. Transistor density has reached almost one billion per chip, and transistor delays are in picoseconds. However, scaling has slowed down, and the semiconductor industry is now facing several challenges. Hybrid CMOS/nano technologies, such as CMOL, are considered as an interim solution to some of the challenges. Another potential architectural solution includes specialized architectures for applications/models in the intelligent computing domain, one aspect of which includes abstract computational models inspired from the neuro/cognitive sciences. Consequently in this dissertation, we focus on the hardware implementations of Bayesian Memory (BM), which is a (Bayesian) Biologically Inspired Computational Model (BICM). This model is a simplified version of George and Hawkins' model of the visual cortex, which includes an inference framework based on Judea Pearl's belief propagation. We then present a "hardware design space exploration" methodology for implementing and analyzing the (digital and mixed-signal) hardware for the BM. This particular methodology involves: analyzing the computational/operational cost and the related micro-architecture, exploring candidate hardware components, proposing various custom hardware architectures using both traditional CMOS and hybrid nanotechnology - CMOL, and investigating the baseline performance/price of these architectures. The results suggest that CMOL is a promising candidate for implementing a BM. Such implementations can utilize the very high density storage/computation benefits of these new nano-scale technologies much more efficiently; for example, the throughput per 858 mm2 (TPM) obtained for CMOL based architectures is 32 to 40 times better than the TPM for a CMOS based multiprocessor/multi-FPGA system, and almost 2000 times better than the TPM for a PC implementation. We later use this methodology to investigate the hardware implementations of cortex-scale spiking neural system, which is an approximate neural equivalent of BICM based cortex-scale system. The results of this investigation also suggest that CMOL is a promising candidate to implement such large-scale neuromorphic systems. In general, the assessment of such hypothetical baseline hardware architectures provides the prospects for building large-scale (mammalian cortex-scale) implementations of neuromorphic/Bayesian/intelligent systems using state-of-the-art and beyond state-of-the-art silicon structures.

  18. Current Issues in Cosmology

    NASA Astrophysics Data System (ADS)

    Pecker, Jean-Claude; Narlikar, Jayant

    2011-09-01

    Part I. Observational Facts Relating to Discrete Sources: 1. The state of cosmology G. Burbidge; 2. The redshifts of galaxies and QSOs E. M. Burbidge and G. Burbidge; 3. Accretion discs in quasars J. Sulentic; Part II. Observational Facts Relating to Background Radiation: 4. CMB observations and consequences F. Bouchet; 5. Abundances of light nuclei K. Olive; 6. Evidence for an accelerating universe or lack of A. Blanchard; Part III. Standard Cosmology: 7. Cosmology, an overview of the standard model F. Bernardeau; 8. What are the building blocks of our universe? K. C. Wali; Part IV. Large-Scale Structure: 9. Observations of large-scale structure V. de Lapparent; 10. Reconstruction of large-scale peculiar velocity fields R. Mohayaee, B. Tully and U. Frisch; Part V. Alternative Cosmologies: 11. The quasi-steady state cosmology J. V. Narlikar; 12. Evidence for iron whiskers in the universe N. C. Wickramasinghe; 13. Alternatives to dark matter: MOND + Mach D. Roscoe; 14. Anthropic principle in cosmology B. Carter; Part VI. Evidence for Anomalous Redshifts: 15. Anomalous redshifts H. C. Arp; 16. Redshifts of galaxies and QSOs: the problem of redshift periodicities G. Burbidge; 17. Statistics of redshift periodicities W. Napier; 18. Local abnormal redshifts J.-C. Pecker; 19. Gravitational lensing and anomalous redshifts J. Surdej, J.-F. Claeskens and D. Sluse; Panel discussion; General discussion; Concluding remarks.

  19. Explaining Large-Scale Policy Change in the Turkish Health Care System: Ideas, Institutions, and Political Actors.

    PubMed

    Agartan, Tuba I

    2015-10-01

    Explaining policy change has been one of the major concerns of the health care politics and policy development literature. This article aims to explain the specific dynamics of large-scale reforms introduced within the framework of the Health Transformation Program in Turkey. It argues that confluence of the three streams - problem, policy, and politics - with the exceptional political will of the Justice and Development Party's (JDP) leaders opened up a window of opportunity for a large-scale policy change. The article also underscores the contribution of recent ideational perspectives that help explain "why" political actors in Turkey would focus on health care reform, given that there are a number of issues waiting to be addressed in the policy agenda. Examining how political actors framed problems and policies deepens our understanding of the content of the reform initiatives as well as the construction of the need to reform. The article builds on the insights of both the ideational and institutionalist perspectives when it argues that the interests, aspirations, and fears of the JDP, alongside the peculiar characteristics of the institutional context, have shaped its priorities and determination to carry out this reform initiative. Copyright © 2015 by Duke University Press.

  20. DEEP: Database of Energy Efficiency Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    A database of energy efficiency performance (DEEP) is a presimulated database to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 10 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER [sic] prototype buildings. The prototype buildings represent seven building types across six vintages of constructions and 16 California climate zones.more » DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air conditioning, plug loads, and domestic hot war. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center (NERSC) of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of the CEC PIER project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users' decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit.« less

  1. A demonstration of a low cost approach to security at shipping facilities and ports

    NASA Astrophysics Data System (ADS)

    Huck, Robert C.; Al Akkoumi, Mouhammad K.; Herath, Ruchira W.; Sluss, James J., Jr.; Radhakrishnan, Sridhar; Landers, Thomas L.

    2010-04-01

    Government funding for the security at shipping facilities and ports is limited so there is a need for low cost scalable security systems. With over 20 million sea, truck, and rail containers entering the United States every year, these facilities pose a large risk to security. Securing these facilities and monitoring the variety of traffic that enter and leave is a major task. To accomplish this, the authors have developed and fielded a low cost fully distributed building block approach to port security at the inland Port of Catoosa in Oklahoma. Based on prior work accomplished in the design and fielding of an intelligent transportation system in the United States, functional building blocks, (e.g. Network, Camera, Sensor, Display, and Operator Console blocks) can be assembled, mixed and matched, and scaled to provide a comprehensive security system. The following functions are demonstrated and scaled through analysis and demonstration: Barge tracking, credential checking, container inventory, vehicle tracking, and situational awareness. The concept behind this research is "any operator on any console can control any device at any time."

  2. Methyl chloride via oxyhydrochlorination of methane: A building block for chemicals and fuels from natural gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, R.L.; Brown, S.S.D.; Ferguson, S.P.

    1995-12-31

    The objectives of this program are to (a) develop a process for converting natural gas to methyl chloride via an oxyhydrochlorination route using highly selective, stable catalysts in a fixed-bed, (b) design a reactor capable of removing the large amount of heat generated in the process so as to control the reaction, (c) develop a recovery system capable of removing the methyl chloride from the product stream and (d) determine the economics and commercial viability of the process. The general approach has been as follows: (a) design and build a laboratory scale reactor, (b) define and synthesize suitable OHC catalystsmore » for evaluation, (c) select first generation OHC catalyst for Process Development Unit (PDU) trials, (d) design, construct and startup PDU, (e) evaluate packed bed reactor design, (f) optimize process, in particular, product recovery operations, (g) determine economics of process, (h) complete preliminary engineering design for Phase II and (i) make scale-up decision and formulate business plan for Phase II. Conclusions regarding process development and catalyst development are presented.« less

  3. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    NASA Astrophysics Data System (ADS)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  4. Research on large-scale wind farm modeling

    NASA Astrophysics Data System (ADS)

    Ma, Longfei; Zhang, Baoqun; Gong, Cheng; Jiao, Ran; Shi, Rui; Chi, Zhongjun; Ding, Yifeng

    2017-01-01

    Due to intermittent and adulatory properties of wind energy, when large-scale wind farm connected to the grid, it will have much impact on the power system, which is different from traditional power plants. Therefore it is necessary to establish an effective wind farm model to simulate and analyze the influence wind farms have on the grid as well as the transient characteristics of the wind turbines when the grid is at fault. However we must first establish an effective WTGs model. As the doubly-fed VSCF wind turbine has become the mainstream wind turbine model currently, this article first investigates the research progress of doubly-fed VSCF wind turbine, and then describes the detailed building process of the model. After that investigating the common wind farm modeling methods and pointing out the problems encountered. As WAMS is widely used in the power system, which makes online parameter identification of the wind farm model based on off-output characteristics of wind farm be possible, with a focus on interpretation of the new idea of identification-based modeling of large wind farms, which can be realized by two concrete methods.

  5. Pursuing scale and quality in STI interventions with sex workers: initial results from Avahan India AIDS Initiative

    PubMed Central

    Steen, R; Mogasale, V; Wi, T; Singh, A K; Das, A; Daly, C; George, B; Neilsen, G; Loo, V; Dallabetta, G

    2006-01-01

    Background Migration, population mobility, and sex work continue to drive sexually transmitted epidemics in India. Yet interventions targeting high incidence networks are rarely implemented at sufficient scale to have impact. India AIDS Initiative (Avahan), funded by the Bill and Melinda Gates Foundation, is scaling up interventions with sex workers (SWs) and other high risk populations in India's six highest HIV prevalence states. Methods Avahan resources are channelled through state level partners (SLPs) to local level non‐governmental organisations (NGOs) who organise outreach, community mobilisation, and dedicated clinics for SWs. These clinics provide services for sexually transmitted infections (STIs) including Condom Promotion, syndromic case management, regular check‐ups, and treatment of asymptomatic infections. SWs take an active role in service delivery. STI capacity building support functions on three levels. A central capacity building team developed guidelines and standards, trains state level STI coordinators, monitors outcomes, and conducts operations research. Standards are documented in an Avahan‐wide manual. State level STI coordinators train NGO clinic staff and conduct supervision of clinics based on these standards and related quality monitoring tools. Clinic and outreach staff report on indicators that guide additional capacity building inputs. Results In 2 years, clinics with community outreach for SWs have been established in 274 settings covering 77 districts. Mapping and size estimation have identified 187 000 SWs. In a subset of four large states covered by six SLPs (183 000 estimated SWs, 65 districts), 128 326 (70%) of the SWs have been contacted through peer outreach and 74 265 (41%) have attended the clinic at least once. A total of 127 630 clinic visits have been reported, an increasing proportion for recommended routine check ups. Supervision and monitoring facilitate standardisation of services across sites. Conclusion Targeted HIV/STI interventions can be brought to scale and standardised given adequate capacity building support. Intervention coverage, service utilisation, and quality are key parameters that should be monitored and progressively improved with active involvement of SWs themselves. PMID:17012513

  6. A Fresh Look at Weather Impact on Peak Electricity Demand and Energy Use of Buildings Using 30-Year Actual Weather Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Chang, Wen-Kuei; Lin, Hung-Wen

    Buildings consume more than one third of the world?s total primary energy. Weather plays a unique and significant role as it directly affects the thermal loads and thus energy performance of buildings. The traditional simulated energy performance using Typical Meteorological Year (TMY) weather data represents the building performance for a typical year, but not necessarily the average or typical long-term performance as buildings with different energy systems and designs respond differently to weather changes. Furthermore, the single-year TMY simulations do not provide a range of results that capture yearly variations due to changing weather, which is important for building energymore » management, and for performing risk assessments of energy efficiency investments. This paper employs large-scale building simulation (a total of 3162 runs) to study the weather impact on peak electricity demand and energy use with the 30-year (1980 to 2009) Actual Meteorological Year (AMY) weather data for three types of office buildings at two design efficiency levels, across all 17 ASHRAE climate zones. The simulated results using the AMY data are compared to those from the TMY3 data to determine and analyze the differences. Besides further demonstration, as done by other studies, that actual weather has a significant impact on both the peak electricity demand and energy use of buildings, the main findings from the current study include: 1) annual weather variation has a greater impact on the peak electricity demand than it does on energy use in buildings; 2) the simulated energy use using the TMY3 weather data is not necessarily representative of the average energy use over a long period, and the TMY3 results can be significantly higher or lower than those from the AMY data; 3) the weather impact is greater for buildings in colder climates than warmer climates; 4) the weather impact on the medium-sized office building was the greatest, followed by the large office and then the small office; and 5) simulated energy savings and peak demand reduction by energy conservation measures using the TMY3 weather data can be significantly underestimated or overestimated. It is crucial to run multi-decade simulations with AMY weather data to fully assess the impact of weather on the long-term performance of buildings, and to evaluate the energy savings potential of energy conservation measures for new and existing buildings from a life cycle perspective.« less

  7. Scale Matters: An Action Plan for Realizing Sector-Wide"Zero-Energy" Performance Goals in Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selkowitz, Stephen; Selkowitz, Stephen; Granderson, Jessica

    2008-06-16

    It is widely accepted that if the United States is to reduce greenhouse gas emissions it must aggressively address energy end use in the building sector. While there have been some notable but modest successes with mandatory and voluntary programs, there have also been puzzling failures to achieve expected savings. Collectively, these programs have not yet reached the majority of the building stock, nor have they yet routinely produced very large savings in individual buildings. Several trends that have the potential to change this are noteworthy: (1) the growing market interest in 'green buildings' and 'sustainable design', (2) the majormore » professional societies (e.g. AIA, ASHRAE) have more aggressively adopted significant improvements in energy efficiency as strategic goals, e.g. targeting 'zero energy', carbon-neutral buildings by 2030. While this vision is widely accepted as desirable, unless there are significant changes to the way buildings are routinely designed, delivered and operated, zero energy buildings will remain a niche phenomenon rather than a sector-wide reality. Toward that end, a public/private coalition including the Alliance to Save Energy, LBNL, AIA, ASHRAE, USGBC and the World Business Council for Sustainable Development (WBCSD) are developing an 'action plan' for moving the U.S. commercial building sector towards zero energy performance. It addresses regional action in a national framework; integrated deployment, demonstration and R&D threads; and would focus on measurable, visible performance indicators. This paper outlines this action plan, focusing on the challenge, the key themes, and the strategies and actions leading to substantial reductions in GHG emissions by 2030.« less

  8. Building the team for team science

    USGS Publications Warehouse

    Read, Emily K.; O'Rourke, M.; Hong, G. S.; Hanson, P. C.; Winslow, Luke A.; Crowley, S.; Brewer, C. A.; Weathers, K. C.

    2016-01-01

    The ability to effectively exchange information and develop trusting, collaborative relationships across disciplinary boundaries is essential for 21st century scientists charged with solving complex and large-scale societal and environmental challenges, yet these communication skills are rarely taught. Here, we describe an adaptable training program designed to increase the capacity of scientists to engage in information exchange and relationship development in team science settings. A pilot of the program, developed by a leader in ecological network science, the Global Lake Ecological Observatory Network (GLEON), indicates that the training program resulted in improvement in early career scientists’ confidence in team-based network science collaborations within and outside of the program. Fellows in the program navigated human-network challenges, expanded communication skills, and improved their ability to build professional relationships, all in the context of producing collaborative scientific outcomes. Here, we describe the rationale for key communication training elements and provide evidence that such training is effective in building essential team science skills.

  9. Evaluation of a micro-scale wind model's performance over realistic building clusters using wind tunnel experiments

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Du, Yunsong; Miao, Shiguang; Fang, Xiaoyi

    2016-08-01

    The simulation performance over complex building clusters of a wind simulation model (Wind Information Field Fast Analysis model, WIFFA) in a micro-scale air pollutant dispersion model system (Urban Microscale Air Pollution dispersion Simulation model, UMAPS) is evaluated using various wind tunnel experimental data including the CEDVAL (Compilation of Experimental Data for Validation of Micro-Scale Dispersion Models) wind tunnel experiment data and the NJU-FZ experiment data (Nanjing University-Fang Zhuang neighborhood wind tunnel experiment data). The results show that the wind model can reproduce the vortexes triggered by urban buildings well, and the flow patterns in urban street canyons and building clusters can also be represented. Due to the complex shapes of buildings and their distributions, the simulation deviations/discrepancies from the measurements are usually caused by the simplification of the building shapes and the determination of the key zone sizes. The computational efficiencies of different cases are also discussed in this paper. The model has a high computational efficiency compared to traditional numerical models that solve the Navier-Stokes equations, and can produce very high-resolution (1-5 m) wind fields of a complex neighborhood scale urban building canopy (~ 1 km ×1 km) in less than 3 min when run on a personal computer.

  10. Flexible and Stretchable Energy Storage: Recent Advances and Future Perspectives.

    PubMed

    Liu, Wei; Song, Min-Sang; Kong, Biao; Cui, Yi

    2017-01-01

    Energy-storage technologies such as lithium-ion batteries and supercapacitors have become fundamental building blocks in modern society. Recently, the emerging direction toward the ever-growing market of flexible and wearable electronics has nourished progress in building multifunctional energy-storage systems that can be bent, folded, crumpled, and stretched while maintaining their electrochemical functions under deformation. Here, recent progress and well-developed strategies in research designed to accomplish flexible and stretchable lithium-ion batteries and supercapacitors are reviewed. The challenges of developing novel materials and configurations with tailored features, and in designing simple and large-scaled manufacturing methods that can be widely utilized are considered. Furthermore, the perspectives and opportunities for this emerging field of materials science and engineering are also discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Burning down the brewery: establishing and evacuating an ancient imperial colony at Cerro Baul, Peru.

    PubMed

    Moseley, Michael E; Nash, Donna J; Williams, Patrick Ryan; DeFrance, Susan D; Miranda, Ana; Ruales, Mario

    2005-11-29

    Before the Inca reigned, two empires held sway over the central Andes from anno Domini 600 to 1000: the Wari empire to the north ruled much of Peru, and Tiwanaku to the south reigned in Bolivia. Face-to-face contact came when both colonized the Moquegua Valley sierra in southern Peru. The state-sponsored Wari incursion, described here, entailed large-scale agrarian reclamation to sustain the occupation of two hills and the adjacent high mesa of Cerro Baúl. Monumental buildings were erected atop the mesa to serve an embassy-like delegation of nobles and attendant personnel that endured for centuries. Final evacuation of the Baúl enclave was accompanied by elaborate ceremonies with brewing, drinking, feasting, vessel smashing, and building burning.

  12. Burning down the brewery: Establishing and evacuating an ancient imperial colony at Cerro Baúl, Peru

    PubMed Central

    Moseley, Michael E.; Nash, Donna J.; Williams, Patrick Ryan; deFrance, Susan D.; Miranda, Ana; Ruales, Mario

    2005-01-01

    Before the Inca reigned, two empires held sway over the central Andes from anno Domini 600 to 1000: the Wari empire to the north ruled much of Peru, and Tiwanaku to the south reigned in Bolivia. Face-to-face contact came when both colonized the Moquegua Valley sierra in southern Peru. The state-sponsored Wari incursion, described here, entailed large-scale agrarian reclamation to sustain the occupation of two hills and the adjacent high mesa of Cerro Baúl. Monumental buildings were erected atop the mesa to serve an embassy-like delegation of nobles and attendant personnel that endured for centuries. Final evacuation of the Baúl enclave was accompanied by elaborate ceremonies with brewing, drinking, feasting, vessel smashing, and building burning. PMID:16293691

  13. A generalized analog implementation of piecewise linear neuron models using CCII building blocks.

    PubMed

    Soleimani, Hamid; Ahmadi, Arash; Bavandpour, Mohammad; Sharifipoor, Ozra

    2014-03-01

    This paper presents a set of reconfigurable analog implementations of piecewise linear spiking neuron models using second generation current conveyor (CCII) building blocks. With the same topology and circuit elements, without W/L modification which is impossible after circuit fabrication, these circuits can produce different behaviors, similar to the biological neurons, both for a single neuron as well as a network of neurons just by tuning reference current and voltage sources. The models are investigated, in terms of analog implementation feasibility and costs, targeting large scale hardware implementations. Results show that, in order to gain the best performance, area and accuracy; these models can be compromised. Simulation results are presented for different neuron behaviors with CMOS 350 nm technology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. OpenSim: A Flexible Distributed Neural Network Simulator with Automatic Interactive Graphics.

    PubMed

    Jarosch, Andreas; Leber, Jean Francois

    1997-06-01

    An object-oriented simulator called OpenSim is presented that achieves a high degree of flexibility by relying on a small set of building blocks. The state variables and algorithms put in this framework can easily be accessed through a command shell. This allows one to distribute a large-scale simulation over several workstations and to generate the interactive graphics automatically. OpenSim opens new possibilities for cooperation among Neural Network researchers. Copyright 1997 Elsevier Science Ltd.

  15. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  16. Technology Innovations from NASA's Next Generation Launch Technology Program

    NASA Technical Reports Server (NTRS)

    Cook, Stephen A.; Morris, Charles E. K., Jr.; Tyson, Richard W.

    2004-01-01

    NASA's Next Generation Launch Technology Program has been on the cutting edge of technology, improving the safety, affordability, and reliability of future space-launch-transportation systems. The array of projects focused on propulsion, airframe, and other vehicle systems. Achievements range from building miniature fuel/oxygen sensors to hot-firings of major rocket-engine systems as well as extreme thermo-mechanical testing of large-scale structures. Results to date have significantly advanced technology readiness for future space-launch systems using either airbreathing or rocket propulsion.

  17. Manganese-Mediated Coupling Reaction of Vinylarenes and Aliphatic Alcohols

    PubMed Central

    Zhang, Wei; Wang, Nai-Xing; Bai, Cui-Bing; Wang, Yan-Jing; Lan, Xing-Wang; Xing, Yalan; Li, Yi-He; Wen, Jia-Long

    2015-01-01

    Alcohols and alkenes are the most abundant and commonly used organic building blocks in the large-scale chemical synthesis. Herein, this is the first time to report a novel and operationally simple coupling reaction of vinylarenes and aliphatic alcohols catalyzed by manganese in the presence of TBHP (tert-butyl hydroperoxide). This coupling reaction provides the oxyalkylated products of vinylarenes with good regioselectivity and accomplishes with the principles of step-economies. A possible reaction mechanism has also been proposed. PMID:26470633

  18. A Short History of War: The Evolution of Warfare and Weapons. Professional Readings in Military Strategy Number 5

    DTIC Science & Technology

    1992-06-30

    first employed on large scale public works projects-building dikes, irrigation systems, the pyramids, and ziggurats of ancient Sumer-it was but a...original Sumerian word for the southern part of Iraq, the site of Sumer with its capital at the city of Ur . If the river is followed northward from...The first historical evidence of soldiers wearing helmets is also provided on the stele. From the bodies of soldiers found in the Death Pits of Ur

  19. Renewable Energy Zone (REZ) Transmission Planning Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Nathan

    A REZ is a geographical area that enables the development of profitable, cost-effective, grid-connected renewable energy (RE). The REZ Transmission Planning Process is a proactive approach to plan, approve, and build transmission infrastructure connecting REZs to the power system which helps to increase the share of solar, wind and other RE resources in the power system while maintaining reliability and economics, and focuses on large-scale wind and solar resources that can be developed in sufficient quantities to warrant transmission system expansion and upgrades.

  20. ΛGR Centennial: Cosmic Web in Dark Energy Background

    NASA Astrophysics Data System (ADS)

    Chernin, A. D.

    The basic building blocks of the Cosmic Web are groups and clusters of galaxies, super-clusters (pancakes) and filaments embedded in the universal dark energy background. The background produces antigravity, and the antigravity effect is strong in groups, clusters and superclusters. Antigravity is very weak in filaments where matter (dark matter and baryons) produces gravity dominating in the filament internal dynamics. Gravity-antigravity interplay on the large scales is a grandiose phenomenon predicted by ΛGR theory and seen in modern observations of the Cosmic Web.

  1. Transparent building-integrated PV modules. Phase 1: Comprehensive report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-09-28

    This Comprehensive Report encompasses the activities that have been undertaken by Kiss + Cathcart, Architects, in conjunction with Energy Photovoltaics, Incorporated (EPV), to develop a flexible patterning system for thin-film photovoltaic (PV) modules for building applications. There are two basic methods for increasing transparency/light transmission by means of patterning the PV film: widening existing scribe lines, or scribing a second series of lines perpendicular to the first. These methods can yield essentially any degree of light transmission, but both result in visible patterns of light and dark on the panel surface. A third proposed method is to burn a gridmore » of dots through the films, independent of the normal cell scribing. This method has the potential to produce a light-transmitting panel with no visible pattern. Ornamental patterns at larger scales can be created using combinations of these techniques. Kiss + Cathcart, Architects, in conjunction with EPV are currently developing a complementary process for the large-scale lamination of thin-film PVs, which enables building integrated (BIPV) modules to be produced in sizes up to 48 in. x 96 in. Flexible laser patterning will be used for three main purposes, all intended to broaden the appeal of the product to the building sector: To create semitransparent thin-film modules for skylights, and in some applications, for vision glazing.; to create patterns for ornamental effects. This application is similar to fritted glass, which is used for shading, visual screening, graphics, and other purposes; and to allow BIPV modules to be fabricated in various sizes and shapes with maximum control over electrical characteristics.« less

  2. FLORIDA LARGE BUILDING STUDY - POLK COUNTY ADMINISTRATION BUILDING

    EPA Science Inventory

    The report describes an extensive characterization and parameter assessment study of a single, large building in Bartow, FL, with the purpose of assessing the impact on radon entry of design, construction, and operating features of the building, particularly the mechanical subsys...

  3. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Zischg, Andreas

    2014-05-01

    We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings, and the share of agricultural buildings was around 50 % lower than on the national level. A significantly higher share of people is exposed in Salzburg (torrents) and Tyrol and Vorarlberg (snow avalanches); nevertheless, there is a need for a further in-depth local analysis. The results clearly indicate that an assessment using nation-wide data on buildings and population has advantages in vulnerability assessment compared to traditional approaches. However, the data has some limits if information on the large scale of individual catchments is needed, which restricts the application when an increase in resilience towards mountain hazards is targeted.

  4. Saliency image of feature building for image quality assessment

    NASA Astrophysics Data System (ADS)

    Ju, Xinuo; Sun, Jiyin; Wang, Peng

    2011-11-01

    The purpose and method of image quality assessment are quite different for automatic target recognition (ATR) and traditional application. Local invariant feature detectors, mainly including corner detectors, blob detectors and region detectors etc., are widely applied for ATR. A saliency model of feature was proposed to evaluate feasibility of ATR in this paper. The first step consisted of computing the first-order derivatives on horizontal orientation and vertical orientation, and computing DoG maps in different scales respectively. Next, saliency images of feature were built based auto-correlation matrix in different scale. Then, saliency images of feature of different scales amalgamated. Experiment were performed on a large test set, including infrared images and optical images, and the result showed that the salient regions computed by this model were consistent with real feature regions computed by mostly local invariant feature extraction algorithms.

  5. Subsurface information for risk-sensitive urban spatial planning in Dhaka Metropolitan City, Bangladesh

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Aziz Patwary, Mohammad Abdul; Bahls, Rebecca; Asaduzzaman, Atm; Ludwig, Rüdiger; Ashraful Kamal, Mohammad; Nahar Faruqa, Nurun; Jabeen, Sarwat

    2016-04-01

    Dhaka Metropolitan City (including Dhaka and five adjacent municipal areas) is one of the fastest developing urban regions in the world. Densely build-up areas in the developed metropolitan area of Dhaka City are subject to extensive restructuring as common six- or lower storied buildings are replaced by higher and heavier constructions. Additional stories are built on existing houses, frequently exceeding the allowable bearing pressure on the subsoil as supported by the foundations. In turn, newly developing city areas are projected in marshy terrains modified by extensive, largely unengineered landfills. In most areas, these terrains bear unfavorable building ground conditions within 30 meters. Within a collaborative technical cooperation project between Bangladesh and Germany, BGR supports GSB in the provision of geo-information for the Capital Development Authority (RAJUK). For general urban planning, RAJUK successively develops a detailed area plan (DAP) at scale 1 : 50000 for the whole Dhaka Metropolitan City area (approx. 1700 km2). Geo-information have not been considered in the present DAP. Within the project, geospatial information in form of a geomorphic map, a digital terrain model and a 3-D subsurface model covering the whole city area have been generated at a scale of 1 : 50000. An extensive engineering geological data base consisting of more than 2200 borehole data with associated Standard Penetration Testing (SPT) and lab data has been compiled. With the field testing (SPT) and engineering geological lab data, the 3-D subsurface model can be parameterized to derive important spatial subsurface information for urban planning like bearing capacity evaluations for different foundation designs or soil liquefaction potential assessments for specific earthquake scenarios. In conjunction with inundation potential evaluations for different flooding scenarios, comprehensive building ground suitability information can be derived to support risk-sensitive urban planning in Dhaka Metropolitan City area at the DAP scale

  6. Large-area zinc oxide nanorod arrays templated by nanoimprint lithography: control of morphologies and optical properties

    NASA Astrophysics Data System (ADS)

    Zhang, Chen; Huang, Xiaohu; Liu, Hongfei; Chua, Soo Jin; Ross, Caroline A.

    2016-12-01

    Vertically aligned, highly ordered, large area arrays of nanostructures are important building blocks for multifunctional devices. Here, ZnO nanorod arrays are selectively synthesized on Si substrates by a solution method within patterns created by nanoimprint lithography. The growth modes of two dimensional nucleation-driven wedding cakes and screw dislocation-driven spirals are inferred to determine the top end morphologies of the nanorods. Sub-bandgap photoluminescence of the nanorods is greatly enhanced by the manipulation of the hydrogen donors via a post-growth thermal treatment. Lasing behavior is facilitated in the nanorods with faceted top ends formed from wedding cakes growth mode. This work demonstrates the control of morphologies of oxide nanostructures in a large scale and the optimization of the optical performance.

  7. Evaluation of modal pushover-based scaling of one component of ground motion: Tall buildings

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2012-01-01

    Nonlinear response history analysis (RHA) is now increasingly used for performance-based seismic design of tall buildings. Required for nonlinear RHAs is a set of ground motions selected and scaled appropriately so that analysis results would be accurate (unbiased) and efficient (having relatively small dispersion). This paper evaluates accuracy and efficiency of recently developed modal pushover–based scaling (MPS) method to scale ground motions for tall buildings. The procedure presented explicitly considers structural strength and is based on the standard intensity measure (IM) of spectral acceleration in a form convenient for evaluating existing structures or proposed designs for new structures. Based on results presented for two actual buildings (19 and 52 stories, respectively), it is demonstrated that the MPS procedure provided a highly accurate estimate of the engineering demand parameters (EDPs), accompanied by significantly reduced record-to-record variability of the responses. In addition, the MPS procedure is shown to be superior to the scaling procedure specified in the ASCE/SEI 7-05 document.

  8. NASA Goddard Earth Sciences Graduate Student Program. [FIRE CIRRUS-II examination of coupling between an upper tropospheric cloud system and synoptic-scale dynamics

    NASA Technical Reports Server (NTRS)

    Ackerman, Thomas P.

    1994-01-01

    The evolution of synoptic-scale dynamics associated with a middle and upper tropospheric cloud event that occurred on 26 November 1991 is examined. The case under consideration occurred during the FIRE CIRRUS-II Intensive Field Observing Period held in Coffeyville, KS during Nov. and Dec., 1991. Using data from the wind profiler demonstration network and a temporally and spatially augmented radiosonde array, emphasis is given to explaining the evolution of the kinematically-derived ageostrophic vertical circulations and correlating the circulation with the forcing of an extensively sampled cloud field. This is facilitated by decomposing the horizontal divergence into its component parts through a natural coordinate representation of the flow. Ageostrophic vertical circulations are inferred and compared to the circulation forcing arising from geostrophic confluence and shearing deformation derived from the Sawyer-Eliassen Equation. It is found that a thermodynamically indirect vertical circulation existed in association with a jet streak exit region. The circulation was displaced to the cyclonic side of the jet axis due to the orientation of the jet exit between a deepening diffluent trough and building ridge. The cloud line formed in the ascending branch of the vertical circulation with the most concentrated cloud development occurring in conjunction with the maximum large-scale vertical motion. The relationship between the large scale dynamics and the parameterization of middle and upper tropospheric clouds in large-scale models is discussed and an example of ice water contents derived from a parameterization forced by the diagnosed vertical motions and observed water vapor contents is presented.

  9. Testing of a Stitched Composite Large-Scale Multi-Bay Pressure Box

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn; Rouse, Marshall; Przekop, Adam; Lovejoy, Andrew

    2016-01-01

    NASA has created the Environmentally Responsible Aviation (ERA) Project to develop technologies to reduce aviation's impact on the environment. A critical aspect of this pursuit is the development of a lighter, more robust airframe to enable the introduction of unconventional aircraft configurations. NASA and The Boeing Company have worked together to develop a structural concept that is lightweight and an advancement beyond state-of-the-art composite structures. The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) is an integrally stiffened panel design where elements are stitched together. The PRSEUS concept is designed to maintain residual load carrying capabilities under a variety of damage scenarios. A series of building block tests were evaluated to explore the fundamental assumptions related to the capability and advantages of PRSEUS panels. The final step in the building block series is an 80%-scale pressure box representing a portion of the center section of a Hybrid Wing Body (HWB) transport aircraft. The testing of this article under maneuver load and internal pressure load conditions is the subject of this paper. The experimental evaluation of this article, along with the other building block tests and the accompanying analyses, has demonstrated the viability of a PRSEUS center body for the HWB vehicle. Additionally, much of the development effort is also applicable to traditional tube-and-wing aircraft, advanced aircraft configurations, and other structures where weight and through-the-thickness strength are design considerations.

  10. Ten-year monitoring of high-rise building columns using long-gauge fiber optic sensors

    NASA Astrophysics Data System (ADS)

    Glisic, B.; Inaudi, D.; Lau, J. M.; Fong, C. C.

    2013-05-01

    A large-scale lifetime building monitoring program was implemented in Singapore in 2001. The monitoring aims of this unique program were to increase safety, verify performance, control quality, increase knowledge, optimize maintenance costs, and evaluate the condition of the structures after a hazardous event. The first instrumented building, which has now been monitored for more than ten years, is presented in this paper. The long-gauge fiber optic strain sensors were embedded in fresh concrete of ground-level columns, thus the monitoring started at the birth of both the construction material and the structure. Measurement sessions were performed during construction, upon completion of each new story and the roof, and after the construction, i.e., in-service. Based on results it was possible to follow and evaluate long-term behavior of the building through every stage of its life. The results of monitoring were analyzed at a local (column) and global (building) level. Over-dimensioning of one column was identified. Differential settlement of foundations was detected, localized, and its magnitude estimated. Post-tremor analysis was performed. Real long-term behavior of concrete columns was assessed. Finally, the long-term performance of the monitoring system was evaluated. The researched monitoring method, monitoring system, rich results gathered over approximately ten years, data analysis algorithms, and the conclusions on the structural behavior and health condition of the building based on monitoring are presented in this paper.

  11. FAST MAGNETIC FIELD AMPLIFICATION IN THE EARLY UNIVERSE: GROWTH OF COLLISIONLESS PLASMA INSTABILITIES IN TURBULENT MEDIA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falceta-Gonçalves, D.; Kowal, G.

    2015-07-20

    In this work we report on a numerical study of the cosmic magnetic field amplification due to collisionless plasma instabilities. The collisionless magnetohydrodynamic equations derived account for the pressure anisotropy that leads, in specific conditions, to the firehose and mirror instabilities. We study the time evolution of seed fields in turbulence under the influence of such instabilities. An approximate analytical time evolution of the magnetic field is provided. The numerical simulations and the analytical predictions are compared. We found that (i) amplification of the magnetic field was efficient in firehose-unstable turbulent regimes, but not in the mirror-unstable models; (ii) the growthmore » rate of the magnetic energy density is much faster than the turbulent dynamo; and (iii) the efficient amplification occurs at small scales. The analytical prediction for the correlation between the growth timescales and pressure anisotropy is confirmed by the numerical simulations. These results reinforce the idea that pressure anisotropies—driven naturally in a turbulent collisionless medium, e.g., the intergalactic medium, could efficiently amplify the magnetic field in the early universe (post-recombination era), previous to the collapse of the first large-scale gravitational structures. This mechanism, though fast for the small-scale fields (∼kpc scales), is unable to provide relatively strong magnetic fields at large scales. Other mechanisms that were not accounted for here (e.g., collisional turbulence once instabilities are quenched, velocity shear, or gravitationally induced inflows of gas into galaxies and clusters) could operate afterward to build up large-scale coherent field structures in the long time evolution.« less

  12. Managing carbon emissions in China through building energy efficiency.

    PubMed

    Li, Jun; Colombier, Michel

    2009-06-01

    This paper attempts to analyse the role of building energy efficiency (BEE) in China in addressing climate change mitigation. It provides an analysis of the current situation and future prospects for the adoption of BEE technologies in Chinese cities. It outlines the economic and institutional barriers to large-scale deployment of the sustainable, low-carbon, and even carbon-free construction techniques. Based on a comprehensive overview of energy demand characteristics and development trends driven by economic and demographic growth, different policy tools for cost-effective CO(2) emission reduction in the Chinese construction sector are described. We propose a comprehensive approach combining building design and construction, and the urban planning and building material industries, in order to drastically improve BEE during this period of rapid urban development. A coherent institutional framework needs to be established to ensure the implementation of efficiency policies. Regulatory and incentive options should be integrated into the policy portfolios of BEE to minimise the efficiency gap and to realise sizeable carbon emissions cuts in the next decades. We analyse in detail several policies and instruments, and formulate relevant policy proposals fostering low-carbon construction technology in China. Specifically, Our analysis shows that improving building energy efficiency can generate considerable carbon emissions reduction credits with competitive price under the CDM framework.

  13. Establishing a National 3d Geo-Data Model for Building Data Compliant to Citygml: Case of Turkey

    NASA Astrophysics Data System (ADS)

    Ates Aydar, S.; Stoter, J.; Ledoux, H.; Demir Ozbek, E.; Yomralioglu, T.

    2016-06-01

    This paper presents the generation of the 3D national building geo-data model of Turkey, which is compatible with the international OGC CityGML Encoding Standard. We prepare an ADE named CityGML-TRKBIS.BI that is produced by extending existing thematic modules of CityGML according to TRKBIS needs. All thematic data groups in TRKBIS geo-data model have been remodelled in order to generate the national large scale 3D geo-data model for Turkey. Specific attention has been paid to data groups that have different class structure according to related CityGML data themes such as building data model. Current 2D geo-information model for building data theme of Turkey (TRKBIS.BI) was established based on INSPIRE specifications for building (Core 2D and Extended 2D profiles), ISO/TC 211 standards and OGC web services. New version of TRKBIS.BI which is established according to semantic and geometric rules of CityGML will represent 2D-2.5D and 3D objects. After a short overview on generic approach, this paper describes extending CityGML building data theme according to TRKBIS.BI through several steps. First, building models of both standards were compared according to their data structure, classes and attributes. Second, CityGML building model was extended with respect to TRKBIS needs and CityGML-TRKBIS Building ADE was established in UML. This study provides new insights into 3D applications in Turkey. The generated 3D geo-data model for building thematic class will be used as a common exchange format that meets 2D, 2.5D and 3D implementation needs at national level.

  14. Building-in-Briefcase: A Rapidly-Deployable Environmental Sensor Suite for the Smart Building.

    PubMed

    Weekly, Kevin; Jin, Ming; Zou, Han; Hsu, Christopher; Soyza, Chris; Bayen, Alexandre; Spanos, Costas

    2018-04-29

    A building’s environment has profound influence on occupant comfort and health. Continuous monitoring of building occupancy and environment is essential to fault detection, intelligent control, and building commissioning. Though many solutions for environmental measuring based on wireless sensor networks exist, they are not easily accessible to households and building owners who may lack time or technical expertise needed to set up a system and get quick and detailed overview of environmental conditions. Building-in-Briefcase (BiB) is a portable sensor network platform that is trivially easy to deploy in any building environment. Once the sensors are distributed, the environmental data is collected and communicated to the BiB router via the Transmission Control Protocol/Internet Protocol (TCP/IP) and WiFi technology, which then forwards the data to the central database securely over the internet through a 3G radio. The user, with minimal effort, can access the aggregated data and visualize the trends in real time on the BiB web portal. Paramount to the adoption and continued operation of an indoor sensing platform is battery lifetime. This design has achieved a multi-year lifespan by careful selection of components, an efficient binary communications protocol and data compression. Our BiB sensor is capable of collecting a rich set of environmental parameters, and is expandable to measure others, such as CO 2 . This paper describes the power characteristics of BiB sensors and their occupancy estimation and activity recognition functionality. We have demonstrated large-scale deployment of BiB throughout Singapore. Our vision is that, by monitoring thousands of buildings through BiB, it would provide ample research opportunities and opportunities to identify ways to improve the building environment and energy efficiency.

  15. Robotically Assembled Aerospace Structures: Digital Material Assembly using a Gantry-Type Assembler

    NASA Technical Reports Server (NTRS)

    Trinh, Greenfield; Copplestone, Grace; O'Connor, Molly; Hu, Steven; Nowak, Sebastian; Cheung, Kenneth; Jenett, Benjamin; Cellucci, Daniel

    2017-01-01

    This paper evaluates the development of automated assembly techniques for discrete lattice structures using a multi-axis gantry type CNC machine. These lattices are made of discrete components called digital materials. We present the development of a specialized end effector that works in conjunction with the CNC machine to assemble these lattices. With this configuration we are able to place voxels at a rate of 1.5 per minute. The scalability of digital material structures due to the incremental modular assembly is one of its key traits and an important metric of interest. We investigate the build times of a 5x5 beam structure on the scale of 1 meter (325 parts), 10 meters (3,250 parts), and 30 meters (9,750 parts). Utilizing the current configuration with a single end effector, performing serial assembly with a globally fixed feed station at the edge of the build volume, the build time increases according to a scaling law of n4, where n is the build scale. Build times can be reduced significantly by integrating feed systems into the gantry itself, resulting in a scaling law of n3. A completely serial assembly process will encounter time limitations as build scale increases. Automated assembly for digital materials can assemble high performance structures from discrete parts, and techniques such as built in feed systems, parallelization, and optimization of the fastening process will yield much higher throughput.

  16. Robotically Assembled Aerospace Structures: Digital Material Assembly using a Gantry-Type Assembler

    NASA Technical Reports Server (NTRS)

    Trinh, Greenfield; Copplestone, Grace; O'Connor, Molly; Hu, Steven; Nowak, Sebastian; Cheung, Kenneth; Jenett, Benjamin; Cellucci, Daniel

    2017-01-01

    This paper evaluates the development of automated assembly techniques for discrete lattice structures using a multi-axis gantry type CNC machine. These lattices are made of discrete components called "digital materials." We present the development of a specialized end effector that works in conjunction with the CNC machine to assemble these lattices. With this configuration we are able to place voxels at a rate of 1.5 per minute. The scalability of digital material structures due to the incremental modular assembly is one of its key traits and an important metric of interest. We investigate the build times of a 5x5 beam structure on the scale of 1 meter (325 parts), 10 meters (3,250 parts), and 30 meters (9,750 parts). Utilizing the current configuration with a single end effector, performing serial assembly with a globally fixed feed station at the edge of the build volume, the build time increases according to a scaling law of n4, where n is the build scale. Build times can be reduced significantly by integrating feed systems into the gantry itself, resulting in a scaling law of n3. A completely serial assembly process will encounter time limitations as build scale increases. Automated assembly for digital materials can assemble high performance structures from discrete parts, and techniques such as built in feed systems, parallelization, and optimization of the fastening process will yield much higher throughput.

  17. Load Diffusion in Composite Structures

    NASA Technical Reports Server (NTRS)

    Horgan, Cornelius O.; Simmonds, J. G.

    2000-01-01

    This research has been concerned with load diffusion in composite structures. Fundamental solid mechanics studies were carried out to provide a basis for assessing the complicated modeling necessary for large scale structures used by NASA. An understanding of the fundamental mechanisms of load diffusion in composite subcomponents is essential in developing primary composite structures. Analytical models of load diffusion behavior are extremely valuable in building an intuitive base for developing refined modeling strategies and assessing results from finite element analyses. The decay behavior of stresses and other field quantities provides a significant aid towards this process. The results are also amendable to parameter study with a large parameter space and should be useful in structural tailoring studies.

  18. The Hyper Suprime-Cam software pipeline

    DOE PAGES

    Bosch, James; Armstrong, Robert; Bickerton, Steven; ...

    2017-10-12

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  19. The Hyper Suprime-Cam software pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch, James; Armstrong, Robert; Bickerton, Steven

    Here in this article, we describe the optical imaging data processing pipeline developed for the Subaru Telescope’s Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope’s Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrendingmore » and image characterizations.« less

  20. Development of building energy asset rating using stock modelling in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Makhmalbaf, Atefe

    2016-01-29

    The US Building Energy Asset Score helps building stakeholders quickly gain insight into the efficiency of building systems (envelope, electrical and mechanical systems). A robust, easy-to-understand 10-point scoring system was developed to facilitate an unbiased comparison of similar building types across the country. The Asset Score does not rely on a database or specific building baselines to establish a rating. Rather, distributions of energy use intensity (EUI) for various building use types were constructed using Latin hypercube sampling and converted to a series of stepped linear scales to score buildings. A score is calculated based on the modelled source EUImore » after adjusting for climate. A web-based scoring tool, which incorporates an analytical engine and a simulation engine, was developed to standardize energy modelling and reduce implementation cost. This paper discusses the methodology used to perform several hundred thousand building simulation runs and develop the scoring scales.« less

  1. Assessment of codes, by-laws and regulations relating to air wells in building design

    NASA Astrophysics Data System (ADS)

    Fadzil, Sharifah Fairuz Syed; Karamazaman, Nazli

    2017-10-01

    Codes and by-laws concerning air well design (for buildings and lavatories) in Malaysia has been established in the Malaysian Uniform Building By-Laws UBBL number 40 (1) and (2) since the 1980s. Wells are there to fulfill the ventilation and daylighting requirements. The minimum well area according to building storey height are compared between UBBL and the Singapore's well requirements from the Building Construction Authority BCA. A visual and graphical representation (with schematics building and well diagrams drawn to scale) of the minimum well sizes and dimensions is given. It can be seen that if the minimum requirement of well size is used for buildings above 8 storeys high, a thin well resulted which is not proportionate to the building height. A proposed dimension is graphed and given to be used in the UBBL which translated to graphics (3 dimensional buildings drawn to scale) created a much better well proportion.

  2. Statistical Learning Theory for High Dimensional Prediction: Application to Criterion-Keyed Scale Development

    PubMed Central

    Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul

    2016-01-01

    Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257

  3. Adaptive latitudinal variation in Common Blackbird Turdus merula nest characteristics

    PubMed Central

    Mainwaring, Mark C; Deeming, D Charles; Jones, Chris I; Hartley, Ian R

    2014-01-01

    Nest construction is taxonomically widespread, yet our understanding of adaptive intraspecific variation in nest design remains poor. Nest characteristics are expected to vary adaptively in response to predictable variation in spring temperatures over large spatial scales, yet such variation in nest design remains largely overlooked, particularly amongst open-cup-nesting birds. Here, we systematically examined the effects of latitudinal variation in spring temperatures and precipitation on the morphology, volume, composition, and insulatory properties of open-cup-nesting Common Blackbirds’ Turdus merula nests to test the hypothesis that birds living in cooler environments at more northerly latitudes would build better insulated nests than conspecifics living in warmer environments at more southerly latitudes. As spring temperatures increased with decreasing latitude, the external diameter of nests decreased. However, as nest wall thickness also decreased, there was no variation in the diameter of the internal nest cups. Only the mass of dry grasses within nests decreased with warmer temperatures at lower latitudes. The insulatory properties of nests declined with warmer temperatures at lower latitudes and nests containing greater amounts of dry grasses had higher insulatory properties. The insulatory properties of nests decreased with warmer temperatures at lower latitudes, via changes in morphology (wall thickness) and composition (dry grasses). Meanwhile, spring precipitation did not vary with latitude, and none of the nest characteristics varied with spring precipitation. This suggests that Common Blackbirds nesting at higher latitudes were building nests with thicker walls in order to counteract the cooler temperatures. We have provided evidence that the nest construction behavior of open-cup-nesting birds systematically varies in response to large-scale spatial variation in spring temperatures. PMID:24683466

  4. Adaptive latitudinal variation in Common Blackbird Turdus merula nest characteristics.

    PubMed

    Mainwaring, Mark C; Deeming, D Charles; Jones, Chris I; Hartley, Ian R

    2014-03-01

    Nest construction is taxonomically widespread, yet our understanding of adaptive intraspecific variation in nest design remains poor. Nest characteristics are expected to vary adaptively in response to predictable variation in spring temperatures over large spatial scales, yet such variation in nest design remains largely overlooked, particularly amongst open-cup-nesting birds. Here, we systematically examined the effects of latitudinal variation in spring temperatures and precipitation on the morphology, volume, composition, and insulatory properties of open-cup-nesting Common Blackbirds' Turdus merula nests to test the hypothesis that birds living in cooler environments at more northerly latitudes would build better insulated nests than conspecifics living in warmer environments at more southerly latitudes. As spring temperatures increased with decreasing latitude, the external diameter of nests decreased. However, as nest wall thickness also decreased, there was no variation in the diameter of the internal nest cups. Only the mass of dry grasses within nests decreased with warmer temperatures at lower latitudes. The insulatory properties of nests declined with warmer temperatures at lower latitudes and nests containing greater amounts of dry grasses had higher insulatory properties. The insulatory properties of nests decreased with warmer temperatures at lower latitudes, via changes in morphology (wall thickness) and composition (dry grasses). Meanwhile, spring precipitation did not vary with latitude, and none of the nest characteristics varied with spring precipitation. This suggests that Common Blackbirds nesting at higher latitudes were building nests with thicker walls in order to counteract the cooler temperatures. We have provided evidence that the nest construction behavior of open-cup-nesting birds systematically varies in response to large-scale spatial variation in spring temperatures.

  5. 1. VIEW LOOKING NORTHWEST AT BUILDING 701. BUILDING 701 WAS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW LOOKING NORTHWEST AT BUILDING 701. BUILDING 701 WAS USED TO DESIGN, BUILD, AND EVALUATE BENCH-SCALE TECHNOLOGIES USED IN ROCKY FLATS WASTE TREATMENT PROCESSES. (1/98) - Rocky Flats Plant, Design Laboratory, Northwest quadrant of Plant, between buildings 776-777 & 771, Golden, Jefferson County, CO

  6. Establishing a Scale for Assessing the Social Validity of Skill Building Interventions for Young Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Berger, Natalie I.; Manston, Lauren; Ingersoll, Brooke

    2016-01-01

    This study evaluated the psychometric properties of the Scale of Treatment Perceptions (STP), a measure of treatment acceptability targeting skill-building interventions for Autism Spectrum Disorder (ASD). This scale utilizes a strength-based approach to intervention assessment, and was established by modifying the Behavior Intervention Rating…

  7. Existence and control of Legionella bacteria in building water systems: A review.

    PubMed

    Springston, John P; Yocavitch, Liana

    2017-02-01

    Legionellae are waterborne bacteria which are capable of causing potentially fatal Legionnaires' disease (LD), as well as Pontiac Fever. Public concern about Legionella exploded following the 1976 outbreak at the American Legion conference in Philadelphia, where 221 attendees contracted pneumonia and 34 died. Since that time, a variety of different control methods and strategies have been developed and implemented in an effort to eradicate Legionella from building water systems. Despite these efforts, the incidence of LD has been steadily increasing in the U.S. for more than a decade. Public health and occupational hygiene professionals have maintained an active debate regarding best practices for management and control of Legionella. Professional opinion remains divided with respect to the relative merits of performing routine sampling for Legionella, vs. the passive, reactive approach that has been largely embraced by public health officials and facility owners. Given the potential risks and ramifications associated with waiting to assess systems for Legionella until after disease has been identified and confirmed, a proactive approach of periodic testing for Legionella, along with proper water treatment, is the best approach to avoiding large-scale disease outbreaks.

  8. End Effects and Load Diffusion in Composite Structures

    NASA Technical Reports Server (NTRS)

    Horgan, Cornelius O.; Ambur, D. (Technical Monitor); Nemeth, M. P. (Technical Monitor)

    2002-01-01

    The research carried out here builds on our previous NASA supported research on the general topic of edge effects and load diffusion in composite structures. Further fundamental solid mechanics studies were carried out to provide a basis for assessing the complicated modeling necessary for large scale structures used by NASA. An understanding of the fundamental mechanisms of load diffusion in composite subcomponents is essential in developing primary composite structures. Specific problems recently considered were focussed on end effects in sandwich structures and for functionally graded materials. Both linear and nonlinear (geometric and material) problems have been addressed. Our goal is the development of readily applicable design formulas for the decay lengths in terms of non-dimensional material and geometric parameters. Analytical models of load diffusion behavior are extremely valuable in building an intuitive base for developing refined modeling strategies and assessing results from finite element analyses. The decay behavior of stresses and other field quantities provides a significant aid towards this process. The analysis is also amenable to parameter study with a large parameter space and should be useful in structural tailoring studies.

  9. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  10. Effect of a constant-level lighting control system on small offices with windows. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edgar, L.

    To reduce energy consumption stemming from lighting, some of the fixtures in Army office buildings have been delamped and building energy managers have instituted the policy of turning lights off when not in use. Even with these measures, lighting is still one of the largest consumers of electricity. The current problem is to find ways to reduce the energy consumption of lighting systems when they are in use. The objectives of this research was to provide information on the performance and energy savings potential of constant level lighting (CLL) controls. Based on a review of product information, researchers selected themore » Conservolite Plus 20 for testing and installed it in 10 office spaces. After 4 months of operation, a survey of the office occupants revealed that they were satisfied with the CLL system. Although electrical cost savings were realized, the payback period varied greatly, depending on the cost of replacing old or inoperable lamps and ballasts. Before large scale installation of CLL systems, it is recommended that the power factor and harmonic distortion be monitored at a large facility.« less

  11. Damage Assessment of a Full-Scale Six-Story wood-frame Building Following Triaxial shake Table Tests

    Treesearch

    John W. van de Lindt; Rakesh Gupta; Shiling Pei; Kazuki Tachibana; Yasuhiro Araki; Douglas Rammer; Hiroshi Isoda

    2012-01-01

    In the summer of 2009, a full-scale midrise wood-frame building was tested under a series of simulated earthquakes on the world's largest shake table in Miki City, Japan. The objective of this series of tests was to validate a performance-based seismic design approach by qualitatively and quantitatively examining the building's seismic performance in terms of...

  12. A CCD experimental platform for large telescope in Antarctica based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhu, Yuhua; Qi, Yongjun

    2014-07-01

    The CCD , as a detector , is one of the important components of astronomical telescopes. For a large telescope in Antarctica, a set of CCD detector system with large size, high sensitivity and low noise is indispensable. Because of the extremely low temperatures and unattended, system maintenance and software and hardware upgrade become hard problems. This paper introduces a general CCD controller experiment platform, using Field programmable gate array FPGA, which is, in fact, a large-scale field reconfigurable array. Taking the advantage of convenience to modify the system, construction of driving circuit, digital signal processing module, network communication interface, control algorithm validation, and remote reconfigurable module may realize. With the concept of integrated hardware and software, the paper discusses the key technology of building scientific CCD system suitable for the special work environment in Antarctica, focusing on the method of remote reconfiguration for controller via network and then offering a feasible hardware and software solution.

  13. Method for detecting moment connection fracture using high-frequency transients in recorded accelerations

    USGS Publications Warehouse

    Rodgers, J.E.; Elebi, M.

    2011-01-01

    The 1994 Northridge earthquake caused brittle fractures in steel moment frame building connections, despite causing little visible building damage in most cases. Future strong earthquakes are likely to cause similar damage to the many un-retrofitted pre-Northridge buildings in the western US and elsewhere. Without obvious permanent building deformation, costly intrusive inspections are currently the only way to determine if major fracture damage that compromises building safety has occurred. Building instrumentation has the potential to provide engineers and owners with timely information on fracture occurrence. Structural dynamics theory predicts and scale model experiments have demonstrated that sudden, large changes in structure properties caused by moment connection fractures will cause transient dynamic response. A method is proposed for detecting the building-wide level of connection fracture damage, based on observing high-frequency, fracture-induced transient dynamic responses in strong motion accelerograms. High-frequency transients are short (<1 s), sudden-onset waveforms with frequency content above 25 Hz that are visually apparent in recorded accelerations. Strong motion data and damage information from intrusive inspections collected from 24 sparsely instrumented buildings following the 1994 Northridge earthquake are used to evaluate the proposed method. The method's overall success rate for this data set is 67%, but this rate varies significantly with damage level. The method performs reasonably well in detecting significant fracture damage and in identifying cases with no damage, but fails in cases with few fractures. Combining the method with other damage indicators and removing records with excessive noise improves the ability to detect the level of damage. ?? 2010 Elsevier B.V. All rights reserved.

  14. A High-Granularity Approach to Modeling Energy Consumption and Savings Potential in the U.S. Residential Building Stock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Building simulations are increasingly used in various applications related to energy efficient buildings. For individual buildings, applications include: design of new buildings, prediction of retrofit savings, ratings, performance path code compliance and qualification for incentives. Beyond individual building applications, larger scale applications (across the stock of buildings at various scales: national, regional and state) include: codes and standards development, utility program design, regional/state planning, and technology assessments. For these sorts of applications, a set of representative buildings are typically simulated to predict performance of the entire population of buildings. Focusing on the U.S. single-family residential building stock, this paper willmore » describe how multiple data sources for building characteristics are combined into a highly-granular database that preserves the important interdependencies of the characteristics. We will present the sampling technique used to generate a representative set of thousands (up to hundreds of thousands) of building models. We will also present results of detailed calibrations against building stock consumption data.« less

  15. Characterization and provenance of the building stones from Pompeii's archaeological site (southern Italy)

    NASA Astrophysics Data System (ADS)

    Balassone, G.; Kastenmeier, P.; di Maio, G.; Mormone, A.; Joachimski, M.

    2009-04-01

    Pompeii is one of the most famous and complex areas of archaeological investigation in the world and with a uniquely favorable state of preservation. Even if many studies have been devoted in time to many archaeological aspects of this ancient city, large-scale and detailed studies aimed at characterizing mineralogy, petrography and isotope geochemistry of the building stones are still lacking. The scope of the present research is to fill this gap, pointing to the definition of the provenance of the stony materials used for ancient constructions of the city of Pompeii and to the possible trade routes. This work is part of a large-scale survey carried out by the Deutsches Archäologisches Institut of Berlin, with the purposes of reconstructing the sources of raw materials of various archaeological sites of the Sarno Plain (e.g. Longola-Poggiomarino settlement, Nuceria, Stabiae, etc.) and consequently also the paleo-environments of this area during the Olocene (Seiler, 2006, 2008; Kastemeier and Seiler, 2007). We sampled all the litotypes with different macroscopic characteristics from various buildings according to location, age (time span VI century B.C. - I century A.D.) and utilization; the architectural buildings considered for this study are mainly represented public and religious buildings, houses and funerary monuments. As possible source areas, representative litotypes have been sampled from ancient pits and outcrops surrounding Pompeii as well. A set of 80 samples have been sampled by means of micro-drillings for mineralogical, petrographic and geochemical analyses, comprising optical microscopy, X-ray diffraction, inductively coupled plasma mass, X-ray fluorescence and C-O isotope geochemistry. Minero-petrographic and XRD studies of Pompeii rock samples have shown that at least ten different litologies occur as building stones, belonging to basaltic to tephritic lavas, pyroclasts (tuffs, scoriae, etc.) and sedimentary rocks (limestone, travertines). Preliminary results on source localities indicate a local provenance for a set of volcanic rock samples, whereas the possible source areas of the sedimentary litotypes seem to be more complex. New minero-petrographic data of samples from surrounding outcrops are presented and compared to the related Pompeii building stones. References Seiler F. (2006) - Current research projects. In: Aktuelle Forchungsprojeckte, Deutsches Archäologisches Institut Zentrale, 34-35. Seiler F. (2008) - Rekonstruktion der antiken Kulturlandschaften des Sarno-Beckens. Ein multidisziplinäres Kooperationsprojekt mit Partnern aus Naturwissenschaften und Altertumswissenschaften in Deutschland, Italien und England. In: P. G. Guzzo - M. P. Guidobaldi (Eds), Nuove ricerche archeologiche nell'area vesuviana (scavi 2003-2006). Convegno Internazionale, Roma 1-3 febbraio, 485-490. Kastenmeier P., Seiler F. (2007) - La ricostruzione dei paleo-paesaggi nella piana del Sarno. Quaderni Autorità di Bacino del Sarno. Studi, documentazione e ricerca, 1, 24-26.

  16. Development of an audio-based virtual gaming environment to assist with navigation skills in the blind.

    PubMed

    Connors, Erin C; Yazzolino, Lindsay A; Sánchez, Jaime; Merabet, Lotfi B

    2013-03-27

    Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.

  17. Gypsum: a review of its role in the deterioration of building materials

    NASA Astrophysics Data System (ADS)

    Charola, A. Elena; Pühringer, Josef; Steiger, Michael

    2007-03-01

    The deterioration of buildings and monuments by gypsum is the result of crystallization cycles of this salt. Although gypsum can dehydrate to a hemihydrate, the mineral bassanite, and to an anhydrate, the mineral anhydrite, this reaction occurs in nature on a geological time scale and therefore it is unlikely to occur when gypsum is found on and in building materials. The CaSO4-H2O system appears deceptively simple, however there are still discrepancies between the experimental and thermodynamically calculated data. The reason for the latter can be attributed to the slow crystallization kinetics of anhydrite. Apart from this, the large numbers of studies carried out on this system have focused on industrially important metastable phases, such as the hemihydrate and soluble anhydrite. The paper presents a review of the studies dealing with the phase equilibria of the CaSO4-H2O system as well as the influence of other salts on the solubility of gypsum. It tries to glean out the relevant information that will serve to explain the deterioration observed on building materials by the crystallization of gypsum and thus allows developing improved conservation methods.

  18. Culture’s building blocks: investigating cultural evolution in a LEGO construction task

    PubMed Central

    McGraw, John J.; Wallot, Sebastian; Mitkidis, Panagiotis; Roepstorff, Andreas

    2014-01-01

    One of the most essential but theoretically vexing issues regarding the notion of culture is that of cultural evolution and transmission: how a group’s accumulated solutions to invariant challenges develop and persevere over time. But at the moment, the notion of applying evolutionary theory to culture remains little more than a suggestive trope. Whereas the modern synthesis of evolutionary theory has provided an encompassing scientific framework for the selection and transmission of biological adaptations, a convincing theory of cultural evolution has yet to emerge. One of the greatest challenges for theorists is identifying the appropriate time scales and units of analysis in order to reduce the intractably large and complex phenomenon of “culture” into its component “building blocks.” In this paper, we present a model for scientifically investigating cultural processes by analyzing the ways people develop conventions in a series of LEGO construction tasks. The data revealed a surprising pattern in the selection of building bricks as well as features of car design across consecutive building sessions. Our findings support a novel methodology for studying the development and transmission of culture through the microcosm of interactive LEGO design and assembly. PMID:25309482

  19. Culture's building blocks: investigating cultural evolution in a LEGO construction task.

    PubMed

    McGraw, John J; Wallot, Sebastian; Mitkidis, Panagiotis; Roepstorff, Andreas

    2014-01-01

    ONE OF THE MOST ESSENTIAL BUT THEORETICALLY VEXING ISSUES REGARDING THE NOTION OF CULTURE IS THAT OF CULTURAL EVOLUTION AND TRANSMISSION: how a group's accumulated solutions to invariant challenges develop and persevere over time. But at the moment, the notion of applying evolutionary theory to culture remains little more than a suggestive trope. Whereas the modern synthesis of evolutionary theory has provided an encompassing scientific framework for the selection and transmission of biological adaptations, a convincing theory of cultural evolution has yet to emerge. One of the greatest challenges for theorists is identifying the appropriate time scales and units of analysis in order to reduce the intractably large and complex phenomenon of "culture" into its component "building blocks." In this paper, we present a model for scientifically investigating cultural processes by analyzing the ways people develop conventions in a series of LEGO construction tasks. The data revealed a surprising pattern in the selection of building bricks as well as features of car design across consecutive building sessions. Our findings support a novel methodology for studying the development and transmission of culture through the microcosm of interactive LEGO design and assembly.

  20. Evaluating Trade-offs of a Large, Infrequent Diversion for Restoration of a Forested Wetland and Associated Ecosystem Services in the Mississippi delta

    NASA Astrophysics Data System (ADS)

    Day, J.; Rutherford, J.; Weigman, A.; D'Elia, C.

    2017-12-01

    Flood control levees have eliminated the supply of sediment to Mississippi delta coastal wetlands, putting the delta on a trajectory for submergence in the 21st century. River diversions have been proposed as a method to provide a sustainable supply of sediment to the delta. Operating river diversions based on the size and frequency of natural crevasse events that were large (>5000 m3/s) and infrequent (active < once a year). This study assesses tradeoffs for a large, infrequent diversion into the forested wetlands of the Maurepas Swamp using a 2-dimensional model, that predicts land building is simulated for several diversion sizes and intermittencies. A cost-benefit analysis (CBA) was conducted by combining model results with an ecosystem service valuation (ESV) and estimated costs. Land building is proportional to diversion size and inversely proportional to years inactive. Because benefits are assumed to scale linearly with land gain, and costs increase with diversion size, there are disadvantages to operating large diversions less often, compared to smaller diversions more often. However, infrequent operation would provide additional ES benefits to the broader Lake Pontchartrain ecosystem by minimizing long-term changes to water quality and salinity, reducing inundation time, and allowing for greater consolidation of soils between diversion pulses. Compared to diversions, marsh creation costs increase over time due to sea level rise and energy costs.

  1. Water, Carbon, and Nutrient Cycling Following Insect-induced Tree Mortality: How Well Do Plot-scale Observations Predict Ecosystem-Scale Response?

    NASA Astrophysics Data System (ADS)

    Brooks, P. D.; Barnard, H. R.; Biederman, J. A.; Borkhuu, B.; Edburg, S. L.; Ewers, B. E.; Gochis, D. J.; Gutmann, E. D.; Harpold, A. A.; Hicke, J. A.; Pendall, E.; Reed, D. E.; Somor, A. J.; Troch, P. A.

    2011-12-01

    Widespread tree mortality caused by insect infestations and drought has impacted millions of hectares across western North America in recent years. Although previous work on post-disturbance responses (e.g. experimental manipulations, fire, and logging) provides insight into how water and biogeochemical cycles may respond to insect infestations and drought, we find that the unique nature of these drivers of tree mortality complicates extrapolation to larger scales. Building from previous work on forest disturbance, we present a conceptual model of how temporal changes in forest structure impact the individual components of energy balance, hydrologic partitioning, and biogeochemical cycling and the interactions among them. We evaluate and refine this model using integrated observations and process modeling on multiple scales including plot, stand, flux tower footprint, hillslope, and catchment to identify scaling relationships and emergent patterns in hydrological and biogeochemical responses. Our initial results suggest that changes in forest structure at point or plot scales largely have predictable effects on energy, water, and biogeochemical cycles that are well captured by land surface, hydrological, and biogeochemical models. However, observations from flux towers and nested catchments suggest that both the hydrological and biogeochemical effects observed at tree and plot scales may be attenuated or exacerbated at larger scales. Compensatory processes are associated with attenuation (e.g. as transpiration decreases, evaporation and sublimation increase), whereas both attenuation and exacerbation may result from nonlinear scaling behavior across transitions in topography and ecosystem structure that affect the redistribution of energy, water, and solutes. Consequently, the effects of widespread tree mortality on ecosystem services of water supply and carbon sequestration will likely depend on how spatial patterns in mortality severity across the landscape affect large-scale hydrological partitioning.

  2. NORTHEAST FACADE AND ONESTORY WING, VIEW FACING SOUTHSOUTHWEST (with scale ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    NORTHEAST FACADE AND ONE-STORY WING, VIEW FACING SOUTH-SOUTHWEST (with scale stick). - Naval Air Station Barbers Point, Control Tower & Aviation Operations Building, Near intersection of runways between Hangar 110 & Building 115, Ewa, Honolulu County, HI

  3. Large-Scale Urban Decontamination; Developments, Historical Examples and Lessons Learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rick Demmer

    2007-02-01

    Recent terrorist threats and actual events have lead to a renewed interest in the technical field of large scale, urban environment decontamination. One of the driving forces for this interest is the real potential for the cleanup and removal of radioactive dispersal device (RDD or “dirty bomb”) residues. In response the U. S. Government has spent many millions of dollars investigating RDD contamination and novel decontamination methodologies. Interest in chemical and biological (CB) cleanup has also peaked with the threat of terrorist action like the anthrax attack at the Hart Senate Office Building and with catastrophic natural events such asmore » Hurricane Katrina. The efficiency of cleanup response will be improved with these new developments and a better understanding of the “old reliable” methodologies. Perhaps the most interesting area of investigation for large area decontamination is that of the RDD. While primarily an economic and psychological weapon, the need to cleanup and return valuable or culturally significant resources to the public is nonetheless valid. Several private companies, universities and National Laboratories are currently developing novel RDD cleanup technologies. Because of its longstanding association with radioactive facilities, the U. S. Department of Energy National Laboratories are at the forefront in developing and testing new RDD decontamination methods. However, such cleanup technologies are likely to be fairly task specific; while many different contamination mechanisms, substrate and environmental conditions will make actual application more complicated. Some major efforts have also been made to model potential contamination, to evaluate both old and new decontamination techniques and to assess their readiness for use. Non-radioactive, CB threats each have unique decontamination challenges and recent events have provided some examples. The U. S. Environmental Protection Agency (EPA), as lead agency for these emergency cleanup responses, has a sound approach for decontamination decision-making that has been applied several times. The anthrax contamination at the U. S. Hart Senate Office Building and numerous U. S. Post Office facilities are examples of employing novel technical responses. Decontamination of the Hart Office building required development of a new approach for high level decontamination of biological contamination as well as techniques for evaluating the technology effectiveness. The World Trade Center destruction also demonstrated the need for, and successful implementation of, appropriate cleanup methodologies. There are a number of significant lessons that can be gained from a look at previous large scale cleanup projects. Too often we are quick to apply a costly “package and dispose” method when sound technological cleaning approaches are available. Understanding historical perspectives, advanced planning and constant technology improvement are essential to successful decontamination.« less

  4. WAKE ISLAND AIRFIELD TERMINAL, BUILDING 1502 LOOKING EAST WITH PHOTO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    WAKE ISLAND AIRFIELD TERMINAL, BUILDING 1502 LOOKING EAST WITH PHOTO SCALE CENTERED ON BUILDING (12/30/2008) - Wake Island Airfield, Terminal Building, West Side of Wake Avenue, Wake Island, Wake Island, UM

  5. Understanding continental megathrust earthquake potential through geological mountain building processes: an example in Nepal Himalaya

    NASA Astrophysics Data System (ADS)

    Zhang, Huai; Zhang, Zhen; Wang, Liangshu; Leroy, Yves; shi, Yaolin

    2017-04-01

    How to reconcile continent megathrust earthquake characteristics, for instances, mapping the large-great earthquake sequences into geological mountain building process, as well as partitioning the seismic-aseismic slips, is fundamental and unclear. Here, we scope these issues by focusing a typical continental collisional belt, the great Nepal Himalaya. We first prove that refined Nepal Himalaya thrusting sequences, with accurately defining of large earthquake cycle scale, provide new geodynamical hints on long-term earthquake potential in association with, either seismic-aseismic slip partition up to the interpretation of the binary interseismic coupling pattern on the Main Himalayan Thrust (MHT), or the large-great earthquake classification via seismic cycle patterns on MHT. Subsequently, sequential limit analysis is adopted to retrieve the detailed thrusting sequences of Nepal Himalaya mountain wedge. Our model results exhibit apparent thrusting concentration phenomenon with four thrusting clusters, entitled as thrusting 'families', to facilitate the development of sub-structural regions respectively. Within the hinterland thrusting family, the total aseismic shortening and the corresponding spatio-temporal release pattern are revealed by mapping projection. Whereas, in the other three families, mapping projection delivers long-term large (M<8)-great (M>8) earthquake recurrence information, including total lifespans, frequencies and large-great earthquake alternation information by identifying rupture distances along the MHT. In addition, this partition has universality in continental-continental collisional orogenic belt with identified interseismic coupling pattern, while not applicable in continental-oceanic megathrust context.

  6. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  7. a Voxel-Based Metadata Structure for Change Detection in Point Clouds of Large-Scale Urban Areas

    NASA Astrophysics Data System (ADS)

    Gehrung, J.; Hebel, M.; Arens, M.; Stilla, U.

    2018-05-01

    Mobile laser scanning has not only the potential to create detailed representations of urban environments, but also to determine changes up to a very detailed level. An environment representation for change detection in large scale urban environments based on point clouds has drawbacks in terms of memory scalability. Volumes, however, are a promising building block for memory efficient change detection methods. The challenge of working with 3D occupancy grids is that the usual raycasting-based methods applied for their generation lead to artifacts caused by the traversal of unfavorable discretized space. These artifacts have the potential to distort the state of voxels in close proximity to planar structures. In this work we propose a raycasting approach that utilizes knowledge about planar surfaces to completely prevent this kind of artifacts. To demonstrate the capabilities of our approach, a method for the iterative volumetric approximation of point clouds that allows to speed up the raycasting by 36 percent is proposed.

  8. DENA: A Configurable Microarchitecture and Design Flow for Biomedical DNA-Based Logic Design.

    PubMed

    Beiki, Zohre; Jahanian, Ali

    2017-10-01

    DNA is known as the building block for storing the life codes and transferring the genetic features through the generations. However, it is found that DNA strands can be used for a new type of computation that opens fascinating horizons in computational medicine. Significant contributions are addressed on design of DNA-based logic gates for medical and computational applications but there are serious challenges for designing the medium and large-scale DNA circuits. In this paper, a new microarchitecture and corresponding design flow is proposed to facilitate the design of multistage large-scale DNA logic systems. Feasibility and efficiency of the proposed microarchitecture are evaluated by implementing a full adder and, then, its cascadability is determined by implementing a multistage 8-bit adder. Simulation results show the highlight features of the proposed design style and microarchitecture in terms of the scalability, implementation cost, and signal integrity of the DNA-based logic system compared to the traditional approaches.

  9. Ecosystem experiment reveals benefits of natural and simulated beaver dams to a threatened population of steelhead (Oncorhynchus mykiss)

    PubMed Central

    Bouwes, Nicolaas; Weber, Nicholas; Jordan, Chris E.; Saunders, W. Carl; Tattam, Ian A.; Volk, Carol; Wheaton, Joseph M.; Pollock, Michael M.

    2016-01-01

    Beaver have been referred to as ecosystem engineers because of the large impacts their dam building activities have on the landscape; however, the benefits they may provide to fluvial fish species has been debated. We conducted a watershed-scale experiment to test how increasing beaver dam and colony persistence in a highly degraded incised stream affects the freshwater production of steelhead (Oncorhynchus mykiss). Following the installation of beaver dam analogs (BDAs), we observed significant increases in the density, survival, and production of juvenile steelhead without impacting upstream and downstream migrations. The steelhead response occurred as the quantity and complexity of their habitat increased. This study is the first large-scale experiment to quantify the benefits of beavers and BDAs to a fish population and its habitat. Beaver mediated restoration may be a viable and efficient strategy to recover ecosystem function of previously incised streams and to increase the production of imperiled fish populations. PMID:27373190

  10. A unified model explains commonness and rarity on coral reefs.

    PubMed

    Connolly, Sean R; Hughes, Terry P; Bellwood, David R

    2017-04-01

    Abundance patterns in ecological communities have important implications for biodiversity maintenance and ecosystem functioning. However, ecological theory has been largely unsuccessful at capturing multiple macroecological abundance patterns simultaneously. Here, we propose a parsimonious model that unifies widespread ecological relationships involving local aggregation, species-abundance distributions, and species associations, and we test this model against the metacommunity structure of reef-building corals and coral reef fishes across the western and central Pacific. For both corals and fishes, the unified model simultaneously captures extremely well local species-abundance distributions, interspecific variation in the strength of spatial aggregation, patterns of community similarity, species accumulation, and regional species richness, performing far better than alternative models also examined here and in previous work on coral reefs. Our approach contributes to the development of synthetic theory for large-scale patterns of community structure in nature, and to addressing ongoing challenges in biodiversity conservation at macroecological scales. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  11. LSSGalPy: Interactive Visualization of the Large-scale Environment Around Galaxies

    NASA Astrophysics Data System (ADS)

    Argudo-Fernández, M.; Duarte Puertas, S.; Ruiz, J. E.; Sabater, J.; Verley, S.; Bergond, G.

    2017-05-01

    New tools are needed to handle the growth of data in astrophysics delivered by recent and upcoming surveys. We aim to build open-source, light, flexible, and interactive software designed to visualize extensive three-dimensional (3D) tabular data. Entirely written in the Python language, we have developed interactive tools to browse and visualize the positions of galaxies in the universe and their positions with respect to its large-scale structures (LSS). Motivated by a previous study, we created two codes using Mollweide projection and wedge diagram visualizations, where survey galaxies can be overplotted on the LSS of the universe. These are interactive representations where the visualizations can be controlled by widgets. We have released these open-source codes that have been designed to be easily re-used and customized by the scientific community to fulfill their needs. The codes are adaptable to other kinds of 3D tabular data and are robust enough to handle several millions of objects. .

  12. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    PubMed

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  13. Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights

    PubMed Central

    Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis

    2014-01-01

    Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270

  14. The trispectrum in the Effective Field Theory of Large Scale Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolini, Daniele; Schutz, Katelin; Solon, Mikhail P.

    2016-06-01

    We compute the connected four point correlation function (the trispectrum in Fourier space) of cosmological density perturbations at one-loop order in Standard Perturbation Theory (SPT) and the Effective Field Theory of Large Scale Structure (EFT of LSS). This paper is a companion to our earlier work on the non-Gaussian covariance of the matter power spectrum, which corresponds to a particular wavenumber configuration of the trispectrum. In the present calculation, we highlight and clarify some of the subtle aspects of the EFT framework that arise at third order in perturbation theory for general wavenumber configurations of the trispectrum. We consistently incorporatemore » vorticity and non-locality in time into the EFT counterterms and lay out a complete basis of building blocks for the stress tensor. We show predictions for the one-loop SPT trispectrum and the EFT contributions, focusing on configurations which have particular relevance for using LSS to constrain primordial non-Gaussianity.« less

  15. Graph 500 on OpenSHMEM: Using a Practical Survey of Past Work to Motivate Novel Algorithmic Developments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grossman, Max; Pritchard Jr., Howard Porter; Budimlic, Zoran

    2016-12-22

    Graph500 [14] is an effort to offer a standardized benchmark across large-scale distributed platforms which captures the behavior of common communicationbound graph algorithms. Graph500 differs from other large-scale benchmarking efforts (such as HPL [6] or HPGMG [7]) primarily in the irregularity of its computation and data access patterns. The core computational kernel of Graph500 is a breadth-first search (BFS) implemented on an undirected graph. The output of Graph500 is a spanning tree of the input graph, usually represented by a predecessor mapping for every node in the graph. The Graph500 benchmark defines several pre-defined input sizes for implementers to testmore » against. This report summarizes investigation into implementing the Graph500 benchmark on OpenSHMEM, and focuses on first building a strong and practical understanding of the strengths and limitations of past work before proposing and developing novel extensions.« less

  16. Enabling Large-Scale Design, Synthesis and Validation of Small Molecule Protein-Protein Antagonists

    PubMed Central

    Koes, David; Khoury, Kareem; Huang, Yijun; Wang, Wei; Bista, Michal; Popowicz, Grzegorz M.; Wolf, Siglinde; Holak, Tad A.; Dömling, Alexander; Camacho, Carlos J.

    2012-01-01

    Although there is no shortage of potential drug targets, there are only a handful known low-molecular-weight inhibitors of protein-protein interactions (PPIs). One problem is that current efforts are dominated by low-yield high-throughput screening, whose rigid framework is not suitable for the diverse chemotypes present in PPIs. Here, we developed a novel pharmacophore-based interactive screening technology that builds on the role anchor residues, or deeply buried hot spots, have in PPIs, and redesigns these entry points with anchor-biased virtual multicomponent reactions, delivering tens of millions of readily synthesizable novel compounds. Application of this approach to the MDM2/p53 cancer target led to high hit rates, resulting in a large and diverse set of confirmed inhibitors, and co-crystal structures validate the designed compounds. Our unique open-access technology promises to expand chemical space and the exploration of the human interactome by leveraging in-house small-scale assays and user-friendly chemistry to rationally design ligands for PPIs with known structure. PMID:22427896

  17. Feature hashing for fast image retrieval

    NASA Astrophysics Data System (ADS)

    Yan, Lingyu; Fu, Jiarun; Zhang, Hongxin; Yuan, Lu; Xu, Hui

    2018-03-01

    Currently, researches on content based image retrieval mainly focus on robust feature extraction. However, due to the exponential growth of online images, it is necessary to consider searching among large scale images, which is very timeconsuming and unscalable. Hence, we need to pay much attention to the efficiency of image retrieval. In this paper, we propose a feature hashing method for image retrieval which not only generates compact fingerprint for image representation, but also prevents huge semantic loss during the process of hashing. To generate the fingerprint, an objective function of semantic loss is constructed and minimized, which combine the influence of both the neighborhood structure of feature data and mapping error. Since the machine learning based hashing effectively preserves neighborhood structure of data, it yields visual words with strong discriminability. Furthermore, the generated binary codes leads image representation building to be of low-complexity, making it efficient and scalable to large scale databases. Experimental results show good performance of our approach.

  18. Efficiency of thin magnetically arrested discs around black holes

    NASA Astrophysics Data System (ADS)

    Avara, Mark J.; McKinney, Jonathan C.; Reynolds, Christopher S.

    2016-10-01

    The radiative and jet efficiencies of thin magnetized accretion discs around black holes (BHs) are affected by BH spin and the presence of a magnetic field that, when strong, could lead to large deviations from Novikov-Thorne (NT) thin disc theory. To seek the maximum deviations, we perform general relativistic magnetohydrodynamic simulations of radiatively efficient thin (half-height H to radius R of H/R ≈ 0.10) discs around moderately rotating BHs with a/M = 0.5. First, our simulations, each evolved for more than 70 000 rg/c (gravitational radius rg and speed of light c), show that large-scale magnetic field readily accretes inward even through our thin disc and builds-up to the magnetically arrested disc (MAD) state. Secondly, our simulations of thin MADs show the disc achieves a radiative efficiency of ηr ≈ 15 per cent (after estimating photon capture), which is about twice the NT value of ηr ˜ 8 per cent for a/M = 0.5 and gives the same luminosity as an NT disc with a/M ≈ 0.9. Compared to prior simulations with ≲10 per cent deviations, our result of an ≈80 per cent deviation sets a new benchmark. Building on prior work, we are now able to complete an important scaling law which suggests that observed jet quenching in the high-soft state in BH X-ray binaries is consistent with an ever-present MAD state with a weak yet sustained jet.

  19. Effects of seed bank disturbance on the fine-scale genetic structure of populations of the rare shrub Grevillea macleayana.

    PubMed

    England, P R; Whelan, R J; Ayre, D J

    2003-11-01

    Dispersal in most plants is mediated by the movement of seeds and pollen, which move genes across the landscape differently. Grevillea macleayana is a rare, fire-dependent Australian shrub with large seeds lacking adaptations for dispersal; yet it produces inflorescences adapted to pollination by highly mobile vertebrates (eg birds). Interpreting fine-scale genetic structure in the light of these two processes is confounded by the recent imposition of anthropogenic disturbances with potentially contrasting genetic consequences: (1) the unusual foraging behaviour of exotic honeybees and 2. widespread disturbance of the soil-stored seedbank by road building and quarrying. To test for evidence of fine-scale genetic structure within G. macleayana populations and to test the prediction that such structure might be masked by disturbance of the seed bank, we sampled two sites in undisturbed habitat and compared their genetic structure with two sites that had been strongly affected by road building using a test for spatial autocorrelation of genotypes. High selfing levels inferred from genotypes at all four sites implies that pollen dispersal is limited. Consistent with this, we observed substantial spatial clustering of genes at 10 m or less in the two undisturbed populations and argue that this reflects the predicted effects of both high selfing levels and limited seed dispersal. In contrast, at the two sites disturbed by road building, spatial autocorrelation was weak. This suggests there has been mixing of the seed bank, counteracting the naturally low dispersal and elevated selfing due to honeybees. Pollination between near neighbours with reduced relatedness potentially has fitness consequences for G. macleayana in disturbed sites.

  20. Dust in the wind: challenges for urban aerodynamics

    NASA Astrophysics Data System (ADS)

    Boris, Jay P.

    2007-04-01

    The fluid dynamics of airflow through a city controls the transport and dispersion of airborne contaminants. This is urban aerodynamics, not meteorology. The average flow, large-scale fluctuations and turbulence are closely coupled to the building geometry. Buildings create large "rooster-tail" wakes; there are systematic fountain flows up the backs of tall buildings; and dust in the wind can move perpendicular to or even against the locally prevailing wind. Requirements for better prediction accuracy demand time-dependent, three-dimensional CFD computations that include solar heating and buoyancy, complete landscape and building geometry specification including foliage and, realistic wind fluctuations. This fundamental prediction capability is necessary to assess urban visibility and line-of-sight sensor performance in street canyons and rugged terrain. Computing urban aerodynamics accurately is clearly a time-dependent High Performance Computing (HPC) problem. In an emergency, on the other hand, prediction technology to assess crisis information, sensor performance, and obscured line-of-sight propagation in the face of industrial spills, transportation accidents, or terrorist attacks has very tight time requirements that suggest simple approximations which tend to produce inaccurate results. In the past we have had to choose one or the other: a fast, inaccurate model or a slow accurate model. Using new fluid-dynamic principles, an urban-oriented emergency assessment system called CT-Analyst® was invented that solves this dilemma. It produces HPC-quality results for airborne contaminant scenarios nearly instantly and has unique new capabilities suited to sensor optimization. This presentation treats the design and use of CT-Analyst and discusses the developments needed for widespread use with advanced sensor and communication systems.

Top